WorldWideScience

Sample records for graphical-data-processing research study

  1. Graphical Language for Data Processing

    Science.gov (United States)

    Alphonso, Keith

    2011-01-01

    A graphical language for processing data allows processing elements to be connected with virtual wires that represent data flows between processing modules. The processing of complex data, such as lidar data, requires many different algorithms to be applied. The purpose of this innovation is to automate the processing of complex data, such as LIDAR, without the need for complex scripting and programming languages. The system consists of a set of user-interface components that allow the user to drag and drop various algorithmic and processing components onto a process graph. By working graphically, the user can completely visualize the process flow and create complex diagrams. This innovation supports the nesting of graphs, such that a graph can be included in another graph as a single step for processing. In addition to the user interface components, the system includes a set of .NET classes that represent the graph internally. These classes provide the internal system representation of the graphical user interface. The system includes a graph execution component that reads the internal representation of the graph (as described above) and executes that graph. The execution of the graph follows the interpreted model of execution in that each node is traversed and executed from the original internal representation. In addition, there are components that allow external code elements, such as algorithms, to be easily integrated into the system, thus making the system infinitely expandable.

  2. Comparison between research data processing capabilities of AMD and NVIDIA architecture-based graphic processors

    International Nuclear Information System (INIS)

    Dudnik, V.A.; Kudryavtsev, V.I.; Us, S.A.; Shestakov, M.V.

    2015-01-01

    A comparative analysis has been made to describe the potentialities of hardware and software tools of two most widely used modern architectures of graphic processors (AMD and NVIDIA). Special features and differences of GPU architectures are exemplified by fragments of GPGPU programs. Time consumption for the program development has been estimated. Some pieces of advice are given as to the optimum choice of the GPU type for speeding up the processing of scientific research results. Recommendations are formulated for the use of software tools that reduce the time of GPGPU application programming for the given types of graphic processors

  3. Data Sorting Using Graphics Processing Units

    Directory of Open Access Journals (Sweden)

    M. J. Mišić

    2012-06-01

    Full Text Available Graphics processing units (GPUs have been increasingly used for general-purpose computation in recent years. The GPU accelerated applications are found in both scientific and commercial domains. Sorting is considered as one of the very important operations in many applications, so its efficient implementation is essential for the overall application performance. This paper represents an effort to analyze and evaluate the implementations of the representative sorting algorithms on the graphics processing units. Three sorting algorithms (Quicksort, Merge sort, and Radix sort were evaluated on the Compute Unified Device Architecture (CUDA platform that is used to execute applications on NVIDIA graphics processing units. Algorithms were tested and evaluated using an automated test environment with input datasets of different characteristics. Finally, the results of this analysis are briefly discussed.

  4. SraTailor: graphical user interface software for processing and visualizing ChIP-seq data.

    Science.gov (United States)

    Oki, Shinya; Maehara, Kazumitsu; Ohkawa, Yasuyuki; Meno, Chikara

    2014-12-01

    Raw data from ChIP-seq (chromatin immunoprecipitation combined with massively parallel DNA sequencing) experiments are deposited in public databases as SRAs (Sequence Read Archives) that are publically available to all researchers. However, to graphically visualize ChIP-seq data of interest, the corresponding SRAs must be downloaded and converted into BigWig format, a process that involves complicated command-line processing. This task requires users to possess skill with script languages and sequence data processing, a requirement that prevents a wide range of biologists from exploiting SRAs. To address these challenges, we developed SraTailor, a GUI (Graphical User Interface) software package that automatically converts an SRA into a BigWig-formatted file. Simplicity of use is one of the most notable features of SraTailor: entering an accession number of an SRA and clicking the mouse are the only steps required to obtain BigWig-formatted files and to graphically visualize the extents of reads at given loci. SraTailor is also able to make peak calls, generate files of other formats, process users' own data, and accept various command-line-like options. Therefore, this software makes ChIP-seq data fully exploitable by a wide range of biologists. SraTailor is freely available at http://www.devbio.med.kyushu-u.ac.jp/sra_tailor/, and runs on both Mac and Windows machines. © 2014 The Authors Genes to Cells © 2014 by the Molecular Biology Society of Japan and Wiley Publishing Asia Pty Ltd.

  5. Hierarchical data structures for graphics program languages

    International Nuclear Information System (INIS)

    Gonauser, M.; Schinner, P.; Weiss, J.

    1978-01-01

    Graphic data processing with a computer makes exacting demands on the interactive capability of the program language and the management of the graphic data. A description of the structure of a graphics program language which has been shown by initial practical experiments to possess a particularly favorable interactive capability is followed by the evaluation of various data structures (list, tree, ring) with respect to their interactive capability in processing graphics. A practical structure is proposed. (orig.) [de

  6. Data structures, computer graphics, and pattern recognition

    CERN Document Server

    Klinger, A; Kunii, T L

    1977-01-01

    Data Structures, Computer Graphics, and Pattern Recognition focuses on the computer graphics and pattern recognition applications of data structures methodology.This book presents design related principles and research aspects of the computer graphics, system design, data management, and pattern recognition tasks. The topics include the data structure design, concise structuring of geometric data for computer aided design, and data structures for pattern recognition algorithms. The survey of data structures for computer graphics systems, application of relational data structures in computer gr

  7. A Relational Reasoning Approach to Text-Graphic Processing

    Science.gov (United States)

    Danielson, Robert W.; Sinatra, Gale M.

    2017-01-01

    We propose that research on text-graphic processing could be strengthened by the inclusion of relational reasoning perspectives. We briefly outline four aspects of relational reasoning: "analogies," "anomalies," "antinomies", and "antitheses". Next, we illustrate how text-graphic researchers have been…

  8. Q-Technique and Graphics Research.

    Science.gov (United States)

    Kahle, Roger R.

    Because Q-technique is as appropriate for use with visual and design items as for use with words, it is not stymied by the topics one is likely to encounter in graphics research. In particular Q-technique is suitable for studying the so-called "congeniality" of typography, for various copytesting usages, and for multivariate graphics research. The…

  9. Graphics on demand: the automatic data visualization on the WEB

    Directory of Open Access Journals (Sweden)

    Ramzi Guetari

    2017-06-01

    Full Text Available Data visualization is an effective tool for communicating the results of opinion surveys, epidemiological studies, statistics on consumer habits, etc. The graphical representation of data usually assists human information processing by reducing demands on attention, working memory, and long-term memory. It allows, among other things, a faster reading of the information (by acting on the forms, directions, colors…, the independence of the language (or culture, a better capture the attention of the audience, etc. Data that could be graphically represented may be structured or unstructured. The unstructured data, whose volume grows exponentially, often hide important and even vital information for society and companies. It, therefore, takes a lot of work to extract valuable information from unstructured data. If it is easier to understand a message through structured data, such as a table, than through a long narrative text, it is even easier to convey a message through a graphic than a table. In our opinion, it is often very useful to synthesize the unstructured data in the form of graphical representations. In this paper, we present an approach for processing unstructured data containing statistics in order to represent them graphically. This approach allows transforming the unstructured data into structured one which globally conveys the same countable information. The graphical representation of such a structured data is then obvious. This approach deals with both quantitative and qualitative data. It is based on Natural Language Processing Techniques and Text Mining. An application that implements this process is also presented in this paper.

  10. Engineering graphics data entry for space station data base

    Science.gov (United States)

    Lacovara, R. C.

    1986-01-01

    The entry of graphical engineering data into the Space Station Data Base was examined. Discussed were: representation of graphics objects; representation of connectivity data; graphics capture hardware; graphics display hardware; site-wide distribution of graphics, and consolidation of tools and hardware. A fundamental assumption was that existing equipment such as IBM based graphics capture software and VAX networked facilities would be exploited. Defensible conclusions reached after study and simulations of use of these systems at the engineering level are: (1) existing IBM based graphics capture software is an adequate and economical means of entry of schematic and block diagram data for present and anticipated electronic systems for Space Station; (2) connectivity data from the aforementioned system may be incorporated into the envisioned Space Station Data Base with modest effort; (3) graphics and connectivity data captured on the IBM based system may be exported to the VAX network in a simple and direct fashion; (4) graphics data may be displayed site-wide on VT-125 terminals and lookalikes; (5) graphics hard-copy may be produced site-wide on various dot-matrix printers; and (6) the system may provide integrated engineering services at both the engineering and engineering management level.

  11. Identification of Learning Processes by Means of Computer Graphics.

    Science.gov (United States)

    Sorensen, Birgitte Holm

    1993-01-01

    Describes a development project for the use of computer graphics and video in connection with an inservice training course for primary education teachers in Denmark. Topics addressed include research approaches to computers; computer graphics in learning processes; activities relating to computer graphics; the role of the teacher; and student…

  12. High-Throughput Tabular Data Processor - Platform independent graphical tool for processing large data sets.

    Science.gov (United States)

    Madanecki, Piotr; Bałut, Magdalena; Buckley, Patrick G; Ochocka, J Renata; Bartoszewski, Rafał; Crossman, David K; Messiaen, Ludwine M; Piotrowski, Arkadiusz

    2018-01-01

    High-throughput technologies generate considerable amount of data which often requires bioinformatic expertise to analyze. Here we present High-Throughput Tabular Data Processor (HTDP), a platform independent Java program. HTDP works on any character-delimited column data (e.g. BED, GFF, GTF, PSL, WIG, VCF) from multiple text files and supports merging, filtering and converting of data that is produced in the course of high-throughput experiments. HTDP can also utilize itemized sets of conditions from external files for complex or repetitive filtering/merging tasks. The program is intended to aid global, real-time processing of large data sets using a graphical user interface (GUI). Therefore, no prior expertise in programming, regular expression, or command line usage is required of the user. Additionally, no a priori assumptions are imposed on the internal file composition. We demonstrate the flexibility and potential of HTDP in real-life research tasks including microarray and massively parallel sequencing, i.e. identification of disease predisposing variants in the next generation sequencing data as well as comprehensive concurrent analysis of microarray and sequencing results. We also show the utility of HTDP in technical tasks including data merge, reduction and filtering with external criteria files. HTDP was developed to address functionality that is missing or rudimentary in other GUI software for processing character-delimited column data from high-throughput technologies. Flexibility, in terms of input file handling, provides long term potential functionality in high-throughput analysis pipelines, as the program is not limited by the currently existing applications and data formats. HTDP is available as the Open Source software (https://github.com/pmadanecki/htdp).

  13. FamSeq: a variant calling program for family-based sequencing data using graphics processing units.

    Directory of Open Access Journals (Sweden)

    Gang Peng

    2014-10-01

    Full Text Available Various algorithms have been developed for variant calling using next-generation sequencing data, and various methods have been applied to reduce the associated false positive and false negative rates. Few variant calling programs, however, utilize the pedigree information when the family-based sequencing data are available. Here, we present a program, FamSeq, which reduces both false positive and false negative rates by incorporating the pedigree information from the Mendelian genetic model into variant calling. To accommodate variations in data complexity, FamSeq consists of four distinct implementations of the Mendelian genetic model: the Bayesian network algorithm, a graphics processing unit version of the Bayesian network algorithm, the Elston-Stewart algorithm and the Markov chain Monte Carlo algorithm. To make the software efficient and applicable to large families, we parallelized the Bayesian network algorithm that copes with pedigrees with inbreeding loops without losing calculation precision on an NVIDIA graphics processing unit. In order to compare the difference in the four methods, we applied FamSeq to pedigree sequencing data with family sizes that varied from 7 to 12. When there is no inbreeding loop in the pedigree, the Elston-Stewart algorithm gives analytical results in a short time. If there are inbreeding loops in the pedigree, we recommend the Bayesian network method, which provides exact answers. To improve the computing speed of the Bayesian network method, we parallelized the computation on a graphics processing unit. This allowed the Bayesian network method to process the whole genome sequencing data of a family of 12 individuals within two days, which was a 10-fold time reduction compared to the time required for this computation on a central processing unit.

  14. ElectroEncephaloGraphics: Making waves in computer graphics research.

    Science.gov (United States)

    Mustafa, Maryam; Magnor, Marcus

    2014-01-01

    Electroencephalography (EEG) is a novel modality for investigating perceptual graphics problems. Until recently, EEG has predominantly been used for clinical diagnosis, in psychology, and by the brain-computer-interface community. Researchers are extending it to help understand the perception of visual output from graphics applications and to create approaches based on direct neural feedback. Researchers have applied EEG to graphics to determine perceived image and video quality by detecting typical rendering artifacts, to evaluate visualization effectiveness by calculating the cognitive load, and to automatically optimize rendering parameters for images and videos on the basis of implicit neural feedback.

  15. A standardised graphic method for describing data privacy frameworks in primary care research using a flexible zone model.

    Science.gov (United States)

    Kuchinke, Wolfgang; Ohmann, Christian; Verheij, Robert A; van Veen, Evert-Ben; Arvanitis, Theodoros N; Taweel, Adel; Delaney, Brendan C

    2014-12-01

    To develop a model describing core concepts and principles of data flow, data privacy and confidentiality, in a simple and flexible way, using concise process descriptions and a diagrammatic notation applied to research workflow processes. The model should help to generate robust data privacy frameworks for research done with patient data. Based on an exploration of EU legal requirements for data protection and privacy, data access policies, and existing privacy frameworks of research projects, basic concepts and common processes were extracted, described and incorporated into a model with a formal graphical representation and a standardised notation. The Unified Modelling Language (UML) notation was enriched by workflow and own symbols to enable the representation of extended data flow requirements, data privacy and data security requirements, privacy enhancing techniques (PET) and to allow privacy threat analysis for research scenarios. Our model is built upon the concept of three privacy zones (Care Zone, Non-care Zone and Research Zone) containing databases, data transformation operators, such as data linkers and privacy filters. Using these model components, a risk gradient for moving data from a zone of high risk for patient identification to a zone of low risk can be described. The model was applied to the analysis of data flows in several general clinical research use cases and two research scenarios from the TRANSFoRm project (e.g., finding patients for clinical research and linkage of databases). The model was validated by representing research done with the NIVEL Primary Care Database in the Netherlands. The model allows analysis of data privacy and confidentiality issues for research with patient data in a structured way and provides a framework to specify a privacy compliant data flow, to communicate privacy requirements and to identify weak points for an adequate implementation of data privacy. Copyright © 2014 Elsevier Ireland Ltd. All rights

  16. Web-based (HTML5) interactive graphics for fusion research and collaboration

    International Nuclear Information System (INIS)

    Kim, E.N.; Schissel, D.P.; Abla, G.; Flanagan, S.; Lee, X.

    2012-01-01

    Highlights: ► Interactive data visualization is supported via the Web without a browser plugin and provides users easy, real-time access to data of different types from various locations. ► Crosshair, zoom, pan as well as toggling dimensionality and a slice bar for multi-dimensional data are available. ► Data with PHP API can be applied: MDSplus and SQL have been tested. ► Modular in design, this has been deployed to support both the experimental and the simulation research arenas. - Abstract: With the continuing development of web technologies, it is becoming feasible for websites to operate a lot like a scientific desktop application. This has opened up more possibilities for utilizing the web browser for interactive scientific research and providing new means of on-line communication and collaboration. This paper describes the research and deployment for utilizing these enhanced web graphics capabilities on the fusion research tools which has led to a general toolkit that can be deployed as required. It allows users to dynamically create, interact with and share with others, the large sets of data generated by the fusion experiments and simulations. Hypertext Preprocessor (PHP), a general-purpose scripting language for the Web, is used to process a series of inputs, and determine the data source types and locations to fetch and organize the data. Protovis, a Javascript and SVG based web graphics package, then quickly draws the interactive graphs and makes it available to the worldwide audience. This toolkit has been deployed to both the simulation and experimental arenas. The deployed applications will be presented as well as the architecture and technologies used in producing the general graphics toolkit.

  17. Diagrams and Relational Maps: The Use of Graphic Elicitation Techniques with Interviewing for Data Collection, Analysis, and Display

    Directory of Open Access Journals (Sweden)

    Andrea J. Copeland PhD

    2012-12-01

    Full Text Available Graphic elicitation techniques, which ask research participants to provide visual data representing personal understandings of concepts, experiences, beliefs, or behaviors, can be especially useful in helping participants to express complex or abstract ideas or opinions. The benefits and drawbacks of using graphic elicitation techniques for data collection, data analysis, and data display in qualitative research studies are analyzed using examples from a research study that employed data matrices and relational maps in conjunction with semi-structured interviews. Results from this analysis demonstrate that the use of these combined techniques for data collection facilitates triangulation and helps to establish internal consistency of data, thereby increasing the trustworthiness of the interpretation of that data and lending support to validity and reliability claims. Findings support the notion that graphic elicitation techniques can be highly useful in qualitative research studies at the data collection, the data analysis, and the data reporting stages. For example, this study found that graphic elicitation techniques are especially useful for eliciting data related to emotions and emotional experiences.

  18. Web-based (HTML5) interactive graphics for fusion research and collaboration

    Energy Technology Data Exchange (ETDEWEB)

    Kim, E.N., E-mail: kimny@fusion.gat.com [General Atomics, P.O. Box 85608, San Diego, CA (United States); Schissel, D.P.; Abla, G.; Flanagan, S.; Lee, X. [General Atomics, P.O. Box 85608, San Diego, CA (United States)

    2012-12-15

    Highlights: Black-Right-Pointing-Pointer Interactive data visualization is supported via the Web without a browser plugin and provides users easy, real-time access to data of different types from various locations. Black-Right-Pointing-Pointer Crosshair, zoom, pan as well as toggling dimensionality and a slice bar for multi-dimensional data are available. Black-Right-Pointing-Pointer Data with PHP API can be applied: MDSplus and SQL have been tested. Black-Right-Pointing-Pointer Modular in design, this has been deployed to support both the experimental and the simulation research arenas. - Abstract: With the continuing development of web technologies, it is becoming feasible for websites to operate a lot like a scientific desktop application. This has opened up more possibilities for utilizing the web browser for interactive scientific research and providing new means of on-line communication and collaboration. This paper describes the research and deployment for utilizing these enhanced web graphics capabilities on the fusion research tools which has led to a general toolkit that can be deployed as required. It allows users to dynamically create, interact with and share with others, the large sets of data generated by the fusion experiments and simulations. Hypertext Preprocessor (PHP), a general-purpose scripting language for the Web, is used to process a series of inputs, and determine the data source types and locations to fetch and organize the data. Protovis, a Javascript and SVG based web graphics package, then quickly draws the interactive graphs and makes it available to the worldwide audience. This toolkit has been deployed to both the simulation and experimental arenas. The deployed applications will be presented as well as the architecture and technologies used in producing the general graphics toolkit.

  19. Perception in statistical graphics

    Science.gov (United States)

    VanderPlas, Susan Ruth

    There has been quite a bit of research on statistical graphics and visualization, generally focused on new types of graphics, new software to create graphics, interactivity, and usability studies. Our ability to interpret and use statistical graphics hinges on the interface between the graph itself and the brain that perceives and interprets it, and there is substantially less research on the interplay between graph, eye, brain, and mind than is sufficient to understand the nature of these relationships. The goal of the work presented here is to further explore the interplay between a static graph, the translation of that graph from paper to mental representation (the journey from eye to brain), and the mental processes that operate on that graph once it is transferred into memory (mind). Understanding the perception of statistical graphics should allow researchers to create more effective graphs which produce fewer distortions and viewer errors while reducing the cognitive load necessary to understand the information presented in the graph. Taken together, these experiments should lay a foundation for exploring the perception of statistical graphics. There has been considerable research into the accuracy of numerical judgments viewers make from graphs, and these studies are useful, but it is more effective to understand how errors in these judgments occur so that the root cause of the error can be addressed directly. Understanding how visual reasoning relates to the ability to make judgments from graphs allows us to tailor graphics to particular target audiences. In addition, understanding the hierarchy of salient features in statistical graphics allows us to clearly communicate the important message from data or statistical models by constructing graphics which are designed specifically for the perceptual system.

  20. The graphics calculator in mathematics education: A critical review of recent research

    Science.gov (United States)

    Penglase, Marina; Arnold, Stephen

    1996-04-01

    The graphics calculator, sometimes referred to as the "super calculator," has sparked great interest among mathematics educators. Considered by many to be a tool which has the potential to revolutionise mathematics education, a significant amount of research has been conducted into its effectiveness as a tool for instruction and learning within precalculus and calculus courses, specifically in the study of functions, graphing and modelling. Some results suggest that these devices (a) can facilitate the learning of functions and graphing concepts and the development of spatial visualisation skills; (b) promote mathematical investigation and exploration; and (c) encourage a shift in emphasis from algebraic manipulation and proof to graphical investigation and examination of the relationship between graphical, algebraic and geometric representations. Other studies, however, indicate that there is still a need for manipulative techniques in the learning of function and graphing concepts, that the use of graphics calculators may not facilitate the learning of particular precalculus topics, and that some "de-skilling" may occur, especially among males. It is the contention of this paper, however, that much of the research in this new and important field fails to provide clear guidance or even to inform debate in adequate ways regarding the role of graphics calculators in mathematics teaching and learning. By failing to distinguish the role of the tool from that of the instructional process, many studies reviewed could be more appropriately classified as "program evaluations" rather than as research on the graphics calculator per se. Further, claims regarding the effectiveness of the graphics calculator as a tool for learning frequently fail to recognise that judgments of effectiveness result directly from existing assumptions regarding both assessment practice and student "achievement."

  1. 3D data processing with advanced computer graphics tools

    Science.gov (United States)

    Zhang, Song; Ekstrand, Laura; Grieve, Taylor; Eisenmann, David J.; Chumbley, L. Scott

    2012-09-01

    Often, the 3-D raw data coming from an optical profilometer contains spiky noises and irregular grid, which make it difficult to analyze and difficult to store because of the enormously large size. This paper is to address these two issues for an optical profilometer by substantially reducing the spiky noise of the 3-D raw data from an optical profilometer, and by rapidly re-sampling the raw data into regular grids at any pixel size and any orientation with advanced computer graphics tools. Experimental results will be presented to demonstrate the effectiveness of the proposed approach.

  2. Seeing is believing: good graphic design principles for medical research.

    Science.gov (United States)

    Duke, Susan P; Bancken, Fabrice; Crowe, Brenda; Soukup, Mat; Botsis, Taxiarchis; Forshee, Richard

    2015-09-30

    Have you noticed when you browse a book, journal, study report, or product label how your eye is drawn to figures more than to words and tables? Statistical graphs are powerful ways to transparently and succinctly communicate the key points of medical research. Furthermore, the graphic design itself adds to the clarity of the messages in the data. The goal of this paper is to provide a mechanism for selecting the appropriate graph to thoughtfully construct quality deliverables using good graphic design principles. Examples are motivated by the efforts of a Safety Graphics Working Group that consisted of scientists from the pharmaceutical industry, Food and Drug Administration, and academic institutions. Copyright © 2015 John Wiley & Sons, Ltd.

  3. Real-time radar signal processing using GPGPU (general-purpose graphic processing unit)

    Science.gov (United States)

    Kong, Fanxing; Zhang, Yan Rockee; Cai, Jingxiao; Palmer, Robert D.

    2016-05-01

    This study introduces a practical approach to develop real-time signal processing chain for general phased array radar on NVIDIA GPUs(Graphical Processing Units) using CUDA (Compute Unified Device Architecture) libraries such as cuBlas and cuFFT, which are adopted from open source libraries and optimized for the NVIDIA GPUs. The processed results are rigorously verified against those from the CPUs. Performance benchmarked in computation time with various input data cube sizes are compared across GPUs and CPUs. Through the analysis, it will be demonstrated that GPGPUs (General Purpose GPU) real-time processing of the array radar data is possible with relatively low-cost commercial GPUs.

  4. A Local Poisson Graphical Model for inferring networks from sequencing data.

    Science.gov (United States)

    Allen, Genevera I; Liu, Zhandong

    2013-09-01

    Gaussian graphical models, a class of undirected graphs or Markov Networks, are often used to infer gene networks based on microarray expression data. Many scientists, however, have begun using high-throughput sequencing technologies such as RNA-sequencing or next generation sequencing to measure gene expression. As the resulting data consists of counts of sequencing reads for each gene, Gaussian graphical models are not optimal for this discrete data. In this paper, we propose a novel method for inferring gene networks from sequencing data: the Local Poisson Graphical Model. Our model assumes a Local Markov property where each variable conditional on all other variables is Poisson distributed. We develop a neighborhood selection algorithm to fit our model locally by performing a series of l1 penalized Poisson, or log-linear, regressions. This yields a fast parallel algorithm for estimating networks from next generation sequencing data. In simulations, we illustrate the effectiveness of our methods for recovering network structure from count data. A case study on breast cancer microRNAs (miRNAs), a novel application of graphical models, finds known regulators of breast cancer genes and discovers novel miRNA clusters and hubs that are targets for future research.

  5. High-throughput sequence alignment using Graphics Processing Units

    Directory of Open Access Journals (Sweden)

    Trapnell Cole

    2007-12-01

    Full Text Available Abstract Background The recent availability of new, less expensive high-throughput DNA sequencing technologies has yielded a dramatic increase in the volume of sequence data that must be analyzed. These data are being generated for several purposes, including genotyping, genome resequencing, metagenomics, and de novo genome assembly projects. Sequence alignment programs such as MUMmer have proven essential for analysis of these data, but researchers will need ever faster, high-throughput alignment tools running on inexpensive hardware to keep up with new sequence technologies. Results This paper describes MUMmerGPU, an open-source high-throughput parallel pairwise local sequence alignment program that runs on commodity Graphics Processing Units (GPUs in common workstations. MUMmerGPU uses the new Compute Unified Device Architecture (CUDA from nVidia to align multiple query sequences against a single reference sequence stored as a suffix tree. By processing the queries in parallel on the highly parallel graphics card, MUMmerGPU achieves more than a 10-fold speedup over a serial CPU version of the sequence alignment kernel, and outperforms the exact alignment component of MUMmer on a high end CPU by 3.5-fold in total application time when aligning reads from recent sequencing projects using Solexa/Illumina, 454, and Sanger sequencing technologies. Conclusion MUMmerGPU is a low cost, ultra-fast sequence alignment program designed to handle the increasing volume of data produced by new, high-throughput sequencing technologies. MUMmerGPU demonstrates that even memory-intensive applications can run significantly faster on the relatively low-cost GPU than on the CPU.

  6. iMOSFLM: a new graphical interface for diffraction-image processing with MOSFLM

    International Nuclear Information System (INIS)

    Battye, T. Geoff G.; Kontogiannis, Luke; Johnson, Owen; Powell, Harold R.; Leslie, Andrew G. W.

    2011-01-01

    A new graphical user interface to the MOSFLM program has been developed to simplify the processing of macromolecular diffraction data. The interface, iMOSFLM, allows data processing via a series of clearly defined tasks and provides visual feedback on the progress of each stage. iMOSFLM is a graphical user interface to the diffraction data-integration program MOSFLM. It is designed to simplify data processing by dividing the process into a series of steps, which are normally carried out sequentially. Each step has its own display pane, allowing control over parameters that influence that step and providing graphical feedback to the user. Suitable values for integration parameters are set automatically, but additional menus provide a detailed level of control for experienced users. The image display and the interfaces to the different tasks (indexing, strategy calculation, cell refinement, integration and history) are described. The most important parameters for each step and the best way of assessing success or failure are discussed

  7. Graphic Design in Libraries: A Conceptual Process

    Science.gov (United States)

    Ruiz, Miguel

    2014-01-01

    Providing successful library services requires efficient and effective communication with users; therefore, it is important that content creators who develop visual materials understand key components of design and, specifically, develop a holistic graphic design process. Graphic design, as a form of visual communication, is the process of…

  8. The End of the Rainbow? Color Schemes for Improved Data Graphics

    Science.gov (United States)

    Light, Adam; Bartlein, Patrick J.

    2004-10-01

    Modern computer displays and printers enable the widespread use of color in scientific communication, but the expertise for designing effective graphics has not kept pace with the technology for producing them. Historically, even the most prestigious publications have tolerated high defect rates in figures and illustrations, and technological advances that make creating and reproducing graphics easier do not appear to have decreased the frequency of errors. Flawed graphics consequently beget more flawed graphics as authors emulate published examples. Color has the potential to enhance communication, but design mistakes can result in color figures that are less effective than gray scale displays of the same data. Empirical research on human subjects can build a fundamental understanding of visual perception and scientific methods can be used to evaluate existing designs, but creating effective data graphics is a design task and not fundamentally a scientific pursuit. Like writing well, creating good data graphics requires a combination of formal knowledge and artistic sensibility tempered by experience: a combination of ``substance, statistics, and design''.

  9. More than words: Using visual graphics for community-based health research.

    Science.gov (United States)

    Morton Ninomiya, Melody E

    2017-04-20

    With increased attention to knowledge translation and community engagement in the applied health research field, many researchers aim to find effective ways of engaging health policy and decision makers and community stakeholders. While visual graphics such as graphs, charts, figures and photographs are common in scientific research dissemination, they are less common as a communication tool in research. In this commentary, I illustrate how and why visual graphics were created and used to facilitate dialogue and communication throughout all phases of a community-based health research study with a rural Indigenous community, advancing community engagement and knowledge utilization of a research study. I suggest that it is essential that researchers consider the use of visual graphics to accurately communicate and translate important health research concepts and content in accessible forms for diverse research stakeholders and target audiences.

  10. Analog-to-digital clinical data collection on networked workstations with graphic user interface.

    Science.gov (United States)

    Lunt, D

    1991-02-01

    An innovative respiratory examination system has been developed that combines physiological response measurement, real-time graphic displays, user-driven operating sequences, and networked file archiving and review into a scientific research and clinical diagnosis tool. This newly constructed computer network is being used to enhance the research center's ability to perform patient pulmonary function examinations. Respiratory data are simultaneously acquired and graphically presented during patient breathing maneuvers and rapidly transformed into graphic and numeric reports, suitable for statistical analysis or database access. The environment consists of the hardware (Macintosh computer, MacADIOS converters, analog amplifiers), the software (HyperCard v2.0, HyperTalk, XCMDs), and the network (AppleTalk, fileservers, printers) as building blocks for data acquisition, analysis, editing, and storage. System operation modules include: Calibration, Examination, Reports, On-line Help Library, Graphic/Data Editing, and Network Storage.

  11. Optimization Solutions for Improving the Performance of the Parallel Reduction Algorithm Using Graphics Processing Units

    Directory of Open Access Journals (Sweden)

    Ion LUNGU

    2012-01-01

    Full Text Available In this paper, we research, analyze and develop optimization solutions for the parallel reduction function using graphics processing units (GPUs that implement the Compute Unified Device Architecture (CUDA, a modern and novel approach for improving the software performance of data processing applications and algorithms. Many of these applications and algorithms make use of the reduction function in their computational steps. After having designed the function and its algorithmic steps in CUDA, we have progressively developed and implemented optimization solutions for the reduction function. In order to confirm, test and evaluate the solutions' efficiency, we have developed a custom tailored benchmark suite. We have analyzed the obtained experimental results regarding: the comparison of the execution time and bandwidth when using graphic processing units covering the main CUDA architectures (Tesla GT200, Fermi GF100, Kepler GK104 and a central processing unit; the data type influence; the binary operator's influence.

  12. Spectra processing with computer graphics

    International Nuclear Information System (INIS)

    Kruse, H.

    1979-01-01

    A program of processng gamma-ray spectra in rock analysis is described. The peak search was performed by applying a cross-correlation function. The experimental data were approximated by an analytical function represented by the sum of a polynomial and a multiple peak function. The latter is Gaussian, joined with the low-energy side by an exponential. A modified Gauss-Newton algorithm is applied for the purpose of fitting the data to the function. The processing of the values derived from a lunar sample demonstrates the effect of different choices of polynomial orders for approximating the background for various fitting intervals. Observations on applications of interactive graphics are presented. 3 figures, 1 table

  13. Integrating user studies into computer graphics-related courses.

    Science.gov (United States)

    Santos, B S; Dias, P; Silva, S; Ferreira, C; Madeira, J

    2011-01-01

    This paper presents computer graphics. Computer graphics and visualization are essentially about producing images for a target audience, be it the millions watching a new CG-animated movie or the small group of researchers trying to gain insight into the large amount of numerical data resulting from a scientific experiment. To ascertain the final images' effectiveness for their intended audience or the designed visualizations' accuracy and expressiveness, formal user studies are often essential. In human-computer interaction (HCI), such user studies play a similar fundamental role in evaluating the usability and applicability of interaction methods and metaphors for the various devices and software systems we use.

  14. Concurrent use of data base and graphics computer workstations to provide graphic access to large, complex data bases for robotics control of nuclear surveillance and maintenance

    International Nuclear Information System (INIS)

    Dalton, G.R.; Tulenko, J.S.; Zhou, X.

    1990-01-01

    The University of Florida is part of a multiuniversity research effort, sponsored by the US Department of Energy which is under way to develop and deploy an advanced semi-autonomous robotic system for use in nuclear power stations. This paper reports on the development of the computer tools necessary to gain convenient graphic access to the intelligence implicit in a large complex data base such as that in a nuclear reactor plant. This program is integrated as a man/machine interface within the larger context of the total computerized robotic planning and control system. The portion of the project described here addresses the connection between the three-dimensional displays on an interactive graphic workstation and a data-base computer running a large data-base server program. Programming the two computers to work together to accept graphic queries and return answers on the graphic workstation is a key part of the interactive capability developed

  15. An Evaluative and Prescriptive Look at Graphics Research.

    Science.gov (United States)

    Soderston, Candace

    Graphics research history shows that some topics have been studied heavily while others have been almost entirely neglected. Furthermore, researchers have used many different methods of defining and measuring effects such as legibility and comprehension, and this, together with vagueness in reporting, makes it difficult to compare studies and draw…

  16. Graphics supercomputer for computational fluid dynamics research

    Science.gov (United States)

    Liaw, Goang S.

    1994-11-01

    The objective of this project is to purchase a state-of-the-art graphics supercomputer to improve the Computational Fluid Dynamics (CFD) research capability at Alabama A & M University (AAMU) and to support the Air Force research projects. A cutting-edge graphics supercomputer system, Onyx VTX, from Silicon Graphics Computer Systems (SGI), was purchased and installed. Other equipment including a desktop personal computer, PC-486 DX2 with a built-in 10-BaseT Ethernet card, a 10-BaseT hub, an Apple Laser Printer Select 360, and a notebook computer from Zenith were also purchased. A reading room has been converted to a research computer lab by adding some furniture and an air conditioning unit in order to provide an appropriate working environments for researchers and the purchase equipment. All the purchased equipment were successfully installed and are fully functional. Several research projects, including two existing Air Force projects, are being performed using these facilities.

  17. Enhancing graphical literacy skills in the high school science classroom via authentic, intensive data collection and graphical representation exposure

    Science.gov (United States)

    Palmeri, Anthony

    This research project was developed to provide extensive practice and exposure to data collection and data representation in a high school science classroom. The student population engaged in this study included 40 high school sophomores enrolled in two microbiology classes. Laboratory investigations and activities were deliberately designed to include quantitative data collection that necessitated organization and graphical representation. These activities were embedded into the curriculum and conducted in conjunction with the normal and expected course content, rather than as a separate entity. It was expected that routine practice with graph construction and interpretation would result in improved competency when graphing data and proficiency in analyzing graphs. To objectively test the effectiveness in achieving this goal, a pre-test and post-test that included graph construction, interpretation, interpolation, extrapolation, and analysis was administered. Based on the results of a paired T-Test, graphical literacy was significantly enhanced by extensive practice and exposure to data representation.

  18. Visual Invention and the Composition of Scientific Research Graphics: A Topological Approach

    Science.gov (United States)

    Walsh, Lynda

    2018-01-01

    This report details the second phase of an ongoing research project investigating the visual invention and composition processes of scientific researchers. In this phase, four academic researchers completed think-aloud protocols as they composed graphics for research presentations; they also answered follow-up questions about their visual…

  19. GPUmotif: an ultra-fast and energy-efficient motif analysis program using graphics processing units.

    Science.gov (United States)

    Zandevakili, Pooya; Hu, Ming; Qin, Zhaohui

    2012-01-01

    Computational detection of TF binding patterns has become an indispensable tool in functional genomics research. With the rapid advance of new sequencing technologies, large amounts of protein-DNA interaction data have been produced. Analyzing this data can provide substantial insight into the mechanisms of transcriptional regulation. However, the massive amount of sequence data presents daunting challenges. In our previous work, we have developed a novel algorithm called Hybrid Motif Sampler (HMS) that enables more scalable and accurate motif analysis. Despite much improvement, HMS is still time-consuming due to the requirement to calculate matching probabilities position-by-position. Using the NVIDIA CUDA toolkit, we developed a graphics processing unit (GPU)-accelerated motif analysis program named GPUmotif. We proposed a "fragmentation" technique to hide data transfer time between memories. Performance comparison studies showed that commonly-used model-based motif scan and de novo motif finding procedures such as HMS can be dramatically accelerated when running GPUmotif on NVIDIA graphics cards. As a result, energy consumption can also be greatly reduced when running motif analysis using GPUmotif. The GPUmotif program is freely available at http://sourceforge.net/projects/gpumotif/

  20. GPUmotif: an ultra-fast and energy-efficient motif analysis program using graphics processing units.

    Directory of Open Access Journals (Sweden)

    Pooya Zandevakili

    Full Text Available Computational detection of TF binding patterns has become an indispensable tool in functional genomics research. With the rapid advance of new sequencing technologies, large amounts of protein-DNA interaction data have been produced. Analyzing this data can provide substantial insight into the mechanisms of transcriptional regulation. However, the massive amount of sequence data presents daunting challenges. In our previous work, we have developed a novel algorithm called Hybrid Motif Sampler (HMS that enables more scalable and accurate motif analysis. Despite much improvement, HMS is still time-consuming due to the requirement to calculate matching probabilities position-by-position. Using the NVIDIA CUDA toolkit, we developed a graphics processing unit (GPU-accelerated motif analysis program named GPUmotif. We proposed a "fragmentation" technique to hide data transfer time between memories. Performance comparison studies showed that commonly-used model-based motif scan and de novo motif finding procedures such as HMS can be dramatically accelerated when running GPUmotif on NVIDIA graphics cards. As a result, energy consumption can also be greatly reduced when running motif analysis using GPUmotif. The GPUmotif program is freely available at http://sourceforge.net/projects/gpumotif/

  1. Development of the data logging and graphical presentation for gamma scanning, trouble shooting and process evaluation in the petroleum refinery column

    International Nuclear Information System (INIS)

    Saengchantr, Dhanaj; Chueinta Siripone

    2009-07-01

    Full text: Software of data logging and graphical presentation on gamma scanning for trouble shooting and process evaluation of the petroleum refinery column was developed. While setting the gamma source and gamma detector at the opposite orientation along side the column and recording the transmitted radiation through the column at several elevations, the relative density gamma intensity vs. vertical elevation could be obtained in the graphical mode. In comparison with engineering drawing, the physical and process abnormalities could be clearly evaluated during field investigation. The program could also accumulate up to 8 data sets, each of 1,000 points allowing with convenience, the comparison of different operational parameters adjustment during remedy of the problem and/or process optimization. Incorporated with this development and other factors, the technology capability of the TINT Service Center to the petroleum refinery was also enhanced

  2. BirdsEyeView (BEV: graphical overviews of experimental data

    Directory of Open Access Journals (Sweden)

    Zhang Lifeng

    2012-09-01

    Full Text Available Abstract Background Analyzing global experimental data can be tedious and time-consuming. Thus, helping biologists see results as quickly and easily as possible can facilitate biological research, and is the purpose of the software we describe. Results We present BirdsEyeView, a software system for visualizing experimental transcriptomic data using different views that users can switch among and compare. BirdsEyeView graphically maps data to three views: Cellular Map (currently a plant cell, Pathway Tree with dynamic mapping, and Gene Ontology http://www.geneontology.org Biological Processes and Molecular Functions. By displaying color-coded values for transcript levels across different views, BirdsEyeView can assist users in developing hypotheses about their experiment results. Conclusions BirdsEyeView is a software system available as a Java Webstart package for visualizing transcriptomic data in the context of different biological views to assist biologists in investigating experimental results. BirdsEyeView can be obtained from http://metnetdb.org/MetNet_BirdsEyeView.htm.

  3. Massively Parallel Signal Processing using the Graphics Processing Unit for Real-Time Brain-Computer Interface Feature Extraction.

    Science.gov (United States)

    Wilson, J Adam; Williams, Justin C

    2009-01-01

    The clock speeds of modern computer processors have nearly plateaued in the past 5 years. Consequently, neural prosthetic systems that rely on processing large quantities of data in a short period of time face a bottleneck, in that it may not be possible to process all of the data recorded from an electrode array with high channel counts and bandwidth, such as electrocorticographic grids or other implantable systems. Therefore, in this study a method of using the processing capabilities of a graphics card [graphics processing unit (GPU)] was developed for real-time neural signal processing of a brain-computer interface (BCI). The NVIDIA CUDA system was used to offload processing to the GPU, which is capable of running many operations in parallel, potentially greatly increasing the speed of existing algorithms. The BCI system records many channels of data, which are processed and translated into a control signal, such as the movement of a computer cursor. This signal processing chain involves computing a matrix-matrix multiplication (i.e., a spatial filter), followed by calculating the power spectral density on every channel using an auto-regressive method, and finally classifying appropriate features for control. In this study, the first two computationally intensive steps were implemented on the GPU, and the speed was compared to both the current implementation and a central processing unit-based implementation that uses multi-threading. Significant performance gains were obtained with GPU processing: the current implementation processed 1000 channels of 250 ms in 933 ms, while the new GPU method took only 27 ms, an improvement of nearly 35 times.

  4. Basic Graphical Data Management System (BAGDAMS version 1.0)

    International Nuclear Information System (INIS)

    Weindorf, J.W.

    1979-06-01

    BAGDAMS (BAsic Graphical DAta Management System) is a set of FORTRAN callable subroutines which provides mass storage data structuring and manipulation capabilities. Although primarily designed to facilitate the handling of graphical data files (files containing information to be displayed upon graphical hardware devices), BAGDAMS can also be used in many other applications requiring complex mass storage resident data structures. (author)

  5. Data processing system for spectroscopy at Novillo Tokamak

    International Nuclear Information System (INIS)

    Ortega C, G.; Gaytan G, E.

    1998-01-01

    Taking as basis some proposed methodologies by software engineering it was designed and developed a data processing system coming from the diagnostic equipment by spectroscopy, for the study of plasma impurities, during the cleaning discharges. the data acquisition is realized through an electronic interface which communicates the computer with the spectroscopy system of Novillo Tokamak. The data were obtained starting from files type text and processed for their subsequently graphic presentation. For development of this system named PRODATN (Processing of Data for Spectroscopy in Novillo Tokamak) was used the LabVIEW graphic programming language. (Author)

  6. Upside to downsizing : Acceleware's graphic processor technology propels seismic data processing revolution

    Energy Technology Data Exchange (ETDEWEB)

    Smith, M.

    2009-11-15

    Accelware has developed a graphic processor technology (GPU) that is transforming the petroleum industry. The benefits of the technology are its small-footprint, low-wattage, and high speed. The software brings supercomputing speed to the desktop by leveraging the massive parallel processing capacity to the very latest in GPU technology. This article discussed the GPU technology and its emergence as a powerful supercomputing tool. Accelware's partnering with California-based NVIDIA was also outlined. The advantages of the technology were also discussed including its smaller footprint. Accelware's hardware takes up a fraction of the space and uses up to 70 per cent less power than a traditional central processing unit. By combining Accelware's core knowledge in making complex algorithms run in parallel with an in-house team of seismic industry experts, the company provides software solutions for seismic data processors that access the massively parallel processing capabilities of GPUs. 1 fig.

  7. Application of computer generated color graphic techniques to the processing and display of three dimensional fluid dynamic data

    Science.gov (United States)

    Anderson, B. H.; Putt, C. W.; Giamati, C. C.

    1981-01-01

    Color coding techniques used in the processing of remote sensing imagery were adapted and applied to the fluid dynamics problems associated with turbofan mixer nozzles. The computer generated color graphics were found to be useful in reconstructing the measured flow field from low resolution experimental data to give more physical meaning to this information and in scanning and interpreting the large volume of computer generated data from the three dimensional viscous computer code used in the analysis.

  8. Development of the spent fuel disassembling process by utilizing the 3D graphic design technology

    International Nuclear Information System (INIS)

    Song, T. K.; Lee, J. Y.; Kim, S. H.; Yun, J. S.

    2001-01-01

    For developing the spent fuel disassembling process, the 3D graphic simulation has been established by utilizing the 3D graphic design technology which is widely used in the industry. The spent fuel disassembling process consists of a downender, a rod extraction device, a rod cutting device, a pellet extracting device and a skeleton compaction device. In this study, the 3D graphical design model of these devices is implemented by conceptual design and established the virtual workcell within kinematics to motion of each device. By implementing this graphic simulation, all the unit process involved in the spent fuel disassembling processes are analyzed and optimized. The 3D graphical model and the 3D graphic simulation can be effectively used for designing the process equipment, as well as the optimized process and maintenance process

  9. Enriching Students’ Vocabulary Mastery Using Graphic Organizers

    Directory of Open Access Journals (Sweden)

    Syaifudin Latif Darmawan

    2017-04-01

    Full Text Available This action research is carried out to (1 identify whether graphic organizers enrich student’s vocabulary mastery; and (2 to describe the classroom situation when graphic organizers are employed in instructional process of vocabulary. The research is conducted in two cycles from March to May 2016/2017 in the eight years of SMP Muhammadiyah Sekampung, East lampung. The procedure of the research consists of identifying the problem, planning the action, implementing the action, observing the action, and reflecting the result of the research. Qualitative data are collected through interview, observation, questionnaire, and research diary. Quantitative data are collected through test. To analyze qualitative data, the researcher used constant comparative method. It consists of four steps: (1 comparing incidents applicable to each category; (2 Integrating categories and their properties; (3 delimiting the theory; (4 Writing the theory. Meanwhile, to analyze quantitative data, the researcher employed descriptive statistic.    The result of the research shows that using graphic organizers can enrich students’ vocabulary mastery and classroom situation. The improvement on students’ vocabulary included; a the students are able to speak English; b the students are able to understand the meaning of the text as they have a lot of vocabularies. The improvement of the classroom situation; (a students come on time in the class (b students are more motivated to join the class (c Students pay more attention in the instructional process (d students’ participation in responding the questions are high.

  10. Iterative Methods for MPC on Graphical Processing Units

    DEFF Research Database (Denmark)

    Gade-Nielsen, Nicolai Fog; Jørgensen, John Bagterp; Dammann, Bernd

    2012-01-01

    The high oating point performance and memory bandwidth of Graphical Processing Units (GPUs) makes them ideal for a large number of computations which often arises in scientic computing, such as matrix operations. GPUs achieve this performance by utilizing massive par- allelism, which requires ree...... as to avoid the use of dense matrices, which may be too large for the limited memory capacity of current graphics cards.......The high oating point performance and memory bandwidth of Graphical Processing Units (GPUs) makes them ideal for a large number of computations which often arises in scientic computing, such as matrix operations. GPUs achieve this performance by utilizing massive par- allelism, which requires...

  11. Collection Of Software For Computer Graphics

    Science.gov (United States)

    Hibbard, Eric A.; Makatura, George

    1990-01-01

    Ames Research Graphics System (ARCGRAPH) collection of software libraries and software utilities assisting researchers in generating, manipulating, and visualizing graphical data. Defines metafile format containing device-independent graphical data. File format used with various computer-graphics-manipulation and -animation software packages at Ames, including SURF (COSMIC Program ARC-12381) and GAS (COSMIC Program ARC-12379). Consists of two-stage "pipeline" used to put out graphical primitives. ARCGRAPH libraries developed on VAX computer running VMS.

  12. Applications of computer-graphics animation for motion-perception research

    Science.gov (United States)

    Proffitt, D. R.; Kaiser, M. K.

    1986-01-01

    The advantages and limitations of using computer animated stimuli in studying motion perception are presented and discussed. Most current programs of motion perception research could not be pursued without the use of computer graphics animation. Computer generated displays afford latitudes of freedom and control that are almost impossible to attain through conventional methods. There are, however, limitations to this presentational medium. At present, computer generated displays present simplified approximations of the dynamics in natural events. Very little is known about how the differences between natural events and computer simulations influence perceptual processing. In practice, the differences are assumed to be irrelevant to the questions under study, and that findings with computer generated stimuli will generalize to natural events.

  13. Data analysis using a data base driven graphics animation system

    International Nuclear Information System (INIS)

    Schwieder, D.H.; Stewart, H.D.; Curtis, J.N.

    1985-01-01

    A graphics animation system has been developed at the Idaho National Engineering Laboratory (INEL) to assist engineers in the analysis of large amounts of time series data. Most prior attempts at computer animation of data involve the development of large and expensive problem-specific systems. This paper discusses a generalized interactive computer animation system designed to be used in a wide variety of data analysis applications. By using relational data base storage of graphics and control information, considerable flexibility in design and development of animated displays is achieved

  14. Achieving graphical excellence: suggestions and methods for creating high-quality visual displays of experimental data.

    Science.gov (United States)

    Schriger, D L; Cooper, R J

    2001-01-01

    Graphics are an important means of communicating experimental data and results. There is evidence, however, that many of the graphics printed in scientific journals contain errors, redundancies, and lack clarity. Perhaps more important, many graphics fail to portray data at an appropriate level of detail, presenting summary statistics rather than underlying distributions. We seek to aid investigators in the production of high-quality graphics that do their investigations justice by providing the reader with optimum access to the relevant aspects of the data. The depiction of by-subject data, the signification of pairing when present, and the use of symbolic dimensionality (graphing different symbols to identify relevant subgroups) and small multiples (the presentation of an array of similar graphics each depicting one group of subjects) to portray stratification are stressed. Step-by-step instructions for the construction of high-quality graphics are offered. We hope that authors will incorporate these suggestions when developing graphics to accompany their manuscripts and that this process will lead to improvements in the graphical literacy of scientific journals. We also hope that journal editors will keep these principles in mind when refereeing manuscripts submitted for peer review.

  15. Impact of memory bottleneck on the performance of graphics processing units

    Science.gov (United States)

    Son, Dong Oh; Choi, Hong Jun; Kim, Jong Myon; Kim, Cheol Hong

    2015-12-01

    Recent graphics processing units (GPUs) can process general-purpose applications as well as graphics applications with the help of various user-friendly application programming interfaces (APIs) supported by GPU vendors. Unfortunately, utilizing the hardware resource in the GPU efficiently is a challenging problem, since the GPU architecture is totally different to the traditional CPU architecture. To solve this problem, many studies have focused on the techniques for improving the system performance using GPUs. In this work, we analyze the GPU performance varying GPU parameters such as the number of cores and clock frequency. According to our simulations, the GPU performance can be improved by 125.8% and 16.2% on average as the number of cores and clock frequency increase, respectively. However, the performance is saturated when memory bottleneck problems incur due to huge data requests to the memory. The performance of GPUs can be improved as the memory bottleneck is reduced by changing GPU parameters dynamically.

  16. Partial wave analysis using graphics processing units

    Energy Technology Data Exchange (ETDEWEB)

    Berger, Niklaus; Liu Beijiang; Wang Jike, E-mail: nberger@ihep.ac.c [Institute of High Energy Physics, Chinese Academy of Sciences, 19B Yuquan Lu, Shijingshan, 100049 Beijing (China)

    2010-04-01

    Partial wave analysis is an important tool for determining resonance properties in hadron spectroscopy. For large data samples however, the un-binned likelihood fits employed are computationally very expensive. At the Beijing Spectrometer (BES) III experiment, an increase in statistics compared to earlier experiments of up to two orders of magnitude is expected. In order to allow for a timely analysis of these datasets, additional computing power with short turnover times has to be made available. It turns out that graphics processing units (GPUs) originally developed for 3D computer games have an architecture of massively parallel single instruction multiple data floating point units that is almost ideally suited for the algorithms employed in partial wave analysis. We have implemented a framework for tensor manipulation and partial wave fits called GPUPWA. The user writes a program in pure C++ whilst the GPUPWA classes handle computations on the GPU, memory transfers, caching and other technical details. In conjunction with a recent graphics processor, the framework provides a speed-up of the partial wave fit by more than two orders of magnitude compared to legacy FORTRAN code.

  17. Functional graphical languages for process control

    International Nuclear Information System (INIS)

    1996-01-01

    A wide variety of safety systems are in use today in the process industries. Most of these systems rely on control software using procedural programming languages. This study investigates the use of functional graphical languages for controls in the process industry. Different vendor proprietary software and languages are investigated and evaluation criteria are outlined based on ability to meet regulatory requirements, reference sites involving applications with similar safety concerns, QA/QC procedures, community of users, type and user-friendliness of the man-machine interface, performance of operational code, and degree of flexibility. (author) 16 refs., 4 tabs

  18. The PC graphics handbook

    CERN Document Server

    Sanchez, Julio

    2003-01-01

    Part I - Graphics Fundamentals PC GRAPHICS OVERVIEW History and Evolution Short History of PC Video PS/2 Video Systems SuperVGA Graphics Coprocessors and Accelerators Graphics Applications State-of-the-Art in PC Graphics 3D Application Programming Interfaces POLYGONAL MODELING Vector and Raster Data Coordinate Systems Modeling with Polygons IMAGE TRANSFORMATIONS Matrix-based Representations Matrix Arithmetic 3D Transformations PROGRAMMING MATRIX TRANSFORMATIONS Numeric Data in Matrix Form Array Processing PROJECTIONS AND RENDERING Perspective The Rendering Pipeline LIGHTING AND SHADING Lightin

  19. Data visualization, bar naked: A free tool for creating interactive graphics.

    Science.gov (United States)

    Weissgerber, Tracey L; Savic, Marko; Winham, Stacey J; Stanisavljevic, Dejana; Garovic, Vesna D; Milic, Natasa M

    2017-12-15

    Although bar graphs are designed for categorical data, they are routinely used to present continuous data in studies that have small sample sizes. This presentation is problematic, as many data distributions can lead to the same bar graph, and the actual data may suggest different conclusions from the summary statistics. To address this problem, many journals have implemented new policies that require authors to show the data distribution. This paper introduces a free, web-based tool for creating an interactive alternative to the bar graph (http://statistika.mfub.bg.ac.rs/interactive-dotplot/). This tool allows authors with no programming expertise to create customized interactive graphics, including univariate scatterplots, box plots, and violin plots, for comparing values of a continuous variable across different study groups. Individual data points may be overlaid on the graphs. Additional features facilitate visualization of subgroups or clusters of non-independent data. A second tool enables authors to create interactive graphics from data obtained with repeated independent experiments (http://statistika.mfub.bg.ac.rs/interactive-repeated-experiments-dotplot/). These tools are designed to encourage exploration and critical evaluation of the data behind the summary statistics and may be valuable for promoting transparency, reproducibility, and open science in basic biomedical research. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  20. HMI Data Processing and Electronics Departmenmt. Scientific report 1984

    International Nuclear Information System (INIS)

    1985-01-01

    The Data Processing and Electronics Department carries out application-centered R+D work in the fields of general and process-related data processing, digital and analog measuring systems, and electronic elements. As part of the HMI infrastructure, the Department carries out central data processing and electronics functions. The R+D activities of the Department and its infrastructural tasks were carried out in seven Working Groups and one Project Group: Computer systems; Mathematics and graphical data processing; Software developments; Process computer systems, hardware; Nuclear electronics, measuring and control systems; Research on structural elements and irradiation testing; Computer center and cooperation in the 'Central Project Leader Group of the German Research Network' (DFN). (orig./RB) [de

  1. Interactive graphics for data analysis principles and examples

    CERN Document Server

    Theus, Martin

    2008-01-01

    Introduction PRINCIPLESInteractivity Queries Selection and Linked Highlighting Linking AnalysesInteracting with Graphics Examining a Single Variable Categorical DataContinuous DataTransforming Data Weighted Plots Interactions between Two VariablesTwo Categorical VariablesOne Categorical Variable and One Continuous VariableTwo Continuous VariablesMultidimensional Plots Mosaic PlotsParallel Coordinate Plots Trellis Displays Plot Ensembles and Statistical ModelsResponse ModelsANOVALoglinear ModelsGeographical DataMore Interactivity Sorting and Ordering Zooming Multiple ViewsInteractive Graphics ?

  2. Interactive and Animated Scalable Vector Graphics and R Data Displays

    Directory of Open Access Journals (Sweden)

    Deborah Nolan

    2012-01-01

    Full Text Available We describe an approach to creating interactive and animated graphical displays using R's graphics engine and Scalable Vector Graphics, an XML vocabulary for describing two-dimensional graphical displays. We use the svg( graphics device in R and then post-process the resulting XML documents. The post-processing identities the elements in the SVG that correspond to the different components of the graphical display, e.g., points, axes, labels, lines. One can then annotate these elements to add interactivity and animation effects. One can also use JavaScript to provide dynamic interactive effects to the plot, enabling rich user interactions and compelling visualizations. The resulting SVG documents can be embedded withinHTML documents and can involve JavaScript code that integrates the SVG and HTML objects. The functionality is provided via the SVGAnnotation package and makes static plots generated via R graphics functions available as stand-alone, interactive and animated plots for the Web and other venues.

  3. Fast analytical scatter estimation using graphics processing units.

    Science.gov (United States)

    Ingleby, Harry; Lippuner, Jonas; Rickey, Daniel W; Li, Yue; Elbakri, Idris

    2015-01-01

    To develop a fast patient-specific analytical estimator of first-order Compton and Rayleigh scatter in cone-beam computed tomography, implemented using graphics processing units. The authors developed an analytical estimator for first-order Compton and Rayleigh scatter in a cone-beam computed tomography geometry. The estimator was coded using NVIDIA's CUDA environment for execution on an NVIDIA graphics processing unit. Performance of the analytical estimator was validated by comparison with high-count Monte Carlo simulations for two different numerical phantoms. Monoenergetic analytical simulations were compared with monoenergetic and polyenergetic Monte Carlo simulations. Analytical and Monte Carlo scatter estimates were compared both qualitatively, from visual inspection of images and profiles, and quantitatively, using a scaled root-mean-square difference metric. Reconstruction of simulated cone-beam projection data of an anthropomorphic breast phantom illustrated the potential of this method as a component of a scatter correction algorithm. The monoenergetic analytical and Monte Carlo scatter estimates showed very good agreement. The monoenergetic analytical estimates showed good agreement for Compton single scatter and reasonable agreement for Rayleigh single scatter when compared with polyenergetic Monte Carlo estimates. For a voxelized phantom with dimensions 128 × 128 × 128 voxels and a detector with 256 × 256 pixels, the analytical estimator required 669 seconds for a single projection, using a single NVIDIA 9800 GX2 video card. Accounting for first order scatter in cone-beam image reconstruction improves the contrast to noise ratio of the reconstructed images. The analytical scatter estimator, implemented using graphics processing units, provides rapid and accurate estimates of single scatter and with further acceleration and a method to account for multiple scatter may be useful for practical scatter correction schemes.

  4. Software Graphics Processing Unit (sGPU) for Deep Space Applications

    Science.gov (United States)

    McCabe, Mary; Salazar, George; Steele, Glen

    2015-01-01

    A graphics processing capability will be required for deep space missions and must include a range of applications, from safety-critical vehicle health status to telemedicine for crew health. However, preliminary radiation testing of commercial graphics processing cards suggest they cannot operate in the deep space radiation environment. Investigation into an Software Graphics Processing Unit (sGPU)comprised of commercial-equivalent radiation hardened/tolerant single board computers, field programmable gate arrays, and safety-critical display software shows promising results. Preliminary performance of approximately 30 frames per second (FPS) has been achieved. Use of multi-core processors may provide a significant increase in performance.

  5. Graphical symbol recognition

    OpenAIRE

    K.C. , Santosh; Wendling , Laurent

    2015-01-01

    International audience; The chapter focuses on one of the key issues in document image processing i.e., graphical symbol recognition. Graphical symbol recognition is a sub-field of a larger research domain: pattern recognition. The chapter covers several approaches (i.e., statistical, structural and syntactic) and specially designed symbol recognition techniques inspired by real-world industrial problems. It, in general, contains research problems, state-of-the-art methods that convey basic s...

  6. Data processing and optimization system to study prospective interstate power interconnections

    Science.gov (United States)

    Podkovalnikov, Sergei; Trofimov, Ivan; Trofimov, Leonid

    2018-01-01

    The paper presents Data processing and optimization system for studying and making rational decisions on the formation of interstate electric power interconnections, with aim to increasing effectiveness of their functioning and expansion. The technologies for building and integrating a Data processing and optimization system including an object-oriented database and a predictive mathematical model for optimizing the expansion of electric power systems ORIRES, are described. The technology of collection and pre-processing of non-structured data collected from various sources and its loading to the object-oriented database, as well as processing and presentation of information in the GIS system are described. One of the approaches of graphical visualization of the results of optimization model is considered on the example of calculating the option for expansion of the South Korean electric power grid.

  7. Bayesian graphical models for genomewide association studies.

    Science.gov (United States)

    Verzilli, Claudio J; Stallard, Nigel; Whittaker, John C

    2006-07-01

    As the extent of human genetic variation becomes more fully characterized, the research community is faced with the challenging task of using this information to dissect the heritable components of complex traits. Genomewide association studies offer great promise in this respect, but their analysis poses formidable difficulties. In this article, we describe a computationally efficient approach to mining genotype-phenotype associations that scales to the size of the data sets currently being collected in such studies. We use discrete graphical models as a data-mining tool, searching for single- or multilocus patterns of association around a causative site. The approach is fully Bayesian, allowing us to incorporate prior knowledge on the spatial dependencies around each marker due to linkage disequilibrium, which reduces considerably the number of possible graphical structures. A Markov chain-Monte Carlo scheme is developed that yields samples from the posterior distribution of graphs conditional on the data from which probabilistic statements about the strength of any genotype-phenotype association can be made. Using data simulated under scenarios that vary in marker density, genotype relative risk of a causative allele, and mode of inheritance, we show that the proposed approach has better localization properties and leads to lower false-positive rates than do single-locus analyses. Finally, we present an application of our method to a quasi-synthetic data set in which data from the CYP2D6 region are embedded within simulated data on 100K single-nucleotide polymorphisms. Analysis is quick (<5 min), and we are able to localize the causative site to a very short interval.

  8. Television equipment for graphic data input into a computer

    International Nuclear Information System (INIS)

    Dement'ev, V.G.; Dudin, Yu.Yu.; Pendyur, S.A.

    1988-01-01

    Television equipment made in CAMAC standard for graphic data input into a computer, in particular, of oscillograms from the screen of the storing oscillograph is described. Determination of point position is based on processing of video signal from telecamera viewing the oscillograph screen so that the line scanning is perpendicular to the orientation of oscillograph scan. The time of one point reading is approximately 20 ms

  9. Living Color Frame System: PC graphics tool for data visualization

    Science.gov (United States)

    Truong, Long V.

    1993-01-01

    Living Color Frame System (LCFS) is a personal computer software tool for generating real-time graphics applications. It is highly applicable for a wide range of data visualization in virtual environment applications. Engineers often use computer graphics to enhance the interpretation of data under observation. These graphics become more complicated when 'run time' animations are required, such as found in many typical modern artificial intelligence and expert systems. Living Color Frame System solves many of these real-time graphics problems.

  10. Measuring Cognitive Load in Test Items: Static Graphics versus Animated Graphics

    Science.gov (United States)

    Dindar, M.; Kabakçi Yurdakul, I.; Inan Dönmez, F.

    2015-01-01

    The majority of multimedia learning studies focus on the use of graphics in learning process but very few of them examine the role of graphics in testing students' knowledge. This study investigates the use of static graphics versus animated graphics in a computer-based English achievement test from a cognitive load theory perspective. Three…

  11. A graphics subsystem retrofit design for the bladed-disk data acquisition system. M.S. Thesis

    Science.gov (United States)

    Carney, R. R.

    1983-01-01

    A graphics subsystem retrofit design for the turbojet blade vibration data acquisition system is presented. The graphics subsystem will operate in two modes permitting the system operator to view blade vibrations on an oscilloscope type of display. The first mode is a real-time mode that displays only gross blade characteristics, such as maximum deflections and standing waves. This mode is used to aid the operator in determining when to collect detailed blade vibration data. The second mode of operation is a post-processing mode that will animate the actual blade vibrations using the detailed data collected on an earlier data collection run. The operator can vary the rate of payback to view differring characteristics of blade vibrations. The heart of the graphics subsystem is a modified version of AMD's ""super sixteen'' computer, called the graphics preprocessor computer (GPC). This computer is based on AMD's 2900 series of bit-slice components.

  12. Mathematics of shape description a morphological approach to image processing and computer graphics

    CERN Document Server

    Ghosh, Pijush K

    2009-01-01

    Image processing problems are often not well defined because real images are contaminated with noise and other uncertain factors. In Mathematics of Shape Description, the authors take a mathematical approach to address these problems using the morphological and set-theoretic approach to image processing and computer graphics by presenting a simple shape model using two basic shape operators called Minkowski addition and decomposition. This book is ideal for professional researchers and engineers in Information Processing, Image Measurement, Shape Description, Shape Representation and Computer Graphics. Post-graduate and advanced undergraduate students in pure and applied mathematics, computer sciences, robotics and engineering will also benefit from this book.  Key FeaturesExplains the fundamental and advanced relationships between algebraic system and shape description through the set-theoretic approachPromotes interaction of image processing geochronology and mathematics in the field of algebraic geometryP...

  13. THREE-DIMENSIONAL MODELING TOOLS IN THE PROCESS OF FORMATION OF GRAPHIC COMPETENCE OF THE FUTURE BACHELOR OF COMPUTER SCIENCE

    Directory of Open Access Journals (Sweden)

    Kateryna P. Osadcha

    2017-12-01

    Full Text Available The article is devoted to some aspects of the formation of future bachelor's graphic competence in computer sciences while teaching the fundamentals for working with three-dimensional modelling means. The analysis, classification and systematization of three-dimensional modelling means are given. The aim of research consists in investigating the set of instruments and classification of three-dimensional modelling means and correlation of skills, which are being formed, concerning inquired ones at the labour market in order to use them further in the process of forming graphic competence during training future bachelors in computer sciences. The peculiarities of the process of forming future bachelor's graphic competence in computer sciences by means of revealing, analyzing and systematizing three-dimensional modelling means and types of three-dimensional graphics at present stage of the development of informational technologies are traced a line round. The result of the research is a soft-ware choice in three-dimensional modelling for the process of training future bachelors in computer sciences.

  14. Graphics Processing Units for HEP trigger systems

    International Nuclear Information System (INIS)

    Ammendola, R.; Bauce, M.; Biagioni, A.; Chiozzi, S.; Cotta Ramusino, A.; Fantechi, R.; Fiorini, M.; Giagu, S.; Gianoli, A.; Lamanna, G.; Lonardo, A.; Messina, A.

    2016-01-01

    General-purpose computing on GPUs (Graphics Processing Units) is emerging as a new paradigm in several fields of science, although so far applications have been tailored to the specific strengths of such devices as accelerator in offline computation. With the steady reduction of GPU latencies, and the increase in link and memory throughput, the use of such devices for real-time applications in high-energy physics data acquisition and trigger systems is becoming ripe. We will discuss the use of online parallel computing on GPU for synchronous low level trigger, focusing on CERN NA62 experiment trigger system. The use of GPU in higher level trigger system is also briefly considered.

  15. Graphics Processing Units for HEP trigger systems

    Energy Technology Data Exchange (ETDEWEB)

    Ammendola, R. [INFN Sezione di Roma “Tor Vergata”, Via della Ricerca Scientifica 1, 00133 Roma (Italy); Bauce, M. [INFN Sezione di Roma “La Sapienza”, P.le A. Moro 2, 00185 Roma (Italy); University of Rome “La Sapienza”, P.lee A.Moro 2, 00185 Roma (Italy); Biagioni, A. [INFN Sezione di Roma “La Sapienza”, P.le A. Moro 2, 00185 Roma (Italy); Chiozzi, S.; Cotta Ramusino, A. [INFN Sezione di Ferrara, Via Saragat 1, 44122 Ferrara (Italy); University of Ferrara, Via Saragat 1, 44122 Ferrara (Italy); Fantechi, R. [INFN Sezione di Pisa, Largo B. Pontecorvo 3, 56127 Pisa (Italy); CERN, Geneve (Switzerland); Fiorini, M. [INFN Sezione di Ferrara, Via Saragat 1, 44122 Ferrara (Italy); University of Ferrara, Via Saragat 1, 44122 Ferrara (Italy); Giagu, S. [INFN Sezione di Roma “La Sapienza”, P.le A. Moro 2, 00185 Roma (Italy); University of Rome “La Sapienza”, P.lee A.Moro 2, 00185 Roma (Italy); Gianoli, A. [INFN Sezione di Ferrara, Via Saragat 1, 44122 Ferrara (Italy); University of Ferrara, Via Saragat 1, 44122 Ferrara (Italy); Lamanna, G., E-mail: gianluca.lamanna@cern.ch [INFN Sezione di Pisa, Largo B. Pontecorvo 3, 56127 Pisa (Italy); INFN Laboratori Nazionali di Frascati, Via Enrico Fermi 40, 00044 Frascati (Roma) (Italy); Lonardo, A. [INFN Sezione di Roma “La Sapienza”, P.le A. Moro 2, 00185 Roma (Italy); Messina, A. [INFN Sezione di Roma “La Sapienza”, P.le A. Moro 2, 00185 Roma (Italy); University of Rome “La Sapienza”, P.lee A.Moro 2, 00185 Roma (Italy); and others

    2016-07-11

    General-purpose computing on GPUs (Graphics Processing Units) is emerging as a new paradigm in several fields of science, although so far applications have been tailored to the specific strengths of such devices as accelerator in offline computation. With the steady reduction of GPU latencies, and the increase in link and memory throughput, the use of such devices for real-time applications in high-energy physics data acquisition and trigger systems is becoming ripe. We will discuss the use of online parallel computing on GPU for synchronous low level trigger, focusing on CERN NA62 experiment trigger system. The use of GPU in higher level trigger system is also briefly considered.

  16. Getting the most from your curves: Exploring and reporting data using informative graphical techniques

    Directory of Open Access Journals (Sweden)

    Masaki Matsunaga

    2009-09-01

    Full Text Available Most psychological research employs tables to report descriptive and inferential statistics. Unfortunately, those tables often misrepresent critical information on the shape and variability of the data’s distribution. In addition, certain information such as the modality and score probability density is hard to report succinctly in tables and, indeed, not reported typically in published research. This paper discusses the importance of using graphical techniques not only to explore data but also to report it effectively. In so doing, the role of exploratory data analysis in detecting Type I and Type II errors is considered. A small data set resembling a Type II error is simulated to demonstrate this procedure, using a conventional parametric test. A potential analysis routine to explore data is also presented. The paper proposes that essential summary statistics and information about the shape and variability of data should be reported via graphical techniques.

  17. Atomic data for controlled fusion research

    International Nuclear Information System (INIS)

    Barnett, C.F.; Ray, J.A.; Ricci, E.; Wilker, M.I.; McDaniel, E.W.; Thomas, E.W.; Gilbody, H.B.

    1977-02-01

    Presented is an evaluated graphical and tabular compilation of atomic and molecular cross sections of interest to controlled thermonuclear research. The cross sections are tabulated and graphed as a function of energy for collision processes involving heavy particles, electrons, and photons with atoms and ions. Also included are sections on data for particle penetration through macroscopic matter, particle transport properties, particle interactions with surfaces, and pertinent charged particle nuclear cross sections and reaction rates. In most cases estimates have been made of the data accuracy

  18. The Use of Graphics to Communicate Findings of Longitudinal Data in Design-Based Research

    Science.gov (United States)

    Francis, Krista; Jacobsen, Michele; Friesen, Sharon

    2014-01-01

    Visuals and graphics have been used for communicating complex ideas since 1786 when William Playfair first invented the line graph and bar chart. Graphs and charts are useful for interpretation and making sense of data. For instance, John Snow's scatter plot helped pinpoint the source of a cholera outbreak in London in 1854 and also changed…

  19. Dynamics Explorer science data processing system

    International Nuclear Information System (INIS)

    Smith, P.H.; Freeman, C.H.; Hoffman, R.A.

    1981-01-01

    The Dynamics Explorer project has acquired the ground data processing system from the Atmosphere Explorer project to provide a central computer facility for the data processing, data management and data analysis activities of the investigators. Access to this system is via remote terminals at the investigators' facilities, which provide ready access to the data sets derived from groups of instruments on both spacecraft. The original system has been upgraded with both new hardware and enhanced software systems. These new systems include color and grey scale graphics terminals, an augmentation computer, micrographies facility, a versatile data base with a directory and data management system, and graphics display software packages. (orig.)

  20. A Correlational Study of Graphic Organizers and Science Achievement of English Language Learners

    Science.gov (United States)

    Clarke, William Gordon

    English language learners (ELLs) demonstrate lower academic performance and have lower graduation and higher dropout rates than their non-ELL peers. The primary purpose of this correlational quantitative study was to investigate the relationship between the use of graphic organizer-infused science instruction and science learning of high school ELLs. Another objective was to determine if the method of instruction, socioeconomic status (SES), gender, and English language proficiency (ELP) were predictors of academic achievement of high school ELLs. Data were gathered from a New York City (NYC) high school fall 2012-2013 archival records of 145 ninth-grade ELLs who had received biology instruction in freestanding English as a second language (ESL) classes, followed by a test of their learning of the material. Fifty-four (37.2%) of these records were of students who had learned science by the conventional textbook method, and 91 (62.8%) by using graphic organizers. Data analysis employed the Statistical Package for the Social Sciences (SPSS) software for multiple regression analysis, which found graphic organizer use to be a significant predictor of New York State Regents Living Environment (NYSRLE) test scores (p < .01). One significant regression model was returned whereby, when combined, the four predictor variables (method of instruction, SES, gender, and ELP) explained 36% of the variance of the NYSRLE score. Implications of the study findings noted graphic organizer use as advantageous for ELL science achievement. Recommendations made for practice were for (a) the adoption of graphic organizer infused-instruction, (b) establishment of a protocol for the implementation of graphic organizer-infused instruction, and (c) increased length of graphic organizer instructional time. Recommendations made for future research were (a) a replication quantitative correlational study in two or more high schools, (b) a quantitative quasi-experimental quantitative study to

  1. Computer graphics and research projects

    International Nuclear Information System (INIS)

    Ingtrakul, P.

    1994-01-01

    This report was prepared as an account of scientific visualization tools and application tools for scientists and engineers. It is provided a set of tools to create pictures and to interact with them in natural ways. It applied many techniques of computer graphics and computer animation through a number of full-color presentations as computer animated commercials, 3D computer graphics, dynamic and environmental simulations, scientific modeling and visualization, physically based modelling, and beavioral, skelatal, dynamics, and particle animation. It took in depth at original hardware and limitations of existing PC graphics adapters contain syste m performance, especially with graphics intensive application programs and user interfaces

  2. CORDSPW - Windows computer program package for graphical interpretation of CORD-2 data

    International Nuclear Information System (INIS)

    Slavic, S.; Kromar, M.

    2007-01-01

    The CORD-2 package, developed at Jozef Stefan Institute, enables determination of the core power distribution and reactivity. Core distributions data generated during the calculation process are stored in CORlib files. CORDSP code, which is a part of the CORD-2 package, displays and compares data contained in CORlib files. Since it runs in the DOS environment, there are several limitations in the presentation of desired data. A CORDSPW package runs in the Windows environment and offers better graphical interpretation of the CORlib data. Core distributions can be displayed, compared, rewritten in the new files and sent to the printer. The user can select the appropriate display of the presented data such as core symmetry, colour and fonts. Core radial and axial distributions can be presented and compared. There are several options to store and print data. The user can choose between standard ASCII and graphical JPG format. (author)

  3. Storyboard dalam Pembuatan Motion Graphic

    OpenAIRE

    Satrya Mahardhika; A.F. Choiril Anam Fathoni

    2013-01-01

    Motion graphics is one category in the animation that makes animation with lots of design elements in each component. Motion graphics needs long process including preproduction, production, and postproduction. Preproduction has an important role so that the next stage may provide guidance or instructions for the production process or the animation process. Preproduction includes research, making the story, script, screenplay, character, environment design and storyboards. The storyboard will ...

  4. Printing--Graphic Arts--Graphic Communications

    Science.gov (United States)

    Hauenstein, A. Dean

    1975-01-01

    Recently, "graphic arts" has shifted from printing skills to a conceptual approach of production processes. "Graphic communications" must embrace the total system of communication through graphic media, to serve broad career education purposes; students taught concepts and principles can be flexible and adaptive. The author…

  5. Exploiting graphics processing units for computational biology and bioinformatics.

    Science.gov (United States)

    Payne, Joshua L; Sinnott-Armstrong, Nicholas A; Moore, Jason H

    2010-09-01

    Advances in the video gaming industry have led to the production of low-cost, high-performance graphics processing units (GPUs) that possess more memory bandwidth and computational capability than central processing units (CPUs), the standard workhorses of scientific computing. With the recent release of generalpurpose GPUs and NVIDIA's GPU programming language, CUDA, graphics engines are being adopted widely in scientific computing applications, particularly in the fields of computational biology and bioinformatics. The goal of this article is to concisely present an introduction to GPU hardware and programming, aimed at the computational biologist or bioinformaticist. To this end, we discuss the primary differences between GPU and CPU architecture, introduce the basics of the CUDA programming language, and discuss important CUDA programming practices, such as the proper use of coalesced reads, data types, and memory hierarchies. We highlight each of these topics in the context of computing the all-pairs distance between instances in a dataset, a common procedure in numerous disciplines of scientific computing. We conclude with a runtime analysis of the GPU and CPU implementations of the all-pairs distance calculation. We show our final GPU implementation to outperform the CPU implementation by a factor of 1700.

  6. The graphics future in scientific applications-trends and developments in computer graphics

    CERN Document Server

    Enderle, G

    1982-01-01

    Computer graphics methods and tools are being used to a great extent in scientific research. The future development in this area will be influenced both by new hardware developments and by software advances. On the hardware sector, the development of the raster technology will lead to the increased use of colour workstations with more local processing power. Colour hardcopy devices for creating plots, slides, or movies will be available at a lower price than today. The first real 3D-workstations will appear on the marketplace. One of the main activities on the software sector is the standardization of computer graphics systems, graphical files, and device interfaces. This will lead to more portable graphical application programs and to a common base for computer graphics education.

  7. Data acquisition, processing and display of experimental data for the Tokamak de Varennes

    International Nuclear Information System (INIS)

    Robins, E.S.; Larsen, J.M.; Lee, A.; Somers, G.

    1985-01-01

    The Tokamak de Varennes is to be a national facility for research into magnetic nuclear fusion. A centralised computer system is currently under development to facilitate the remote control, acquisition, processing and display of experimental data. The software (GALE-V) consists of a set of tasks to build data structures which mirror the physical arrangement of each experiment and provide the bases for the interpretation and presentation of the data to each experimenter. Data retrieval is accomplished through the graphics subsystem, and an interface for user-written data processing programs allows for the varied needs of data analysis of each experiment. Other facilities being developed provide the tools for a user to retrieve, process and view the data in a simple manner

  8. Characterizing chemical systems with on-line computers and graphics

    International Nuclear Information System (INIS)

    Frazer, J.W.; Rigdon, L.P.; Brand, H.R.; Pomernacki, C.L.

    1979-01-01

    Incorporating computers and graphics on-line to chemical experiments and processes opens up new opportunities for the study and control of complex systems. Systems having many variables can be characterized even when the variable interactions are nonlinear, and the system cannot a priori be represented by numerical methods and models. That is, large sets of accurate data can be rapidly acquired, then modeling and graphic techniques can be used to obtain partial interpretation plus design of further experimentation. The experimenter can thus comparatively quickly iterate between experimentation and modeling to obtain a final solution. We have designed and characterized a versatile computer-controlled apparatus for chemical research, which incorporates on-line instrumentation and graphics. It can be used to determine the mechanism of enzyme-induced reactions or to optimize analytical methods. The apparatus can also be operated as a pilot plant to design control strategies. On-line graphics were used to display conventional plots used by biochemists and three-dimensional response-surface plots

  9. Efficient particle-in-cell simulation of auroral plasma phenomena using a CUDA enabled graphics processing unit

    Science.gov (United States)

    Sewell, Stephen

    This thesis introduces a software framework that effectively utilizes low-cost commercially available Graphic Processing Units (GPUs) to simulate complex scientific plasma phenomena that are modeled using the Particle-In-Cell (PIC) paradigm. The software framework that was developed conforms to the Compute Unified Device Architecture (CUDA), a standard for general purpose graphic processing that was introduced by NVIDIA Corporation. This framework has been verified for correctness and applied to advance the state of understanding of the electromagnetic aspects of the development of the Aurora Borealis and Aurora Australis. For each phase of the PIC methodology, this research has identified one or more methods to exploit the problem's natural parallelism and effectively map it for execution on the graphic processing unit and its host processor. The sources of overhead that can reduce the effectiveness of parallelization for each of these methods have also been identified. One of the novel aspects of this research was the utilization of particle sorting during the grid interpolation phase. The final representation resulted in simulations that executed about 38 times faster than simulations that were run on a single-core general-purpose processing system. The scalability of this framework to larger problem sizes and future generation systems has also been investigated.

  10. Heterogeneous Multicore Parallel Programming for Graphics Processing Units

    Directory of Open Access Journals (Sweden)

    Francois Bodin

    2009-01-01

    Full Text Available Hybrid parallel multicore architectures based on graphics processing units (GPUs can provide tremendous computing power. Current NVIDIA and AMD Graphics Product Group hardware display a peak performance of hundreds of gigaflops. However, exploiting GPUs from existing applications is a difficult task that requires non-portable rewriting of the code. In this paper, we present HMPP, a Heterogeneous Multicore Parallel Programming workbench with compilers, developed by CAPS entreprise, that allows the integration of heterogeneous hardware accelerators in a unintrusive manner while preserving the legacy code.

  11. GPFrontend and GPGraphics: graphical analysis tools for genetic association studies.

    Science.gov (United States)

    Uebe, Steffen; Pasutto, Francesca; Krumbiegel, Mandy; Schanze, Denny; Ekici, Arif B; Reis, André

    2010-09-21

    Most software packages for whole genome association studies are non-graphical, purely text based programs originally designed to run with UNIX-like operating systems. Graphical output is often not intended or supposed to be performed with other command line tools, e.g. gnuplot. Using the Microsoft .NET 2.0 platform and Visual Studio 2005, we have created a graphical software package to analyze data from microarray whole genome association studies, both for a DNA-pooling based approach as well as regular single sample data. Part of this package was made to integrate with GenePool 0.8.2, a previously existing software suite for GNU/Linux systems, which we have modified to run in a Microsoft Windows environment. Further modifications cause it to generate some additional data. This enables GenePool to interact with the .NET parts created by us. The programs we developed are GPFrontend, a graphical user interface and frontend to use GenePool and create metadata files for it, and GPGraphics, a program to further analyze and graphically evaluate output of different WGA analysis programs, among them also GenePool. Our programs enable regular MS Windows users without much experience in bioinformatics to easily visualize whole genome data from a variety of sources.

  12. GPFrontend and GPGraphics: graphical analysis tools for genetic association studies

    Directory of Open Access Journals (Sweden)

    Schanze Denny

    2010-09-01

    Full Text Available Abstract Background Most software packages for whole genome association studies are non-graphical, purely text based programs originally designed to run with UNIX-like operating systems. Graphical output is often not intended or supposed to be performed with other command line tools, e.g. gnuplot. Results Using the Microsoft .NET 2.0 platform and Visual Studio 2005, we have created a graphical software package to analyze data from microarray whole genome association studies, both for a DNA-pooling based approach as well as regular single sample data. Part of this package was made to integrate with GenePool 0.8.2, a previously existing software suite for GNU/Linux systems, which we have modified to run in a Microsoft Windows environment. Further modifications cause it to generate some additional data. This enables GenePool to interact with the .NET parts created by us. The programs we developed are GPFrontend, a graphical user interface and frontend to use GenePool and create metadata files for it, and GPGraphics, a program to further analyze and graphically evaluate output of different WGA analysis programs, among them also GenePool. Conclusions Our programs enable regular MS Windows users without much experience in bioinformatics to easily visualize whole genome data from a variety of sources.

  13. Quantitative data systemization and visualisation in marketing research: groceries selection determinants

    Directory of Open Access Journals (Sweden)

    Jitka Janová

    2010-01-01

    Full Text Available The paper aims to fill in the gap in effective interpretation of the results obtained when systemizing the marketing data by cluster analysis. The graphic visualization of the cluster analysis results is developed in the way the marketing information can be more easily readable and interpretable. Using the primary research data concerning decision making process of consumer when purchasing gro­ce­ries, the systemization of consumers using hierarchical cluster analysis is performed for several sets of consumers’ characteristics and for each case the graphic visualization is developed. The graphical information is interpreted and the marketing impacts of the results obtained by the cluster analysis and presented by the visualization are discussed. Range of possible applications of the procedure constructed encompasses also other spheres of primary and secondary marketing research and ge­ne­ral­ly is useful for the effective analyses of various statistical surveys.

  14. Gamma camera image processing and graphical analysis mutual software system

    International Nuclear Information System (INIS)

    Wang Zhiqian; Chen Yongming; Ding Ailian; Ling Zhiye; Jin Yongjie

    1992-01-01

    GCCS gamma camera image processing and graphical analysis system is a special mutual software system. It is mainly used to analyse various patient data acquired from gamma camera. This system is used on IBM PC, PC/XT or PC/AT. It consists of several parts: system management, data management, device management, program package and user programs. The system provides two kinds of user interfaces: command menu and command characters. It is easy to change and enlarge this system because it is best modularized. The user programs include almost all the clinical protocols used now

  15. Graphic Presentation: An Empirical Examination of the Graphic Novel Approach to Communicate Business Concepts

    Science.gov (United States)

    Short, Jeremy C.; Randolph-Seng, Brandon; McKenny, Aaron F.

    2013-01-01

    Graphic novels have been increasingly incorporated into business communication forums. Despite potential benefits, little research has examined the merits of the graphic novel approach. In response, we engage in a two-study approach. Study 1 explores the potential of graphic novels to affect learning outcomes and finds that the graphic novel was…

  16. Discrete-Event Execution Alternatives on General Purpose Graphical Processing Units

    International Nuclear Information System (INIS)

    Perumalla, Kalyan S.

    2006-01-01

    Graphics cards, traditionally designed as accelerators for computer graphics, have evolved to support more general-purpose computation. General Purpose Graphical Processing Units (GPGPUs) are now being used as highly efficient, cost-effective platforms for executing certain simulation applications. While most of these applications belong to the category of time-stepped simulations, little is known about the applicability of GPGPUs to discrete event simulation (DES). Here, we identify some of the issues and challenges that the GPGPU stream-based interface raises for DES, and present some possible approaches to moving DES to GPGPUs. Initial performance results on simulation of a diffusion process show that DES-style execution on GPGPU runs faster than DES on CPU and also significantly faster than time-stepped simulations on either CPU or GPGPU.

  17. GPU applications for data processing

    Energy Technology Data Exchange (ETDEWEB)

    Vladymyrov, Mykhailo, E-mail: mykhailo.vladymyrov@cern.ch [LPI - Lebedev Physical Institute of the Russian Academy of Sciences, RUS-119991 Moscow (Russian Federation); Aleksandrov, Andrey [LPI - Lebedev Physical Institute of the Russian Academy of Sciences, RUS-119991 Moscow (Russian Federation); INFN sezione di Napoli, I-80125 Napoli (Italy); Tioukov, Valeri [INFN sezione di Napoli, I-80125 Napoli (Italy)

    2015-12-31

    Modern experiments that use nuclear photoemulsion imply fast and efficient data acquisition from the emulsion can be performed. The new approaches in developing scanning systems require real-time processing of large amount of data. Methods that use Graphical Processing Unit (GPU) computing power for emulsion data processing are presented here. It is shown how the GPU-accelerated emulsion processing helped us to rise the scanning speed by factor of nine.

  18. Study of the Korean anthracite for utilization and the coal mine data management

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-12-01

    This report consists of two articles. (1) Petrographic study of the Korean anthracite for utilization (5): This research was initiated for the development of filtering materials those can be used in waste water treatment sites The small scale of filtration tester was built on the waste water treatment site of Chungjoo electric Co. to use waste water processed by purifying system for the feasibility study. (2) Study of the closed coal mine data management: Underground maps about 1700 adits of 100 coal mines, and related graphic data have been collected in the database. And all those data were entered into the database in vectorial form, coordinates obtaining from the digitizing tablet. Detailed works are described in the other report, including the discussions of graphic database and data handling of graphical mine data. Comments about the GIS is also provided in the volume. (author). 25 refs., 45 figs., 50 tabs., 3 maps.

  19. Systems Biology Graphical Notation: Process Description language Level 1 Version 1.3.

    Science.gov (United States)

    Moodie, Stuart; Le Novère, Nicolas; Demir, Emek; Mi, Huaiyu; Villéger, Alice

    2015-09-04

    The Systems Biological Graphical Notation (SBGN) is an international community effort for standardized graphical representations of biological pathways and networks. The goal of SBGN is to provide unambiguous pathway and network maps for readers with different scientific backgrounds as well as to support efficient and accurate exchange of biological knowledge between different research communities, industry, and other players in systems biology. Three SBGN languages, Process Description (PD), Entity Relationship (ER) and Activity Flow (AF), allow for the representation of different aspects of biological and biochemical systems at different levels of detail. The SBGN Process Description language represents biological entities and processes between these entities within a network. SBGN PD focuses on the mechanistic description and temporal dependencies of biological interactions and transformations. The nodes (elements) are split into entity nodes describing, e.g., metabolites, proteins, genes and complexes, and process nodes describing, e.g., reactions and associations. The edges (connections) provide descriptions of relationships (or influences) between the nodes, such as consumption, production, stimulation and inhibition. Among all three languages of SBGN, PD is the closest to metabolic and regulatory pathways in biological literature and textbooks, but its well-defined semantics offer a superior precision in expressing biological knowledge.

  20. HLYWD: a program for post-processing data files to generate selected plots or time-lapse graphics

    International Nuclear Information System (INIS)

    Munro, J.K. Jr.

    1980-05-01

    The program HLYWD is a post-processor of output files generated by large plasma simulation computations or of data files containing a time sequence of plasma diagnostics. It is intended to be used in a production mode for either type of application; i.e., it allows one to generate along with the graphics sequence, segments containing title, credits to those who performed the work, text to describe the graphics, and acknowledgement of funding agency. The current version is designed to generate 3D plots and allows one to select type of display (linear or semi-log scales), choice of normalization of function values for display purposes, viewing perspective, and an option to allow continuous rotations of surfaces. This program was developed with the intention of being relatively easy to use, reasonably flexible, and requiring a minimum investment of the user's time. It uses the TV80 library of graphics software and ORDERLIB system software on the CDC 7600 at the National Magnetic Fusion Energy Computing Center at Lawrence Livermore Laboratory in California

  1. Interplay of Computer and Paper-Based Sketching in Graphic Design

    Science.gov (United States)

    Pan, Rui; Kuo, Shih-Ping; Strobel, Johannes

    2013-01-01

    The purpose of this study is to investigate student designers' attitude and choices towards the use of computers and paper sketches when involved in a graphic design process. 65 computer graphic technology undergraduates participated in this research. A mixed method study with survey and in-depth interviews was applied to answer the research…

  2. Semantic processing of EHR data for clinical research.

    Science.gov (United States)

    Sun, Hong; Depraetere, Kristof; De Roo, Jos; Mels, Giovanni; De Vloed, Boris; Twagirumukiza, Marc; Colaert, Dirk

    2015-12-01

    There is a growing need to semantically process and integrate clinical data from different sources for clinical research. This paper presents an approach to integrate EHRs from heterogeneous resources and generate integrated data in different data formats or semantics to support various clinical research applications. The proposed approach builds semantic data virtualization layers on top of data sources, which generate data in the requested semantics or formats on demand. This approach avoids upfront dumping to and synchronizing of the data with various representations. Data from different EHR systems are first mapped to RDF data with source semantics, and then converted to representations with harmonized domain semantics where domain ontologies and terminologies are used to improve reusability. It is also possible to further convert data to application semantics and store the converted results in clinical research databases, e.g. i2b2, OMOP, to support different clinical research settings. Semantic conversions between different representations are explicitly expressed using N3 rules and executed by an N3 Reasoner (EYE), which can also generate proofs of the conversion processes. The solution presented in this paper has been applied to real-world applications that process large scale EHR data. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Enhancement of graphic user interface data acquisition of small angle neutron scattering

    International Nuclear Information System (INIS)

    Abd Aziz Muhammad; Abd Jalil Abd Hamid

    2004-01-01

    This paper discusses the activities of the development of data acquisition software for PC, which capable of controlling instrument via IEEE-488 and graphic visualization for small angle neutron scattering (SANS) runs in DOS mode. With the help of outstanding free ware graphic library for DOS, this software has enhanced the efficiency of graphic visualization for SANSLab data acquisition. Featuring easy-to-use graphical user interface (GUI) and several other built-in tools for convenience, this software can be manipulated with the mouse or the keyboard. This software can be converted into an inexpensive data acquisition system for SANS. (Author)

  4. Reflector antenna analysis using physical optics on Graphics Processing Units

    DEFF Research Database (Denmark)

    Borries, Oscar Peter; Sørensen, Hans Henrik Brandenborg; Dammann, Bernd

    2014-01-01

    The Physical Optics approximation is a widely used asymptotic method for calculating the scattering from electrically large bodies. It requires significant computational work and little memory, and is thus well suited for application on a Graphics Processing Unit. Here, we investigate the perform......The Physical Optics approximation is a widely used asymptotic method for calculating the scattering from electrically large bodies. It requires significant computational work and little memory, and is thus well suited for application on a Graphics Processing Unit. Here, we investigate...

  5. Commercial Off-The-Shelf (COTS) Graphics Processing Board (GPB) Radiation Test Evaluation Report

    Science.gov (United States)

    Salazar, George A.; Steele, Glen F.

    2013-01-01

    Large round trip communications latency for deep space missions will require more onboard computational capabilities to enable the space vehicle to undertake many tasks that have traditionally been ground-based, mission control responsibilities. As a result, visual display graphics will be required to provide simpler vehicle situational awareness through graphical representations, as well as provide capabilities never before done in a space mission, such as augmented reality for in-flight maintenance or Telepresence activities. These capabilities will require graphics processors and associated support electronic components for high computational graphics processing. In an effort to understand the performance of commercial graphics card electronics operating in the expected radiation environment, a preliminary test was performed on five commercial offthe- shelf (COTS) graphics cards. This paper discusses the preliminary evaluation test results of five COTS graphics processing cards tested to the International Space Station (ISS) low earth orbit radiation environment. Three of the five graphics cards were tested to a total dose of 6000 rads (Si). The test articles, test configuration, preliminary results, and recommendations are discussed.

  6. Reproducible Data Processing Research for the CABRI R.I.A. experiments Acoustic Emission signal analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pantera, Laurent [CEA, DEN, CAD/DER/SRES/LPRE, Cadarache, F-13108 Saint-Paul-lez-Durance (France); Issiaka Traore, Oumar [Laboratory of Machanics and Acoustics (LMA) CNRS, 13402 Marseille (France)

    2015-07-01

    The CABRI facility is an experimental nuclear reactor of the French Atomic Energy Commission (CEA) designed to study the behaviour of fuel rods at high burnup under Reactivity Initiated Accident (R.I.A.) conditions such as the scenario of a control rod ejection. During the experimental phase, the behaviour of the fuel element generates acoustic waves which can be detected by two microphones placed upstream and downstream from the test device. Studies carried out on the last fourteen tests showed the interest in carrying out temporal and spectral analyses on these signals by showing the existence of signatures which can be correlated with physical phenomena. We want presently to return to this rich data in order to have a new point of view by applying modern signal processing methods. Such an antecedent works resumption leads to some difficulties. Although all the raw data are accessible in the form of text files, analyses and graphics representations were not clear in reproducing from the former studies since the people who were in charge of the original work have left the laboratory and it is not easy when time passes, even with our own work, to be able to remember the steps of data manipulations and the exact setup. Thus we decided to consolidate the availability of the data and its manipulation in order to provide a robust data processing workflow to the experimentalists before doing any further investigations. To tackle this issue of strong links between data, treatments and the generation of documents, we adopted a Reproducible Research paradigm. We shall first present the tools chosen in our laboratory to implement this workflow and, then we shall describe the global perception carried out to continue the study of the Acoustic Emission signals recorded by the two microphones during the last fourteen CABRI R.I.A. tests. (authors)

  7. PROMOTING STUDENTS’ EXPLICIT INFORMATION SKILL IN READING COMPREHENSION THROUGH GRAPHIC ORGANIZERS

    Directory of Open Access Journals (Sweden)

    Syaifudin Latif Darmawan

    2013-10-01

    Full Text Available This research is carried out to (1 identify whether graphic organizers can improve students’ reading comprehension; and (2 to describe the classroom situation when graphic organizers are employed in instructional process of reading comprehension. The research is adminisitered in two cycles 2014 in the second grade of SMP Muhamadiyah Sekampung, Lampung Timur. The procedure of the research consists of identifying the problem, planning the action, implementing the action, observing the action, and reflecting the result of the research. Qualitative data are collected through interview, observation, questionnaire, and research diary. Quantitative data are collected through test. To analyze qualitative data, the researcher used constant comparative method. It consists of four steps: (1 comparing incidents applicable to each category; (2 Integrating categories and their properties; (3 delimiting the theory; (4 Writing the theory. Meanwhile, to analyze quantitative data, the researcher employed descriptive statistic.    The result of the research shows that using graphic organizers can improve students’ reading comprehension and classroom situation. The improvement on students’ reading comprehension is students are able to find explicit information in a text. The improvement of the classroom situation; (a students come on time in the class (b students are more motivated to join the class (c Students pay more attention in the instructional process. In addition, the improvement also happens to the scores. The mean score increases from 57.56 in the pre-test, 63.34 in the formative test of cycle 1, and 69.56 in the post test of cycle 2

  8. permGPU: Using graphics processing units in RNA microarray association studies

    Directory of Open Access Journals (Sweden)

    George Stephen L

    2010-06-01

    Full Text Available Abstract Background Many analyses of microarray association studies involve permutation, bootstrap resampling and cross-validation, that are ideally formulated as embarrassingly parallel computing problems. Given that these analyses are computationally intensive, scalable approaches that can take advantage of multi-core processor systems need to be developed. Results We have developed a CUDA based implementation, permGPU, that employs graphics processing units in microarray association studies. We illustrate the performance and applicability of permGPU within the context of permutation resampling for a number of test statistics. An extensive simulation study demonstrates a dramatic increase in performance when using permGPU on an NVIDIA GTX 280 card compared to an optimized C/C++ solution running on a conventional Linux server. Conclusions permGPU is available as an open-source stand-alone application and as an extension package for the R statistical environment. It provides a dramatic increase in performance for permutation resampling analysis in the context of microarray association studies. The current version offers six test statistics for carrying out permutation resampling analyses for binary, quantitative and censored time-to-event traits.

  9. Brain Activities Associated with Graphic Emoticons: An fMRI Study

    Science.gov (United States)

    Yuasa, Masahide; Saito, Keiichi; Mukawa, Naoki

    In this paper, we describe the brain activities that are associated with graphic emoticons by using functional MRI (fMRI). We use various types of faces from abstract to photorealistic in computer network applications. A graphics emoticon is an abstract face in communication over computer network. In this research, we created various graphic emoticons for the fMRI study and the graphic emoticons were classified according to friendliness and level of arousal. We investigated the brain activities of participants who were required to evaluate the emotional valence of the graphic emoticons (happy or sad). The experimental results showed that not only the right inferior frontal gyrus and the cingulate gyrus, but also the inferior and middle temporal gyrus and the fusiform gyrus, were found to be activated during the experiment. Forthermore, it is possible that the activation of the right inferior frontal gyrus and the cingulate gyrus is related to the type of abstract face. Since the inferior and middle temporal gyrus were activated, even though the graphic emoticons are static, we may perceive graphic emoticons as dynamic and living agents. Moreover, it is believed that text and graphics emoticons play an important role in enriching communication among users.

  10. eRNA: a graphic user interface-based tool optimized for large data analysis from high-throughput RNA sequencing.

    Science.gov (United States)

    Yuan, Tiezheng; Huang, Xiaoyi; Dittmar, Rachel L; Du, Meijun; Kohli, Manish; Boardman, Lisa; Thibodeau, Stephen N; Wang, Liang

    2014-03-05

    RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification" includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module "mRNA identification" includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module "Target screening" provides expression profiling analyses and graphic visualization. The module "Self-testing" offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program's functionality. eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory.

  11. Graphic display of spatially distributed binary-state experimental data

    International Nuclear Information System (INIS)

    Watson, B.L.

    1981-01-01

    Experimental data collected from a large number of transducers spatially distributed throughout a three-dimensional volume has typically posed a difficult interpretation task for the analyst. This paper describes one approach to alleviating this problem by presenting color graphic displays of experimental data; specifically, data representing the dynamic three-dimensional distribution of cooling fluid collected during the reflood and refill of simulated nuclear reactor vessels. Color-coded binary data (wet/dry) are integrated with a graphic representation of the reactor vessel and displayed on a high-resolution color CRT. The display is updated with successive data sets and made into 16-mm movies for distribution and analysis. Specific display formats are presented and extension to other applications discussed

  12. Data Integration Tool: From Permafrost Data Translation Research Tool to A Robust Research Application

    Science.gov (United States)

    Wilcox, H.; Schaefer, K. M.; Jafarov, E. E.; Strawhacker, C.; Pulsifer, P. L.; Thurmes, N.

    2016-12-01

    The United States National Science Foundation funded PermaData project led by the National Snow and Ice Data Center (NSIDC) with a team from the Global Terrestrial Network for Permafrost (GTN-P) aimed to improve permafrost data access and discovery. We developed a Data Integration Tool (DIT) to significantly speed up the time of manual processing needed to translate inconsistent, scattered historical permafrost data into files ready to ingest directly into the GTN-P. We leverage this data to support science research and policy decisions. DIT is a workflow manager that divides data preparation and analysis into a series of steps or operations called widgets. Each widget does a specific operation, such as read, multiply by a constant, sort, plot, and write data. DIT allows the user to select and order the widgets as desired to meet their specific needs. Originally it was written to capture a scientist's personal, iterative, data manipulation and quality control process of visually and programmatically iterating through inconsistent input data, examining it to find problems, adding operations to address the problems, and rerunning until the data could be translated into the GTN-P standard format. Iterative development of this tool led to a Fortran/Python hybrid then, with consideration of users, licensing, version control, packaging, and workflow, to a publically available, robust, usable application. Transitioning to Python allowed the use of open source frameworks for the workflow core and integration with a javascript graphical workflow interface. DIT is targeted to automatically handle 90% of the data processing for field scientists, modelers, and non-discipline scientists. It is available as an open source tool in GitHub packaged for a subset of Mac, Windows, and UNIX systems as a desktop application with a graphical workflow manager. DIT was used to completely translate one dataset (133 sites) that was successfully added to GTN-P, nearly translate three datasets

  13. How Random Noise and a Graphical Convention Subverted Behavioral Scientists' Explanations of Self-Assessment Data: Numeracy Underlies Better Alternatives

    Directory of Open Access Journals (Sweden)

    Edward Nuhfer

    2017-01-01

    Full Text Available Despite nearly two decades of research, researchers have not resolved whether people generally perceive their skills accurately or inaccurately. In this paper, we trace this lack of resolution to numeracy, specifically to the frequently overlooked complications that arise from the noisy data produced by the paired measures that researchers employ to determine self-assessment accuracy. To illustrate the complications and ways to resolve them, we employ a large dataset (N = 1154 obtained from paired measures of documented reliability to study self-assessed proficiency in science literacy. We collected demographic information that allowed both criterion-referenced and normative-based analyses of self-assessment data. We used these analyses to propose a quantitatively based classification scale and show how its use informs the nature of self-assessment. Much of the current consensus about peoples' inability to self-assess accurately comes from interpreting normative data presented in the Kruger-Dunning type graphical format or closely related (y - x vs. (x graphical conventions. Our data show that peoples' self-assessments of competence, in general, reflect a genuine competence that they can demonstrate. That finding contradicts the current consensus about the nature of self-assessment. Our results further confirm that experts are more proficient in self-assessing their abilities than novices and that women, in general, self-assess more accurately than men. The validity of interpretations of data depends strongly upon how carefully the researchers consider the numeracy that underlies graphical presentations and conclusions. Our results indicate that carefully measured self-assessments provide valid, measurable and valuable information about proficiency.

  14. The Gaussian Graphical Model in Cross-Sectional and Time-Series Data.

    Science.gov (United States)

    Epskamp, Sacha; Waldorp, Lourens J; Mõttus, René; Borsboom, Denny

    2018-04-16

    We discuss the Gaussian graphical model (GGM; an undirected network of partial correlation coefficients) and detail its utility as an exploratory data analysis tool. The GGM shows which variables predict one-another, allows for sparse modeling of covariance structures, and may highlight potential causal relationships between observed variables. We describe the utility in three kinds of psychological data sets: data sets in which consecutive cases are assumed independent (e.g., cross-sectional data), temporally ordered data sets (e.g., n = 1 time series), and a mixture of the 2 (e.g., n > 1 time series). In time-series analysis, the GGM can be used to model the residual structure of a vector-autoregression analysis (VAR), also termed graphical VAR. Two network models can then be obtained: a temporal network and a contemporaneous network. When analyzing data from multiple subjects, a GGM can also be formed on the covariance structure of stationary means-the between-subjects network. We discuss the interpretation of these models and propose estimation methods to obtain these networks, which we implement in the R packages graphicalVAR and mlVAR. The methods are showcased in two empirical examples, and simulation studies on these methods are included in the supplementary materials.

  15. Risk Management Collaboration through Sharing Interactive Graphics

    Science.gov (United States)

    Slingsby, Aidan; Dykes, Jason; Wood, Jo; Foote, Matthew

    2010-05-01

    Risk management involves the cooperation of scientists, underwriters and actuaries all of whom analyse data to support decision-making. Results are often disseminated through static documents with graphics that convey the message the analyst wishes to communicate. Interactive graphics are increasingly popular means of communicating the results of data analyses because they enable other parties to explore and visually analyse some of the data themselves prior to and during discussion. Discussion around interactive graphics can occur synchronously in face-to-face meetings or with video-conferencing and screen sharing or they can occur asynchronously through web-sites such as ManyEyes, web-based fora, blogs, wikis and email. A limitation of approaches that do not involve screen sharing is the difficulty in sharing the results of insights from interacting with the graphic. Static images accompanied can be shared but these themselves cannot be interacted, producing a discussion bottleneck (Baker, 2008). We address this limitation by allowing the state and configuration of graphics to be shared (rather than static images) so that a user can reproduce someone else's graphic, interact with it and then share the results of this accompanied with some commentary. HiVE (Slingsby et al, 2009) is a compact and intuitive text-based language that has been designed for this purpose. We will describe the vizTweets project (a 9-month project funded by JISC) in which we are applying these principles to insurance risk management in the context of the Willis Research Network, the world's largest collaboration between the insurance industry and the academia). The project aims to extend HiVE to meet the needs of the sector, design, implement free-available web services and tools and to provide case studies. We will present a case study that demonstrate the potential of this approach for collaboration within the Willis Research Network. Baker, D. Towards Transparency in Visualisation Based

  16. Mastering probabilistic graphical models using Python

    CERN Document Server

    Ankan, Ankur

    2015-01-01

    If you are a researcher or a machine learning enthusiast, or are working in the data science field and have a basic idea of Bayesian learning or probabilistic graphical models, this book will help you to understand the details of graphical models and use them in your data science problems.

  17. SU-E-P-59: A Graphical Interface for XCAT Phantom Configuration, Generation and Processing

    International Nuclear Information System (INIS)

    Myronakis, M; Cai, W; Dhou, S; Cifter, F; Lewis, J; Hurwitz, M

    2015-01-01

    Purpose: To design a comprehensive open-source, publicly available, graphical user interface (GUI) to facilitate the configuration, generation, processing and use of the 4D Extended Cardiac-Torso (XCAT) phantom. Methods: The XCAT phantom includes over 9000 anatomical objects as well as respiratory, cardiac and tumor motion. It is widely used for research studies in medical imaging and radiotherapy. The phantom generation process involves the configuration of a text script to parameterize the geometry, motion, and composition of the whole body and objects within it, and to generate simulated PET or CT images. To avoid the need for manual editing or script writing, our MATLAB-based GUI uses slider controls, drop-down lists, buttons and graphical text input to parameterize and process the phantom. Results: Our GUI can be used to: a) generate parameter files; b) generate the voxelized phantom; c) combine the phantom with a lesion; d) display the phantom; e) produce average and maximum intensity images from the phantom output files; f) incorporate irregular patient breathing patterns; and f) generate DICOM files containing phantom images. The GUI provides local help information using tool-tip strings on the currently selected phantom, minimizing the need for external documentation. The DICOM generation feature is intended to simplify the process of importing the phantom images into radiotherapy treatment planning systems or other clinical software. Conclusion: The GUI simplifies and automates the use of the XCAT phantom for imaging-based research projects in medical imaging or radiotherapy. This has the potential to accelerate research conducted with the XCAT phantom, or to ease the learning curve for new users. This tool does not include the XCAT phantom software itself. We would like to acknowledge funding from MRA, Varian Medical Systems Inc

  18. SU-E-P-59: A Graphical Interface for XCAT Phantom Configuration, Generation and Processing

    Energy Technology Data Exchange (ETDEWEB)

    Myronakis, M; Cai, W; Dhou, S; Cifter, F; Lewis, J [Brigham and Women’s Hospital, Boston, MA (United States); Hurwitz, M [Newton, MA (United States)

    2015-06-15

    Purpose: To design a comprehensive open-source, publicly available, graphical user interface (GUI) to facilitate the configuration, generation, processing and use of the 4D Extended Cardiac-Torso (XCAT) phantom. Methods: The XCAT phantom includes over 9000 anatomical objects as well as respiratory, cardiac and tumor motion. It is widely used for research studies in medical imaging and radiotherapy. The phantom generation process involves the configuration of a text script to parameterize the geometry, motion, and composition of the whole body and objects within it, and to generate simulated PET or CT images. To avoid the need for manual editing or script writing, our MATLAB-based GUI uses slider controls, drop-down lists, buttons and graphical text input to parameterize and process the phantom. Results: Our GUI can be used to: a) generate parameter files; b) generate the voxelized phantom; c) combine the phantom with a lesion; d) display the phantom; e) produce average and maximum intensity images from the phantom output files; f) incorporate irregular patient breathing patterns; and f) generate DICOM files containing phantom images. The GUI provides local help information using tool-tip strings on the currently selected phantom, minimizing the need for external documentation. The DICOM generation feature is intended to simplify the process of importing the phantom images into radiotherapy treatment planning systems or other clinical software. Conclusion: The GUI simplifies and automates the use of the XCAT phantom for imaging-based research projects in medical imaging or radiotherapy. This has the potential to accelerate research conducted with the XCAT phantom, or to ease the learning curve for new users. This tool does not include the XCAT phantom software itself. We would like to acknowledge funding from MRA, Varian Medical Systems Inc.

  19. Graphics in DAQSIM

    International Nuclear Information System (INIS)

    Wang, C.C.; Booth, A.W.; Chen, Y.M.; Botlo, M.

    1993-06-01

    At the Superconducting Super Collider Laboratory (SSCL) a tool called DAQSIM has been developed to study the behavior of Data Acquisition (DAQ) systems. This paper reports and discusses the graphics used in DAQSIM. DAQSIM graphics includes graphical user interface (GUI), animation, debugging, and control facilities. DAQSIM graphics not only provides a convenient DAQ simulation environment, it also serves as an efficient manager in simulation development and verification

  20. Graphic design and scientific research: the experience of the INGV Laboratorio Grafica e Immagini

    Science.gov (United States)

    Riposati, Daniela; D'Addezio, Giuliana; Chesi, Angela; Di Laura, Francesca; Palone, Sabrina

    2016-04-01

    The Laboratorio Grafica e Immagini is the INGV reference structure for the graphic and visual communication supporting institutional and research activities. Part of the activity is focused on the production of different materials concerning the INGV Educational and Outreach projects on the main themes of Geophysics and natural hazards. The forefront results of research activity, in fact, are periodically transferred to the public through an intense and comprehensive plan of scientific dissemination. In 10 years of activity, the Laboratorio has become an essential point of reference for this production, widely known within the scientific community. Positive experiences are the result of a strict relationship between graphic design and scientific research, in particular the process concerning the collaborative work between designers and researchers. In projects such as the realization of museum exhibition or the production of illustrative brochures, generally designed for broad-spectrum public, the goal is to make easier the understanding and to support the scientific message, making concepts enjoyable and fruitful through the emotional involvement that visual image can arouse. Our graphics and editorial products through composition of signs and images by using differt tools on different media (the use of colors, lettering, graphic design, visual design, web design etc.) link to create a strong identity "INGV style", in order to make them easily recognizable in Educational and Outreach projects: in one words "branding". For example, a project product package might include a logo or other artwork, organized text and pure design elements such as shapes and colour, which unify the piece. Colour is used not only to help the "brand" stand out from the international overview, but in our case to have a unifying outcome across all the INGV sections. We also analysed the restyling project of different materials, one of the most important features of graphic design

  1. Storyboard dalam Pembuatan Motion Graphic

    Directory of Open Access Journals (Sweden)

    Satrya Mahardhika

    2013-10-01

    Full Text Available Motion graphics is one category in the animation that makes animation with lots of design elements in each component. Motion graphics needs long process including preproduction, production, and postproduction. Preproduction has an important role so that the next stage may provide guidance or instructions for the production process or the animation process. Preproduction includes research, making the story, script, screenplay, character, environment design and storyboards. The storyboard will be determined through camera angles, blocking, sets, and many supporting roles involved in a scene. Storyboard is also useful as a production reference in recording or taping each scene in sequence or as an efficient priority. The example used is an ad creation using motion graphic animation storyboard which has an important role as a blueprint for every scene and giving instructions to make the transition movement, layout, blocking, and defining camera movement that everything should be done periodically in animation production. Planning before making the animation or motion graphic will make the job more organized, presentable, and more efficient in the process.  

  2. SAS and R data management, statistical analysis, and graphics

    CERN Document Server

    Kleinman, Ken

    2009-01-01

    An All-in-One Resource for Using SAS and R to Carry out Common TasksProvides a path between languages that is easier than reading complete documentationSAS and R: Data Management, Statistical Analysis, and Graphics presents an easy way to learn how to perform an analytical task in both SAS and R, without having to navigate through the extensive, idiosyncratic, and sometimes unwieldy software documentation. The book covers many common tasks, such as data management, descriptive summaries, inferential procedures, regression analysis, and the creation of graphics, along with more complex applicat

  3. Using R for Data Management, Statistical Analysis, and Graphics

    CERN Document Server

    Horton, Nicholas J

    2010-01-01

    This title offers quick and easy access to key element of documentation. It includes worked examples across a wide variety of applications, tasks, and graphics. "Using R for Data Management, Statistical Analysis, and Graphics" presents an easy way to learn how to perform an analytical task in R, without having to navigate through the extensive, idiosyncratic, and sometimes unwieldy software documentation and vast number of add-on packages. Organized by short, clear descriptive entries, the book covers many common tasks, such as data management, descriptive summaries, inferential proc

  4. Graphic organizers and their effects on the reading comprehension of students with LD: a synthesis of research.

    Science.gov (United States)

    Kim, Ae-Hwa; Vaughn, Sharon; Wanzek, Jeanne; Wei, Shangjin

    2004-01-01

    Previous research studies examining the effects of graphic organizers on reading comprehension for students with learning disabilities (LD) are reviewed. An extensive search of the professional literature between 1963 and June 2001 yielded a total of 21 group design intervention studies that met the criteria for inclusion in the synthesis. Using graphic organizers (i.e., semantic organizers, framed outlines, cognitive maps with and without a mnemonic) was associated with improved reading comprehension overall for students with LD. Compared to standardized reading measures, researcher-developed comprehension measures were associated with higher effect sizes. Initial gains demonstrated when using graphic organizers were not revealed during later comprehension tasks or on new comprehension tasks.

  5. Graphics and Statistics for Cardiology: Data visualisation for meta-analysis.

    Science.gov (United States)

    Kiran, Amit; Crespillo, Abel Pérez; Rahimi, Kazem

    2017-01-01

    Graphical displays play a pivotal role in understanding data sets and disseminating results. For meta-analysis, they are instrumental in presenting findings from multiple studies. This report presents guidance to authors wishing to submit graphical displays as part of their meta-analysis to a clinical cardiology journal, such as HeartWhen using graphical displays for meta-analysis, we recommend the following: Use a flow diagram to describe the number of studies returned from the initial search, the inclusion/exclusion criteria applied and the final number of studies used in the meta-analysis.Present results from the meta-analysis using a figure that incorporates a forest plot and underlying (tabulated) statistics, including test for heterogeneity.Use displays such as funnel plot (minimum 10 studies) and Galbraith plot to visually present distribution of effect sizes or associations in order to evaluate small-study effects and publication bias).For meta-regression, the bubble plot is a useful display for assessing associations by study-level factors.Final checks on graphs, such as appropriate use of axis scale, line pattern, text size and graph resolution, should always be performed. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  6. Graphics processing units in bioinformatics, computational biology and systems biology.

    Science.gov (United States)

    Nobile, Marco S; Cazzaniga, Paolo; Tangherloni, Andrea; Besozzi, Daniela

    2017-09-01

    Several studies in Bioinformatics, Computational Biology and Systems Biology rely on the definition of physico-chemical or mathematical models of biological systems at different scales and levels of complexity, ranging from the interaction of atoms in single molecules up to genome-wide interaction networks. Traditional computational methods and software tools developed in these research fields share a common trait: they can be computationally demanding on Central Processing Units (CPUs), therefore limiting their applicability in many circumstances. To overcome this issue, general-purpose Graphics Processing Units (GPUs) are gaining an increasing attention by the scientific community, as they can considerably reduce the running time required by standard CPU-based software, and allow more intensive investigations of biological systems. In this review, we present a collection of GPU tools recently developed to perform computational analyses in life science disciplines, emphasizing the advantages and the drawbacks in the use of these parallel architectures. The complete list of GPU-powered tools here reviewed is available at http://bit.ly/gputools. © The Author 2016. Published by Oxford University Press.

  7. Stochastic Analysis of a Queue Length Model Using a Graphics Processing Unit

    Czech Academy of Sciences Publication Activity Database

    Přikryl, Jan; Kocijan, J.

    2012-01-01

    Roč. 5, č. 2 (2012), s. 55-62 ISSN 1802-971X R&D Projects: GA MŠk(CZ) MEB091015 Institutional support: RVO:67985556 Keywords : graphics processing unit * GPU * Monte Carlo simulation * computer simulation * modeling Subject RIV: BC - Control Systems Theory http://library.utia.cas.cz/separaty/2012/AS/prikryl-stochastic analysis of a queue length model using a graphics processing unit.pdf

  8. Micromagnetic simulations using Graphics Processing Units

    International Nuclear Information System (INIS)

    Lopez-Diaz, L; Aurelio, D; Torres, L; Martinez, E; Hernandez-Lopez, M A; Gomez, J; Alejos, O; Carpentieri, M; Finocchio, G; Consolo, G

    2012-01-01

    The methodology for adapting a standard micromagnetic code to run on graphics processing units (GPUs) and exploit the potential for parallel calculations of this platform is discussed. GPMagnet, a general purpose finite-difference GPU-based micromagnetic tool, is used as an example. Speed-up factors of two orders of magnitude can be achieved with GPMagnet with respect to a serial code. This allows for running extensive simulations, nearly inaccessible with a standard micromagnetic solver, at reasonable computational times. (topical review)

  9. Graphical Environment Tools for Application to Gamma-Ray Energy Tracking Arrays

    Energy Technology Data Exchange (ETDEWEB)

    Todd, Richard A. [RIS Corp.; Radford, David C. [ORNL Physics Div.

    2013-12-30

    Highly segmented, position-sensitive germanium detector systems are being developed for nuclear physics research where traditional electronic signal processing with mixed analog and digital function blocks would be enormously complex and costly. Future systems will be constructed using pipelined processing of high-speed digitized signals as is done in the telecommunications industry. Techniques which provide rapid algorithm and system development for future systems are desirable. This project has used digital signal processing concepts and existing graphical system design tools to develop a set of re-usable modular functions and libraries targeted for the nuclear physics community. Researchers working with complex nuclear detector arrays such as the Gamma-Ray Energy Tracking Array (GRETA) have been able to construct advanced data processing algorithms for implementation in field programmable gate arrays (FPGAs) through application of these library functions using intuitive graphical interfaces.

  10. Optimized Laplacian image sharpening algorithm based on graphic processing unit

    Science.gov (United States)

    Ma, Tinghuai; Li, Lu; Ji, Sai; Wang, Xin; Tian, Yuan; Al-Dhelaan, Abdullah; Al-Rodhaan, Mznah

    2014-12-01

    In classical Laplacian image sharpening, all pixels are processed one by one, which leads to large amount of computation. Traditional Laplacian sharpening processed on CPU is considerably time-consuming especially for those large pictures. In this paper, we propose a parallel implementation of Laplacian sharpening based on Compute Unified Device Architecture (CUDA), which is a computing platform of Graphic Processing Units (GPU), and analyze the impact of picture size on performance and the relationship between the processing time of between data transfer time and parallel computing time. Further, according to different features of different memory, an improved scheme of our method is developed, which exploits shared memory in GPU instead of global memory and further increases the efficiency. Experimental results prove that two novel algorithms outperform traditional consequentially method based on OpenCV in the aspect of computing speed.

  11. Graphics Processing Unit Accelerated Hirsch-Fye Quantum Monte Carlo

    Science.gov (United States)

    Moore, Conrad; Abu Asal, Sameer; Rajagoplan, Kaushik; Poliakoff, David; Caprino, Joseph; Tomko, Karen; Thakur, Bhupender; Yang, Shuxiang; Moreno, Juana; Jarrell, Mark

    2012-02-01

    In Dynamical Mean Field Theory and its cluster extensions, such as the Dynamic Cluster Algorithm, the bottleneck of the algorithm is solving the self-consistency equations with an impurity solver. Hirsch-Fye Quantum Monte Carlo is one of the most commonly used impurity and cluster solvers. This work implements optimizations of the algorithm, such as enabling large data re-use, suitable for the Graphics Processing Unit (GPU) architecture. The GPU's sheer number of concurrent parallel computations and large bandwidth to many shared memories takes advantage of the inherent parallelism in the Green function update and measurement routines, and can substantially improve the efficiency of the Hirsch-Fye impurity solver.

  12. Investigating Creativity in Graphic Design Education from Psychological Perspectives

    Directory of Open Access Journals (Sweden)

    Salman Amur Alhajri

    2017-01-01

    Full Text Available The role of creativity in graphic design education has been a central aspect of graphic design education. The psychological component of creativity and its role in graphic design education has not been given much importance. The present research would attempt to study ‘creativity in graphic design education from psychological perspectives’. A thorough review of literature would be conducted on graphic design education, creativity and its psychological aspects. Creativity is commonly defined as a ‘problem solving’ feature in design education. Students of graphic design have to involve themselves in the identification of cultural and social elements. Instruction in the field of graphic design must be aimed at enhancing the creative abilities of the student. The notion that creativity is a cultural production is strengthened by the problem solving methods employed in all cultures. Most cultures regard creativity as a process which leads to the creation of something new. Based on this idea, a cross-cultural research was conducted to explore the concept of creativity from Arabic and Western perspective. From a psychological viewpoint, the student’s cognition, thinking patterns and habits also have a role in knowledge acquisition. The field of graphic design is not equipped with a decent framework which necessitates certain modes of instruction; appropriate to the discipline. The results of the study revealed that the psychological aspect of creativity needs to be adequately understood in order to enhance creativity in graphic design education.

  13. Visualisation for Stochastic Process Algebras: The Graphic Truth

    DEFF Research Database (Denmark)

    Smith, Michael James Andrew; Gilmore, Stephen

    2011-01-01

    and stochastic activity networks provide an automaton-based view of the model, which may be easier to visualise, at the expense of portability. In this paper, we argue that we can achieve the benefits of both approaches by generating a graphical view of a stochastic process algebra model, which is synchronised...

  14. Demonstrating Patterns in the Views Of Stakeholders Regarding Ethically-Salient Issues in Clinical Research: A Novel Use of Graphical Models in Empirical Ethics Inquiry.

    Science.gov (United States)

    Kim, Jane Paik; Roberts, Laura Weiss

    Empirical ethics inquiry works from the notion that stakeholder perspectives are necessary for gauging the ethical acceptability of human studies and assuring that research aligns with societal expectations. Although common, studies involving different populations often entail comparisons of trends that problematize the interpretation of results. Using graphical model selection - a technique aimed at transcending limitations of conventional methods - this report presents data on the ethics of clinical research with two objectives: (1) to display the patterns of views held by ill and healthy individuals in clinical research as a test of the study's original hypothesis and (2) to introduce graphical model selection as a key analytic tool for ethics research. In this IRB-approved, NIH-funded project, data were collected from 60 mentally ill and 43 physically ill clinical research protocol volunteers, 47 healthy protocol-consented participants, and 29 healthy individuals without research protocol experience. Respondents were queried on the ethical acceptability of research involving people with mental and physical illness (i.e., cancer, HIV, depression, schizophrenia, and post-traumatic stress disorder) and non-illness related sources of vulnerability (e.g., age, class, gender, ethnicity). Using a statistical algorithm, we selected graphical models to display interrelationships among responses to questions. Both mentally and physically ill protocol volunteers revealed a high degree of connectivity among ethically-salient perspectives. Healthy participants, irrespective of research protocol experience, revealed patterns of views that were not highly connected. Between ill and healthy protocol participants, the pattern of views is vastly different. Experience with illness was tied to dense connectivity, whereas healthy individuals expressed views with sparse connections. In offering a nuanced perspective on the interrelation of ethically relevant responses, graphical

  15. Massively parallel signal processing using the graphics processing unit for real-time brain-computer interface feature extraction

    Directory of Open Access Journals (Sweden)

    J. Adam Wilson

    2009-07-01

    Full Text Available The clock speeds of modern computer processors have nearly plateaued in the past five years. Consequently, neural prosthetic systems that rely on processing large quantities of data in a short period of time face a bottleneck, in that it may not be possible to process all of the data recorded from an electrode array with high channel counts and bandwidth, such as electrocorticographic grids or other implantable systems. Therefore, in this study a method of using the processing capabilities of a graphics card (GPU was developed for real-time neural signal processing of a brain-computer interface (BCI. The NVIDIA CUDA system was used to offload processing to the GPU, which is capable of running many operations in parallel, potentially greatly increasing the speed of existing algorithms. The BCI system records many channels of data, which are processed and translated into a control signal, such as the movement of a computer cursor. This signal processing chain involves computing a matrix-matrix multiplication (i.e., a spatial filter, followed by calculating the power spectral density on every channel using an auto-regressive method, and finally classifying appropriate features for control. In this study, the first two computationally-intensive steps were implemented on the GPU, and the speed was compared to both the current implementation and a CPU-based implementation that uses multi-threading. Significant performance gains were obtained with GPU processing: the current implementation processed 1000 channels in 933 ms, while the new GPU method took only 27 ms, an improvement of nearly 35 times.

  16. Expert Graphics System Research in the Department of the Navy.

    Science.gov (United States)

    Duff, Jon M.

    1987-01-01

    Presents current trends in the development of expert systems within the Department of the Navy, particularly research into expert graphics systems intended to support the Authoring Instructional Methods (AIM) research project. Defines artificial intelligence and expert systems. Discusses the operations and functions of the Navy's intelligent…

  17. Class Evolution Tree: A Graphical Tool to Support Decisions on the Number of Classes in Exploratory Categorical Latent Variable Modeling for Rehabilitation Research

    Science.gov (United States)

    Kriston, Levente; Melchior, Hanne; Hergert, Anika; Bergelt, Corinna; Watzke, Birgit; Schulz, Holger; von Wolff, Alessa

    2011-01-01

    The aim of our study was to develop a graphical tool that can be used in addition to standard statistical criteria to support decisions on the number of classes in explorative categorical latent variable modeling for rehabilitation research. Data from two rehabilitation research projects were used. In the first study, a latent profile analysis was…

  18. Code REX to fit experimental data to exponential functions and graphics plotting

    International Nuclear Information System (INIS)

    Romero, L.; Travesi, A.

    1983-01-01

    The REX code, written in Fortran IV, performs the fitting a set of experimental data to different kind of functions as: straight-line (Y = A + BX) , and various exponential type (Y-A B x , Y=A X B ; Y=A exp(BX) ) , using the Least Squares criterion. Such fitting could be done directly for one selected function of for the our simultaneously and allows to chose the function that best fitting to the data, since presents the statistics data of all the fitting. Further, it presents the graphics plotting, of the fitted function, in the appropriate coordinate axes system. An additional option allows also the Graphic plotting of experimental data used for the fitting. All the data necessary to execute this code are asked to the operator in the terminal screen, in the iterative way by screen-operator dialogue, and the values are introduced through the keyboard. This code could be executed with any computer provided with graphic screen and keyboard terminal, with a X-Y plotter serial connected to the graphics terminal. (Author) 5 refs

  19. Using Interactive Graphics to Teach Multivariate Data Analysis to Psychology Students

    Science.gov (United States)

    Valero-Mora, Pedro M.; Ledesma, Ruben D.

    2011-01-01

    This paper discusses the use of interactive graphics to teach multivariate data analysis to Psychology students. Three techniques are explored through separate activities: parallel coordinates/boxplots; principal components/exploratory factor analysis; and cluster analysis. With interactive graphics, students may perform important parts of the…

  20. Accelerating VASP electronic structure calculations using graphic processing units

    KAUST Repository

    Hacene, Mohamed

    2012-08-20

    We present a way to improve the performance of the electronic structure Vienna Ab initio Simulation Package (VASP) program. We show that high-performance computers equipped with graphics processing units (GPUs) as accelerators may reduce drastically the computation time when offloading these sections to the graphic chips. The procedure consists of (i) profiling the performance of the code to isolate the time-consuming parts, (ii) rewriting these so that the algorithms become better-suited for the chosen graphic accelerator, and (iii) optimizing memory traffic between the host computer and the GPU accelerator. We chose to accelerate VASP with NVIDIA GPU using CUDA. We compare the GPU and original versions of VASP by evaluating the Davidson and RMM-DIIS algorithms on chemical systems of up to 1100 atoms. In these tests, the total time is reduced by a factor between 3 and 8 when running on n (CPU core + GPU) compared to n CPU cores only, without any accuracy loss. © 2012 Wiley Periodicals, Inc.

  1. Accelerating VASP electronic structure calculations using graphic processing units

    KAUST Repository

    Hacene, Mohamed; Anciaux-Sedrakian, Ani; Rozanska, Xavier; Klahr, Diego; Guignon, Thomas; Fleurat-Lessard, Paul

    2012-01-01

    We present a way to improve the performance of the electronic structure Vienna Ab initio Simulation Package (VASP) program. We show that high-performance computers equipped with graphics processing units (GPUs) as accelerators may reduce drastically the computation time when offloading these sections to the graphic chips. The procedure consists of (i) profiling the performance of the code to isolate the time-consuming parts, (ii) rewriting these so that the algorithms become better-suited for the chosen graphic accelerator, and (iii) optimizing memory traffic between the host computer and the GPU accelerator. We chose to accelerate VASP with NVIDIA GPU using CUDA. We compare the GPU and original versions of VASP by evaluating the Davidson and RMM-DIIS algorithms on chemical systems of up to 1100 atoms. In these tests, the total time is reduced by a factor between 3 and 8 when running on n (CPU core + GPU) compared to n CPU cores only, without any accuracy loss. © 2012 Wiley Periodicals, Inc.

  2. The graphics future in scientific applications

    International Nuclear Information System (INIS)

    Enderle, G.

    1982-01-01

    Computer graphics methods and tools are being used to a great extent in scientific research. The future development in this area will be influenced both by new hardware developments and by software advances. On the hardware sector, the development of the raster technology will lead to the increased use of colour workstations with more local processing power. Colour hardcopy devices for creating plots, slides, or movies will be available at a lower price than today. The first real 3D-workstations appear on the marketplace. One of the main activities on the software sector is the standardization of computer graphics systems, graphical files, and device interfaces. This will lead to more portable graphical application programs and to a common base for computer graphics education. (orig.)

  3. General Purpose Graphics Processing Unit Based High-Rate Rice Decompression and Reed-Solomon Decoding

    Energy Technology Data Exchange (ETDEWEB)

    Loughry, Thomas A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    As the volume of data acquired by space-based sensors increases, mission data compression/decompression and forward error correction code processing performance must likewise scale. This competency development effort was explored using the General Purpose Graphics Processing Unit (GPGPU) to accomplish high-rate Rice Decompression and high-rate Reed-Solomon (RS) decoding at the satellite mission ground station. Each algorithm was implemented and benchmarked on a single GPGPU. Distributed processing across one to four GPGPUs was also investigated. The results show that the GPGPU has considerable potential for performing satellite communication Data Signal Processing, with three times or better performance improvements and up to ten times reduction in cost over custom hardware, at least in the case of Rice Decompression and Reed-Solomon Decoding.

  4. Graphics Processing Unit Enhanced Parallel Document Flocking Clustering

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Xiaohui [ORNL; Potok, Thomas E [ORNL; ST Charles, Jesse Lee [ORNL

    2010-01-01

    Analyzing and clustering documents is a complex problem. One explored method of solving this problem borrows from nature, imitating the flocking behavior of birds. One limitation of this method of document clustering is its complexity O(n2). As the number of documents grows, it becomes increasingly difficult to generate results in a reasonable amount of time. In the last few years, the graphics processing unit (GPU) has received attention for its ability to solve highly-parallel and semi-parallel problems much faster than the traditional sequential processor. In this paper, we have conducted research to exploit this archi- tecture and apply its strengths to the flocking based document clustering problem. Using the CUDA platform from NVIDIA, we developed a doc- ument flocking implementation to be run on the NVIDIA GEFORCE GPU. Performance gains ranged from thirty-six to nearly sixty times improvement of the GPU over the CPU implementation.

  5. High speed graphic program on a personal computer and its utilization in JIPP T-IIU online data-processing system

    International Nuclear Information System (INIS)

    Taniguchi, Yoshiyuki; Noda, Nobuaki; Sasao, Mamiko; Sato, Masahiro

    1986-01-01

    A high speed graphic program was developed on a personal computer PC9801. Using this program, one can draw a waveform of successive 16 bit-integer data, such as obtained by analog-to-digital convertor. The program is written by the machine language and has a form of a subroutine program which can be called from main programs under N 88 BASIC. The time for drawing one waveform is 4 ms, which is two orders faster than the time with standard graphic routines of BASIC interpreter. This program is very convenient for the real-time display of plasma-monitoring raw data, such as plasma current, loop voltage, rf power etc. in tokamak experiments. This program has been utilized in JIPP T-IIU experiments and enables to display data of 8 channel ADC within a few seconds before the system transmits the data from CAMAC to the computer center of the institute. The program and its utilization are presented. (author)

  6. SCNS: a graphical tool for reconstructing executable regulatory networks from single-cell genomic data.

    Science.gov (United States)

    Woodhouse, Steven; Piterman, Nir; Wintersteiger, Christoph M; Göttgens, Berthold; Fisher, Jasmin

    2018-05-25

    Reconstruction of executable mechanistic models from single-cell gene expression data represents a powerful approach to understanding developmental and disease processes. New ambitious efforts like the Human Cell Atlas will soon lead to an explosion of data with potential for uncovering and understanding the regulatory networks which underlie the behaviour of all human cells. In order to take advantage of this data, however, there is a need for general-purpose, user-friendly and efficient computational tools that can be readily used by biologists who do not have specialist computer science knowledge. The Single Cell Network Synthesis toolkit (SCNS) is a general-purpose computational tool for the reconstruction and analysis of executable models from single-cell gene expression data. Through a graphical user interface, SCNS takes single-cell qPCR or RNA-sequencing data taken across a time course, and searches for logical rules that drive transitions from early cell states towards late cell states. Because the resulting reconstructed models are executable, they can be used to make predictions about the effect of specific gene perturbations on the generation of specific lineages. SCNS should be of broad interest to the growing number of researchers working in single-cell genomics and will help further facilitate the generation of valuable mechanistic insights into developmental, homeostatic and disease processes.

  7. A low-cost system for graphical process monitoring with colour video symbol display units

    International Nuclear Information System (INIS)

    Grauer, H.; Jarsch, V.; Mueller, W.

    1977-01-01

    A system for computer controlled graphic process supervision, using color symbol video displays is described. It has the following characteristics: - compact unit: no external memory for image storage - problem oriented simple descriptive cut to the process program - no restriction of the graphical representation of process variables - computer and display independent, by implementation of colours and parameterized code creation for the display. (WB) [de

  8. Accelerating Solution Proposal of AES Using a Graphic Processor

    Directory of Open Access Journals (Sweden)

    STRATULAT, M.

    2011-11-01

    Full Text Available The main goal of this work is to analyze the possibility of using a graphic processing unit in non graphical calculations. Graphic Processing Units are being used nowadays not only for game engines and movie encoding/decoding, but also for a vast area of applications, like Cryptography. We used the graphic processing unit as a cryptographic coprocessor in order accelerate AES algorithm. Our implementation of AES is on a GPU using CUDA architecture. The performances obtained show that the CUDA implementation can offer speedups of 11.95Gbps. The tests are conducted in two directions: running the tests on small data sizes that are located in memory and large data that are stored in files on hard drives.

  9. Examples of data processing systems. Data processing system for JT-60

    International Nuclear Information System (INIS)

    Aoyagi, Tetsuo

    1996-01-01

    JT-60 data processing system is a large computer complex system including a lot of micro-computers, several mini-computers, and a main-frame computer. As general introduction of the original system configuration has been published previously, some improvements are described here. Transient mass data storage system, network database server, a data acquisition system using engineering workstations, and a graphic terminal emulator for X-Window are presented. These new features are realized by utilizing recent progress in computer and network technology and carefully designed user interface specification of the original system. (author)

  10. Data acquisition and signal processing system for IPR R1 TRIGA-Mark I nuclear research reactor of CDTN

    International Nuclear Information System (INIS)

    Mesquita, A.Z.; Maretti, F. Jr.; Rezende, H.C.; Tambourgi, E.B.

    2004-01-01

    The TRIGA IPR-R1 Nuclear Research Reactor, located at the Nuclear Technology Development Center (CDTN/CNEN) in Belo Horizonte, Brazil, is being operated since 44 years ago. The main operational parameters were monitored by analog recorders and counters located in the reactor control console. The reactor operators registered the most important operational parameters and data in the reactor logbook. This process is quite useful, but it can involve some human errors. It is also impossible for the operators to take notes of all variables involving the process mainly during fast power transients in some operations. A PC-based data acquisition was developed for the reactor that allows online monitoring, through graphic interfaces, and shows operational parameters evolution to the operators. Some parameters that were not measured, like the power and the coolant flow rate at the primary loop, are monitored now in the computer video monitor. The developed system allows measuring out all parameters in a frequency up to 1 kHz. These data is also recorded in text files available for consults and analysis. (author)

  11. Evaluating virtual hosted desktops for graphics-intensive astronomy

    Science.gov (United States)

    Meade, B. F.; Fluke, C. J.

    2018-04-01

    Visualisation of data is critical to understanding astronomical phenomena. Today, many instruments produce datasets that are too big to be downloaded to a local computer, yet many of the visualisation tools used by astronomers are deployed only on desktop computers. Cloud computing is increasingly used to provide a computation and simulation platform in astronomy, but it also offers great potential as a visualisation platform. Virtual hosted desktops, with graphics processing unit (GPU) acceleration, allow interactive, graphics-intensive desktop applications to operate co-located with astronomy datasets stored in remote data centres. By combining benchmarking and user experience testing, with a cohort of 20 astronomers, we investigate the viability of replacing physical desktop computers with virtual hosted desktops. In our work, we compare two Apple MacBook computers (one old and one new, representing hardware and opposite ends of the useful lifetime) with two virtual hosted desktops: one commercial (Amazon Web Services) and one in a private research cloud (the Australian NeCTAR Research Cloud). For two-dimensional image-based tasks and graphics-intensive three-dimensional operations - typical of astronomy visualisation workflows - we found that benchmarks do not necessarily provide the best indication of performance. When compared to typical laptop computers, virtual hosted desktops can provide a better user experience, even with lower performing graphics cards. We also found that virtual hosted desktops are equally simple to use, provide greater flexibility in choice of configuration, and may actually be a more cost-effective option for typical usage profiles.

  12. Topographic Digital Raster Graphics - USGS DIGITAL RASTER GRAPHICS

    Data.gov (United States)

    NSGIC Local Govt | GIS Inventory — USGS Topographic Digital Raster Graphics downloaded from LABINS (http://data.labins.org/2003/MappingData/drg/drg_stpl83.cfm). A digital raster graphic (DRG) is a...

  13. Code ''Repol'' to fit experimental data with a polynomial and its graphics plotting

    International Nuclear Information System (INIS)

    Travesi, A.; Romero, L.

    1983-01-01

    The ''Repol'' code performs the fitting of a set of experimental data, with a polynomial of mth. degree (max. 10), using the Least Squares Criterion. Further, it presents the graphic plotting of the fitted polynomial, in the appropriate coordinates axes system, by a plotter. An additional option allows also the graphic plotting of the experimental data, used for the fit. The necessary data to execute this code, are asked to the operator in the screen, in a iterative way, by screen-operator dialogue, and the values are introduced through the keyboard. This code is written in Fortran IV, and because of its structure programming in subroutine blocks, can be adapted to any computer with graphic screen and keyboard terminal, with a plotter serial connected to it, whose software has the Hewlett Packard ''Graphics 1000''. (author)

  14. Classification of hyperspectral imagery using MapReduce on a NVIDIA graphics processing unit (Conference Presentation)

    Science.gov (United States)

    Ramirez, Andres; Rahnemoonfar, Maryam

    2017-04-01

    A hyperspectral image provides multidimensional figure rich in data consisting of hundreds of spectral dimensions. Analyzing the spectral and spatial information of such image with linear and non-linear algorithms will result in high computational time. In order to overcome this problem, this research presents a system using a MapReduce-Graphics Processing Unit (GPU) model that can help analyzing a hyperspectral image through the usage of parallel hardware and a parallel programming model, which will be simpler to handle compared to other low-level parallel programming models. Additionally, Hadoop was used as an open-source version of the MapReduce parallel programming model. This research compared classification accuracy results and timing results between the Hadoop and GPU system and tested it against the following test cases: the CPU and GPU test case, a CPU test case and a test case where no dimensional reduction was applied.

  15. SDAR: a practical tool for graphical analysis of two-dimensional data

    Directory of Open Access Journals (Sweden)

    Weeratunga Saroja

    2012-08-01

    Full Text Available Abstract Background Two-dimensional data needs to be processed and analysed in almost any experimental laboratory. Some tasks in this context may be performed with generic software such as spreadsheet programs which are available ubiquitously, others may require more specialised software that requires paid licences. Additionally, more complex software packages typically require more time by the individual user to understand and operate. Practical and convenient graphical data analysis software in Java with a user-friendly interface are rare. Results We have developed SDAR, a Java application to analyse two-dimensional data with an intuitive graphical user interface. A smart ASCII parser allows import of data into SDAR without particular format requirements. The centre piece of SDAR is the Java class GraphPanel which provides methods for generic tasks of data visualisation. Data can be manipulated and analysed with respect to the most common operations experienced in an experimental biochemical laboratory. Images of the data plots can be generated in SVG-, TIFF- or PNG-format. Data exported by SDAR is annotated with commands compatible with the Grace software. Conclusion Since SDAR is implemented in Java, it is truly cross-platform compatible. The software is easy to install, and very convenient to use judging by experience in our own laboratories. It is freely available to academic users at http://www.structuralchemistry.org/pcsb/. To download SDAR, users will be asked for their name, institution and email address. A manual, as well as the source code of the GraphPanel class can also be downloaded from this site.

  16. An adaptive structure data acquisition system using a graphical-based programming language

    Science.gov (United States)

    Baroth, Edmund C.; Clark, Douglas J.; Losey, Robert W.

    1992-01-01

    An example of the implementation of data fusion using a PC and a graphical programming language is discussed. A schematic of the data acquisition system and user interface panel for an adaptive structure test are presented. The computer programs (a series of icons 'wired' together) are also discussed. The way in which using graphical-based programming software to control a data acquisition system can simplify analysis of data, promote multidisciplinary interaction, and provide users a more visual key to understanding their data are shown.

  17. CT applications of medical computer graphics

    International Nuclear Information System (INIS)

    Rhodes, M.L.

    1985-01-01

    Few applications of computer graphics show as much promise and early success as that for CT. Unlike electron microscopy, ultrasound, business, military, and animation applications, CT image data are inherently digital. CT pictures can be processed directly by programs well established in the fields of computer graphics and digital image processing. Methods for reformatting digital pictures, enhancing structure shape, reducing image noise, and rendering three-dimensional (3D) scenes of anatomic structures have all become routine at many CT centers. In this chapter, the authors provide a brief introduction to computer graphics terms and techniques commonly applied to CT pictures and, when appropriate, to those showing promise for magnetic resonance images. Topics discussed here are image-processing options that are applied to digital images already constructed. In the final portion of this chapter techniques for ''slicing'' CT image data are presented, and geometric principles that describe the specification of oblique and curved images are outlined. Clinical examples are included

  18. Data taking and processing system for nuclear experimental physics study

    International Nuclear Information System (INIS)

    Nagashima, Y.; Kimura, H.; Katori, K.; Kuriyama, K.

    1979-01-01

    A multi input, multi mode, multi user data taking and processing system was developed. This system has following special features. 1) It is multi computer system which is constitute with two special processors and two mini computers. 2) The pseudo devices are introduced to make operating procedurs simply and easily. Especially, the selection or modification of 1 - 8 coincidence mode can be done very easily and quickly. 3) A 16 Kch spectrum storage has 8 partitions. Every partitions having floating size are handled automatically by the data taking software SHINE. 4) On line real time data processing can be done. Useing the FORTRAN language, user may prepare the processing software apart from the data taking software. Under the RSX-11D system software, this software runs concurrently with the data taking software by a multi programming mode. 5) The data communication between arbitraly external devices and this system can be done. With this communication procedures, not only the data transfer between computers, but also the control of the experimental devices are realized. Like the real time processing software, this software can be prepared by users and be ran concurrently with other softwares. 6) For data monitoring, two different graphic displays are used complementally. One is a refresh typed high speed display. The other is a storage typed large screen display. Raw datas are displayed on the former. Processed datas or multi parametric large volume datas are displayed on the later one. (author)

  19. PyMS: a Python toolkit for processing of gas chromatography-mass spectrometry (GC-MS data. Application and comparative study of selected tools

    Directory of Open Access Journals (Sweden)

    O'Callaghan Sean

    2012-05-01

    Full Text Available Abstract Background Gas chromatography–mass spectrometry (GC-MS is a technique frequently used in targeted and non-targeted measurements of metabolites. Most existing software tools for processing of raw instrument GC-MS data tightly integrate data processing methods with graphical user interface facilitating interactive data processing. While interactive processing remains critically important in GC-MS applications, high-throughput studies increasingly dictate the need for command line tools, suitable for scripting of high-throughput, customized processing pipelines. Results PyMS comprises a library of functions for processing of instrument GC-MS data developed in Python. PyMS currently provides a complete set of GC-MS processing functions, including reading of standard data formats (ANDI- MS/NetCDF and JCAMP-DX, noise smoothing, baseline correction, peak detection, peak deconvolution, peak integration, and peak alignment by dynamic programming. A novel common ion single quantitation algorithm allows automated, accurate quantitation of GC-MS electron impact (EI fragmentation spectra when a large number of experiments are being analyzed. PyMS implements parallel processing for by-row and by-column data processing tasks based on Message Passing Interface (MPI, allowing processing to scale on multiple CPUs in distributed computing environments. A set of specifically designed experiments was performed in-house and used to comparatively evaluate the performance of PyMS and three widely used software packages for GC-MS data processing (AMDIS, AnalyzerPro, and XCMS. Conclusions PyMS is a novel software package for the processing of raw GC-MS data, particularly suitable for scripting of customized processing pipelines and for data processing in batch mode. PyMS provides limited graphical capabilities and can be used both for routine data processing and interactive/exploratory data analysis. In real-life GC-MS data processing scenarios PyMS performs

  20. Code REPOL to fit experimental data with a polynomial, and its graphics plotting

    International Nuclear Information System (INIS)

    Romero, L.; Travesi, A.

    1983-01-01

    The REPOL code, performs the fitting a set of experimental data, with a polynomial of mth. degree (max. 10), using the Least Squares Criterion. further, it presents the graphic plotting of the fitted polynomial, in the appropriate coordinates axes system, by a plotter. An additional option allows also the graphic plotting of the experimental data, used for the fit. The necessary data to execute this code, are asked to the operator in the screen, in a iterative way, by screen-operator dialogue, and the values are introduced through the keyboard. This code is written in Fortran IV, and because of its structure programming in subroutine blocks, can be adapted to any computer with graphic screen and keyboard terminal, with a plotter serial connected to it, whose Software has the Hewlett Packard Graphics 1000. (Author) 5 refs

  1. Graphical passwords: a qualitative study of password patterns

    CSIR Research Space (South Africa)

    Vorster, J

    2015-03-01

    Full Text Available Graphical passwords schemas are becoming more main-stream. There are many different approaches to graphical passwords, each with its own drawbacks and advantages. There has been many studies to suggest that graphical passwords should be stronger...

  2. Interactive voxel graphics in virtual reality

    Science.gov (United States)

    Brody, Bill; Chappell, Glenn G.; Hartman, Chris

    2002-06-01

    Interactive voxel graphics in virtual reality poses significant research challenges in terms of interface, file I/O, and real-time algorithms. Voxel graphics is not so new, as it is the focus of a good deal of scientific visualization. Interactive voxel creation and manipulation is a more innovative concept. Scientists are understandably reluctant to manipulate data. They collect or model data. A scientific analogy to interactive graphics is the generation of initial conditions for some model. It is used as a method to test those models. We, however, are in the business of creating new data in the form of graphical imagery. In our endeavor, science is a tool and not an end. Nevertheless, there is a whole class of interactions and associated data generation scenarios that are natural to our way of working and that are also appropriate to scientific inquiry. Annotation by sketching or painting to point to and distinguish interesting and important information is very significant for science as well as art. Annotation in 3D is difficult without a good 3D interface. Interactive graphics in virtual reality is an appropriate approach to this problem.

  3. Mechanical properties of bovine cortical bone based on the automated ball indentation technique and graphics processing method.

    Science.gov (United States)

    Zhang, Airong; Zhang, Song; Bian, Cuirong

    2018-02-01

    Cortical bone provides the main form of support in humans and other vertebrates against various forces. Thus, capturing its mechanical properties is important. In this study, the mechanical properties of cortical bone were investigated by using automated ball indentation and graphics processing at both the macroscopic and microstructural levels under dry conditions. First, all polished samples were photographed under a metallographic microscope, and the area ratio of the circumferential lamellae and osteons was calculated through the graphics processing method. Second, fully-computer-controlled automated ball indentation (ABI) tests were performed to explore the micro-mechanical properties of the cortical bone at room temperature and a constant indenter speed. The indentation defects were examined with a scanning electron microscope. Finally, the macroscopic mechanical properties of the cortical bone were estimated with the graphics processing method and mixture rule. Combining ABI and graphics processing proved to be an effective tool to obtaining the mechanical properties of the cortical bone, and the indenter size had a significant effect on the measurement. The methods presented in this paper provide an innovative approach to acquiring the macroscopic mechanical properties of cortical bone in a nondestructive manner. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Graphical Model Theory for Wireless Sensor Networks

    International Nuclear Information System (INIS)

    Davis, William B.

    2002-01-01

    Information processing in sensor networks, with many small processors, demands a theory of computation that allows the minimization of processing effort, and the distribution of this effort throughout the network. Graphical model theory provides a probabilistic theory of computation that explicitly addresses complexity and decentralization for optimizing network computation. The junction tree algorithm, for decentralized inference on graphical probability models, can be instantiated in a variety of applications useful for wireless sensor networks, including: sensor validation and fusion; data compression and channel coding; expert systems, with decentralized data structures, and efficient local queries; pattern classification, and machine learning. Graphical models for these applications are sketched, and a model of dynamic sensor validation and fusion is presented in more depth, to illustrate the junction tree algorithm

  5. Graphic Arts: Book Three. The Press and Related Processes.

    Science.gov (United States)

    Farajollahi, Karim; And Others

    The third of a three-volume set of instructional materials for a graphic arts course, this manual consists of nine instructional units dealing with presses and related processes. Covered in the units are basic press fundamentals, offset press systems, offset press operating procedures, offset inks and dampening chemistry, preventive maintenance…

  6. The Use of Computer Graphics in the Design Process.

    Science.gov (United States)

    Palazzi, Maria

    This master's thesis examines applications of computer technology to the field of industrial design and ways in which technology can transform the traditional process. Following a statement of the problem, the history and applications of the fields of computer graphics and industrial design are reviewed. The traditional industrial design process…

  7. Regressive research: The pitfalls of post hoc data selection in the study of unconscious mental processes.

    Science.gov (United States)

    Shanks, David R

    2017-06-01

    Many studies of unconscious processing involve comparing a performance measure (e.g., some assessment of perception or memory) with an awareness measure (such as a verbal report or a forced-choice response) taken either concurrently or separately. Unconscious processing is inferred when above-chance performance is combined with null awareness. Often, however, aggregate awareness is better than chance, and data analysis therefore employs a form of extreme group analysis focusing post hoc on participants, trials, or items where awareness is absent or at chance. The pitfalls of this analytic approach are described with particular reference to recent research on implicit learning and subliminal perception. Because of regression to the mean, the approach can mislead researchers into erroneous conclusions concerning unconscious influences on behavior. Recommendations are made about future use of post hoc selection in research on unconscious cognition.

  8. TMSEEG: A MATLAB-Based Graphical User Interface for Processing Electrophysiological Signals during Transcranial Magnetic Stimulation.

    Science.gov (United States)

    Atluri, Sravya; Frehlich, Matthew; Mei, Ye; Garcia Dominguez, Luis; Rogasch, Nigel C; Wong, Willy; Daskalakis, Zafiris J; Farzan, Faranak

    2016-01-01

    Concurrent recording of electroencephalography (EEG) during transcranial magnetic stimulation (TMS) is an emerging and powerful tool for studying brain health and function. Despite a growing interest in adaptation of TMS-EEG across neuroscience disciplines, its widespread utility is limited by signal processing challenges. These challenges arise due to the nature of TMS and the sensitivity of EEG to artifacts that often mask TMS-evoked potentials (TEP)s. With an increase in the complexity of data processing methods and a growing interest in multi-site data integration, analysis of TMS-EEG data requires the development of a standardized method to recover TEPs from various sources of artifacts. This article introduces TMSEEG, an open-source MATLAB application comprised of multiple algorithms organized to facilitate a step-by-step procedure for TMS-EEG signal processing. Using a modular design and interactive graphical user interface (GUI), this toolbox aims to streamline TMS-EEG signal processing for both novice and experienced users. Specifically, TMSEEG provides: (i) targeted removal of TMS-induced and general EEG artifacts; (ii) a step-by-step modular workflow with flexibility to modify existing algorithms and add customized algorithms; (iii) a comprehensive display and quantification of artifacts; (iv) quality control check points with visual feedback of TEPs throughout the data processing workflow; and (v) capability to label and store a database of artifacts. In addition to these features, the software architecture of TMSEEG ensures minimal user effort in initial setup and configuration of parameters for each processing step. This is partly accomplished through a close integration with EEGLAB, a widely used open-source toolbox for EEG signal processing. In this article, we introduce TMSEEG, validate its features and demonstrate its application in extracting TEPs across several single- and multi-pulse TMS protocols. As the first open-source GUI-based pipeline

  9. TMSEEG: A MATLAB-Based Graphical User Interface for Processing Electrophysiological Signals during Transcranial Magnetic Stimulation

    Science.gov (United States)

    Atluri, Sravya; Frehlich, Matthew; Mei, Ye; Garcia Dominguez, Luis; Rogasch, Nigel C.; Wong, Willy; Daskalakis, Zafiris J.; Farzan, Faranak

    2016-01-01

    Concurrent recording of electroencephalography (EEG) during transcranial magnetic stimulation (TMS) is an emerging and powerful tool for studying brain health and function. Despite a growing interest in adaptation of TMS-EEG across neuroscience disciplines, its widespread utility is limited by signal processing challenges. These challenges arise due to the nature of TMS and the sensitivity of EEG to artifacts that often mask TMS-evoked potentials (TEP)s. With an increase in the complexity of data processing methods and a growing interest in multi-site data integration, analysis of TMS-EEG data requires the development of a standardized method to recover TEPs from various sources of artifacts. This article introduces TMSEEG, an open-source MATLAB application comprised of multiple algorithms organized to facilitate a step-by-step procedure for TMS-EEG signal processing. Using a modular design and interactive graphical user interface (GUI), this toolbox aims to streamline TMS-EEG signal processing for both novice and experienced users. Specifically, TMSEEG provides: (i) targeted removal of TMS-induced and general EEG artifacts; (ii) a step-by-step modular workflow with flexibility to modify existing algorithms and add customized algorithms; (iii) a comprehensive display and quantification of artifacts; (iv) quality control check points with visual feedback of TEPs throughout the data processing workflow; and (v) capability to label and store a database of artifacts. In addition to these features, the software architecture of TMSEEG ensures minimal user effort in initial setup and configuration of parameters for each processing step. This is partly accomplished through a close integration with EEGLAB, a widely used open-source toolbox for EEG signal processing. In this article, we introduce TMSEEG, validate its features and demonstrate its application in extracting TEPs across several single- and multi-pulse TMS protocols. As the first open-source GUI-based pipeline

  10. A Theoretical Analysis of Learning with Graphics--Implications for Computer Graphics Design.

    Science.gov (United States)

    ChanLin, Lih-Juan

    This paper reviews the literature pertinent to learning with graphics. The dual coding theory provides explanation about how graphics are stored and precessed in semantic memory. The level of processing theory suggests how graphics can be employed in learning to encourage deeper processing. In addition to dual coding theory and level of processing…

  11. GRAPHICAL MODELS OF THE AIRCRAFT MAINTENANCE PROCESS

    Directory of Open Access Journals (Sweden)

    Stanislav Vladimirovich Daletskiy

    2017-01-01

    Full Text Available The aircraft maintenance is realized by a rapid sequence of maintenance organizational and technical states, its re- search and analysis are carried out by statistical methods. The maintenance process concludes aircraft technical states con- nected with the objective patterns of technical qualities changes of the aircraft as a maintenance object and organizational states which determine the subjective organization and planning process of aircraft using. The objective maintenance pro- cess is realized in Maintenance and Repair System which does not include maintenance organization and planning and is a set of related elements: aircraft, Maintenance and Repair measures, executors and documentation that sets rules of their interaction for maintaining of the aircraft reliability and readiness for flight. The aircraft organizational and technical states are considered, their characteristics and heuristic estimates of connection in knots and arcs of graphs and of aircraft organi- zational states during regular maintenance and at technical state failure are given. It is shown that in real conditions of air- craft maintenance, planned aircraft technical state control and maintenance control through it, is only defined by Mainte- nance and Repair conditions at a given Maintenance and Repair type and form structures, and correspondingly by setting principles of Maintenance and Repair work types to the execution, due to maintenance, by aircraft and all its units mainte- nance and reconstruction strategies. The realization of planned Maintenance and Repair process determines the one of the constant maintenance component. The proposed graphical models allow to reveal quantitative correlations between graph knots to improve maintenance processes by statistical research methods, what reduces manning, timetable and expenses for providing safe civil aviation aircraft maintenance.

  12. The graphics system and the data saving for the SAPHIR experiment

    International Nuclear Information System (INIS)

    Albold, D.

    1990-08-01

    Important extensions have been made to the data acquisition system SOS for the SAPHIR experiment at the Bonn ELSA facilities. As support for various online-programs, controlling components of the detector, a graphic system for presenting data was developed. This enables any program in the system to use all graphic devices. Main component is a program serving requests for presentation on a 19 inch color monitor. Window-technique allows a presentation of several graphics on one screen. Equipped with a trackball and using menus, this is an easy to use and powerful tool in controlling the experiment. Other important extensions concern data storage. A huge amount of event data can be stored on 8 mm cassettes by the program Eventsaver. This program can be controlled by a component of the SAPHIR-Online SOL running on a VAX-Computer and using windows and menus. The smaller amount of data, containing parameters and programs, which should be accessible within a small period of time, can be stored on a magnetic disk. A program supporting a file-structure for access to this disk is described. (orig./HSI) [de

  13. Integrating post-Newtonian equations on graphics processing units

    Energy Technology Data Exchange (ETDEWEB)

    Herrmann, Frank; Tiglio, Manuel [Department of Physics, Center for Fundamental Physics, and Center for Scientific Computation and Mathematical Modeling, University of Maryland, College Park, MD 20742 (United States); Silberholz, John [Center for Scientific Computation and Mathematical Modeling, University of Maryland, College Park, MD 20742 (United States); Bellone, Matias [Facultad de Matematica, Astronomia y Fisica, Universidad Nacional de Cordoba, Cordoba 5000 (Argentina); Guerberoff, Gustavo, E-mail: tiglio@umd.ed [Facultad de Ingenieria, Instituto de Matematica y Estadistica ' Prof. Ing. Rafael Laguardia' , Universidad de la Republica, Montevideo (Uruguay)

    2010-02-07

    We report on early results of a numerical and statistical study of binary black hole inspirals. The two black holes are evolved using post-Newtonian approximations starting with initially randomly distributed spin vectors. We characterize certain aspects of the distribution shortly before merger. In particular we note the uniform distribution of black hole spin vector dot products shortly before merger and a high correlation between the initial and final black hole spin vector dot products in the equal-mass, maximally spinning case. More than 300 million simulations were performed on graphics processing units, and we demonstrate a speed-up of a factor 50 over a more conventional CPU implementation. (fast track communication)

  14. A standardised graphic method for describing data privacy frameworks in primary care research using a flexible zone model.

    NARCIS (Netherlands)

    Kuchinke, W.; Ohmann, C.; Verheij, R.A.; Veen, E.B. van; Arvanitis, T.N.; Taweel, A.; Delaney, B.C.

    2014-01-01

    Purpose: To develop a model describing core concepts and principles of data flow, data privacy and confidentiality, in a simple and flexible way, using concise process descriptions and a diagrammatic notation applied to research workflow processes. The model should help to generate robust data

  15. On line and on paper: Visual representations, visual culture, and computer graphics in design engineering

    Energy Technology Data Exchange (ETDEWEB)

    Henderson, K.

    1991-01-01

    The research presented examines the visual communication practices of engineers and the impact of the implementation of computer graphics on their visual culture. The study is based on participant observation of day-to-day practices in two contemporary industrial settings among engineers engaged in the actual process of designing new pieces of technology. In addition, over thirty interviews were conducted at other industrial sites to confirm that the findings were not an isolated phenomenon. The data show that there is no one best way' to use a computer graphics system, but rather that use is site specific and firms and individuals engage in mixed paper and electronic practices as well as differential use of electronic options to get the job done. This research illustrates that rigid models which assume a linear theory of innovation, projecting a straight-forward process from idea, to drawing, to prototype, to production, are seriously misguided.

  16. PC Graphic file programing

    International Nuclear Information System (INIS)

    Yang, Jin Seok

    1993-04-01

    This book gives description of basic of graphic knowledge and understanding and realization of graphic file form. The first part deals with graphic with graphic data, store of graphic data and compress of data, programing language such as assembling, stack, compile and link of program and practice and debugging. The next part mentions graphic file form such as Mac paint file, GEM/IMG file, PCX file, GIF file, and TIFF file, consideration of hardware like mono screen driver and color screen driver in high speed, basic conception of dithering and conversion of formality.

  17. APEX_SCOPE: A graphical user interface for visualization of multi-modal data in inter-disciplinary studies.

    Science.gov (United States)

    Kanbar, Lara J; Shalish, Wissam; Precup, Doina; Brown, Karen; Sant'Anna, Guilherme M; Kearney, Robert E

    2017-07-01

    In multi-disciplinary studies, different forms of data are often collected for analysis. For example, APEX, a study on the automated prediction of extubation readiness in extremely preterm infants, collects clinical parameters and cardiorespiratory signals. A variety of cardiorespiratory metrics are computed from these signals and used to assign a cardiorespiratory pattern at each time. In such a situation, exploratory analysis requires a visualization tool capable of displaying these different types of acquired and computed signals in an integrated environment. Thus, we developed APEX_SCOPE, a graphical tool for the visualization of multi-modal data comprising cardiorespiratory signals, automated cardiorespiratory metrics, automated respiratory patterns, manually classified respiratory patterns, and manual annotations by clinicians during data acquisition. This MATLAB-based application provides a means for collaborators to view combinations of signals to promote discussion, generate hypotheses and develop features.

  18. Three-dimensional range data compression using computer graphics rendering pipeline.

    Science.gov (United States)

    Zhang, Song

    2012-06-20

    This paper presents the idea of naturally encoding three-dimensional (3D) range data into regular two-dimensional (2D) images utilizing computer graphics rendering pipeline. The computer graphics pipeline provides a means to sample 3D geometry data into regular 2D images, and also to retrieve the depth information for each sampled pixel. The depth information for each pixel is further encoded into red, green, and blue color channels of regular 2D images. The 2D images can further be compressed with existing 2D image compression techniques. By this novel means, 3D geometry data obtained by 3D range scanners can be instantaneously compressed into 2D images, providing a novel way of storing 3D range data into its 2D counterparts. We will present experimental results to verify the performance of this proposed technique.

  19. Inventory of data bases, graphics packages, and models in Department of Energy laboratories

    International Nuclear Information System (INIS)

    Shriner, C.R.; Peck, L.J.

    1978-11-01

    A central inventory of energy-related environmental bibliographic and numeric data bases, graphics packages, integrated hardware/software systems, and models was established at Oak Ridge National Laboratory in an effort to make these resources at Department of Energy (DOE) laboratories better known and available to researchers and managers. This inventory will also serve to identify and avoid duplication among laboratories. The data were collected at each DOE laboratory, then sent to ORNL and merged into a single file. This document contains the data from the merged file. The data descriptions are organized under major data types: data bases, graphics packages, integrated hardware/software systems, and models. The data include descriptions of subject content, documentation, and contact persons. Also provided are computer data such as media on which the item is available, size of the item, computer on which the item executes, minimum hardware configuration necessary to execute the item, software language(s) and/or data base management system utilized, and character set used. For the models, additional data are provided to define the model more accurately. These data include a general statement of algorithms, computational methods, and theories used by the model; organizations currently using the model; the general application area of the model; sources of data utilized by the model; model validation methods, sensitivity analysis, and procedures; and general model classification. Data in this inventory will be available for on-line data retrieval on the DOE/RECON system

  20. A Student-Friendly Graphical User Interface to Extract Data from Remote Sensing Level-2 Products.

    Science.gov (United States)

    Bernardello, R.

    2016-02-01

    Remote sensing era has provided an unprecedented amount of publicly available data. The United States National Aeronautics and Space Administration Goddard Space Flight Center (NASA-GSFC) has achieved remarkable results in the distribution of these data to the scientific community through the OceanColor web page (http://oceancolor.gsfc.nasa.gov/). However, the access to these data, is not straightforward and needs a certain investment of time in learning the use of existing software. Satellite sensors acquire raw data that are processed through several steps towards a format usable by the scientific community. These products are distributed in Hierarchical Data Format (HDF) which often represents the first obstacle for students, teachers and scientists not used to deal with extensive matrices. We present here SATellite data PROcessing (SATPRO) a newly developed Graphical User Interface (GUI) designed in MATLAB environment to provide an easy, immediate yet reliable way to select and extract Level-2 data from NASA SeaWIFS and MODIS-Aqua databases for oceanic surface temperature and chlorophyll. Since no previous experience with MATLAB is required, SATPRO allows the user to explore the available dataset without investing any software-learning time. SATPRO is an ideal tool to introduce undergraduate students to the use of remote sensing data in oceanography and can also be useful for research projects at the graduate level.

  1. Using global positioning systems in health research: a practical approach to data collection and processing.

    Science.gov (United States)

    Kerr, Jacqueline; Duncan, Scott; Schipperijn, Jasper; Schipperjin, Jasper

    2011-11-01

    The use of GPS devices in health research is increasingly popular. There are currently no best-practice guidelines for collecting, processing, and analyzing GPS data. The standardization of data collection and processing procedures will improve data quality, allow more-meaningful comparisons across studies and populations, and advance this field more rapidly. This paper aims to take researchers, who are considering using GPS devices in their research, through device-selection criteria, device settings, participant data collection, data cleaning, data processing, and integration of data into GIS. Recommendations are outlined for each stage of data collection and analysis and indicates challenges that should be considered. This paper highlights the benefits of collecting GPS data over traditional self-report or estimated exposure measures. Information presented here will allow researchers to make an informed decision about incorporating this readily available technology into their studies. This work reflects the state of the art in 2011. Copyright © 2011 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  2. The Case for Graphic Novels

    Directory of Open Access Journals (Sweden)

    Steven Hoover

    2012-04-01

    Full Text Available Many libraries and librarians have embraced graphic novels. A number of books, articles, and presentations have focused on the history of the medium and offered advice on building and maintaining collections, but very little attention has been given the question of how integrate graphic novels into a library’s instructional efforts. This paper will explore the characteristics of graphic novels that make them a valuable resource for librarians who focus on research and information literacy instruction, identify skills and competencies that can be taught by the study of graphic novels, and will provide specific examples of how to incorporate graphic novels into instruction.

  3. A study of perceptions of graphical passwords

    CSIR Research Space (South Africa)

    Vorster, JS

    2015-10-01

    Full Text Available Depending on the graphical password schema, the key-space can be even bigger than alpha-numeric passwords. However, in conventional passwords, users will re-use letters within a password. This study investigates graphical passwords for symbol...

  4. ESO Reflex: a graphical workflow engine for data reduction

    Science.gov (United States)

    Hook, Richard; Ullgrén, Marko; Romaniello, Martino; Maisala, Sami; Oittinen, Tero; Solin, Otto; Savolainen, Ville; Järveläinen, Pekka; Tyynelä, Jani; Péron, Michèle; Ballester, Pascal; Gabasch, Armin; Izzo, Carlo

    ESO Reflex is a prototype software tool that provides a novel approach to astronomical data reduction by integrating a modern graphical workflow system (Taverna) with existing legacy data reduction algorithms. Most of the raw data produced by instruments at the ESO Very Large Telescope (VLT) in Chile are reduced using recipes. These are compiled C applications following an ESO standard and utilising routines provided by the Common Pipeline Library (CPL). Currently these are run in batch mode as part of the data flow system to generate the input to the ESO/VLT quality control process and are also exported for use offline. ESO Reflex can invoke CPL-based recipes in a flexible way through a general purpose graphical interface. ESO Reflex is based on the Taverna system that was originally developed within the UK life-sciences community. Workflows have been created so far for three VLT/VLTI instruments, and the GUI allows the user to make changes to these or create workflows of their own. Python scripts or IDL procedures can be easily brought into workflows and a variety of visualisation and display options, including custom product inspection and validation steps, are available. Taverna is intended for use with web services and experiments using ESO Reflex to access Virtual Observatory web services have been successfully performed. ESO Reflex is the main product developed by Sampo, a project led by ESO and conducted by a software development team from Finland as an in-kind contribution to joining ESO. The goal was to look into the needs of the ESO community in the area of data reduction environments and to create pilot software products that illustrate critical steps along the road to a new system. Sampo concluded early in 2008. This contribution will describe ESO Reflex and show several examples of its use both locally and using Virtual Observatory remote web services. ESO Reflex is expected to be released to the community in early 2009.

  5. Granatum: a graphical single-cell RNA-Seq analysis pipeline for genomics scientists.

    Science.gov (United States)

    Zhu, Xun; Wolfgruber, Thomas K; Tasato, Austin; Arisdakessian, Cédric; Garmire, David G; Garmire, Lana X

    2017-12-05

    Single-cell RNA sequencing (scRNA-Seq) is an increasingly popular platform to study heterogeneity at the single-cell level. Computational methods to process scRNA-Seq data are not very accessible to bench scientists as they require a significant amount of bioinformatic skills. We have developed Granatum, a web-based scRNA-Seq analysis pipeline to make analysis more broadly accessible to researchers. Without a single line of programming code, users can click through the pipeline, setting parameters and visualizing results via the interactive graphical interface. Granatum conveniently walks users through various steps of scRNA-Seq analysis. It has a comprehensive list of modules, including plate merging and batch-effect removal, outlier-sample removal, gene-expression normalization, imputation, gene filtering, cell clustering, differential gene expression analysis, pathway/ontology enrichment analysis, protein network interaction visualization, and pseudo-time cell series construction. Granatum enables broad adoption of scRNA-Seq technology by empowering bench scientists with an easy-to-use graphical interface for scRNA-Seq data analysis. The package is freely available for research use at http://garmiregroup.org/granatum/app.

  6. [Influence of the recording interval and a graphic organizer on the writing process/product and on other psychological variables].

    Science.gov (United States)

    García Sánchez, Jesús N; Rodríguez Pérez, Celestino

    2007-05-01

    An experimental study of the influence of the recording interval and a graphic organizer on the processes of writing composition and on the final product is presented. We studied 326 participants, age 10 to 16 years old, by means of a nested design. Two groups were compared: one group was aided in the writing process with a graphic organizer and the other was not. Each group was subdivided into two further groups: one with a mean recording interval of 45 seconds and the other with approximately 90 seconds recording interval in a writing log. The results showed that the group aided by a graphic organizer obtained better results both in processes and writing product, and that the groups assessed with an average interval of 45 seconds obtained worse results. Implications for educational practice are discussed, and limitations and future perspectives are commented on.

  7. A Preliminary Study of a Spanish Graphic Novella Targeting Hearing Loss Prevention.

    Science.gov (United States)

    Guiberson, Mark; Wakefield, Emily

    2017-09-18

    This preliminary study developed a digital graphic novella targeting hearing protection beliefs of Spanish-speaking agricultural workers. Researchers used pretest-posttest interview surveys to establish if the novella had an immediate influence on the participants' beliefs about noise-induced hearing loss and usage of hearing protection devices. Researchers developed a digital graphic novella directed to increase knowledge about noise-induced hearing loss and increase the proper use of hearing protection devices. The novella was tailored to meet the specific linguistic and literacy needs of Spanish-speaking agricultural workers. Thirty-one Spanish-speaking farmworkers of Mexican nationality participated. This study included an interview survey with specific questions on noise-induced hearing loss, myths, and hearing protection device usage. A pretest-posttest design was applied to measure the graphic novella's immediate influence on workers. The posttest scores on Hearing Protection Beliefs statements were significantly better than pretest scores, with a large effect size observed. Digital media may be an effective way to overcome language and literacy barriers with Spanish-speaking workers when providing health education and prevention efforts.

  8. Graphics and visualization principles & algorithms

    CERN Document Server

    Theoharis, T; Platis, Nikolaos; Patrikalakis, Nicholas M

    2008-01-01

    Computer and engineering collections strong in applied graphics and analysis of visual data via computer will find Graphics & Visualization: Principles and Algorithms makes an excellent classroom text as well as supplemental reading. It integrates coverage of computer graphics and other visualization topics, from shadow geneeration and particle tracing to spatial subdivision and vector data visualization, and it provides a thorough review of literature from multiple experts, making for a comprehensive review essential to any advanced computer study.-California Bookw

  9. Accelerating Molecular Dynamic Simulation on Graphics Processing Units

    Science.gov (United States)

    Friedrichs, Mark S.; Eastman, Peter; Vaidyanathan, Vishal; Houston, Mike; Legrand, Scott; Beberg, Adam L.; Ensign, Daniel L.; Bruns, Christopher M.; Pande, Vijay S.

    2009-01-01

    We describe a complete implementation of all-atom protein molecular dynamics running entirely on a graphics processing unit (GPU), including all standard force field terms, integration, constraints, and implicit solvent. We discuss the design of our algorithms and important optimizations needed to fully take advantage of a GPU. We evaluate its performance, and show that it can be more than 700 times faster than a conventional implementation running on a single CPU core. PMID:19191337

  10. Processing sequence annotation data using the Lua programming language.

    Science.gov (United States)

    Ueno, Yutaka; Arita, Masanori; Kumagai, Toshitaka; Asai, Kiyoshi

    2003-01-01

    The data processing language in a graphical software tool that manages sequence annotation data from genome databases should provide flexible functions for the tasks in molecular biology research. Among currently available languages we adopted the Lua programming language. It fulfills our requirements to perform computational tasks for sequence map layouts, i.e. the handling of data containers, symbolic reference to data, and a simple programming syntax. Upon importing a foreign file, the original data are first decomposed in the Lua language while maintaining the original data schema. The converted data are parsed by the Lua interpreter and the contents are stored in our data warehouse. Then, portions of annotations are selected and arranged into our catalog format to be depicted on the sequence map. Our sequence visualization program was successfully implemented, embedding the Lua language for processing of annotation data and layout script. The program is available at http://staff.aist.go.jp/yutaka.ueno/guppy/.

  11. Energy Level Composite Curves-a new graphical methodology for the integration of energy intensive processes

    International Nuclear Information System (INIS)

    Anantharaman, Rahul; Abbas, Own Syed; Gundersen, Truls

    2006-01-01

    Pinch Analysis, Exergy Analysis and Optimization have all been used independently or in combination for the energy integration of process plants. In order to address the issue of energy integration, taking into account composition and pressure effects, the concept of energy level as proposed by [X. Feng, X.X. Zhu, Combining pinch and exergy analysis for process modifications, Appl. Therm. Eng. 17 (1997) 249] has been modified and expanded in this work. We have developed a strategy for energy integration that uses process simulation tools to define the interaction between the various subsystems in the plant and a graphical technique to help the engineer interpret the results of the simulation with physical insights that point towards exploring possible integration schemes to increase energy efficiency. The proposed graphical representation of energy levels of processes is very similar to the Composite Curves of Pinch Analysis-the interpretation of the Energy Level Composite Curves reduces to the Pinch Analysis case when dealing with heat transfer. Other similarities and differences are detailed in this work. Energy integration of a methanol plant is taken as a case study to test the efficacy of this methodology. Potential integration schemes are identified that would have been difficult to visualize without the help of the new graphical representation

  12. Fast DRR splat rendering using common consumer graphics hardware

    International Nuclear Information System (INIS)

    Spoerk, Jakob; Bergmann, Helmar; Wanschitz, Felix; Dong, Shuo; Birkfellner, Wolfgang

    2007-01-01

    Digitally rendered radiographs (DRR) are a vital part of various medical image processing applications such as 2D/3D registration for patient pose determination in image-guided radiotherapy procedures. This paper presents a technique to accelerate DRR creation by using conventional graphics hardware for the rendering process. DRR computation itself is done by an efficient volume rendering method named wobbled splatting. For programming the graphics hardware, NVIDIAs C for Graphics (Cg) is used. The description of an algorithm used for rendering DRRs on the graphics hardware is presented, together with a benchmark comparing this technique to a CPU-based wobbled splatting program. Results show a reduction of rendering time by about 70%-90% depending on the amount of data. For instance, rendering a volume of 2x10 6 voxels is feasible at an update rate of 38 Hz compared to 6 Hz on a common Intel-based PC using the graphics processing unit (GPU) of a conventional graphics adapter. In addition, wobbled splatting using graphics hardware for DRR computation provides higher resolution DRRs with comparable image quality due to special processing characteristics of the GPU. We conclude that DRR generation on common graphics hardware using the freely available Cg environment is a major step toward 2D/3D registration in clinical routine

  13. VAX Professional Workstation goes graphic

    International Nuclear Information System (INIS)

    Downward, J.G.

    1984-01-01

    The VAX Professional Workstation (VPW) is a collection of programs and procedures designed to provide an integrated work-station environment for the staff at KMS Fusion's research laboratories. During the past year numerous capabilities have been added to VPW, including support for VT125/VT240/4014 graphic workstations, editing windows, and additional desk utilities. Graphics workstation support allows users to create, edit, and modify graph data files, enter the data via a graphic tablet, create simple plots with DATATRIEVE or DECgraph on ReGIS terminals, or elaborate plots with TEKGRAPH on ReGIS or Tektronix terminals. Users may assign display error bars to the data and interactively plot it in a variety of ways. Users also can create and display viewgraphs. Hard copy output for a large network of office terminals is obtained by multiplexing each terminal's video output into a recently developed video multiplexer front ending a single channel video hard copy unit

  14. Graphics gems II

    CERN Document Server

    Arvo, James

    1991-01-01

    Graphics Gems II is a collection of articles shared by a diverse group of people that reflect ideas and approaches in graphics programming which can benefit other computer graphics programmers.This volume presents techniques for doing well-known graphics operations faster or easier. The book contains chapters devoted to topics on two-dimensional and three-dimensional geometry and algorithms, image processing, frame buffer techniques, and ray tracing techniques. The radiosity approach, matrix techniques, and numerical and programming techniques are likewise discussed.Graphics artists and comput

  15. Acceleration of Meshfree Radial Point Interpolation Method on Graphics Hardware

    International Nuclear Information System (INIS)

    Nakata, Susumu

    2008-01-01

    This article describes a parallel computational technique to accelerate radial point interpolation method (RPIM)-based meshfree method using graphics hardware. RPIM is one of the meshfree partial differential equation solvers that do not require the mesh structure of the analysis targets. In this paper, a technique for accelerating RPIM using graphics hardware is presented. In the method, the computation process is divided into small processes suitable for processing on the parallel architecture of the graphics hardware in a single instruction multiple data manner.

  16. Real-time colouring and filtering with graphics shaders

    Science.gov (United States)

    Vohl, D.; Fluke, C. J.; Barnes, D. G.; Hassan, A. H.

    2017-11-01

    Despite the popularity of the Graphics Processing Unit (GPU) for general purpose computing, one should not forget about the practicality of the GPU for fast scientific visualization. As astronomers have increasing access to three-dimensional (3D) data from instruments and facilities like integral field units and radio interferometers, visualization techniques such as volume rendering offer means to quickly explore spectral cubes as a whole. As most 3D visualization techniques have been developed in fields of research like medical imaging and fluid dynamics, many transfer functions are not optimal for astronomical data. We demonstrate how transfer functions and graphics shaders can be exploited to provide new astronomy-specific explorative colouring methods. We present 12 shaders, including four novel transfer functions specifically designed to produce intuitive and informative 3D visualizations of spectral cube data. We compare their utility to classic colour mapping. The remaining shaders highlight how common computation like filtering, smoothing and line ratio algorithms can be integrated as part of the graphics pipeline. We discuss how this can be achieved by utilizing the parallelism of modern GPUs along with a shading language, letting astronomers apply these new techniques at interactive frame rates. All shaders investigated in this work are included in the open source software shwirl (Vohl 2017).

  17. Data processing system for spectroscopy at Novillo Tokamak; Sistema de procesamiento de datos para espectroscopia en el Tokamak Novillo

    Energy Technology Data Exchange (ETDEWEB)

    Ortega C, G.; Gaytan G, E. [Instituto Tecnologico de Toluca, Instituto nacional de Investigaciones Nucleares, A.P. 18-1027, 11801 Mexico D.F. (Mexico)

    1998-07-01

    Taking as basis some proposed methodologies by software engineering it was designed and developed a data processing system coming from the diagnostic equipment by spectroscopy, for the study of plasma impurities, during the cleaning discharges. the data acquisition is realized through an electronic interface which communicates the computer with the spectroscopy system of Novillo Tokamak. The data were obtained starting from files type text and processed for their subsequently graphic presentation. For development of this system named PRODATN (Processing of Data for Spectroscopy in Novillo Tokamak) was used the LabVIEW graphic programming language. (Author)

  18. Graphical models for inference under outcome-dependent sampling

    DEFF Research Database (Denmark)

    Didelez, V; Kreiner, S; Keiding, N

    2010-01-01

    a node for the sampling indicator, assumptions about sampling processes can be made explicit. We demonstrate how to read off such graphs whether consistent estimation of the association between exposure and outcome is possible. Moreover, we give sufficient graphical conditions for testing and estimating......We consider situations where data have been collected such that the sampling depends on the outcome of interest and possibly further covariates, as for instance in case-control studies. Graphical models represent assumptions about the conditional independencies among the variables. By including...

  19. Communicating with scientific graphics: A descriptive inquiry into non-ideal normativity.

    Science.gov (United States)

    Sheredos, Benjamin

    2017-06-01

    Scientists' graphical practices have recently become a target of inquiry in the philosophy of science, and in the cognitive sciences. Here I supplement our understanding of graphical practices via a case study of how researchers crafted the graphics for scientific publication in the field of circadian biology. The case highlights social aspects of graphical production which have gone understudied - especially concerning the negotiation of publication. I argue that it also supports a challenge to the claim that empirically-informed "cognitive design principles" offer an apt understanding of the norms of success which govern good scientific graphic design to communicate data and hypotheses to other experts. In this respect, the case-study also illustrates how "descriptive" studies of scientific practice can connect with normative issues in philosophy of science, thereby addressing a central concern in recent discussions of practice-oriented philosophy of science. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. ESO Reflex: A Graphical Workflow Engine for Data Reduction

    Science.gov (United States)

    Hook, R.; Romaniello, M.; Péron, M.; Ballester, P.; Gabasch, A.; Izzo, C.; Ullgrén, M.; Maisala, S.; Oittinen, T.; Solin, O.; Savolainen, V.; Järveläinen, P.; Tyynelä, J.

    2008-08-01

    Sampo {http://www.eso.org/sampo} (Hook et al. 2005) is a project led by ESO and conducted by a software development team from Finland as an in-kind contribution to joining ESO. The goal is to assess the needs of the ESO community in the area of data reduction environments and to create pilot software products that illustrate critical steps along the road to a new system. Those prototypes will not only be used to validate concepts and understand requirements but will also be tools of immediate value for the community. Most of the raw data produced by ESO instruments can be reduced using CPL {http://www.eso.org/cpl} recipes: compiled C programs following an ESO standard and utilizing routines provided by the Common Pipeline Library. Currently reduction recipes are run in batch mode as part of the data flow system to generate the input to the ESO VLT/VLTI quality control process and are also made public for external users. Sampo has developed a prototype application called ESO Reflex {http://www.eso.org/sampo/reflex/} that integrates a graphical user interface and existing data reduction algorithms. ESO Reflex can invoke CPL-based recipes in a flexible way through a dedicated interface. ESO Reflex is based on the graphical workflow engine Taverna {http://taverna.sourceforge.net} that was originally developed by the UK eScience community, mostly for work in the life sciences. Workflows have been created so far for three VLT/VLTI instrument modes ( VIMOS/IFU {http://www.eso.org/instruments/vimos/}, FORS spectroscopy {http://www.eso.org/instruments/fors/} and AMBER {http://www.eso.org/instruments/amber/}), and the easy-to-use GUI allows the user to make changes to these or create workflows of their own. Python scripts and IDL procedures can be easily brought into workflows and a variety of visualisation and display options, including custom product inspection and validation steps, are available.

  1. Graphics processing unit based computation for NDE applications

    Science.gov (United States)

    Nahas, C. A.; Rajagopal, Prabhu; Balasubramaniam, Krishnan; Krishnamurthy, C. V.

    2012-05-01

    Advances in parallel processing in recent years are helping to improve the cost of numerical simulation. Breakthroughs in Graphical Processing Unit (GPU) based computation now offer the prospect of further drastic improvements. The introduction of 'compute unified device architecture' (CUDA) by NVIDIA (the global technology company based in Santa Clara, California, USA) has made programming GPUs for general purpose computing accessible to the average programmer. Here we use CUDA to develop parallel finite difference schemes as applicable to two problems of interest to NDE community, namely heat diffusion and elastic wave propagation. The implementations are for two-dimensions. Performance improvement of the GPU implementation against serial CPU implementation is then discussed.

  2. mcaGUI: microbial community analysis R-Graphical User Interface (GUI)

    OpenAIRE

    Copeland, Wade K.; Krishnan, Vandhana; Beck, Daniel; Settles, Matt; Foster, James A.; Cho, Kyu-Chul; Day, Mitch; Hickey, Roxana; Schütte, Ursel M.E.; Zhou, Xia; Williams, Christopher J.; Forney, Larry J.; Abdo, Zaid

    2012-01-01

    Summary: Microbial communities have an important role in natural ecosystems and have an impact on animal and human health. Intuitive graphic and analytical tools that can facilitate the study of these communities are in short supply. This article introduces Microbial Community Analysis GUI, a graphical user interface (GUI) for the R-programming language (R Development Core Team, 2010). With this application, researchers can input aligned and clustered sequence data to create custom abundance ...

  3. What Does It Take to Be a Successful Graphic Designer: A Phenomenological Study on Graphic Design Curriculum

    Science.gov (United States)

    Beller, Shannon

    2017-01-01

    This study examined the phenomenon of what it takes to be a successful graphic designer. With an identity crisis in graphic design education, design curriculum is faced with uncertainties. With the diversity of programs and degrees in graphic design, the competencies and skills developed among the various programs reflect different purposes, thus…

  4. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis

    Directory of Open Access Journals (Sweden)

    Qu Lijia

    2009-03-01

    Full Text Available Abstract Background Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. Results In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion, data reduction (PCA, LDA, ULDA, unsupervised clustering (K-Mean and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM. Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Conclusion Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases

  5. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis.

    Science.gov (United States)

    Wang, Tao; Shao, Kang; Chu, Qinying; Ren, Yanfei; Mu, Yiming; Qu, Lijia; He, Jie; Jin, Changwen; Xia, Bin

    2009-03-16

    Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion), data reduction (PCA, LDA, ULDA), unsupervised clustering (K-Mean) and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM). Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases. Moreover, with its open source architecture, interested

  6. Flowfield computer graphics

    Science.gov (United States)

    Desautel, Richard

    1993-01-01

    The objectives of this research include supporting the Aerothermodynamics Branch's research by developing graphical visualization tools for both the branch's adaptive grid code and flow field ray tracing code. The completed research for the reporting period includes development of a graphical user interface (GUI) and its implementation into the NAS Flowfield Analysis Software Tool kit (FAST), for both the adaptive grid code (SAGE) and the flow field ray tracing code (CISS).

  7. Development of the graphic design and control system based on a graphic simulator for the spent fuel dismantling equipment

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J. Y.; Kim, S. H.; Song, T. G.; Yoon, J. S

    2000-06-01

    In this study, the graphic design system is developed for designing the spent fuel rod consolidation and the dismantling processes. This system is used throughout the design stages from the conceptual design to the motion analysis. Also, the real-time control system of the rod extracting equipment is developed. This system utilizes the graphic simulator which simulates the motion of the equipment in real time by synchronously connecting the control PC with the graphic server through the TCP/IP network. The developed system is expected to be used as an effective tool in designing the process equipment for the spent fuel management. And the real-time graphic control system can be effectively used to enhance the reliability and safety of the spent fuel handling process by providing the remote monitoring function of the process.

  8. Development of the graphic design and control system based on a graphic simulator for the spent fuel dismantling equipment

    International Nuclear Information System (INIS)

    Lee, J. Y.; Kim, S. H.; Song, T. G.; Yoon, J. S.

    2000-06-01

    In this study, the graphic design system is developed for designing the spent fuel rod consolidation and the dismantling processes. This system is used throughout the design stages from the conceptual design to the motion analysis. Also, the real-time control system of the rod extracting equipment is developed. This system utilizes the graphic simulator which simulates the motion of the equipment in real time by synchronously connecting the control PC with the graphic server through the TCP/IP network. The developed system is expected to be used as an effective tool in designing the process equipment for the spent fuel management. And the real-time graphic control system can be effectively used to enhance the reliability and safety of the spent fuel handling process by providing the remote monitoring function of the process

  9. Development Of 12 Head GAMMA Detection And Graphical Presentation Software Suitable For Industrial Process Investigation By Radiotracer Technique

    International Nuclear Information System (INIS)

    Saengchantr, Dhanaj; Chueinta, Siripone

    2009-07-01

    Full text: Data logging with prompt graphical presentation software accommodating gamma radiation signals from 12 scintillation detectors through standard RS-232 interface has been developed. Laboratory testing by detection of injected-mixed radioactive tracer in a fluid flowing inside a pipe was conducted. The radioactive mixed fluid passed through the detectors located at several points along the pipe and the generated signals correspond to the mass flow inside the pipe were recorded. Up to 10,000 data points of fast (20 millisecond) dwell time could be accumulated. Graphical presentation allowed fast interpretation while the output data were suitable for more accurate evaluation with standard software e.g. Residence Time Distribution (RTD), Computed Tomography Visualization. Further utilization in the industry, in conjunction with radiotracer techniques, for troubleshooting and process optimization will be further carried out

  10. Graphics workflow optimization when editing standard tasks using modern graphics editing programs

    OpenAIRE

    Khabirova, Maja

    2012-01-01

    This work focuses on the description and characteristics of common problems which graphic designers face daily when working for advertising agencies. This work describes tasks and organises them according to the type of graphic being processed and the types of output. In addition, this work describes the ways these common tasks can be completed using modern graphics editing software. It also provides a practical definition of a graphic designer and graphic agency. The aim of this work is to m...

  11. Web-based execution of graphical work-flows: a modular platform for multifunctional scientific process automation

    International Nuclear Information System (INIS)

    De Ley, E.; Jacobs, D.; Ounsy, M.

    2012-01-01

    The Passerelle process automation suite offers a fundamentally modular solution platform, based on a layered integration of several best-of-breed technologies. It has been successfully applied by Synchrotron Soleil as the sequencer for data acquisition and control processes on its beamlines, integrated with TANGO as a control bus and GlobalScreen TM ) as the SCADA package. Since last year it is being used as the graphical work-flow component for the development of an eclipse-based Data Analysis Work Bench, at ESRF. The top layer of Passerelle exposes an actor-based development paradigm, based on the Ptolemy framework (UC Berkeley). Actors provide explicit reusability and strong decoupling, combined with an inherently concurrent execution model. Actor libraries exist for TANGO integration, web-services, database operations, flow control, rules-based analysis, mathematical calculations, launching external scripts etc. Passerelle's internal architecture is based on OSGi, the major Java framework for modular service-based applications. A large set of modules exist that can be recombined as desired to obtain different features and deployment models. Besides desktop versions of the Passerelle work-flow workbench, there is also the Passerelle Manager. It is a secured web application including a graphical editor, for centralized design, execution, management and monitoring of process flows, integrating standard Java Enterprise services with OSGi. We will present the internal technical architecture, some interesting application cases and the lessons learnt. (authors)

  12. Explicet: graphical user interface software for metadata-driven management, analysis and visualization of microbiome data.

    Science.gov (United States)

    Robertson, Charles E; Harris, J Kirk; Wagner, Brandie D; Granger, David; Browne, Kathy; Tatem, Beth; Feazel, Leah M; Park, Kristin; Pace, Norman R; Frank, Daniel N

    2013-12-01

    Studies of the human microbiome, and microbial community ecology in general, have blossomed of late and are now a burgeoning source of exciting research findings. Along with the advent of next-generation sequencing platforms, which have dramatically increased the scope of microbiome-related projects, several high-performance sequence analysis pipelines (e.g. QIIME, MOTHUR, VAMPS) are now available to investigators for microbiome analysis. The subject of our manuscript, the graphical user interface-based Explicet software package, fills a previously unmet need for a robust, yet intuitive means of integrating the outputs of the software pipelines with user-specified metadata and then visualizing the combined data.

  13. Data processing of remotely sensed airborne hyperspectral data using the Airborne Processing Library (APL): Geocorrection algorithm descriptions and spatial accuracy assessment

    Science.gov (United States)

    Warren, Mark A.; Taylor, Benjamin H.; Grant, Michael G.; Shutler, Jamie D.

    2014-03-01

    Remote sensing airborne hyperspectral data are routinely used for applications including algorithm development for satellite sensors, environmental monitoring and atmospheric studies. Single flight lines of airborne hyperspectral data are often in the region of tens of gigabytes in size. This means that a single aircraft can collect terabytes of remotely sensed hyperspectral data during a single year. Before these data can be used for scientific analyses, they need to be radiometrically calibrated, synchronised with the aircraft's position and attitude and then geocorrected. To enable efficient processing of these large datasets the UK Airborne Research and Survey Facility has recently developed a software suite, the Airborne Processing Library (APL), for processing airborne hyperspectral data acquired from the Specim AISA Eagle and Hawk instruments. The APL toolbox allows users to radiometrically calibrate, geocorrect, reproject and resample airborne data. Each stage of the toolbox outputs data in the common Band Interleaved Lines (BILs) format, which allows its integration with other standard remote sensing software packages. APL was developed to be user-friendly and suitable for use on a workstation PC as well as for the automated processing of the facility; to this end APL can be used under both Windows and Linux environments on a single desktop machine or through a Grid engine. A graphical user interface also exists. In this paper we describe the Airborne Processing Library software, its algorithms and approach. We present example results from using APL with an AISA Eagle sensor and we assess its spatial accuracy using data from multiple flight lines collected during a campaign in 2008 together with in situ surveyed ground control points.

  14. Some research advances in computer graphics that will enhance applications to engineering design

    Science.gov (United States)

    Allan, J. J., III

    1975-01-01

    Research in man/machine interactions and graphics hardware/software that will enhance applications to engineering design was described. Research aspects of executive systems, command languages, and networking used in the computer applications laboratory are mentioned. Finally, a few areas where little or no research is being done were identified.

  15. MetaboLab - advanced NMR data processing and analysis for metabolomics

    Directory of Open Access Journals (Sweden)

    Günther Ulrich L

    2011-09-01

    Full Text Available Abstract Background Despite wide-spread use of Nuclear Magnetic Resonance (NMR in metabolomics for the analysis of biological samples there is a lack of graphically driven, publicly available software to process large one and two-dimensional NMR data sets for statistical analysis. Results Here we present MetaboLab, a MATLAB based software package that facilitates NMR data processing by providing automated algorithms for processing series of spectra in a reproducible fashion. A graphical user interface provides easy access to all steps of data processing via a script builder to generate MATLAB scripts, providing an option to alter code manually. The analysis of two-dimensional spectra (1H,13C-HSQC spectra is facilitated by the use of a spectral library derived from publicly available databases which can be extended readily. The software allows to display specific metabolites in small regions of interest where signals can be picked. To facilitate the analysis of series of two-dimensional spectra, different spectra can be overlaid and assignments can be transferred between spectra. The software includes mechanisms to account for overlapping signals by highlighting neighboring and ambiguous assignments. Conclusions The MetaboLab software is an integrated software package for NMR data processing and analysis, closely linked to the previously developed NMRLab software. It includes tools for batch processing and gives access to a wealth of algorithms available in the MATLAB framework. Algorithms within MetaboLab help to optimize the flow of metabolomics data preparation for statistical analysis. The combination of an intuitive graphical user interface along with advanced data processing algorithms facilitates the use of MetaboLab in a broader metabolomics context.

  16. A graphical user-interface control system at SRRC

    International Nuclear Information System (INIS)

    Chen, J.S.; Wang, C.J.; Chen, S.J.; Jan, G.J.

    1993-01-01

    A graphical user interface control system of 1.3 GeV synchrotron radiation light source was designed and implemented for the beam transport line (BTL) and storage ring (SR). A modern control technique has been used to implement and control the third generation synchrotron light source. Two level computer hardware configuration, that includes process and console computers as a top level and VME based intelligent local controller as a bottom level, was setup and tested. Both level computers are linked by high speed Ethernet data communication network. A database includes static and dynamic databases as well as access routines were developed. In order to commission and operate the machine friendly, the graphical man machine interface was designed and coded. The graphical user interface (GUI) software was installed on VAX workstations for the BTL and SR at the Synchrotron Radiation Research Center (SRRC). The over all performance has been evaluated at 10Hz update rate. The results showed that the graphical operator interface control system is versatile system and can be implemented into the control system of the accelerator. It will provide the tool to control and monitor the equipments of the radiation light source especially for machine commissioning and operation

  17. A Case Study of a Hybrid Parallel 3D Surface Rendering Graphics Architecture

    DEFF Research Database (Denmark)

    Holten-Lund, Hans Erik; Madsen, Jan; Pedersen, Steen

    1997-01-01

    This paper presents a case study in the design strategy used inbuilding a graphics computer, for drawing very complex 3Dgeometric surfaces. The goal is to build a PC based computer systemcapable of handling surfaces built from about 2 million triangles, andto be able to render a perspective view...... of these on a computer displayat interactive frame rates, i.e. processing around 50 milliontriangles per second. The paper presents a hardware/softwarearchitecture called HPGA (Hybrid Parallel Graphics Architecture) whichis likely to be able to carry out this task. The case study focuses ontechniques to increase...

  18. Stages in the research process.

    Science.gov (United States)

    Gelling, Leslie

    2015-03-04

    Research should be conducted in a systematic manner, allowing the researcher to progress from a general idea or clinical problem to scientifically rigorous research findings that enable new developments to improve clinical practice. Using a research process helps guide this process. This article is the first in a 26-part series on nursing research. It examines the process that is common to all research, and provides insights into ten different stages of this process: developing the research question, searching and evaluating the literature, selecting the research approach, selecting research methods, gaining access to the research site and data, pilot study, sampling and recruitment, data collection, data analysis, and dissemination of results and implementation of findings.

  19. AGUIA: autonomous graphical user interface assembly for clinical trials semantic data services.

    Science.gov (United States)

    Correa, Miria C; Deus, Helena F; Vasconcelos, Ana T; Hayashi, Yuki; Ajani, Jaffer A; Patnana, Srikrishna V; Almeida, Jonas S

    2010-10-26

    AGUIA is a front-end web application originally developed to manage clinical, demographic and biomolecular patient data collected during clinical trials at MD Anderson Cancer Center. The diversity of methods involved in patient screening and sample processing generates a variety of data types that require a resource-oriented architecture to capture the associations between the heterogeneous data elements. AGUIA uses a semantic web formalism, resource description framework (RDF), and a bottom-up design of knowledge bases that employ the S3DB tool as the starting point for the client's interface assembly. The data web service, S3DB, meets the necessary requirements of generating the RDF and of explicitly distinguishing the description of the domain from its instantiation, while allowing for continuous editing of both. Furthermore, it uses an HTTP-REST protocol, has a SPARQL endpoint, and has open source availability in the public domain, which facilitates the development and dissemination of this application. However, S3DB alone does not address the issue of representing content in a form that makes sense for domain experts. We identified an autonomous set of descriptors, the GBox, that provides user and domain specifications for the graphical user interface. This was achieved by identifying a formalism that makes use of an RDF schema to enable the automatic assembly of graphical user interfaces in a meaningful manner while using only resources native to the client web browser (JavaScript interpreter, document object model). We defined a generalized RDF model such that changes in the graphic descriptors are automatically and immediately (locally) reflected into the configuration of the client's interface application. The design patterns identified for the GBox benefit from and reflect the specific requirements of interacting with data generated by clinical trials, and they contain clues for a general purpose solution to the challenge of having interfaces

  20. AGUIA: autonomous graphical user interface assembly for clinical trials semantic data services

    Directory of Open Access Journals (Sweden)

    Hayashi Yuki

    2010-10-01

    Full Text Available Abstract Background AGUIA is a front-end web application originally developed to manage clinical, demographic and biomolecular patient data collected during clinical trials at MD Anderson Cancer Center. The diversity of methods involved in patient screening and sample processing generates a variety of data types that require a resource-oriented architecture to capture the associations between the heterogeneous data elements. AGUIA uses a semantic web formalism, resource description framework (RDF, and a bottom-up design of knowledge bases that employ the S3DB tool as the starting point for the client's interface assembly. Methods The data web service, S3DB, meets the necessary requirements of generating the RDF and of explicitly distinguishing the description of the domain from its instantiation, while allowing for continuous editing of both. Furthermore, it uses an HTTP-REST protocol, has a SPARQL endpoint, and has open source availability in the public domain, which facilitates the development and dissemination of this application. However, S3DB alone does not address the issue of representing content in a form that makes sense for domain experts. Results We identified an autonomous set of descriptors, the GBox, that provides user and domain specifications for the graphical user interface. This was achieved by identifying a formalism that makes use of an RDF schema to enable the automatic assembly of graphical user interfaces in a meaningful manner while using only resources native to the client web browser (JavaScript interpreter, document object model. We defined a generalized RDF model such that changes in the graphic descriptors are automatically and immediately (locally reflected into the configuration of the client's interface application. Conclusions The design patterns identified for the GBox benefit from and reflect the specific requirements of interacting with data generated by clinical trials, and they contain clues for a general

  1. Real time 3D structural and Doppler OCT imaging on graphics processing units

    Science.gov (United States)

    Sylwestrzak, Marcin; Szlag, Daniel; Szkulmowski, Maciej; Gorczyńska, Iwona; Bukowska, Danuta; Wojtkowski, Maciej; Targowski, Piotr

    2013-03-01

    In this report the application of graphics processing unit (GPU) programming for real-time 3D Fourier domain Optical Coherence Tomography (FdOCT) imaging with implementation of Doppler algorithms for visualization of the flows in capillary vessels is presented. Generally, the time of the data processing of the FdOCT data on the main processor of the computer (CPU) constitute a main limitation for real-time imaging. Employing additional algorithms, such as Doppler OCT analysis, makes this processing even more time consuming. Lately developed GPUs, which offers a very high computational power, give a solution to this problem. Taking advantages of them for massively parallel data processing, allow for real-time imaging in FdOCT. The presented software for structural and Doppler OCT allow for the whole processing with visualization of 2D data consisting of 2000 A-scans generated from 2048 pixels spectra with frame rate about 120 fps. The 3D imaging in the same mode of the volume data build of 220 × 100 A-scans is performed at a rate of about 8 frames per second. In this paper a software architecture, organization of the threads and optimization applied is shown. For illustration the screen shots recorded during real time imaging of the phantom (homogeneous water solution of Intralipid in glass capillary) and the human eye in-vivo is presented.

  2. Graphic presentation of quarterly 90Sr fallout data, 1954-1982

    International Nuclear Information System (INIS)

    Larsen, R.J.

    1984-01-01

    This report graphically presents all of the precipitation and 90 Sr deposition data for all stations operated as part of the Environmental Measurements Laboratory's (EML) global fallout program since the initiation of the program in 1954. 3 references, 179 figures

  3. Graphical representation of transmutation and decay chain data, transmutation cross section and delayed gamma ray emission data

    International Nuclear Information System (INIS)

    Seki, Yasushi; Iida, Hiromasa; Kawasaki, Hiromitsu.

    1982-09-01

    In a D-T burning fusion reactor, the neutron induced activity severely limits personnel access to the reactor. Accurate evaluation of the induced activity and dose rate is necessary to conduct effective biological shield design. In order to evaluate the dose rate accurately, considerable amount of activation data is required. This report gives graphical representation of transmutation and decay chain data, transmutation cross section data and delayed gamma ray emission data for 116 nuclides of interest in terms of fusion reactor design. This graphical representation was made with hope of producing a reference for examining activation problems. It has already been shown to be effective in correcting inappropriate data. A computer code AMOEBA developed for the checking and plotting of the activation data is also described in this report. (author)

  4. Performance evaluation of throughput computing workloads using multi-core processors and graphics processors

    Science.gov (United States)

    Dave, Gaurav P.; Sureshkumar, N.; Blessy Trencia Lincy, S. S.

    2017-11-01

    Current trend in processor manufacturing focuses on multi-core architectures rather than increasing the clock speed for performance improvement. Graphic processors have become as commodity hardware for providing fast co-processing in computer systems. Developments in IoT, social networking web applications, big data created huge demand for data processing activities and such kind of throughput intensive applications inherently contains data level parallelism which is more suited for SIMD architecture based GPU. This paper reviews the architectural aspects of multi/many core processors and graphics processors. Different case studies are taken to compare performance of throughput computing applications using shared memory programming in OpenMP and CUDA API based programming.

  5. Omeups: an interactive graphics program for analysing collision data

    International Nuclear Information System (INIS)

    Burgess, A.; Mason, H.E.; Tully, J.A.

    1991-01-01

    The aim of the micro-computer program OMEUPS is to provide a simple means of critically assessing and compacting collision strength data for electron impact excitation of positive ions. The program is interactive and allows data to be analysed graphically: it should be of particular interest to astrophysicists as well as to those specialising in atomic physics. The method on which the program is based allows one to interpolate or extrapolate existing data in energy and temperature; store data in compact form without losing significant information; perform Maxwell averaging; detect printing and computational errors in tabulated data

  6. The measurement of statistical reasoning in verbal-numerical and graphical forms: a pilot study

    International Nuclear Information System (INIS)

    Agus, M; Penna, M P; Peró-Cebollero, M; Guàrdia-Olmos, J

    2013-01-01

    Numerous subjects have trouble in understanding various conceptions connected to statistical problems. Research reports how students' ability to solve problems (including statistical problems) can be influenced by exhibiting proofs. In this work we aim to contrive an original and easy instrument able to assess statistical reasoning on uncertainty and on association, regarding two different forms of proof presentation: pictorial-graphical and verbal–numerical. We have conceived eleven pairs of simple problems in the verbal–numerical and pictorial–graphical form and we have presented the proofs to 47 undergraduate students. The purpose of our work was to evaluate the goodness and reliability of these problems in the assessment of statistical reasoning. Each subject solved each pair of proofs in the verbal-numerical and in the pictorial–graphical form, in different problem presentation orders. Data analyses have highlighted that six out of the eleven pairs of problems appear to be useful and adequate to estimate statistical reasoning on uncertainty and that there is no effect due to the order of presentation in the verbal–numerical and pictorial–graphical form

  7. Computation of large covariance matrices by SAMMY on graphical processing units and multicore CPUs

    International Nuclear Information System (INIS)

    Arbanas, G.; Dunn, M.E.; Wiarda, D.

    2011-01-01

    Computational power of Graphical Processing Units and multicore CPUs was harnessed by the nuclear data evaluation code SAMMY to speed up computations of large Resonance Parameter Covariance Matrices (RPCMs). This was accomplished by linking SAMMY to vendor-optimized implementations of the matrix-matrix multiplication subroutine of the Basic Linear Algebra Library to compute the most time-consuming step. The 235 U RPCM computed previously using a triple-nested loop was re-computed using the NVIDIA implementation of the subroutine on a single Tesla Fermi Graphical Processing Unit, and also using the Intel's Math Kernel Library implementation on two different multicore CPU systems. A multiplication of two matrices of dimensions 16,000×20,000 that had previously taken days, took approximately one minute on the GPU. Comparable performance was achieved on a dual six-core CPU system. The magnitude of the speed-up suggests that these, or similar, combinations of hardware and libraries may be useful for large matrix operations in SAMMY. Uniform interfaces of standard linear algebra libraries make them a promising candidate for a programming framework of a new generation of SAMMY for the emerging heterogeneous computing platforms. (author)

  8. Computation of large covariance matrices by SAMMY on graphical processing units and multicore CPUs

    Energy Technology Data Exchange (ETDEWEB)

    Arbanas, G.; Dunn, M.E.; Wiarda, D., E-mail: arbanasg@ornl.gov, E-mail: dunnme@ornl.gov, E-mail: wiardada@ornl.gov [Oak Ridge National Laboratory, Oak Ridge, TN (United States)

    2011-07-01

    Computational power of Graphical Processing Units and multicore CPUs was harnessed by the nuclear data evaluation code SAMMY to speed up computations of large Resonance Parameter Covariance Matrices (RPCMs). This was accomplished by linking SAMMY to vendor-optimized implementations of the matrix-matrix multiplication subroutine of the Basic Linear Algebra Library to compute the most time-consuming step. The {sup 235}U RPCM computed previously using a triple-nested loop was re-computed using the NVIDIA implementation of the subroutine on a single Tesla Fermi Graphical Processing Unit, and also using the Intel's Math Kernel Library implementation on two different multicore CPU systems. A multiplication of two matrices of dimensions 16,000×20,000 that had previously taken days, took approximately one minute on the GPU. Comparable performance was achieved on a dual six-core CPU system. The magnitude of the speed-up suggests that these, or similar, combinations of hardware and libraries may be useful for large matrix operations in SAMMY. Uniform interfaces of standard linear algebra libraries make them a promising candidate for a programming framework of a new generation of SAMMY for the emerging heterogeneous computing platforms. (author)

  9. Extending the LWS Data Environment: Distributed Data Processing and Analysis

    Science.gov (United States)

    Narock, Thomas

    2005-01-01

    The final stages of this work saw changes to the original framework, as well as the completion and integration of several data processing services. Initially, it was thought that a peer-to-peer architecture was necessary to make this work possible. The peer-to-peer architecture provided many benefits including the dynamic discovery of new services that would be continually added. A prototype example was built and while it showed promise, a major disadvantage was seen in that it was not easily integrated into the existing data environment. While the peer-to-peer system worked well for finding and accessing distributed data processing services, it was found that its use was limited by the difficulty in calling it from existing tools and services. After collaborations with members of the data community, it was determined that our data processing system was of high value and that a new interface should be pursued in order for the community to take full advantage of it. As such; the framework was modified from a peer-to-peer architecture to a more traditional web service approach. Following this change multiple data processing services were added. These services include such things as coordinate transformations and sub setting of data. Observatory (VHO), assisted with integrating the new architecture into the VHO. This allows anyone using the VHO to search for data, to then pass that data through our processing services prior to downloading it. As a second attempt at demonstrating the new system, a collaboration was established with the Collaborative Sun Earth Connector (CoSEC) group at Lockheed Martin. This group is working on a graphical user interface to the Virtual Observatories and data processing software. The intent is to provide a high-level easy-to-use graphical interface that will allow access to the existing Virtual Observatories and data processing services from one convenient application. Working with the CoSEC group we provided access to our data

  10. New Python-based methods for data processing

    International Nuclear Information System (INIS)

    Sauter, Nicholas K.; Hattne, Johan; Grosse-Kunstleve, Ralf W.; Echols, Nathaniel

    2013-01-01

    The Computational Crystallography Toolbox (cctbx) is a flexible software platform that has been used to develop high-throughput crystal-screening tools for both synchrotron sources and X-ray free-electron lasers. Plans for data-processing and visualization applications are discussed, and the benefits and limitations of using graphics-processing units are evaluated. Current pixel-array detectors produce diffraction images at extreme data rates (of up to 2 TB h −1 ) that make severe demands on computational resources. New multiprocessing frameworks are required to achieve rapid data analysis, as it is important to be able to inspect the data quickly in order to guide the experiment in real time. By utilizing readily available web-serving tools that interact with the Python scripting language, it was possible to implement a high-throughput Bragg-spot analyzer (cctbx.spotfinder) that is presently in use at numerous synchrotron-radiation beamlines. Similarly, Python interoperability enabled the production of a new data-reduction package (cctbx.xfel) for serial femtosecond crystallography experiments at the Linac Coherent Light Source (LCLS). Future data-reduction efforts will need to focus on specialized problems such as the treatment of diffraction spots on interleaved lattices arising from multi-crystal specimens. In these challenging cases, accurate modeling of close-lying Bragg spots could benefit from the high-performance computing capabilities of graphics-processing units

  11. New Python-based methods for data processing

    Energy Technology Data Exchange (ETDEWEB)

    Sauter, Nicholas K., E-mail: nksauter@lbl.gov; Hattne, Johan; Grosse-Kunstleve, Ralf W.; Echols, Nathaniel [Lawrence Berkeley National Laboratory, 1 Cyclotron Road, Berkeley, CA 94720 (United States)

    2013-07-01

    The Computational Crystallography Toolbox (cctbx) is a flexible software platform that has been used to develop high-throughput crystal-screening tools for both synchrotron sources and X-ray free-electron lasers. Plans for data-processing and visualization applications are discussed, and the benefits and limitations of using graphics-processing units are evaluated. Current pixel-array detectors produce diffraction images at extreme data rates (of up to 2 TB h{sup −1}) that make severe demands on computational resources. New multiprocessing frameworks are required to achieve rapid data analysis, as it is important to be able to inspect the data quickly in order to guide the experiment in real time. By utilizing readily available web-serving tools that interact with the Python scripting language, it was possible to implement a high-throughput Bragg-spot analyzer (cctbx.spotfinder) that is presently in use at numerous synchrotron-radiation beamlines. Similarly, Python interoperability enabled the production of a new data-reduction package (cctbx.xfel) for serial femtosecond crystallography experiments at the Linac Coherent Light Source (LCLS). Future data-reduction efforts will need to focus on specialized problems such as the treatment of diffraction spots on interleaved lattices arising from multi-crystal specimens. In these challenging cases, accurate modeling of close-lying Bragg spots could benefit from the high-performance computing capabilities of graphics-processing units.

  12. Malleable Thought: The Role of Craft Thinking in Practice-Led Graphic Design

    Science.gov (United States)

    Ings, Welby

    2015-01-01

    This article considers the potential of craft processes as creative engagements in graphic design research. It initially discusses the uneasy history of craft within the discipline, then draws upon case studies undertaken by three established designers who, in their postgraduate theses, engaged with craft as a process of thinking. In doing so, the…

  13. Technique and cue selection for graphical presentation of generic hyperdimensional data

    Science.gov (United States)

    Howard, Lee M.; Burton, Robert P.

    2013-12-01

    Several presentation techniques have been created for visualization of data with more than three variables. Packages have been written, each of which implements a subset of these techniques. However, these packages generally fail to provide all the features needed by the user during the visualization process. Further, packages generally limit support for presentation techniques to a few techniques. A new package called Petrichor accommodates all necessary and useful features together in one system. Any presentation technique may be added easily through an extensible plugin system. Features are supported by a user interface that allows easy interaction with data. Annotations allow users to mark up visualizations and share information with others. By providing a hyperdimensional graphics package that easily accommodates presentation techniques and includes a complete set of features, including those that are rarely or never supported elsewhere, the user is provided with a tool that facilitates improved interaction with multivariate data to extract and disseminate information.

  14. Graphic filter library implemented in CUDA language

    OpenAIRE

    Peroutková, Hedvika

    2009-01-01

    This thesis deals with the problem of reducing computation time of raster image processing by parallel computing on graphics processing unit. Raster image processing thereby refers to the application of graphic filters, which can be applied in sequence with different settings. This thesis evaluates the suitability of using parallelization on graphic card for raster image adjustments based on multicriterial choice. Filters are implemented for graphics processing unit in CUDA language. Opacity ...

  15. Graphics gems

    CERN Document Server

    Heckbert, Paul S

    1994-01-01

    Graphics Gems IV contains practical techniques for 2D and 3D modeling, animation, rendering, and image processing. The book presents articles on polygons and polyhedral; a mix of formulas, optimized algorithms, and tutorial information on the geometry of 2D, 3D, and n-D space; transformations; and parametric curves and surfaces. The text also includes articles on ray tracing; shading 3D models; and frame buffer techniques. Articles on image processing; algorithms for graphical layout; basic interpolation methods; and subroutine libraries for vector and matrix algebra are also demonstrated. Com

  16. Automated graphic image generation system for effective representation of infectious disease surveillance data.

    Science.gov (United States)

    Inoue, Masashi; Hasegawa, Shinsaku; Suyama, Akihiko; Meshitsuka, Shunsuke

    2003-11-01

    Infectious disease surveillance schemes have been established to detect infectious disease outbreak in the early stages, to identify the causative viral strains, and to rapidly assess related morbidity and mortality. To make a scheme function well, two things are required. Firstly, it must have sufficient sensitivity and be timely to guarantee as short a delay as possible from collection to redistribution of information. Secondly, it must provide a good representation of the results of the surveillance. To do this, we have developed a database system that can redistribute the information via the Internet. The feature of this system is to automatically generate the graphic images based on the numerical data stored in the database by using Hypertext Preprocessor (PHP) script and Graphics Drawing (GD) library. It dynamically displays the information as a map or bar chart as well as a numerical impression according to the real time demand of the users. This system will be a useful tool for medical personnel and researchers working on infectious disease problems and will save significant time in the redistribution of information.

  17. A Block-Asynchronous Relaxation Method for Graphics Processing Units

    OpenAIRE

    Anzt, H.; Dongarra, J.; Heuveline, Vincent; Tomov, S.

    2011-01-01

    In this paper, we analyze the potential of asynchronous relaxation methods on Graphics Processing Units (GPUs). For this purpose, we developed a set of asynchronous iteration algorithms in CUDA and compared them with a parallel implementation of synchronous relaxation methods on CPU-based systems. For a set of test matrices taken from the University of Florida Matrix Collection we monitor the convergence behavior, the average iteration time and the total time-to-solution time. Analyzing the r...

  18. How do trees grow? Response from the graphical and quantitative analyses of computed tomography scanning data collected on stem sections.

    Science.gov (United States)

    Dutilleul, Pierre; Han, Li Wen; Beaulieu, Jean

    2014-06-01

    Tree growth, as measured via the width of annual rings, is used for environmental impact assessment and climate back-forecasting. This fascinating natural process has been studied at various scales in the stem (from cell and fiber within a growth ring, to ring and entire stem) in one, two, and three dimensions. A new approach is presented to study tree growth in 3D from stem sections, at a scale sufficiently small to allow the delineation of reliable limits for annual rings and large enough to capture directional variation in growth rates. The technology applied is computed tomography scanning, which provides - for one stem section - millions of data (indirect measures of wood density) that can be mapped, together with a companion measure of dispersion and growth ring limits in filigree. Graphical and quantitative analyses are reported for white spruce trees with circular vs non-circular growth. Implications for dendroclimatological research are discussed. Copyright © 2014 Académie des sciences. Published by Elsevier SAS. All rights reserved.

  19. Interactive graphical system for small-angle scattering analysis of polydisperse systems

    International Nuclear Information System (INIS)

    Konarev, P V; Volkov, V V; Svergun, D I

    2016-01-01

    A program suite for one-dimensional small-angle scattering analysis of polydisperse systems and multiple data sets is presented. The main program, POLYSAS , has a menu-driven graphical user interface calling computational modules from ATSAS package to perform data treatment and analysis. The graphical menu interface allows one to process multiple (time, concentration or temperature-dependent) data sets and interactively change the parameters for the data modelling using sliders. The graphical representation of the data is done via the Winteracter-based program SASPLOT . The package is designed for the analysis of polydisperse systems and mixtures, and permits one to obtain size distributions and evaluate the volume fractions of the components using linear and non-linear fitting algorithms as well as model-independent singular value decomposition. The use of the POLYSAS package is illustrated by the recent examples of its application to study concentration-dependent oligomeric states of proteins and time kinetics of polymer micelles for anticancer drug delivery. (paper)

  20. Automatic processing of radioimmunological research data on a computer

    International Nuclear Information System (INIS)

    Korolyuk, I.P.; Gorodenko, A.N.; Gorodenko, S.I.

    1979-01-01

    A program ''CRITEST'' in the language PL/1 for the EC computer intended for automatic processing of the results of radioimmunological research has been elaborated. The program works in the operation system of the OC EC computer and is performed in the section OC 60 kb. When compiling the program Eitken's modified algorithm was used. The program was clinically approved when determining a number of hormones: CTH, T 4 , T 3 , TSH. The automatic processing of the radioimmunological research data on the computer makes it possible to simplify the labour-consuming analysis and to raise its accuracy

  1. ggplot2 elegant graphics for data analysis

    CERN Document Server

    Wickham, Hadley

    2016-01-01

    This new edition to the classic book by ggplot2 creator Hadley Wickham highlights compatibility with knitr and RStudio. ggplot2 is a data visualization package for R that helps users create data graphics, including those that are multi-layered, with ease. With ggplot2, it's easy to: • produce handsome, publication-quality plots with automatic legends created from the plot specification • superimpose multiple layers (points, lines, maps, tiles, box plots) from different data sources with automatically adjusted common scales • add customizable smoothers that use powerful modeling capabilities of R, such as loess, linear models, generalized additive models, and robust regression • save any ggplot2 plot (or part thereof) for later modification or reuse • create custom themes that capture in-house or journal style requirements and that can easily be applied to multiple plots • approach a graph from a visual perspective, thinking about how each component of the data is represented on the final plot This...

  2. Real-time processing for full-range Fourier-domain optical-coherence tomography with zero-filling interpolation using multiple graphic processing units.

    Science.gov (United States)

    Watanabe, Yuuki; Maeno, Seiya; Aoshima, Kenji; Hasegawa, Haruyuki; Koseki, Hitoshi

    2010-09-01

    The real-time display of full-range, 2048?axial pixelx1024?lateral pixel, Fourier-domain optical-coherence tomography (FD-OCT) images is demonstrated. The required speed was achieved by using dual graphic processing units (GPUs) with many stream processors to realize highly parallel processing. We used a zero-filling technique, including a forward Fourier transform, a zero padding to increase the axial data-array size to 8192, an inverse-Fourier transform back to the spectral domain, a linear interpolation from wavelength to wavenumber, a lateral Hilbert transform to obtain the complex spectrum, a Fourier transform to obtain the axial profiles, and a log scaling. The data-transfer time of the frame grabber was 15.73?ms, and the processing time, which includes the data transfer between the GPU memory and the host computer, was 14.75?ms, for a total time shorter than the 36.70?ms frame-interval time using a line-scan CCD camera operated at 27.9?kHz. That is, our OCT system achieved a processed-image display rate of 27.23 frames/s.

  3. Accelerating cardiac bidomain simulations using graphics processing units.

    Science.gov (United States)

    Neic, A; Liebmann, M; Hoetzl, E; Mitchell, L; Vigmond, E J; Haase, G; Plank, G

    2012-08-01

    Anatomically realistic and biophysically detailed multiscale computer models of the heart are playing an increasingly important role in advancing our understanding of integrated cardiac function in health and disease. Such detailed simulations, however, are computationally vastly demanding, which is a limiting factor for a wider adoption of in-silico modeling. While current trends in high-performance computing (HPC) hardware promise to alleviate this problem, exploiting the potential of such architectures remains challenging since strongly scalable algorithms are necessitated to reduce execution times. Alternatively, acceleration technologies such as graphics processing units (GPUs) are being considered. While the potential of GPUs has been demonstrated in various applications, benefits in the context of bidomain simulations where large sparse linear systems have to be solved in parallel with advanced numerical techniques are less clear. In this study, the feasibility of multi-GPU bidomain simulations is demonstrated by running strong scalability benchmarks using a state-of-the-art model of rabbit ventricles. The model is spatially discretized using the finite element methods (FEM) on fully unstructured grids. The GPU code is directly derived from a large pre-existing code, the Cardiac Arrhythmia Research Package (CARP), with very minor perturbation of the code base. Overall, bidomain simulations were sped up by a factor of 11.8 to 16.3 in benchmarks running on 6-20 GPUs compared to the same number of CPU cores. To match the fastest GPU simulation which engaged 20 GPUs, 476 CPU cores were required on a national supercomputing facility.

  4. Graphic Data Display from Manufacturing on Web Pages

    Directory of Open Access Journals (Sweden)

    Martin VALAS

    2009-06-01

    Full Text Available Industrial data can by displayed in graphical form which is usually used by three types of users. The first, nonstop users, most frequent operational engineer, who checking actual displayed values and then intervene in operation. The second are occasional users who are interested in historical data e.g. for servicing reason. The last users’ types are tradesmen and managers. State comparison few days or months ago helps as decision-making support. Graph component with web application, which provides data as XML document, was designed for second users group. Graph component displays historical data. Students can fully understand all the problems go along with web application creation in ASP.NET, which provides data in XML document, as well as graph component creation in integrated development environment Flash, thanks in detail described solution using ActionScript.

  5. A sampler of useful computational tools for applied geometry, computer graphics, and image processing foundations for computer graphics, vision, and image processing

    CERN Document Server

    Cohen-Or, Daniel; Ju, Tao; Mitra, Niloy J; Shamir, Ariel; Sorkine-Hornung, Olga; Zhang, Hao (Richard)

    2015-01-01

    A Sampler of Useful Computational Tools for Applied Geometry, Computer Graphics, and Image Processing shows how to use a collection of mathematical techniques to solve important problems in applied mathematics and computer science areas. The book discusses fundamental tools in analytical geometry and linear algebra. It covers a wide range of topics, from matrix decomposition to curvature analysis and principal component analysis to dimensionality reduction.Written by a team of highly respected professors, the book can be used in a one-semester, intermediate-level course in computer science. It

  6. Development of Data Acquisition System for nuclear thermal hydraulic out-of-pile facility using the graphical programming methods

    Energy Technology Data Exchange (ETDEWEB)

    Bouaichaoui, Youcef; Berrahal, Abderezak; Halbaoui, Khaled [Birine Nuclear Research Center/CRNB/COMENA/ALGERIA, BO 180, Ain Oussera, 17200, Djelfa (Algeria)

    2015-07-01

    This paper describes the design of data acquisition system (DAQ) that is connected to a PC and development of a feedback control system that maintains the coolant temperature of the process at a desired set point using a digital controller system based on the graphical programming language. The paper will provide details about the data acquisition unit, shows the implementation of the controller, and present test results. (authors)

  7. A synthesis of research on color, typography and graphics as they relate to readability

    Science.gov (United States)

    Lamoreaux, M. E.

    1985-09-01

    A foundation for future research on the use of color, typography, and graphics to improve readability is provided. Articles from the broad fields of education and psychology, as well as from the fields of journalism and printing, have been reviewed for research relating color, typography, and graphics to reading ease, speed, or comprehension. The most relevant articles reviewed are presented in an annoated bibliography; the remaining articles are also presented in bibliographic format. This literature review indicates that recognition and recall of printed material may be improved through the use of headings, underlining, color, and, especially, illustrations. Current research suggests that individuals can remember pictures far longer than past research indicates. However, researchers are divided on the usefulness of illustrations to improve reading comprehension. On the other hand, reading comprehension can be improved through the use of statistical graphs and tables if the reader is properly trained in the use of these devices.

  8. HARMONICS SURFACES OF GRAPHIC TYPE IN R3

    Directory of Open Access Journals (Sweden)

    Carlos Carrión Riveros

    2016-06-01

    Full Text Available In this research we study harmonic surfaces immersed in R3. We dened Harmonic surfaces of graphic type and showed that a harmonious surface graphic type is minimal if and only if it is part of a plane or a helix. Also, we give a characterization of harmonic surfaces graphic type parameterized by asymptotic lines and some examples.

  9. General purpose graphic processing unit implementation of adaptive pulse compression algorithms

    Science.gov (United States)

    Cai, Jingxiao; Zhang, Yan

    2017-07-01

    This study introduces a practical approach to implement real-time signal processing algorithms for general surveillance radar based on NVIDIA graphical processing units (GPUs). The pulse compression algorithms are implemented using compute unified device architecture (CUDA) libraries such as CUDA basic linear algebra subroutines and CUDA fast Fourier transform library, which are adopted from open source libraries and optimized for the NVIDIA GPUs. For more advanced, adaptive processing algorithms such as adaptive pulse compression, customized kernel optimization is needed and investigated. A statistical optimization approach is developed for this purpose without needing much knowledge of the physical configurations of the kernels. It was found that the kernel optimization approach can significantly improve the performance. Benchmark performance is compared with the CPU performance in terms of processing accelerations. The proposed implementation framework can be used in various radar systems including ground-based phased array radar, airborne sense and avoid radar, and aerospace surveillance radar.

  10. Energy- and cost-efficient lattice-QCD computations using graphics processing units

    Energy Technology Data Exchange (ETDEWEB)

    Bach, Matthias

    2014-07-01

    Quarks and gluons are the building blocks of all hadronic matter, like protons and neutrons. Their interaction is described by Quantum Chromodynamics (QCD), a theory under test by large scale experiments like the Large Hadron Collider (LHC) at CERN and in the future at the Facility for Antiproton and Ion Research (FAIR) at GSI. However, perturbative methods can only be applied to QCD for high energies. Studies from first principles are possible via a discretization onto an Euclidean space-time grid. This discretization of QCD is called Lattice QCD (LQCD) and is the only ab-initio option outside of the high-energy regime. LQCD is extremely compute and memory intensive. In particular, it is by definition always bandwidth limited. Thus - despite the complexity of LQCD applications - it led to the development of several specialized compute platforms and influenced the development of others. However, in recent years General-Purpose computation on Graphics Processing Units (GPGPU) came up as a new means for parallel computing. Contrary to machines traditionally used for LQCD, graphics processing units (GPUs) are a massmarket product. This promises advantages in both the pace at which higher-performing hardware becomes available and its price. CL2QCD is an OpenCL based implementation of LQCD using Wilson fermions that was developed within this thesis. It operates on GPUs by all major vendors as well as on central processing units (CPUs). On the AMD Radeon HD 7970 it provides the fastest double-precision D kernel for a single GPU, achieving 120GFLOPS. D - the most compute intensive kernel in LQCD simulations - is commonly used to compare LQCD platforms. This performance is enabled by an in-depth analysis of optimization techniques for bandwidth-limited codes on GPUs. Further, analysis of the communication between GPU and CPU, as well as between multiple GPUs, enables high-performance Krylov space solvers and linear scaling to multiple GPUs within a single system. LQCD

  11. Energy- and cost-efficient lattice-QCD computations using graphics processing units

    International Nuclear Information System (INIS)

    Bach, Matthias

    2014-01-01

    Quarks and gluons are the building blocks of all hadronic matter, like protons and neutrons. Their interaction is described by Quantum Chromodynamics (QCD), a theory under test by large scale experiments like the Large Hadron Collider (LHC) at CERN and in the future at the Facility for Antiproton and Ion Research (FAIR) at GSI. However, perturbative methods can only be applied to QCD for high energies. Studies from first principles are possible via a discretization onto an Euclidean space-time grid. This discretization of QCD is called Lattice QCD (LQCD) and is the only ab-initio option outside of the high-energy regime. LQCD is extremely compute and memory intensive. In particular, it is by definition always bandwidth limited. Thus - despite the complexity of LQCD applications - it led to the development of several specialized compute platforms and influenced the development of others. However, in recent years General-Purpose computation on Graphics Processing Units (GPGPU) came up as a new means for parallel computing. Contrary to machines traditionally used for LQCD, graphics processing units (GPUs) are a massmarket product. This promises advantages in both the pace at which higher-performing hardware becomes available and its price. CL2QCD is an OpenCL based implementation of LQCD using Wilson fermions that was developed within this thesis. It operates on GPUs by all major vendors as well as on central processing units (CPUs). On the AMD Radeon HD 7970 it provides the fastest double-precision D kernel for a single GPU, achieving 120GFLOPS. D - the most compute intensive kernel in LQCD simulations - is commonly used to compare LQCD platforms. This performance is enabled by an in-depth analysis of optimization techniques for bandwidth-limited codes on GPUs. Further, analysis of the communication between GPU and CPU, as well as between multiple GPUs, enables high-performance Krylov space solvers and linear scaling to multiple GPUs within a single system. LQCD

  12. Graphics gems V (Macintosh version)

    CERN Document Server

    Paeth, Alan W

    1995-01-01

    Graphics Gems V is the newest volume in The Graphics Gems Series. It is intended to provide the graphics community with a set of practical tools for implementing new ideas and techniques, and to offer working solutions to real programming problems. These tools are written by a wide variety of graphics programmers from industry, academia, and research. The books in the series have become essential, time-saving tools for many programmers.Latest collection of graphics tips in The Graphics Gems Series written by the leading programmers in the field.Contains over 50 new gems displaying some of t

  13. Real-time speckle variance swept-source optical coherence tomography using a graphics processing unit.

    Science.gov (United States)

    Lee, Kenneth K C; Mariampillai, Adrian; Yu, Joe X Z; Cadotte, David W; Wilson, Brian C; Standish, Beau A; Yang, Victor X D

    2012-07-01

    Advances in swept source laser technology continues to increase the imaging speed of swept-source optical coherence tomography (SS-OCT) systems. These fast imaging speeds are ideal for microvascular detection schemes, such as speckle variance (SV), where interframe motion can cause severe imaging artifacts and loss of vascular contrast. However, full utilization of the laser scan speed has been hindered by the computationally intensive signal processing required by SS-OCT and SV calculations. Using a commercial graphics processing unit that has been optimized for parallel data processing, we report a complete high-speed SS-OCT platform capable of real-time data acquisition, processing, display, and saving at 108,000 lines per second. Subpixel image registration of structural images was performed in real-time prior to SV calculations in order to reduce decorrelation from stationary structures induced by the bulk tissue motion. The viability of the system was successfully demonstrated in a high bulk tissue motion scenario of human fingernail root imaging where SV images (512 × 512 pixels, n = 4) were displayed at 54 frames per second.

  14. A data collection and processing procedure for evaluating a research program

    Science.gov (United States)

    Giuseppe Rensi; H. Dean Claxton

    1972-01-01

    A set of computer programs compiled for the information processing requirements of a model for evaluating research proposals are described. The programs serve to assemble and store information, periodically update it, and convert it to a form usable for decision-making. Guides for collecting and coding data are explained. The data-processing options available and...

  15. Touch-sensitive graphics terminal applied to process control

    International Nuclear Information System (INIS)

    Bennion, S.I.; Creager, J.D.; VanHouten, R.D.

    1981-01-01

    Limited initial demonstrations of the system described took place during September 1980. A single CRT was used an an input device in the control center while operating a furnace and a pellet inspection gage. These two process line devices were completely controlled, despite the longer than desired response times noted, using a single control station located in the control center. The operator could conveniently execute any function from this remote location which could be performed locally at the hard-wired control panels. With the installation of the enhancements, the integrated touchscreen/graphics terminal will provide a preferable alternative to normal keyboard command input devices

  16. Watermarking Algorithms for 3D NURBS Graphic Data

    Directory of Open Access Journals (Sweden)

    Jae Jun Lee

    2004-10-01

    Full Text Available Two watermarking algorithms for 3D nonuniform rational B-spline (NURBS graphic data are proposed: one is appropriate for the steganography, and the other for watermarking. Instead of directly embedding data into the parameters of NURBS, the proposed algorithms embed data into the 2D virtual images extracted by parameter sampling of 3D model. As a result, the proposed steganography algorithm can embed information into more places of the surface than the conventional algorithm, while preserving the data size of the model. Also, any existing 2D watermarking technique can be used for the watermarking of 3D NURBS surfaces. From the experiment, it is found that the algorithm for the watermarking is robust to the attacks on weights, control points, and knots. It is also found to be robust to the remodeling of NURBS models.

  17. Advanced diagnostic graphics

    International Nuclear Information System (INIS)

    Bray, M.A.; Petersen, R.J.; Clark, M.T.; Gertman, D.I.

    1981-01-01

    This paper reports US NRC-sponsored research at the Idaho National Engineering Laboratory (INEL) involving evaluation of computer-based diagnostic graphics. The specific targets of current evaluations are multivariate data display formats which may be used in Safety Parameter Display Systems (SPDS) being developed for nuclear power plant control rooms. The purpose of the work is to provide a basis for NRC action in regulating licensee SPDSs or later computer/cathode ray tube (CRT) applications in nuclear control rooms

  18. PROFESSIONALLY ORIENTED COURSE OF ENGINEERING-GRAPHICAL TRAINING

    Directory of Open Access Journals (Sweden)

    Olga V. Zhuykova

    2015-01-01

    Full Text Available The aim of the article is to present the results of managing the competence oriented self-directed student learning while studying graphical subjects at Kalashnikov Izhevsk State Technical University.Methods. The technology of self-directed engineering-graphical training of future bachelors based on the analysis of educational literature and teaching experience, providing individualization and professional education is suggested. The method of team expert appraisal was used at all stages of self-directed learning management. This method is one of main in qualimetry (the science concerned with assessing and evaluating the quality of any objects and processes; it permits to reveal the components of engineering-graphical competence, to establish the criteria and markers of determining the level of its development, to perform expert evaluation of student tasks and estimation procedures.Results. It has been established that the revitalization of student selfdirected learning owing to professional education and individualization permits to increase the level of student engineering-graphical competence development. Scientific novelty. The criteria evaluation procedures for determining the level of student engineering-graphical competence development in the process of their professional oriented self-directed learning while studying graphical subjects at a technical university are developed.Practical significance. The professional-focused educational trajectories of independent engineering-graphic preparation of students are designed and substantially filled in content. Such training is being realised at the present time at Kalashnikov Izhevsk State Technical University, major «Instrument Engineering». 

  19. Porting of the transfer-matrix method for multilayer thin-film computations on graphics processing units

    Science.gov (United States)

    Limmer, Steffen; Fey, Dietmar

    2013-07-01

    Thin-film computations are often a time-consuming task during optical design. An efficient way to accelerate these computations with the help of graphics processing units (GPUs) is described. It turned out that significant speed-ups can be achieved. We investigate the circumstances under which the best speed-up values can be expected. Therefore we compare different GPUs among themselves and with a modern CPU. Furthermore, the effect of thickness modulation on the speed-up and the runtime behavior depending on the input data is examined.

  20. Graphics Gems III IBM version

    CERN Document Server

    Kirk, David

    1994-01-01

    This sequel to Graphics Gems (Academic Press, 1990), and Graphics Gems II (Academic Press, 1991) is a practical collection of computer graphics programming tools and techniques. Graphics Gems III contains a larger percentage of gems related to modeling and rendering, particularly lighting and shading. This new edition also covers image processing, numerical and programming techniques, modeling and transformations, 2D and 3D geometry and algorithms,ray tracing and radiosity, rendering, and more clever new tools and tricks for graphics programming. Volume III also includes a

  1. Image processing and computer graphics in radiology. Pt. A

    International Nuclear Information System (INIS)

    Toennies, K.D.

    1993-01-01

    The reports give a full review of all aspects of digital imaging in radiology which are of significance to image processing and the subsequent picture archiving and communication techniques. The review strongly clings to practice and illustrates the various contributions from specialized areas of the computer sciences, such as computer vision, computer graphics, database systems and information and communication systems, man-machine interactions and software engineering. Methods and models available are explained and assessed for their respective performance and value, and basic principles are briefly explained. (DG) [de

  2. Image processing and computer graphics in radiology. Pt. B

    International Nuclear Information System (INIS)

    Toennies, K.D.

    1993-01-01

    The reports give a full review of all aspects of digital imaging in radiology which are of significance to image processing and the subsequent picture archiving and communication techniques. The review strongly clings to practice and illustrates the various contributions from specialized areas of the computer sciences, such as computer vision, computer graphics, database systems and information and communication systems, man-machine interactions and software engineering. Methods and models available are explained and assessed for their respective performance and value, and basic principles are briefly explained. (DG) [de

  3. High-Performance Pseudo-Random Number Generation on Graphics Processing Units

    OpenAIRE

    Nandapalan, Nimalan; Brent, Richard P.; Murray, Lawrence M.; Rendell, Alistair

    2011-01-01

    This work considers the deployment of pseudo-random number generators (PRNGs) on graphics processing units (GPUs), developing an approach based on the xorgens generator to rapidly produce pseudo-random numbers of high statistical quality. The chosen algorithm has configurable state size and period, making it ideal for tuning to the GPU architecture. We present a comparison of both speed and statistical quality with other common parallel, GPU-based PRNGs, demonstrating favourable performance o...

  4. Research on Process-oriented Spatio-temporal Data Model

    Directory of Open Access Journals (Sweden)

    XUE Cunjin

    2016-02-01

    Full Text Available According to the analysis of the present status and existing problems of spatio-temporal data models developed in last 20 years,this paper proposes a process-oriented spatio-temporal data model (POSTDM,aiming at representing,organizing and storing continuity and gradual geographical entities. The dynamic geographical entities are graded and abstracted into process objects series from their intrinsic characteristics,which are process objects,process stage objects,process sequence objects and process state objects. The logical relationships among process entities are further studied and the structure of UML models and storage are also designed. In addition,through the mechanisms of continuity and gradual changes impliedly recorded by process objects,and the modes of their procedure interfaces offered by the customized ObjcetStorageTable,the POSTDM can carry out process representation,storage and dynamic analysis of continuity and gradual geographic entities. Taking a process organization and storage of marine data as an example,a prototype system (consisting of an object-relational database and a functional analysis platform is developed for validating and evaluating the model's practicability.

  5. Real-time data acquisition and processing platform for fusion experiments

    International Nuclear Information System (INIS)

    Ruiz, M.; Barrera, E.; Lopez, S.; Machon, D.; Vega, J.; Sanchez, E.

    2004-01-01

    This paper describes the features of the hardware and low-level software of the PXI real-time data acquisition and processing system developed for the TJ-II device located in the Centro de Investigaciones Energeticas Medioambientales y Tecnologicas (CIEMAT) in Madrid. This system fulfills three objectives: (1) to increase processing capabilities of standard data acquisition systems by adding specific processing cards, (2) to acquire and process data in real time with a view to deployment on steady state fusion devices, and (3) to develop the data acquisition and processing applications using graphical languages like LabView

  6. Organization of the independent work of students while studying engineering graphics

    Directory of Open Access Journals (Sweden)

    Tel’noy Viktor Ivanovich

    2015-01-01

    Full Text Available The article reveals the possibility of creating and implementing teaching conditions for the rational organization of the independent work of first-year students in state of adaptation to the study of the course of engineering drawing. Theoretical and methodological aspects of students’ independent work are presented: types and forms of organization and control, training and methodological support of their independent work. The authors used such an approach to independent work organization: teacher-led classes during the main types of training activities (lectures, practical and laboratory work; form of organization of training (extracurricular, and also self study using innovative teaching methods promotes creative activities of students and the development of competencies of a future skilled construction industry professional. The role of modern information and communication technologies in independent work of students was specified. According to the degree of coverage of students, taking into account individual characteristics and different levels of preparedness, the following forms of independent work organization were detached: individual, differentiated and front.In the process of engineering graphics studying it is recommended to use the following basic forms of independent work: ongoing work with the lecture material; selection and study of literature and electronic sources of information on the problems of the discipline; preparation for the main classroom training; performing calculation and graphic works; work in student scientific societies and carrying out research work; participation in scientific conferences, seminars and other. Emphasis on the formation of students’ skills in working with different types of educational and scientific literature, the ability to analyze, organize information in electronical library systems, open educational resources.

  7. COMPUTER GRAPHICS IN ENGINEERING GRAPHICS DEPARTMENT OF MOSCOW AVIATION INSTITUTE EDUCATIONAL PROCESS

    OpenAIRE

    Ludmila P. Bobrik; Leonid V. Markin

    2013-01-01

    Current state of technical universities students’ engineering grounding and “Engineering graphics” course place in MAI are analyzed in this paper. Also bachelor degree problems and experience of creation of issuing specialty based on «Engineering graphics» department are considered. 

  8. AN APPROACH TO EFFICIENT FEM SIMULATIONS ON GRAPHICS PROCESSING UNITS USING CUDA

    Directory of Open Access Journals (Sweden)

    Björn Nutti

    2014-04-01

    Full Text Available The paper presents a highly efficient way of simulating the dynamic behavior of deformable objects by means of the finite element method (FEM with computations performed on Graphics Processing Units (GPU. The presented implementation reduces bottlenecks related to memory accesses by grouping the necessary data per node pairs, in contrast to the classical way done per element. This strategy reduces the memory access patterns that are not suitable for the GPU memory architecture. Furthermore, the presented implementation takes advantage of the underlying sparse-block-matrix structure, and it has been demonstrated how to avoid potential bottlenecks in the algorithm. To achieve plausible deformational behavior for large local rotations, the objects are modeled by means of a simplified co-rotational FEM formulation.

  9. Fast ray-tracing of human eye optics on Graphics Processing Units.

    Science.gov (United States)

    Wei, Qi; Patkar, Saket; Pai, Dinesh K

    2014-05-01

    We present a new technique for simulating retinal image formation by tracing a large number of rays from objects in three dimensions as they pass through the optic apparatus of the eye to objects. Simulating human optics is useful for understanding basic questions of vision science and for studying vision defects and their corrections. Because of the complexity of computing such simulations accurately, most previous efforts used simplified analytical models of the normal eye. This makes them less effective in modeling vision disorders associated with abnormal shapes of the ocular structures which are hard to be precisely represented by analytical surfaces. We have developed a computer simulator that can simulate ocular structures of arbitrary shapes, for instance represented by polygon meshes. Topographic and geometric measurements of the cornea, lens, and retina from keratometer or medical imaging data can be integrated for individualized examination. We utilize parallel processing using modern Graphics Processing Units (GPUs) to efficiently compute retinal images by tracing millions of rays. A stable retinal image can be generated within minutes. We simulated depth-of-field, accommodation, chromatic aberrations, as well as astigmatism and correction. We also show application of the technique in patient specific vision correction by incorporating geometric models of the orbit reconstructed from clinical medical images. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  10. MASSIVELY PARALLEL LATENT SEMANTIC ANALYSES USING A GRAPHICS PROCESSING UNIT

    Energy Technology Data Exchange (ETDEWEB)

    Cavanagh, J.; Cui, S.

    2009-01-01

    Latent Semantic Analysis (LSA) aims to reduce the dimensions of large term-document datasets using Singular Value Decomposition. However, with the ever-expanding size of datasets, current implementations are not fast enough to quickly and easily compute the results on a standard PC. A graphics processing unit (GPU) can solve some highly parallel problems much faster than a traditional sequential processor or central processing unit (CPU). Thus, a deployable system using a GPU to speed up large-scale LSA processes would be a much more effective choice (in terms of cost/performance ratio) than using a PC cluster. Due to the GPU’s application-specifi c architecture, harnessing the GPU’s computational prowess for LSA is a great challenge. We presented a parallel LSA implementation on the GPU, using NVIDIA® Compute Unifi ed Device Architecture and Compute Unifi ed Basic Linear Algebra Subprograms software. The performance of this implementation is compared to traditional LSA implementation on a CPU using an optimized Basic Linear Algebra Subprograms library. After implementation, we discovered that the GPU version of the algorithm was twice as fast for large matrices (1 000x1 000 and above) that had dimensions not divisible by 16. For large matrices that did have dimensions divisible by 16, the GPU algorithm ran fi ve to six times faster than the CPU version. The large variation is due to architectural benefi ts of the GPU for matrices divisible by 16. It should be noted that the overall speeds for the CPU version did not vary from relative normal when the matrix dimensions were divisible by 16. Further research is needed in order to produce a fully implementable version of LSA. With that in mind, the research we presented shows that the GPU is a viable option for increasing the speed of LSA, in terms of cost/performance ratio.

  11. General aviation design synthesis utilizing interactive computer graphics

    Science.gov (United States)

    Galloway, T. L.; Smith, M. R.

    1976-01-01

    Interactive computer graphics is a fast growing area of computer application, due to such factors as substantial cost reductions in hardware, general availability of software, and expanded data communication networks. In addition to allowing faster and more meaningful input/output, computer graphics permits the use of data in graphic form to carry out parametric studies for configuration selection and for assessing the impact of advanced technologies on general aviation designs. The incorporation of interactive computer graphics into a NASA developed general aviation synthesis program is described, and the potential uses of the synthesis program in preliminary design are demonstrated.

  12. COMPUTER GRAPHICS IN ENGINEERING GRAPHICS DEPARTMENT OF MOSCOW AVIATION INSTITUTE EDUCATIONAL PROCESS

    Directory of Open Access Journals (Sweden)

    Ludmila P. Bobrik

    2013-01-01

    Full Text Available Current state of technical universities students’ engineering grounding and “Engineering graphics” course place in MAI are analyzed in this paper. Also bachelor degree problems and experience of creation of issuing specialty based on «Engineering graphics» department are considered. 

  13. TEK11 graphics user's guide

    International Nuclear Information System (INIS)

    Stewart, C.R. Jr.; Joubert, W.D.; Overbey, D.R.; Stewart, K.A.

    1978-10-01

    The TEK11 graphics library was written for use on PDP-11 minicomputers running the RT-11 operating system to drive Tektronix 4010 graphics display terminals. Library subroutines are coded in FORTRAN and assembly language. The library includes routines to draw axes, either linear or semilog, to plot data in terms of logical values without first scaling to screen coordinates, to label graphs, and to plot in a maximum of four regions on the screen. Modes of plotting may be point plot with any character at the point, vector plot, or bar plot. Two features, automatic scaling and windowing, permit the researcher to use computer graphics without spending time first to learn about scaling or ''Tek points'' and preparing long parameter lists for subroutines. Regions on the screen are defined by specifying minima and maxima logical coordinates, i.e., 0 K or milliseconds, and a region number. After definition, a region may be activated for plotting by calling REGN with the region number as an argument

  14. The missing graphical user interface for genomics.

    Science.gov (United States)

    Schatz, Michael C

    2010-01-01

    The Galaxy package empowers regular users to perform rich DNA sequence analysis through a much-needed and user-friendly graphical web interface. See research article http://genomebiology.com/2010/11/8/R86 RESEARCH HIGHLIGHT: With the advent of affordable and high-throughput DNA sequencing, sequencing is becoming an essential component in nearly every genetics lab. These data are being generated to probe sequence variations, to understand transcribed, regulated or methylated DNA elements, and to explore a host of other biological features across the tree of life and across a range of environments and conditions. Given this deluge of data, novices and experts alike are facing the daunting challenge of trying to analyze the raw sequence data computationally. With so many tools available and so many assays to analyze, how can one be expected to stay current with the state of the art? How can one be expected to learn to use each tool and construct robust end-to-end analysis pipelines, all while ensuring that input formats, command-line options, sequence databases and program libraries are set correctly? Finally, once the analysis is complete, how does one ensure the results are reproducible and transparent for others to scrutinize and study?In an article published in Genome Biology, Jeremy Goecks, Anton Nekrutenko, James Taylor and the rest of the Galaxy Team (Goecks et al. 1) make a great advance towards resolving these critical questions with the latest update to their Galaxy Project. The ambitious goal of Galaxy is to empower regular users to carry out their own computational analysis without having to be an expert in computational biology or computer science. Galaxy adds a desperately needed graphical user interface to genomics research, making data analysis universally accessible in a web browser, and freeing users from the minutiae of archaic command-line parameters, data formats and scripting languages. Data inputs and computational steps are selected from

  15. Computer-aided digitization of graphical mass flow data from the 1/5-scale Mark I BWR pressure suppression experiment

    International Nuclear Information System (INIS)

    Holman, G.S.; McCauley, E.W.

    1979-01-01

    Periodically in the analysis of engineering data, it becomes necessary to use graphical output as the solitary source of accurate numerical data for use in subsequent calculations. Such was our experience in the extended analysis of data from the 1/5-scale Mark I boiling water reactor pressure suppression experiment (PSE). The original numerical results of extensive computer calculations performed at the time of the actual PSE tests and required for the later extended analysis program had not been retained as archival records. We were, therefore, required to recover the previously calculated data, either by a complete recalculation or from available computer graphics records. Time constraints suggested recovery from the graphics records as the more viable approach. This report describes two different approaches to recovery of digital data from graphics records. One, combining hard and software techniques immediately available to us at LLL, proved to be inadequate for our purposes. The other approach required the development of pure software techniques that interfaced with LLL computer graphics to unpack digital coordinate information directly from graphics files. As a result of this effort, we were able to recover the required data with no significant loss in the accuracy of the original calculations

  16. A computer graphics display technique for the examination of aircraft design data

    Science.gov (United States)

    Talcott, N. A., Jr.

    1981-01-01

    An interactive computer graphics technique has been developed for quickly sorting and interpreting large amounts of aerodynamic data. It utilizes a graphic representation rather than numbers. The geometry package represents the vehicle as a set of panels. These panels are ordered in groups of ascending values (e.g., equilibrium temperatures). The groups are then displayed successively on a CRT building up to the complete vehicle. A zoom feature allows for displaying only the panels with values between certain limits. The addition of color allows a one-time display thus eliminating the need for a display build up.

  17. Common Graphics Library (CGL). Volume 1: LEZ user's guide

    Science.gov (United States)

    Taylor, Nancy L.; Hammond, Dana P.; Hofler, Alicia S.; Miner, David L.

    1988-01-01

    Users are introduced to and instructed in the use of the Langley Easy (LEZ) routines of the Common Graphics Library (CGL). The LEZ routines form an application independent graphics package which enables the user community to view data quickly and easily, while providing a means of generating scientific charts conforming to the publication and/or viewgraph process. A distinct advantage for using the LEZ routines is that the underlying graphics package may be replaced or modified without requiring the users to change their application programs. The library is written in ANSI FORTRAN 77, and currently uses a CORE-based underlying graphics package, and is therefore machine independent, providing support for centralized and/or distributed computer systems.

  18. Future of motion graphics and particle systems

    OpenAIRE

    Warambo, Bryan

    2012-01-01

    The purpose of this research is to study the use of particle systems in motion graphics, which is known to be the most popular graphics tool for multiple animated elements. It is known to be a procedural animation because as the emitter builds up more particles are formed to create a motion effect. At the same time exploring the future of motion graphics and Particle systems connection and the relevance it has in terms of longevity in being a major post-production element in digital media. Th...

  19. Development of a visualized software for tokamak experiment data processing

    International Nuclear Information System (INIS)

    Cao Jianyong; Ding Xuantong; Luo Cuiwen

    2004-01-01

    With the VBA programming in Microsoft Excel, the authors have developed a post-processing software of experimental data in tokamak. The standard formal data in the HL-1M and HL-2A tokamaks can be read, displayed in Excel, and transmitted directly into the MATLAB workspace, for displaying pictures in MATLAB with the software. The authors have also developed data post-processing software in MATLAB environment, which can read standard format data, display picture, supply visual graphical user interface and provide part of advanced signal processing ability

  20. Development of a graphic interface for the Ramona-3B code

    International Nuclear Information System (INIS)

    Maldonado D, D.; Santos O, M.A.

    2003-01-01

    In this work a graphic interface that interprets the data of the Ramona-3B code is presented. The Ramona-3B code it is a computer program, that it uses text files as input and its generate output also of this type. The quantity of generated information is so big that always it is necessary to process this information with graphic tools to be able to analyze the results of the simulations of nuclear centrals with boiling water reactors. When having a modern tool that it translates text in graphics in an automatic way and that it is of great versatility, one can obtain a graphic interface that facilitates the interpretation of how a BWR nuclear plant behaves. To achieve this tool the key it has been a program that it reads chains of previously indicated characters that keeps the data in a file for later to manipulate them in the creation of the graphic interface. It is used a software of easy access that resists the processing of a great one quantity of data and that later its have been able to graph. Another important function of this interface it is allowing the modification of the input file for Ramona using graphic unfolding and helps in it lines without necessarily to go to the file with input data. For the design of graphic interface it was decided first to show the more representative variables of a BWR type nuclear plant. It is used Mat lab as platform on several options, as PHP, Lab view or C ' . The obtained graphs allow monitoring the plant and to have the control of selected variables. For the graphic interface only is necessary to indicate it the variable to simulate for to be able to interpret graphically the behavior of the BWR type nuclear plant. This tool is of great utility for the teaching of students that they are interested in this type of nuclear topics. (Author)

  1. Graphical Diagnosis of Performances in Photovoltaic Systems: A Case Study in Southern Spain

    Directory of Open Access Journals (Sweden)

    Isabel Santiago

    2017-11-01

    Full Text Available The starting point of the operation and maintenance tasks in photovoltaic plants is the continuous monitoring and supervision of its components. The great amount of registered data requires a major improvement in the ways this information is processed and analyzed to rapidly detect any potential fault, without incurring additional costs. In this paper, a procedure to perform a detailed graphical supported analysis of the operation of photovoltaic installations, based on inverter data, and using a self-developed application, is presented. The program carries out the automated processing of the registered data, providing their access and visualization by means of color maps. These graphs allow a large volume of data set to be simultaneously represented in a readable way, enabling operation and maintenance operators to quickly detect patterns that would require any type of intervention. As a case study, the operation of a grid-connected photovoltaic plant located in southern Spain was studied during a period of three years. The average daily efficiency values of the PV modules and inverters were in the range of 7.6–14.6%, and 73.5–94% respectively. Moreover, the presence of shadings, as well as the hours and days mainly affected by this issue, was easily detected.

  2. Single Molecule Analysis Research Tool (SMART: an integrated approach for analyzing single molecule data.

    Directory of Open Access Journals (Sweden)

    Max Greenfeld

    Full Text Available Single molecule studies have expanded rapidly over the past decade and have the ability to provide an unprecedented level of understanding of biological systems. A common challenge upon introduction of novel, data-rich approaches is the management, processing, and analysis of the complex data sets that are generated. We provide a standardized approach for analyzing these data in the freely available software package SMART: Single Molecule Analysis Research Tool. SMART provides a format for organizing and easily accessing single molecule data, a general hidden Markov modeling algorithm for fitting an array of possible models specified by the user, a standardized data structure and graphical user interfaces to streamline the analysis and visualization of data. This approach guides experimental design, facilitating acquisition of the maximal information from single molecule experiments. SMART also provides a standardized format to allow dissemination of single molecule data and transparency in the analysis of reported data.

  3. Baseliner: An open-source, interactive tool for processing sap flux data from thermal dissipation probes

    Directory of Open Access Journals (Sweden)

    A. Christopher Oishi

    2016-01-01

    Full Text Available Estimating transpiration from woody plants using thermal dissipation sap flux sensors requires careful data processing. Currently, researchers accomplish this using spreadsheets, or by personally writing scripts for statistical software programs (e.g., R, SAS. We developed the Baseliner software to help establish a standardized protocol for processing sap flux data. Baseliner enables users to QA/QC data and process data using a combination of automated steps, visualization, and manual editing. Data processing requires establishing a zero-flow reference value, or “baseline”, which varies among sensors and with time. Since no set of algorithms currently exists to reliably QA/QC and estimate the zero-flow baseline, Baseliner provides a graphical user interface to allow visual inspection and manipulation of data. Data are first automatically processed using a set of user defined parameters. The user can then view the data for additional, manual QA/QC and baseline identification using mouse and keyboard commands. The open-source software allows for user customization of data processing algorithms as improved methods are developed.

  4. ANALYSIS OF MULTIDIMENSIONAL MEDICAL DATA USING PICTOGRAPHICS «CHERNOFF FACES»

    Directory of Open Access Journals (Sweden)

    I. A. Osadchaya

    2014-01-01

    Full Text Available The use of graphics in research works not only increases the speed of information transmission and increases the level of its understanding, but also contributes to the development of such important for professionals in any industry qualities of intuition and creative thinking. Methods of cognitive graphics significantly extend the capabilities of specialists any field of knowledge to identify the most informative parameters when processing the extensive data base and solving specific problems; detect sometimes radically new facts, radically changing their views known. A separate direction of cognitive graphics forms in medicine. Visualization of the current state of the object and the characteristic features provide continuous control over the condition of groups of persons or individual.This work focuses on the identification of psychological and physiological characteristics of patients with various forms of bronchial asthma using the methods of visualization of multidimensional data.Thus, the object of study are the physiological data of patients with bronchial asthma. Subject of research are the methods of cognitive graphics, namely, the methods of information presentation in the form of graphic images.The aim of this work is to study the possibilities of applying the methods of cognitive graphics in the study of the physiological characteristics of patients with various forms of bronchial asthma. In the end, work has revealed a number of regularities for various forms of bronchial asthma using the methods of data visualization.

  5. Connecting Knowledge for Text Construction through the Use of Graphic Organizers

    OpenAIRE

    Reyes, Elsy Camila

    2011-01-01

    This study analyzed how basic level students comprehend short descriptive texts and rewrite their texts through the use of graphic organizers (GOs). The research was built upon the qualitative research paradigm with the inclusion of descriptive and introspective approaches. The study was carried out at a prestigious private school in Bogotá, Colombia, with basic English level II sixth graders. Data was gathered through focus groups, GOs, and students' documents. The results of the study demon...

  6. An interactive graphical tool for exploring sequential dependencies in categorical data

    International Nuclear Information System (INIS)

    Fitzgerald, M.

    1997-01-01

    As monitoring and data storage devices have become cheaper and more readily available, it has become common practice to establish automated monitoring processes which collect enormous amounts of data. For example, in a waste storage facility, waste from several different sources may be combined and stored in a single storage container. Within this unit, many different types of chemical and microbiological reactions may take place over the course of time, not all of which are completely understood. Thus, it is important to monitor the levels of several different chemical compounds within the system, in order to ensure that the waste is being stored safely. The monitoring devices record any anomalous behavior of the system, such as when the presence of a certain chemical compound exceeds some prescribed expectation, the pressure within the container increases beyond a tolerance threshold, the temperature drops more than .5 degree, etc. These monitoring systems may thus collect large quantities of data in fairly short periods of time. The challenge is then to utilize these massive data sets to bring about an understanding of the process and discover potential avenues of intervention. This report describes an interactive graphical tool, written in XLISP-STAT, for exploratory data analysis of dependencies in sequences of categorical data. Both global and local views of the dependency structure can be insightful, and allowing the user the flexibility to change critical parameters and switch between views in a simple, interactive, point-and-click environment can make the task of exploring dependencies among a large number of categories feasible and lead to a better understanding of the sequential properties of the data

  7. Advanced Research and Data Methods in Women's Health: Big Data Analytics, Adaptive Studies, and the Road Ahead.

    Science.gov (United States)

    Macedonia, Christian R; Johnson, Clark T; Rajapakse, Indika

    2017-02-01

    Technical advances in science have had broad implications in reproductive and women's health care. Recent innovations in population-level data collection and storage have made available an unprecedented amount of data for analysis while computational technology has evolved to permit processing of data previously thought too dense to study. "Big data" is a term used to describe data that are a combination of dramatically greater volume, complexity, and scale. The number of variables in typical big data research can readily be in the thousands, challenging the limits of traditional research methodologies. Regardless of what it is called, advanced data methods, predictive analytics, or big data, this unprecedented revolution in scientific exploration has the potential to dramatically assist research in obstetrics and gynecology broadly across subject matter. Before implementation of big data research methodologies, however, potential researchers and reviewers should be aware of strengths, strategies, study design methods, and potential pitfalls. Examination of big data research examples contained in this article provides insight into the potential and the limitations of this data science revolution and practical pathways for its useful implementation.

  8. Quality Characteristics of a Graduate Teacher Education Program in Graphic Communications: Results from a Delphi Research Study.

    Science.gov (United States)

    Clark, Aaron C.; Scales, Alice Y.

    2000-01-01

    Investigates characteristics of a quality program in graphic communications teacher education with involvement of professionals in the field. Uses the Delphi technique to achieve consensus on the characteristics that they felt compromised a good educational program for future graphics teachers. (Contains 27 references.) (Author/YDS)

  9. Initial Assessment of Parallelization of Monte Carlo Calculation using Graphics Processing Units

    International Nuclear Information System (INIS)

    Choi, Sung Hoon; Joo, Han Gyu

    2009-01-01

    Monte Carlo (MC) simulation is an effective tool for calculating neutron transports in complex geometry. However, because Monte Carlo simulates each neutron behavior one by one, it takes a very long computing time if enough neutrons are used for high precision of calculation. Accordingly, methods that reduce the computing time are required. In a Monte Carlo code, parallel calculation is well-suited since it simulates the behavior of each neutron independently and thus parallel computation is natural. The parallelization of the Monte Carlo codes, however, was done using multi CPUs. By the global demand for high quality 3D graphics, the Graphics Processing Unit (GPU) has developed into a highly parallel, multi-core processor. This parallel processing capability of GPUs can be available to engineering computing once a suitable interface is provided. Recently, NVIDIA introduced CUDATM, a general purpose parallel computing architecture. CUDA is a software environment that allows developers to manage GPU using C/C++ or other languages. In this work, a GPU-based Monte Carlo is developed and the initial assessment of it parallel performance is investigated

  10. Critical frameworks for graphic design: graphic design and visual culture

    OpenAIRE

    Dauppe, Michele-Anne

    2011-01-01

    The paper considers an approach to the study of graphic design which addresses the expanding nature of graphic design in the 21st century and the purposeful application of theory to the subject of graphic design. In recent years graphic design has expanded its domain from the world of print culture (e.g. books, posters) into what is sometimes called screen culture. Everything from a mobile phone to a display in an airport lounge to the A.T.M. carries graphic design. It has become ever more ub...

  11. Research Notes ~ Selecting Research Areas and Research Design Approaches in Distance Education: Process Issues

    Directory of Open Access Journals (Sweden)

    Sudarshan Mishra

    2004-11-01

    Full Text Available The purpose of this paper is to study the process used for selecting research areas and methodological approaches in distance education in India. Experts from the field of distance education in India were interviewed at length, with the aim of collecting qualitative data on opinions on process-issues for selecting areas for research, research design, and appropriate methodological approaches in distance education. Data collected from these interviews were subjected to content analysis; triangulation and peer consultation techniques were used for cross-checking and data verification. While the findings and recommendations of this study have limited application in that they can only be used in the specific context outlined in this paper, respondents in this study nonetheless revealed the pressing need for more process-oriented research in examining media and technology, learners and learning, and distance learning evaluation processes. Our research, which yielded interesting empirical findings, also determined that a mixed approach – one that involves both quantitative and qualitative methods – is more appropriate for conducting research in distance education in India. Qualitative evidence from our research also indicates that respondents interviewed felt that emphasis should be placed on interdisciplinary and systemic research, over that of traditional disciplinary research. Research methods such as student self-reporting, extensive and highly targeted interviews, conversation and discourse analysis, were determined to as useful for data collection for this study.

  12. Improving aircraft conceptual design - A PHIGS interactive graphics interface for ACSYNT

    Science.gov (United States)

    Wampler, S. G.; Myklebust, A.; Jayaram, S.; Gelhausen, P.

    1988-01-01

    A CAD interface has been created for the 'ACSYNT' aircraft conceptual design code that permits the execution and control of the design process via interactive graphics menus. This CAD interface was coded entirely with the new three-dimensional graphics standard, the Programmer's Hierarchical Interactive Graphics System. The CAD/ACSYNT system is designed for use by state-of-the-art high-speed imaging work stations. Attention is given to the approaches employed in modeling, data storage, and rendering.

  13. Graphical Models with R

    DEFF Research Database (Denmark)

    Højsgaard, Søren; Edwards, David; Lauritzen, Steffen

    Graphical models in their modern form have been around since the late 1970s and appear today in many areas of the sciences. Along with the ongoing developments of graphical models, a number of different graphical modeling software programs have been written over the years. In recent years many...... of these software developments have taken place within the R community, either in the form of new packages or by providing an R ingerface to existing software. This book attempts to give the reader a gentle introduction to graphical modeling using R and the main features of some of these packages. In addition......, the book provides examples of how more advanced aspects of graphical modeling can be represented and handled within R. Topics covered in the seven chapters include graphical models for contingency tables, Gaussian and mixed graphical models, Bayesian networks and modeling high dimensional data...

  14. Integrating macromolecular X-ray diffraction data with the graphical user interface iMosflm.

    Science.gov (United States)

    Powell, Harold R; Battye, T Geoff G; Kontogiannis, Luke; Johnson, Owen; Leslie, Andrew G W

    2017-07-01

    X-ray crystallography is the predominant source of structural information for biological macromolecules, providing fundamental insights into biological function. The availability of robust and user-friendly software to process the collected X-ray diffraction images makes the technique accessible to a wider range of scientists. iMosflm/MOSFLM (http://www.mrc-lmb.cam.ac.uk/harry/imosflm) is a software package designed to achieve this goal. The graphical user interface (GUI) version of MOSFLM (called iMosflm) is designed to guide inexperienced users through the steps of data integration, while retaining powerful features for more experienced users. Images from almost all commercially available X-ray detectors can be handled using this software. Although the program uses only 2D profile fitting, it can readily integrate data collected in the 'fine phi-slicing' mode (in which the rotation angle per image is less than the crystal mosaic spread by a factor of at least 2), which is commonly used with modern very fast readout detectors. The GUI provides real-time feedback on the success of the indexing step and the progress of data processing. This feedback includes the ability to monitor detector and crystal parameter refinement and to display the average spot shape in different regions of the detector. Data scaling and merging tasks can be initiated directly from the interface. Using this protocol, a data set of 360 images with ∼2,000 reflections per image can be processed in ∼4 min.

  15. Design Application Translates 2-D Graphics to 3-D Surfaces

    Science.gov (United States)

    2007-01-01

    Fabric Images Inc., specializing in the printing and manufacturing of fabric tension architecture for the retail, museum, and exhibit/tradeshow communities, designed software to translate 2-D graphics for 3-D surfaces prior to print production. Fabric Images' fabric-flattening design process models a 3-D surface based on computer-aided design (CAD) specifications. The surface geometry of the model is used to form a 2-D template, similar to a flattening process developed by NASA's Glenn Research Center. This template or pattern is then applied in the development of a 2-D graphic layout. Benefits of this process include 11.5 percent time savings per project, less material wasted, and the ability to improve upon graphic techniques and offer new design services. Partners include Exhibitgroup/Giltspur (end-user client: TAC Air, a division of Truman Arnold Companies Inc.), Jack Morton Worldwide (end-user client: Nickelodeon), as well as 3D Exhibits Inc., and MG Design Associates Corp.

  16. Graphic Organizers or Graphic Overviews? Presentation Order Effects with Computer-Based Text

    Science.gov (United States)

    Shaw, Shana; Nihalani, Priya; Mayrath, Michael; Robinson, Daniel H.

    2012-01-01

    It has long been assumed that graphic organizers (GOs) should be presented to students following text as an organizer, rather than preceding text as an overview. Robinson et al. ("Educational Technology Research & Development," 51(4), 25-41, 2003) challenged this assumption by finding support for GOs as an overview. The present study further…

  17. Processing-in-Memory Enabled Graphics Processors for 3D Rendering

    Energy Technology Data Exchange (ETDEWEB)

    Xie, Chenhao; Song, Shuaiwen; Wang, Jing; Zhang, Weigong; Fu, Xin

    2017-02-06

    The performance of 3D rendering of Graphics Processing Unit that convents 3D vector stream into 2D frame with 3D image effects significantly impact users’ gaming experience on modern computer systems. Due to the high texture throughput in 3D rendering, main memory bandwidth becomes a critical obstacle for improving the overall rendering performance. 3D stacked memory systems such as Hybrid Memory Cube (HMC) provide opportunities to significantly overcome the memory wall by directly connecting logic controllers to DRAM dies. Based on the observation that texel fetches significantly impact off-chip memory traffic, we propose two architectural designs to enable Processing-In-Memory based GPU for efficient 3D rendering.

  18. The computer graphics metafile

    CERN Document Server

    Henderson, LR; Shepherd, B; Arnold, D B

    1990-01-01

    The Computer Graphics Metafile deals with the Computer Graphics Metafile (CGM) standard and covers topics ranging from the structure and contents of a metafile to CGM functionality, metafile elements, and real-world applications of CGM. Binary Encoding, Character Encoding, application profiles, and implementations are also discussed. This book is comprised of 18 chapters divided into five sections and begins with an overview of the CGM standard and how it can meet some of the requirements for storage of graphical data within a graphics system or application environment. The reader is then intr

  19. Research reports 'nuclear research' (BMFT-KBK) (1965-1975). Research reports 'data processing' (BMFT-FB DV) (1971-1975)

    International Nuclear Information System (INIS)

    1975-01-01

    The BMFT catalogue compiled by ZAED, contains a bibliography of research reports in nuclear engineering (BMFT-KN K) from 1965 to 1975 and reports on data processing (BMFT-FB DV) from 1971 to 1975. (HK) [de

  20. DSISoft—a MATLAB VSP data processing package

    Science.gov (United States)

    Beaty, K. S.; Perron, G.; Kay, I.; Adam, E.

    2002-05-01

    DSISoft is a public domain vertical seismic profile processing software package developed at the Geological Survey of Canada. DSISoft runs under MATLAB version 5.0 and above and hence is portable between computer operating systems supported by MATLAB (i.e. Unix, Windows, Macintosh, Linux). The package includes processing modules for reading and writing various standard seismic data formats, performing data editing, sorting, filtering, and other basic processing modules. The processing sequence can be scripted allowing batch processing and easy documentation. A structured format has been developed to ensure future additions to the package are compatible with existing modules. Interactive modules have been created using MATLAB's graphical user interface builder for displaying seismic data, picking first break times, examining frequency spectra, doing f- k filtering, and plotting the trace header information. DSISoft modular design facilitates the incorporation of new processing algorithms as they are developed. This paper gives an overview of the scope of the software and serves as a guide for the addition of new modules.

  1. Lamb wave propagation modelling and simulation using parallel processing architecture and graphical cards

    International Nuclear Information System (INIS)

    Paćko, P; Bielak, T; Staszewski, W J; Uhl, T; Spencer, A B; Worden, K

    2012-01-01

    This paper demonstrates new parallel computation technology and an implementation for Lamb wave propagation modelling in complex structures. A graphical processing unit (GPU) and computer unified device architecture (CUDA), available in low-cost graphical cards in standard PCs, are used for Lamb wave propagation numerical simulations. The local interaction simulation approach (LISA) wave propagation algorithm has been implemented as an example. Other algorithms suitable for parallel discretization can also be used in practice. The method is illustrated using examples related to damage detection. The results demonstrate good accuracy and effective computational performance of very large models. The wave propagation modelling presented in the paper can be used in many practical applications of science and engineering. (paper)

  2. Scalable Adaptive Graphics Environment (SAGE) Software for the Visualization of Large Data Sets on a Video Wall

    Science.gov (United States)

    Jedlovec, Gary; Srikishen, Jayanthi; Edwards, Rita; Cross, David; Welch, Jon; Smith, Matt

    2013-01-01

    The use of collaborative scientific visualization systems for the analysis, visualization, and sharing of "big data" available from new high resolution remote sensing satellite sensors or four-dimensional numerical model simulations is propelling the wider adoption of ultra-resolution tiled display walls interconnected by high speed networks. These systems require a globally connected and well-integrated operating environment that provides persistent visualization and collaboration services. This abstract and subsequent presentation describes a new collaborative visualization system installed for NASA's Shortterm Prediction Research and Transition (SPoRT) program at Marshall Space Flight Center and its use for Earth science applications. The system consists of a 3 x 4 array of 1920 x 1080 pixel thin bezel video monitors mounted on a wall in a scientific collaboration lab. The monitors are physically and virtually integrated into a 14' x 7' for video display. The display of scientific data on the video wall is controlled by a single Alienware Aurora PC with a 2nd Generation Intel Core 4.1 GHz processor, 32 GB memory, and an AMD Fire Pro W600 video card with 6 mini display port connections. Six mini display-to-dual DVI cables are used to connect the 12 individual video monitors. The open source Scalable Adaptive Graphics Environment (SAGE) windowing and media control framework, running on top of the Ubuntu 12 Linux operating system, allows several users to simultaneously control the display and storage of high resolution still and moving graphics in a variety of formats, on tiled display walls of any size. The Ubuntu operating system supports the open source Scalable Adaptive Graphics Environment (SAGE) software which provides a common environment, or framework, enabling its users to access, display and share a variety of data-intensive information. This information can be digital-cinema animations, high-resolution images, high-definition video

  3. Scalable Adaptive Graphics Environment (SAGE) Software for the Visualization of Large Data Sets on a Video Wall

    Science.gov (United States)

    Jedlovec, G.; Srikishen, J.; Edwards, R.; Cross, D.; Welch, J. D.; Smith, M. R.

    2013-12-01

    The use of collaborative scientific visualization systems for the analysis, visualization, and sharing of 'big data' available from new high resolution remote sensing satellite sensors or four-dimensional numerical model simulations is propelling the wider adoption of ultra-resolution tiled display walls interconnected by high speed networks. These systems require a globally connected and well-integrated operating environment that provides persistent visualization and collaboration services. This abstract and subsequent presentation describes a new collaborative visualization system installed for NASA's Short-term Prediction Research and Transition (SPoRT) program at Marshall Space Flight Center and its use for Earth science applications. The system consists of a 3 x 4 array of 1920 x 1080 pixel thin bezel video monitors mounted on a wall in a scientific collaboration lab. The monitors are physically and virtually integrated into a 14' x 7' for video display. The display of scientific data on the video wall is controlled by a single Alienware Aurora PC with a 2nd Generation Intel Core 4.1 GHz processor, 32 GB memory, and an AMD Fire Pro W600 video card with 6 mini display port connections. Six mini display-to-dual DVI cables are used to connect the 12 individual video monitors. The open source Scalable Adaptive Graphics Environment (SAGE) windowing and media control framework, running on top of the Ubuntu 12 Linux operating system, allows several users to simultaneously control the display and storage of high resolution still and moving graphics in a variety of formats, on tiled display walls of any size. The Ubuntu operating system supports the open source Scalable Adaptive Graphics Environment (SAGE) software which provides a common environment, or framework, enabling its users to access, display and share a variety of data-intensive information. This information can be digital-cinema animations, high-resolution images, high-definition video

  4. Analysis of graphical representation among freshmen in undergraduate physics laboratory

    Science.gov (United States)

    Adam, A. S.; Anggrayni, S.; Kholiq, A.; Putri, N. P.; Suprapto, N.

    2018-03-01

    Physics concept understanding is the importance of the physics laboratory among freshmen in the undergraduate program. These include the ability to interpret the meaning of the graph to make an appropriate conclusion. This particular study analyses the graphical representation among freshmen in an undergraduate physics laboratory. This study uses empirical study with quantitative approach. The graphical representation covers 3 physics topics: velocity of sound, simple pendulum and spring system. The result of this study shows most of the freshmen (90% of the sample) make a graph based on the data from physics laboratory. It means the transferring process of raw data which illustrated in the table to physics graph can be categorised. Most of the Freshmen use the proportional principle of the variable in graph analysis. However, Freshmen can't make the graph in an appropriate variable to gain more information and can't analyse the graph to obtain the useful information from the slope.

  5. Design considerations and philosophy of a device-independent publications/graphics system

    International Nuclear Information System (INIS)

    Burt, J.S.

    1978-01-01

    Over a period of ten years the National Nuclear Data Center has implemented graphics systems to meet a broad range of user requirements in the areas of interactive graphics, publications, and, to a lesser extent, text-editing, graphical data interpretations, and on-line data evaluation. The systems have been designed to support varying levels of user sophistication with respect to programing ability and user knowledge of the hardware involved. An overview is presented of the NNDC's graphics system which is available to the user via a higher-level language, FORTRAN. The system was designed with layers of software between the user and the device-dependent code. One layer is dedicated to processing the incompatibilities and inconsistencies between such devices as paper plotter, interactive graphics, and FR-80 microfilm/microfiche hardware. Another handles the niceties necessary for finer-quality publications work, e.g., superscripting, subscripting, boldface, variable character/page sizing, rotation, the use of multiple character sets (e.g., mathematical, Greek, physics) as well as features to allow the user to design special characters. 12 figures

  6. Graphics-based intelligent search and abstracting using Data Modeling

    Science.gov (United States)

    Jaenisch, Holger M.; Handley, James W.; Case, Carl T.; Songy, Claude G.

    2002-11-01

    This paper presents an autonomous text and context-mining algorithm that converts text documents into point clouds for visual search cues. This algorithm is applied to the task of data-mining a scriptural database comprised of the Old and New Testaments from the Bible and the Book of Mormon, Doctrine and Covenants, and the Pearl of Great Price. Results are generated which graphically show the scripture that represents the average concept of the database and the mining of the documents down to the verse level.

  7. SIproc: an open-source biomedical data processing platform for large hyperspectral images.

    Science.gov (United States)

    Berisha, Sebastian; Chang, Shengyuan; Saki, Sam; Daeinejad, Davar; He, Ziqi; Mankar, Rupali; Mayerich, David

    2017-04-10

    There has recently been significant interest within the vibrational spectroscopy community to apply quantitative spectroscopic imaging techniques to histology and clinical diagnosis. However, many of the proposed methods require collecting spectroscopic images that have a similar region size and resolution to the corresponding histological images. Since spectroscopic images contain significantly more spectral samples than traditional histology, the resulting data sets can approach hundreds of gigabytes to terabytes in size. This makes them difficult to store and process, and the tools available to researchers for handling large spectroscopic data sets are limited. Fundamental mathematical tools, such as MATLAB, Octave, and SciPy, are extremely powerful but require that the data be stored in fast memory. This memory limitation becomes impractical for even modestly sized histological images, which can be hundreds of gigabytes in size. In this paper, we propose an open-source toolkit designed to perform out-of-core processing of hyperspectral images. By taking advantage of graphical processing unit (GPU) computing combined with adaptive data streaming, our software alleviates common workstation memory limitations while achieving better performance than existing applications.

  8. Tracking Research Data Footprints via Integration with Research Graph

    Science.gov (United States)

    Evans, B. J. K.; Wang, J.; Aryani, A.; Conlon, M.; Wyborn, L. A.; Choudhury, S. A.

    2017-12-01

    The researcher of today is likely to be part of a team that will use subsets of data from at least one, if not more external repositories, and that same data could be used by multiple researchers for many different purposes. At best, the repositories that host this data will know who is accessing their data, but rarely what they are using it for, resulting in funders of data collecting programs and data repositories that store the data unlikely to know: 1) which research funding contributed to the collection and preservation of a dataset, and 2) which data contributed to high impact research and publications. In days of funding shortages there is a growing need to be able to trace the footprint a data set from the originator that collected the data to the repository that stores the data and ultimately to any derived publications. The Research Data Alliance's Data Description Registry Interoperability Working Group (DDRIWG) has addressed this problem through the development of a distributed graph, called Research Graph that can map each piece of the research interaction puzzle by building aggregated graphs. It can connect datasets on the basis of co-authorship or other collaboration models such as joint funding and grants and can connect research datasets, publications, grants and researcher profiles across research repositories and infrastructures such as DataCite and ORCID. National Computational Infrastructure (NCI) in Australia is one of the early adopters of Research Graph. The graphic view and quantitative analysis helps NCI track the usage of their National reference data collections thus quantifying the role that these NCI-hosted data assets play within the funding-researcher-data-publication-cycle. The graph can unlock the complex interactions of the research projects by tracking the contribution of datasets, the various funding bodies and the downstream data users. RMap Project is a similar initiative which aims to solve complex relationships among

  9. Computer graphics from basic to application

    International Nuclear Information System (INIS)

    Kim, Do Hyeong; Mun, Sung Min

    1998-04-01

    This book mentions conception of computer graphics, background history, necessity and applied field like construction design, image processing, auto mobile design, fashion design and TV broadcast, basic principle of computer, computer graphics hardware, computer graphics software such as adobe illustrator tool box and adobe photo shop, quarkXpress like introduction, application and operating circumstance, 3D graphics with summary, difference of versions of 3D studio and system, and Auto CAD application.

  10. Computer graphics from basic to application

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Do Hyeong; Mun, Sung Min

    1998-04-15

    This book mentions conception of computer graphics, background history, necessity and applied field like construction design, image processing, auto mobile design, fashion design and TV broadcast, basic principle of computer, computer graphics hardware, computer graphics software such as adobe illustrator tool box and adobe photo shop, quarkXpress like introduction, application and operating circumstance, 3D graphics with summary, difference of versions of 3D studio and system, and Auto CAD application.

  11. The software and algorithms for hyperspectral data processing

    Science.gov (United States)

    Shyrayeva, Anhelina; Martinov, Anton; Ivanov, Victor; Katkovsky, Leonid

    2017-04-01

    Hyperspectral remote sensing technique is widely used for collecting and processing -information about the Earth's surface objects. Hyperspectral data are combined to form a three-dimensional (x, y, λ) data cube. Department of Aerospace Research of the Institute of Applied Physical Problems of the Belarusian State University presents a general model of the software for hyperspectral image data analysis and processing. The software runs in Windows XP/7/8/8.1/10 environment on any personal computer. This complex has been has been written in C++ language using QT framework and OpenGL for graphical data visualization. The software has flexible structure that consists of a set of independent plugins. Each plugin was compiled as Qt Plugin and represents Windows Dynamic library (dll). Plugins can be categorized in terms of data reading types, data visualization (3D, 2D, 1D) and data processing The software has various in-built functions for statistical and mathematical analysis, signal processing functions like direct smoothing function for moving average, Savitzky-Golay smoothing technique, RGB correction, histogram transformation, and atmospheric correction. The software provides two author's engineering techniques for the solution of atmospheric correction problem: iteration method of refinement of spectral albedo's parameters using Libradtran and analytical least square method. The main advantages of these methods are high rate of processing (several minutes for 1 GB data) and low relative error in albedo retrieval (less than 15%). Also, the software supports work with spectral libraries, region of interest (ROI) selection, spectral analysis such as cluster-type image classification and automatic hypercube spectrum comparison by similarity criterion with similar ones from spectral libraries, and vice versa. The software deals with different kinds of spectral information in order to identify and distinguish spectrally unique materials. Also, the following advantages

  12. The challenge of simple graphics for multimodal studies

    DEFF Research Database (Denmark)

    Johannessen, Christian Mosbæk

    2018-01-01

    This article suggests that a Multimodal Social Semiotics (MSS) approach to graphics is severely challenged by structurally very simple texts. Methodologically, MSS favours the level at which elements from discrete modes are integrated grammatically into texts. Because the tradition has this focus......, the analytical description of the expression plane of many modes is underdeveloped. In the case of graphics, we have no descriptive or explanatory readiness for graphic form. The article aims to remedy this problem by combining (i) a small inventory of formal dichotomies for graphic shape features at a general...

  13. Workflows for microarray data processing in the Kepler environment

    Science.gov (United States)

    2012-01-01

    Background Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. Results We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip) datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data) and therefore are close to traditional shell scripting or

  14. Workflows for microarray data processing in the Kepler environment

    Directory of Open Access Journals (Sweden)

    Stropp Thomas

    2012-05-01

    Full Text Available Abstract Background Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. Results We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data and therefore are close to

  15. Workflows for microarray data processing in the Kepler environment.

    Science.gov (United States)

    Stropp, Thomas; McPhillips, Timothy; Ludäscher, Bertram; Bieda, Mark

    2012-05-17

    Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip) datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data) and therefore are close to traditional shell scripting or R

  16. Representation stigma: Perceptions of tools and processes for design graphics

    Directory of Open Access Journals (Sweden)

    David Barbarash

    2016-12-01

    Full Text Available Practicing designers and design students across multiple fields were surveyed to measure preference and perception of traditional hand and digital tools to determine if common biases for an individual toolset are realized in practice. Significant results were found, primarily with age being a determinant in preference of graphic tools and processes; this finding demonstrates a hard line between generations of designers. Results show that while there are strong opinions in tools and processes, the realities of modern business practice and production gravitate towards digital methods despite a traditional tool preference in more experienced designers. While negative stigmas regarding computers remain, younger generations are more accepting of digital tools and images, which should eventually lead to a paradigm shift in design professions.

  17. Graphics Technology Study. Volume 1. State of Graphics Technology

    Science.gov (United States)

    1986-12-01

    reaction of special heat sensitive paper when exposed to the heated elements of a thermal print head. Copy quality was poor due to characteristics...Vendors are now attempting to offer smaller units aimed at applications such as typography , graphic arts, CAD, and office automation. The key element in

  18. Learning Graphical Models With Hubs.

    Science.gov (United States)

    Tan, Kean Ming; London, Palma; Mohan, Karthik; Lee, Su-In; Fazel, Maryam; Witten, Daniela

    2014-10-01

    We consider the problem of learning a high-dimensional graphical model in which there are a few hub nodes that are densely-connected to many other nodes. Many authors have studied the use of an ℓ 1 penalty in order to learn a sparse graph in the high-dimensional setting. However, the ℓ 1 penalty implicitly assumes that each edge is equally likely and independent of all other edges. We propose a general framework to accommodate more realistic networks with hub nodes, using a convex formulation that involves a row-column overlap norm penalty. We apply this general framework to three widely-used probabilistic graphical models: the Gaussian graphical model, the covariance graph model, and the binary Ising model. An alternating direction method of multipliers algorithm is used to solve the corresponding convex optimization problems. On synthetic data, we demonstrate that our proposed framework outperforms competitors that do not explicitly model hub nodes. We illustrate our proposal on a webpage data set and a gene expression data set.

  19. Development of a graphical user interface and graphics display for the WIND system

    International Nuclear Information System (INIS)

    O'Steen, B.L.; Fast, J.D.; Suire, B.S.

    1992-01-01

    An advanced graphical user interface (GUI) and improved graphics for transport calculations have been developed for the Weather Information and Display System (WINDS). Two WINDS transport codes, Area Evac and 2DPUF, have been ported from their original VAX/VMS environment to a UNIX operating system and reconfigured to take advantage of the new graphics capability. A developmental prototype of this software is now available on a UNIX based IBM 340 workstation in the Dose Assessment Center (DAC). Automatic transfer of meteorological data from the WINDS VAX computers to the IBM workstation in the DAC has been implemented. This includes both regional National Weather Service (NWS) data and SRS tower data. The above developments fulfill a FY 1993 DOE milestone

  20. Data management for community research projects: A JGOFS case study

    Science.gov (United States)

    Lowry, Roy K.

    1992-01-01

    Since the mid 1980s, much of the marine science research effort in the United Kingdom has been focused into large scale collaborative projects involving public sector laboratories and university departments, termed Community Research Projects. Two of these, the Biogeochemical Ocean Flux Study (BOFS) and the North Sea Project incorporated large scale data collection to underpin multidisciplinary modeling efforts. The challenge of providing project data sets to support the science was met by a small team within the British Oceanographic Data Centre (BODC) operating as a topical data center. The role of the data center was to both work up the data from the ship's sensors and to combine these data with sample measurements into online databases. The working up of the data was achieved by a unique symbiosis between data center staff and project scientists. The project management, programming and data processing skills of the data center were combined with the oceanographic experience of the project communities to develop a system which has produced quality controlled, calibrated data sets from 49 research cruises in 3.5 years of operation. The data center resources required to achieve this were modest and far outweighed by the time liberated in the scientific community by the removal of the data processing burden. Two online project databases have been assembled containing a very high proportion of the data collected. As these are under the control of BODC their long term availability as part of the UK national data archive is assured. The success of the topical data center model for UK Community Research Project data management has been founded upon the strong working relationships forged between the data center and project scientists. These can only be established by frequent personal contact and hence the relatively small size of the UK has been a critical factor. However, projects covering a larger, even international scale could be successfully supported by a

  1. The Case for Graphic Novels

    OpenAIRE

    Steven Hoover

    2012-01-01

    Many libraries and librarians have embraced graphic novels. A number of books, articles, and presentations have focused on the history of the medium and offered advice on building and maintaining collections, but very little attention has been given the question of how integrate graphic novels into a library’s instructional efforts. This paper will explore the characteristics of graphic novels that make them a valuable resource for librarians who focus on research and information literacy i...

  2. Discovering epistasis in large scale genetic association studies by exploiting graphics cards.

    Science.gov (United States)

    Chen, Gary K; Guo, Yunfei

    2013-12-03

    Despite the enormous investments made in collecting DNA samples and generating germline variation data across thousands of individuals in modern genome-wide association studies (GWAS), progress has been frustratingly slow in explaining much of the heritability in common disease. Today's paradigm of testing independent hypotheses on each single nucleotide polymorphism (SNP) marker is unlikely to adequately reflect the complex biological processes in disease risk. Alternatively, modeling risk as an ensemble of SNPs that act in concert in a pathway, and/or interact non-additively on log risk for example, may be a more sensible way to approach gene mapping in modern studies. Implementing such analyzes genome-wide can quickly become intractable due to the fact that even modest size SNP panels on modern genotype arrays (500k markers) pose a combinatorial nightmare, require tens of billions of models to be tested for evidence of interaction. In this article, we provide an in-depth analysis of programs that have been developed to explicitly overcome these enormous computational barriers through the use of processors on graphics cards known as Graphics Processing Units (GPU). We include tutorials on GPU technology, which will convey why they are growing in appeal with today's numerical scientists. One obvious advantage is the impressive density of microprocessor cores that are available on only a single GPU. Whereas high end servers feature up to 24 Intel or AMD CPU cores, the latest GPU offerings from nVidia feature over 2600 cores. Each compute node may be outfitted with up to 4 GPU devices. Success on GPUs varies across problems. However, epistasis screens fare well due to the high degree of parallelism exposed in these problems. Papers that we review routinely report GPU speedups of over two orders of magnitude (>100x) over standard CPU implementations.

  3. Discovering epistasis in large scale genetic association studies by exploiting graphics cards

    Directory of Open Access Journals (Sweden)

    Gary K Chen

    2013-12-01

    Full Text Available Despite the enormous investments made in collecting DNA samples and generating germline variation data across thousands of individuals in modern genome wide association studies (GWAS, progress has been frustratingly slow in explaining much of the heritability in common disease. Today’s paradigm of testing independent hypotheses on each SNP marker is unlikely to adequately reflect the complex biological processes in disease risk. Alternatively, modeling risk as an ensemble of SNPs that act in concert in a pathway, and/or interact non-additively on log risk for example, may be a more sensible way to approach gene mapping in modern studies. Implementing such analyses genome-wide can quickly become intractable due to the fact that even modest size SNP panels on modern genotype arrays (500k markers pose a combinatorial nightmare, require tens of billions of models to be tested for evidence of interaction. In this article, we provide an in-depth analysis of programs that have been developed to explicitly overcome these enormous computational barriers through the use of processors on graphics cards known as Graphics Processing Units (GPU. We include tutorials on GPU technology, which will convey why they are growing in appeal with today’s numerical scientists. One obvious advantage is the impressive density of microprocessor cores that are available on only a single GPU. Whereas high end servers feature up to 24 Intel or AMD CPU cores, the latest GPU offerings from nVidia feature over 2,600 cores. Each compute node may be outfitted with up to 4 GPU devices. Success on GPUs varies across problems. However epistasis screens fare well due to the high degree of parallelism exposed in these problems. Papers that we review routinely report GPU speedups of over two orders of magnitude (>100x over standard CPU implementations.

  4. Managing Astronomy Research Data: Case Studies of Big and Small Research Projects

    Science.gov (United States)

    Sands, Ashley E.

    2015-01-01

    Astronomy data management refers to all actions taken upon data over the course of the entire research process. It includes activities involving the collection, organization, analysis, release, storage, archiving, preservation, and curation of research data. Astronomers have cultivated data management tools, infrastructures, and local practices to ensure the use and future reuse of their data. However, new sky surveys will soon amass petabytes of data requiring new data management strategies.The goal of this dissertation, to be completed in 2015, is to identify and understand data management practices and the infrastructure and expertise required to support best practices. This will benefit the astronomy community in efforts toward an integrated scholarly communication framework.This dissertation employs qualitative, social science research methods (including interviews, observations, and document analysis) to conduct case studies of data management practices, covering the entire data lifecycle, amongst three populations: Sloan Digital Sky Survey (SDSS) collaboration team members; Individual and small-group users of SDSS data; and Large Synoptic Survey Telescope (LSST) collaboration team members. I have been observing the collection, release, and archiving of data by the SDSS collaboration, the data practices of individuals and small groups using SDSS data in journal articles, and the LSST collaboration's planning and building of infrastructure to produce data.Preliminary results demonstrate that current data management practices in astronomy are complex, situational, and heterogeneous. Astronomers often have different management repertoires for working on sky surveys and for their own data collections, varying their data practices as they move between projects. The multitude of practices complicates coordinated efforts to maintain data.While astronomy expertise proves critical to managing astronomy data in the short, medium, and long term, the larger astronomy

  5. A graphical method for estimating the tunneling factor for mode conversion processes

    International Nuclear Information System (INIS)

    Swanson, D.G.

    1994-01-01

    The fundamental parameter characterizing the strength of any mode conversion process is the tunneling parameter, which is typically determined from a model dispersion relation which is transformed into a differential equation. Here a graphical method is described which gives the tunneling parameter from quantities directly measured from a simple graph of the dispersion relation. The accuracy of the estimate depends only on the accuracy of the measurements

  6. Data processing

    International Nuclear Information System (INIS)

    Cousot, P.

    1988-01-01

    The 1988 progress report of the Data Processing laboratory (Polytechnic School, France), is presented. The laboratory research fields are: the semantics, the tests and the semantic analysis of the codes, the formal calculus, the software applications, the algorithms, the neuron networks and VLSI (Very Large Scale Integration). The investigations concerning the polynomial rings are performed by means of the standard basis approach. Among the research topics, the Pascal codes, the parallel processing, the combinatorial, statistical and asymptotic properties of the fundamental data processing tools, the signal processing and the pattern recognition. The published papers, the congress communications and the thesis are also included [fr

  7. A handbook of statistical graphics using SAS ODS

    CERN Document Server

    Der, Geoff

    2014-01-01

    An Introduction to Graphics: Good Graphics, Bad Graphics, Catastrophic Graphics and Statistical GraphicsThe Challenger DisasterGraphical DisplaysA Little History and Some Early Graphical DisplaysGraphical DeceptionAn Introduction to ODS GraphicsGenerating ODS GraphsODS DestinationsStatistical Graphics ProceduresODS Graphs from Statistical ProceduresControlling ODS GraphicsControlling Labelling in GraphsODS Graphics EditorGraphs for Displaying the Characteristics of Univariate Data: Horse Racing, Mortality Rates, Forearm Lengths, Survival Times and Geyser EruptionsIntroductionPie Chart, Bar Cha

  8. User-Friendly Data Servers for Climate Studies at the Asia-Pacific Data-Research Center (APDRC)

    Science.gov (United States)

    Yuan, G.; Shen, Y.; Zhang, Y.; Merrill, R.; Waseda, T.; Mitsudera, H.; Hacker, P.

    2002-12-01

    The APDRC was recently established within the International Pacific Research Center (IPRC) at the University of Hawaii. The APDRC mission is to increase understanding of climate variability in the Asia-Pacific region by developing the computational, data-management, and networking infrastructure necessary to make data resources readily accessible and usable by researchers, and by undertaking data-intensive research activities that will both advance knowledge and lead to improvements in data preparation and data products. A focus of recent activity is the implementation of user-friendly data servers. The APDRC is currently running a Live Access Server (LAS) developed at NOAA/PMEL to provide access to and visualization of gridded climate products via the web. The LAS also allows users to download the selected data subsets in various formats (such as binary, netCDF and ASCII). Most of the datasets served by the LAS are also served through our OPeNDAP server (formerly DODS), which allows users to directly access the data using their desktop client tools (e.g. GrADS, Matlab and Ferret). In addition, the APDRC is running an OPeNDAP Catalog/Aggregation Server (CAS) developed by Unidata at UCAR to serve climate data and products such as model output and satellite-derived products. These products are often large (> 2 GB) and are therefore stored as multiple files (stored separately in time or in parameters). The CAS remedies the inconvenience of multiple files and allows access to the whole dataset (or any subset that cuts across the multiple files) via a single request command from any DODS enabled client software. Once the aggregation of files is configured at the server (CAS), the process of aggregation is transparent to the user. The user only needs to know a single URL for the entire dataset, which is, in fact, stored as multiple files. CAS even allows aggregation of files on different systems and at different locations. Currently, the APDRC is serving NCEP, ECMWF

  9. Aggregation, Validation, and Generalization of Qualitative Data - Methodological and Practical Research Strategies Illustrated by the Research Process of an empirically Based Typology.

    Science.gov (United States)

    Weis, Daniel; Willems, Helmut

    2017-06-01

    The article deals with the question of how aggregated data which allow for generalizable insights can be generated from single-case based qualitative investigations. Thereby, two central challenges of qualitative social research are outlined: First, researchers must ensure that the single-case data can be aggregated and condensed so that new collective structures can be detected. Second, they must apply methods and practices to allow for the generalization of the results beyond the specific study. In the following, we demonstrate how and under what conditions these challenges can be addressed in research practice. To this end, the research process of the construction of an empirically based typology is described. A qualitative study, conducted within the framework of the Luxembourg Youth Report, is used to illustrate this process. Specifically, strategies are presented which increase the likelihood of generalizability or transferability of the results, while also highlighting their limitations.

  10. SimHap GUI: an intuitive graphical user interface for genetic association analysis.

    Science.gov (United States)

    Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J

    2008-12-25

    Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis.

  11. Critique and Process: Signature Pedagogies in the Graphic Design Classroom

    Science.gov (United States)

    Motley, Phillip

    2017-01-01

    Like many disciplines in design and the visual fine arts, critique is a signature pedagogy in the graphic design classroom. It serves as both a formative and summative assessment while also giving students the opportunity to practice the habits of graphic design. Critiques help students become keen observers of relevant disciplinary criteria;…

  12. Dialog system for automatic data input/output and processing with two BESM-6 computers

    International Nuclear Information System (INIS)

    Belyaev, Y.N.; Gorlov, Y.P.; Makarychev, S.V.; Monakov, A.A.; Shcherbakov, S.A.

    1985-01-01

    This paper presents a system for conducting experiments with fully automatic processing of data from multichannel recorders in the dialog mode. The system acquires data at a rate of 2.5 . 10 3 readings/sec, processes in real time, and outputs digital and graphical material in a multitasking environment

  13. Computer Graphics and Administrative Decision-Making.

    Science.gov (United States)

    Yost, Michael

    1984-01-01

    Reduction in prices now makes it possible for almost any institution to use computer graphics for administrative decision making and research. Current and potential uses of computer graphics in these two areas are discussed. (JN)

  14. Research on key technologies of data processing in internet of things

    Science.gov (United States)

    Zhu, Yangqing; Liang, Peiying

    2017-08-01

    The data of Internet of things (IOT) has the characteristics of polymorphism, heterogeneous, large amount and processing real-time. The traditional structured and static batch processing method has not met the requirements of data processing of IOT. This paper studied a middleware that can integrate heterogeneous data of IOT, and integrated different data formats into a unified format. Designed a data processing model of IOT based on the Storm flow calculation architecture, integrated the existing Internet security technology to build the Internet security system of IOT data processing, which provided reference for the efficient transmission and processing of IOT data.

  15. Graphic terminal based on storage tube display with microcomputer

    International Nuclear Information System (INIS)

    Leich, H.; Levchanovsky, F.; Nikulnikov, A.; Polyntsev, A.; Prikhodko, V.

    1981-01-01

    This paper describes a graphic terminal where a microcomputer realizes functions like the generation of picture elements (points, symbols, vectors), display control, processing of data received from keyboard and trackball, communication with a host computer and others. The terminal has been designed for operating in a local network as well as in autonomous control systems for data acquisition and processing in physical experiments [ru

  16. Safety Parameters Graphical Interface

    International Nuclear Information System (INIS)

    Canamero, B.

    1998-01-01

    Nuclear power plant data are received at the Operations Center of the Consejo de Seguridad Nuclear in emergency situations. In order to achieve the required interface and to prepare those data to perform simulation and forecasting with already existing computer codes a Safety Parameters Graphical Interface (IGPS) has been developed. The system runs in a UNIX environment and use the Xwindows capabilities. The received data are stored in such a way that it can be easily used for further analysis and training activities. The system consists of task-oriented modules (processes) which communicate each other using well known UNIX mechanisms (signals, sockets and shared memory segments). IGPS conceptually have two different parts: Data collection and preparation, and Data monitorization. (Author)

  17. A DDC Bibliography on Optical or Graphic Information Processing (Information Sciences Series). Volume I.

    Science.gov (United States)

    Defense Documentation Center, Alexandria, VA.

    This unclassified-unlimited bibliography contains 183 references, with abstracts, dealing specifically with optical or graphic information processing. Citations are grouped under three headings: display devices and theory, character recognition, and pattern recognition. Within each group, they are arranged in accession number (AD-number) sequence.…

  18. Graphical Geometric and Learning/Optimization-Based Methods in Statistical Signal and Image Processing Object Recognition and Data Fusion

    National Research Council Canada - National Science Library

    Willsky, Alan S

    2008-01-01

    ...: (a) the use of graphical, hierarchical, and multiresolution representations for the development of statistical modeling methodologies for complex phenomena and for the construction of scalable algorithms...

  19. visnormsc: A Graphical User Interface to Normalize Single-cell RNA Sequencing Data.

    Science.gov (United States)

    Tang, Lijun; Zhou, Nan

    2017-12-26

    Single-cell RNA sequencing (RNA-seq) allows the analysis of gene expression with high resolution. The intrinsic defects of this promising technology imports technical noise into the single-cell RNA-seq data, increasing the difficulty of accurate downstream inference. Normalization is a crucial step in single-cell RNA-seq data pre-processing. SCnorm is an accurate and efficient method that can be used for this purpose. An R implementation of this method is currently available. On one hand, the R package possesses many excellent features from R. On the other hand, R programming ability is required, which prevents the biologists who lack the skills from learning to use it quickly. To make this method more user-friendly, we developed a graphical user interface, visnormsc, for normalization of single-cell RNA-seq data. It is implemented in Python and is freely available at https://github.com/solo7773/visnormsc . Although visnormsc is based on the existing method, it contributes to this field by offering a user-friendly alternative. The out-of-the-box and cross-platform features make visnormsc easy to learn and to use. It is expected to serve biologists by simplifying single-cell RNA-seq normalization.

  20. Using Twitter for Demographic and Social Science Research: Tools for Data Collection and Processing.

    Science.gov (United States)

    McCormick, Tyler H; Lee, Hedwig; Cesare, Nina; Shojaie, Ali; Spiro, Emma S

    2017-08-01

    Despite recent and growing interest in using Twitter to examine human behavior and attitudes, there is still significant room for growth regarding the ability to leverage Twitter data for social science research. In particular, gleaning demographic information about Twitter users-a key component of much social science research-remains a challenge. This article develops an accurate and reliable data processing approach for social science researchers interested in using Twitter data to examine behaviors and attitudes, as well as the demographic characteristics of the populations expressing or engaging in them. Using information gathered from Twitter users who state an intention to not vote in the 2012 presidential election, we describe and evaluate a method for processing data to retrieve demographic information reported by users that is not encoded as text (e.g., details of images) and evaluate the reliability of these techniques. We end by assessing the challenges of this data collection strategy and discussing how large-scale social media data may benefit demographic researchers.

  1. X-ray graphical and thermodynamical study of mercury arsenates

    International Nuclear Information System (INIS)

    Makitova, G. Zh.; Mustafin, E. S.; Kasenov, B. K.

    1999-01-01

    Purposes of the work are both determination of lattice parameters on the base of X-ray graphical data and experimental study of thermal conduction dependence of mercury arsenates Hg(AsO 3 ) 2 and Hg 3 (AsO 4 ) 2 . In this work for the first time the parameters of elementary cell thermal conduction in the range 298.15-625 K were determined. Formation of equilibrium contents of mercury arsenates was confirmed by X-ray phase analysis conducted on DRON-2,0 unit under Cu K - radiation. Curves of thermal-differential analysis show, that Hg(AsO 3 ) 2 and Hg 3 (AsO 4 ) 2 melting incongruently, relatively at 725 and 790 grad C. Displaying of X-ray- grammars of examined compounds have been conducted by homology method. On the base the displaying parameter of lattice crystallization were determined. Further arsenates were exposed to calorimetric research for determination of its thermal conduction. It is shown, that Hg 3 (AsO 4 ) 2 thermal conduction has maximum at 448 K and then it value is go down at 473 K and then smoothly increasing. It was supposed, such behavior is related with second kind phase transformation

  2. Reverse-engineering graphical innovation: an introduction to graphical regimes

    Directory of Open Access Journals (Sweden)

    Dominic Arsenault

    2013-03-01

    Full Text Available Technological innovation in the video games industry is a rich area of research that has barely been explored as of yet. Gamers are always clamoring for novelty and a remedy to the oft-decried “sequelitis” that “plagues” the industry, while game publishers and platform holders secretly plan a next-gen platform to capture the ever-shifting market. In this light, the importance of graphics cannot be understated, as it is usually taken for granted in game historiography that “[g]ame graphics were, and to a large extent still are, the main criteria by which advancing video game technology is benchmarked” (Wolf, 2003, p.53.

  3. Graphical Data Analysis on the Circle: Wrap-Around Time Series Plots for (Interrupted) Time Series Designs.

    Science.gov (United States)

    Rodgers, Joseph Lee; Beasley, William Howard; Schuelke, Matthew

    2014-01-01

    Many data structures, particularly time series data, are naturally seasonal, cyclical, or otherwise circular. Past graphical methods for time series have focused on linear plots. In this article, we move graphical analysis onto the circle. We focus on 2 particular methods, one old and one new. Rose diagrams are circular histograms and can be produced in several different forms using the RRose software system. In addition, we propose, develop, illustrate, and provide software support for a new circular graphical method, called Wrap-Around Time Series Plots (WATS Plots), which is a graphical method useful to support time series analyses in general but in particular in relation to interrupted time series designs. We illustrate the use of WATS Plots with an interrupted time series design evaluating the effect of the Oklahoma City bombing on birthrates in Oklahoma County during the 10 years surrounding the bombing of the Murrah Building in Oklahoma City. We compare WATS Plots with linear time series representations and overlay them with smoothing and error bands. Each method is shown to have advantages in relation to the other; in our example, the WATS Plots more clearly show the existence and effect size of the fertility differential.

  4. Mathematical Graphic Organizers

    Science.gov (United States)

    Zollman, Alan

    2009-01-01

    As part of a math-science partnership, a university mathematics educator and ten elementary school teachers developed a novel approach to mathematical problem solving derived from research on reading and writing pedagogy. Specifically, research indicates that students who use graphic organizers to arrange their ideas improve their comprehension…

  5. The Visualization Toolkit (VTK): Rewriting the rendering code for modern graphics cards

    Science.gov (United States)

    Hanwell, Marcus D.; Martin, Kenneth M.; Chaudhary, Aashish; Avila, Lisa S.

    2015-09-01

    The Visualization Toolkit (VTK) is an open source, permissively licensed, cross-platform toolkit for scientific data processing, visualization, and data analysis. It is over two decades old, originally developed for a very different graphics card architecture. Modern graphics cards feature fully programmable, highly parallelized architectures with large core counts. VTK's rendering code was rewritten to take advantage of modern graphics cards, maintaining most of the toolkit's programming interfaces. This offers the opportunity to compare the performance of old and new rendering code on the same systems/cards. Significant improvements in rendering speeds and memory footprints mean that scientific data can be visualized in greater detail than ever before. The widespread use of VTK means that these improvements will reap significant benefits.

  6. PAW [Physics Analysis Workstation] at Fermilab: CORE based graphics implementation of HIGZ [High Level Interface to Graphics and Zebra

    International Nuclear Information System (INIS)

    Johnstad, H.

    1989-06-01

    The Physics Analysis Workstation system (PAW) is primarily intended to be the last link in the analysis chain of experimental data. The graphical part of PAW is based on HIGZ (High Level Interface to Graphics and Zebra), which is based on the OSI and ANSI standard Graphics Kernel System (GKS). HIGZ is written in the context of PAW. At Fermilab, the CORE based graphics system DI-3000 by Precision Visuals Inc., is widely used in the analysis of experimental data. The graphical part of the PAW routines has been totally rewritten and implemented in the Fermilab environment. 3 refs

  7. MGUPGMA: A Fast UPGMA Algorithm With Multiple Graphics Processing Units Using NCCL

    Directory of Open Access Journals (Sweden)

    Guan-Jie Hua

    2017-10-01

    Full Text Available A phylogenetic tree is a visual diagram of the relationship between a set of biological species. The scientists usually use it to analyze many characteristics of the species. The distance-matrix methods, such as Unweighted Pair Group Method with Arithmetic Mean and Neighbor Joining, construct a phylogenetic tree by calculating pairwise genetic distances between taxa. These methods have the computational performance issue. Although several new methods with high-performance hardware and frameworks have been proposed, the issue still exists. In this work, a novel parallel Unweighted Pair Group Method with Arithmetic Mean approach on multiple Graphics Processing Units is proposed to construct a phylogenetic tree from extremely large set of sequences. The experimental results present that the proposed approach on a DGX-1 server with 8 NVIDIA P100 graphic cards achieves approximately 3-fold to 7-fold speedup over the implementation of Unweighted Pair Group Method with Arithmetic Mean on a modern CPU and a single GPU, respectively.

  8. MGUPGMA: A Fast UPGMA Algorithm With Multiple Graphics Processing Units Using NCCL.

    Science.gov (United States)

    Hua, Guan-Jie; Hung, Che-Lun; Lin, Chun-Yuan; Wu, Fu-Che; Chan, Yu-Wei; Tang, Chuan Yi

    2017-01-01

    A phylogenetic tree is a visual diagram of the relationship between a set of biological species. The scientists usually use it to analyze many characteristics of the species. The distance-matrix methods, such as Unweighted Pair Group Method with Arithmetic Mean and Neighbor Joining, construct a phylogenetic tree by calculating pairwise genetic distances between taxa. These methods have the computational performance issue. Although several new methods with high-performance hardware and frameworks have been proposed, the issue still exists. In this work, a novel parallel Unweighted Pair Group Method with Arithmetic Mean approach on multiple Graphics Processing Units is proposed to construct a phylogenetic tree from extremely large set of sequences. The experimental results present that the proposed approach on a DGX-1 server with 8 NVIDIA P100 graphic cards achieves approximately 3-fold to 7-fold speedup over the implementation of Unweighted Pair Group Method with Arithmetic Mean on a modern CPU and a single GPU, respectively.

  9. Optimization Techniques for 3D Graphics Deployment on Mobile Devices

    Science.gov (United States)

    Koskela, Timo; Vatjus-Anttila, Jarkko

    2015-03-01

    3D Internet technologies are becoming essential enablers in many application areas including games, education, collaboration, navigation and social networking. The use of 3D Internet applications with mobile devices provides location-independent access and richer use context, but also performance issues. Therefore, one of the important challenges facing 3D Internet applications is the deployment of 3D graphics on mobile devices. In this article, we present an extensive survey on optimization techniques for 3D graphics deployment on mobile devices and qualitatively analyze the applicability of each technique from the standpoints of visual quality, performance and energy consumption. The analysis focuses on optimization techniques related to data-driven 3D graphics deployment, because it supports off-line use, multi-user interaction, user-created 3D graphics and creation of arbitrary 3D graphics. The outcome of the analysis facilitates the development and deployment of 3D Internet applications on mobile devices and provides guidelines for future research.

  10. Animated computer graphics models of space and earth sciences data generated via the massively parallel processor

    Science.gov (United States)

    Treinish, Lloyd A.; Gough, Michael L.; Wildenhain, W. David

    1987-01-01

    The capability was developed of rapidly producing visual representations of large, complex, multi-dimensional space and earth sciences data sets via the implementation of computer graphics modeling techniques on the Massively Parallel Processor (MPP) by employing techniques recently developed for typically non-scientific applications. Such capabilities can provide a new and valuable tool for the understanding of complex scientific data, and a new application of parallel computing via the MPP. A prototype system with such capabilities was developed and integrated into the National Space Science Data Center's (NSSDC) Pilot Climate Data System (PCDS) data-independent environment for computer graphics data display to provide easy access to users. While developing these capabilities, several problems had to be solved independently of the actual use of the MPP, all of which are outlined.

  11. Use of information system data of jet crushing acoustic monitoring for the process management

    Directory of Open Access Journals (Sweden)

    T.M. Bulanaya

    2012-12-01

    Full Text Available The graphic interpretation of amplitude and frequency of acoustic signals of loose material jet grinding process are resulted. Criteria of process management is determined on the basis of the acoustic monitoring data of jet mill acting.

  12. Development of Point Kernel Shielding Analysis Computer Program Implementing Recent Nuclear Data and Graphic User Interfaces

    International Nuclear Information System (INIS)

    Kang, Sang Ho; Lee, Seung Gi; Chung, Chan Young; Lee, Choon Sik; Lee, Jai Ki

    2001-01-01

    In order to comply with revised national regulationson radiological protection and to implement recent nuclear data and dose conversion factors, KOPEC developed a new point kernel gamma and beta ray shielding analysis computer program. This new code, named VisualShield, adopted mass attenuation coefficient and buildup factors from recent ANSI/ANS standards and flux-to-dose conversion factors from the International Commission on Radiological Protection (ICRP) Publication 74 for estimation of effective/equivalent dose recommended in ICRP 60. VisualShield utilizes graphical user interfaces and 3-D visualization of the geometric configuration for preparing input data sets and analyzing results, which leads users to error free processing with visual effects. Code validation and data analysis were performed by comparing the results of various calculations to the data outputs of previous programs such as MCNP 4B, ISOSHLD-II, QAD-CGGP, etc

  13. Graphic Notation in Music Therapy: A Discussion of What to Notate in Graphic Notation and How

    DEFF Research Database (Denmark)

    Bergstrøm-Nielsen, Carl

    2009-01-01

    This article presents graphic notations of music and related forms of communication in music therapy contexts, created by different authors and practitioners. Their purposes, objects of description, and the elements of graphic language are reflected upon in a comparative discussion. From...... that it becomes clear that the aspect of overview is a fundamental one, facilitating perception of complex data. This also makes possible to memorise complex data, extending the natural limits of human memory. Discovering hidden aspects in the clinical data, as well as sharing and communicating these aspects...... are also important concerns. Among the authors discussed, there is a large variety both in goals and methods. Keywords are proposed to circumscribe moments of possible interest connected to graphic notations. I suggest that the discipline of graphic notation can be useful for the grounding of music therapy...

  14. Development of a graphical interface computer code for reactor fuel reloading optimization

    International Nuclear Information System (INIS)

    Do Quang Binh; Nguyen Phuoc Lan; Bui Xuan Huy

    2007-01-01

    This report represents the results of the project performed in 2007. The aim of this project is to develop a graphical interface computer code that allows refueling engineers to design fuel reloading patterns for research reactor using simulated graphical model of reactor core. Besides, this code can perform refueling optimization calculations based on genetic algorithms as well as simulated annealing. The computer code was verified based on a sample problem, which relies on operational and experimental data of Dalat research reactor. This code can play a significant role in in-core fuel management practice at nuclear research reactor centers and in training. (author)

  15. mcaGUI: microbial community analysis R-Graphical User Interface (GUI).

    Science.gov (United States)

    Copeland, Wade K; Krishnan, Vandhana; Beck, Daniel; Settles, Matt; Foster, James A; Cho, Kyu-Chul; Day, Mitch; Hickey, Roxana; Schütte, Ursel M E; Zhou, Xia; Williams, Christopher J; Forney, Larry J; Abdo, Zaid

    2012-08-15

    Microbial communities have an important role in natural ecosystems and have an impact on animal and human health. Intuitive graphic and analytical tools that can facilitate the study of these communities are in short supply. This article introduces Microbial Community Analysis GUI, a graphical user interface (GUI) for the R-programming language (R Development Core Team, 2010). With this application, researchers can input aligned and clustered sequence data to create custom abundance tables and perform analyses specific to their needs. This GUI provides a flexible modular platform, expandable to include other statistical tools for microbial community analysis in the future. The mcaGUI package and source are freely available as part of Bionconductor at http://www.bioconductor.org/packages/release/bioc/html/mcaGUI.html

  16. Graphics-based nuclear facility modeling and management

    International Nuclear Information System (INIS)

    Rod, S.R.

    1991-07-01

    Nuclear waste management facilities are characterized by their complexity, many unprecedented features, and numerous competing design requirements. This paper describes the development of comprehensive descriptive databases and three-dimensional models of nuclear waste management facilities and applies the database/model to an example facility. The important features of the facility database/model are its abilities to (1) process large volumes of site data, plant data, and nuclear material inventory data in an efficient, integrated manner; (2) produce many different representations of the data to fulfill information needs as they arise; (3) create a complete three-dimensional solid model of the plant with all related information readily accessible; and (4) support complete, consistent inventory control and plant configuration control. While the substantive heart of the system is the database, graphic visualization of the data vastly improves the clarity of the information presented. Graphic representations are a convenient framework for the presentation of plant and inventory data, allowing all types of information to be readily located and presented in a manner that is easily understood. 2 refs., 5 figs., 1 tab

  17. A Codesign Case Study in Computer Graphics

    DEFF Research Database (Denmark)

    Brage, Jens P.; Madsen, Jan

    1994-01-01

    The paper describes a codesign case study where a computer graphics application is examined with the intention to speed up its execution. The application is specified as a C program, and is characterized by the lack of a simple compute-intensive kernel. The hardware/software partitioning is based...

  18. Accelerating image reconstruction in three-dimensional optoacoustic tomography on graphics processing units.

    Science.gov (United States)

    Wang, Kun; Huang, Chao; Kao, Yu-Jiun; Chou, Cheng-Ying; Oraevsky, Alexander A; Anastasio, Mark A

    2013-02-01

    Optoacoustic tomography (OAT) is inherently a three-dimensional (3D) inverse problem. However, most studies of OAT image reconstruction still employ two-dimensional imaging models. One important reason is because 3D image reconstruction is computationally burdensome. The aim of this work is to accelerate existing image reconstruction algorithms for 3D OAT by use of parallel programming techniques. Parallelization strategies are proposed to accelerate a filtered backprojection (FBP) algorithm and two different pairs of projection/backprojection operations that correspond to two different numerical imaging models. The algorithms are designed to fully exploit the parallel computing power of graphics processing units (GPUs). In order to evaluate the parallelization strategies for the projection/backprojection pairs, an iterative image reconstruction algorithm is implemented. Computer simulation and experimental studies are conducted to investigate the computational efficiency and numerical accuracy of the developed algorithms. The GPU implementations improve the computational efficiency by factors of 1000, 125, and 250 for the FBP algorithm and the two pairs of projection/backprojection operators, respectively. Accurate images are reconstructed by use of the FBP and iterative image reconstruction algorithms from both computer-simulated and experimental data. Parallelization strategies for 3D OAT image reconstruction are proposed for the first time. These GPU-based implementations significantly reduce the computational time for 3D image reconstruction, complementing our earlier work on 3D OAT iterative image reconstruction.

  19. Multi-window dialogue system of data processing for experimental setup VASSILISSA in PAW environment

    International Nuclear Information System (INIS)

    Andreev, A.N.; Vakatov, D.V.; Veselski, M.; Eremin, A.V.; Ivanov, V.V.; Khasanov, A.M.

    1992-01-01

    Multi-window dialogue system for processing data acquired from experimental setup VASSILISSA is presented. The system provides friendly user's interface for experimental data conversion, selection and preparing for graphic analysis with PAW. 7 refs.; 5 figs.; 1 tab

  20. Collaborating on a Graphic Medicine Novel

    DEFF Research Database (Denmark)

    Frølunde, Lisbeth

    2018-01-01

    The presentation centers on establishing creative collaborations to support the production of my graphic novel (Family Anecdotes) about mourning and mental health. I explore various challenges of authoring an “autobiofictional" graphic medicine novel – as an arts-based communication researcher, a...

  1. Development of data processing system for the start-up test of FUGEN

    International Nuclear Information System (INIS)

    Nakajima, Ichiro; Kato, Hidemasa

    1981-01-01

    The data processing in the start-up test of conventional reactors has been carried out by recording data with transient phenomena recorders (e.g. electromagnetic oscillographs) or analog data recorders. On the other hand, the rapid works for detailed comparison and investigation between the test data and the analyzed results have been indispensable because ''Fugen'' is a new type of reactor, for which the results in conventional reactors did not necessarily serve as reference. Therefore, in the start-up test of the ''Fugen'' plant, the test data processing and the forecast analysis were performed by installing on the site a mini-computer capable of independently processing the test data and a terminal equipment connected to a large computer with a special communication line. As soon as the testing was completed, the comparison of the test data with the forecast analysis was presented on a graphic display (CRT), and the analysis was modified until significant differences did not appear between the test data and the analyzed data. In this paper, the system hardware and software are described, and two functions of forecast analysis and test data processing are explained. The time required for printing-out or graphic display from inputting 600 kB analysis code data using the terminal equipment was 10 to 30 minutes, and the evaluation and investigation for the test results were able to be immediately achieved by the data processing using the mini-computer. This is one of the factors to carry out the start-up test satisfactorily together with the forecast analysis works. (Wakatsuki, Y.)

  2. OAP- OFFICE AUTOMATION PILOT GRAPHICS DATABASE SYSTEM

    Science.gov (United States)

    Ackerson, T.

    1994-01-01

    The Office Automation Pilot (OAP) Graphics Database system offers the IBM PC user assistance in producing a wide variety of graphs and charts. OAP uses a convenient database system, called a chartbase, for creating and maintaining data associated with the charts, and twelve different graphics packages are available to the OAP user. Each of the graphics capabilities is accessed in a similar manner. The user chooses creation, revision, or chartbase/slide show maintenance options from an initial menu. The user may then enter or modify data displayed on a graphic chart. The cursor moves through the chart in a "circular" fashion to facilitate data entries and changes. Various "help" functions and on-screen instructions are available to aid the user. The user data is used to generate the graphics portion of the chart. Completed charts may be displayed in monotone or color, printed, plotted, or stored in the chartbase on the IBM PC. Once completed, the charts may be put in a vector format and plotted for color viewgraphs. The twelve graphics capabilities are divided into three groups: Forms, Structured Charts, and Block Diagrams. There are eight Forms available: 1) Bar/Line Charts, 2) Pie Charts, 3) Milestone Charts, 4) Resources Charts, 5) Earned Value Analysis Charts, 6) Progress/Effort Charts, 7) Travel/Training Charts, and 8) Trend Analysis Charts. There are three Structured Charts available: 1) Bullet Charts, 2) Organization Charts, and 3) Work Breakdown Structure (WBS) Charts. The Block Diagram available is an N x N Chart. Each graphics capability supports a chartbase. The OAP graphics database system provides the IBM PC user with an effective means of managing data which is best interpreted as a graphic display. The OAP graphics database system is written in IBM PASCAL 2.0 and assembler for interactive execution on an IBM PC or XT with at least 384K of memory, and a color graphics adapter and monitor. Printed charts require an Epson, IBM, OKIDATA, or HP Laser

  3. Three-dimensional photoacoustic tomography based on graphics-processing-unit-accelerated finite element method.

    Science.gov (United States)

    Peng, Kuan; He, Ling; Zhu, Ziqiang; Tang, Jingtian; Xiao, Jiaying

    2013-12-01

    Compared with commonly used analytical reconstruction methods, the frequency-domain finite element method (FEM) based approach has proven to be an accurate and flexible algorithm for photoacoustic tomography. However, the FEM-based algorithm is computationally demanding, especially for three-dimensional cases. To enhance the algorithm's efficiency, in this work a parallel computational strategy is implemented in the framework of the FEM-based reconstruction algorithm using a graphic-processing-unit parallel frame named the "compute unified device architecture." A series of simulation experiments is carried out to test the accuracy and accelerating effect of the improved method. The results obtained indicate that the parallel calculation does not change the accuracy of the reconstruction algorithm, while its computational cost is significantly reduced by a factor of 38.9 with a GTX 580 graphics card using the improved method.

  4. A High Performance VLSI Computer Architecture For Computer Graphics

    Science.gov (United States)

    Chin, Chi-Yuan; Lin, Wen-Tai

    1988-10-01

    A VLSI computer architecture, consisting of multiple processors, is presented in this paper to satisfy the modern computer graphics demands, e.g. high resolution, realistic animation, real-time display etc.. All processors share a global memory which are partitioned into multiple banks. Through a crossbar network, data from one memory bank can be broadcasted to many processors. Processors are physically interconnected through a hyper-crossbar network (a crossbar-like network). By programming the network, the topology of communication links among processors can be reconfigurated to satisfy specific dataflows of different applications. Each processor consists of a controller, arithmetic operators, local memory, a local crossbar network, and I/O ports to communicate with other processors, memory banks, and a system controller. Operations in each processor are characterized into two modes, i.e. object domain and space domain, to fully utilize the data-independency characteristics of graphics processing. Special graphics features such as 3D-to-2D conversion, shadow generation, texturing, and reflection, can be easily handled. With the current high density interconnection (MI) technology, it is feasible to implement a 64-processor system to achieve 2.5 billion operations per second, a performance needed in most advanced graphics applications.

  5. Continuous Learning Graphical Knowledge Unit for Cluster Identification in High Density Data Sets

    Directory of Open Access Journals (Sweden)

    K.K.L.B. Adikaram

    2016-12-01

    Full Text Available Big data are visually cluttered by overlapping data points. Rather than removing, reducing or reformulating overlap, we propose a simple, effective and powerful technique for density cluster generation and visualization, where point marker (graphical symbol of a data point overlap is exploited in an additive fashion in order to obtain bitmap data summaries in which clusters can be identified visually, aided by automatically generated contour lines. In the proposed method, the plotting area is a bitmap and the marker is a shape of more than one pixel. As the markers overlap, the red, green and blue (RGB colour values of pixels in the shared region are added. Thus, a pixel of a 24-bit RGB bitmap can code up to 224 (over 1.6 million overlaps. A higher number of overlaps at the same location makes the colour of this area identical, which can be identified by the naked eye. A bitmap is a matrix of colour values that can be represented as integers. The proposed method updates this matrix while adding new points. Thus, this matrix can be considered as an up-to-time knowledge unit of processed data. Results show cluster generation, cluster identification, missing and out-of-range data visualization, and outlier detection capability of the newly proposed method.

  6. Data processing in cosmic rays at the Institute of Physical and Chemical Research

    International Nuclear Information System (INIS)

    Wada, Masami

    1980-01-01

    Data processing performed by the World Data Center for Cosmic Rays, installed at the Institute of Physical and Chemical Research (IPCR) is reported. The Center was set up as a member of the World Data Center for Solar and Terrestrial Physics and performs assigned services. There are several C-level World Data Centers in Japan, and the DC for Cosmic Rays, IPCR, is described in detail, in the context of cosmic ray research itself. As to the future of the Center, IPCR, personal opinions and expectations are made. Thus a glimpse on a century of International Cooperative Observation and a quarter century of world data center operations are made from cosmic ray research side. (author)

  7. Graphic Narratives and Cancer Prevention: A Case Study of an American Cancer Society Comic Book.

    Science.gov (United States)

    Krakow, Melinda

    2017-05-01

    As the interest in graphic medicine grows, health communicators have started engaging readers with compelling visual and textual accounts of health and illness, including via comic books. One context where comics have shown promise is cancer communication. This brief report presents an early example of graphic medicine developed by the American Cancer Society. "Ladies … Wouldn't It Be Better to Know?" is a comic book produced in the 1960s to provide the public with lay information about the Pap test for cervical cancer prevention and detection. An analysis of a key narrative attribute, plot development, illustrates the central role that perceived barriers played in this midcentury public health message, a component that remains a consideration of cancer communication design today. This case study of an early graphic narrative identifies promising cancer message features that can be used to address and refute barriers to cervical cancer screening and connects contemporary research with historical efforts in public health communication.

  8. Utero-fetal unit and pregnant woman modeling using a computer graphics approach for dosimetry studies.

    Science.gov (United States)

    Anquez, Jérémie; Boubekeur, Tamy; Bibin, Lazar; Angelini, Elsa; Bloch, Isabelle

    2009-01-01

    Potential sanitary effects related to electromagnetic fields exposure raise public concerns, especially for fetuses during pregnancy. Human fetus exposure can only be assessed through simulated dosimetry studies, performed on anthropomorphic models of pregnant women. In this paper, we propose a new methodology to generate a set of detailed utero-fetal unit (UFU) 3D models during the first and third trimesters of pregnancy, based on segmented 3D ultrasound and MRI data. UFU models are built using recent geometry processing methods derived from mesh-based computer graphics techniques and embedded in a synthetic woman body. Nine pregnant woman models have been generated using this approach and validated by obstetricians, for anatomical accuracy and representativeness.

  9. A course in constructing effective displays of data for pharmaceutical research personnel.

    Science.gov (United States)

    Bradstreet, Thomas E; Nessly, Michael L; Short, Thomas H

    2013-01-01

    Interpreting data and communicating effectively through graphs and tables are requisite skills for statisticians and non-statisticians in the pharmaceutical industry. However, the quality of visual displays of data in the medical and pharmaceutical literature and at scientific conferences is severely lacking. We describe an interactive, workshop-driven, 2-day short course that we constructed for pharmaceutical research personnel to learn these skills. The examples in the course and the workshop datasets source from our professional experiences, the scientific literature, and the mass media. During the course, the participants are exposed to and gain hands-on experience with the principles of visual and graphical perception, design, and construction of both graphic and tabular displays of quantitative and qualitative information. After completing the course, with a critical eye, the participants are able to construct, revise, critique, and interpret graphic and tabular displays according to an extensive set of guidelines. Copyright © 2013 John Wiley & Sons, Ltd.

  10. Development of a prototype graphic simulation program for severe accident training

    International Nuclear Information System (INIS)

    Kim, Ko Ryu; Jeong, Kwang Sub; Ha, Jae Joo

    2000-05-01

    This is a report of the development process and related technologies of severe accident graphic simulators, required in industrial severe accident management and training. Here, we say 'a severe accident graphic simulator' as a graphics add-in system to existing calculation codes, which can show the severe accident phenomena dynamically on computer screens and therefore which can supplement one of main defects of existing calculation codes. With graphic simulators it is fairly easy to see the total behavior of nuclear power plants, where it was very difficult to see only from partial variable numerical information. Moreover, the fast processing and control feature of a graphic simulator can give some opportunities of predicting the severe accident advancement among several possibilities, to one who is not an expert. Utilizing graphic simulators' we expect operators' and TSC members' physical phenomena understanding enhancement from the realistic dynamic behavior of plants. We also expect that severe accident training course can gain better training effects using graphic simulator's control functions and predicting capabilities, and therefore we expect that graphic simulators will be effective decision-aids tools both in sever accident training course and in real severe accident situations. With these in mind, we have developed a prototype graphic simulator having surveyed related technologies, and from this development experiences we have inspected the possibility to build a severe accident graphic simulator. The prototype graphic simulator is developed under IBM PC WinNT environments and is suited to Uljin 3and4 nuclear power plant. When supplied with adequate severe accident scenario as an input, the prototype can provide graphical simulations of plant safety systems' dynamic behaviors. The prototype is composed of several different modules, which are phenomena display module, MELCOR data interface module and graphic database interface module. Main functions of

  11. Examining Data Processing Work as Part of the Scientific Data Lifecycle Comparing Practices Across Four Scientific Research Groups

    OpenAIRE

    Paine, Drew; Lee, Charlotte

    2015-01-01

    Slides from Charlotte P. Lee's presentation at the 2015 iConference on our paper "Examining Data Processing Work as Part of the Scientific Data Lifecycle: Comparing Practices Across Four Scientific Research Groups".

  12. Graphical function mapping as a new way to explore cause-and-effect chains

    Science.gov (United States)

    Evans, Mary Anne

    2016-01-01

    Graphical function mapping provides a simple method for improving communication within interdisciplinary research teams and between scientists and nonscientists. This article introduces graphical function mapping using two examples and discusses its usefulness. Function mapping projects the outcome of one function into another to show the combined effect. Using this mathematical property in a simpler, even cartoon-like, graphical way allows the rapid combination of multiple information sources (models, empirical data, expert judgment, and guesses) in an intuitive visual to promote further discussion, scenario development, and clear communication.

  13. A real-time GNSS-R system based on software-defined radio and graphics processing units

    Science.gov (United States)

    Hobiger, Thomas; Amagai, Jun; Aida, Masanori; Narita, Hideki

    2012-04-01

    Reflected signals of the Global Navigation Satellite System (GNSS) from the sea or land surface can be utilized to deduce and monitor physical and geophysical parameters of the reflecting area. Unlike most other remote sensing techniques, GNSS-Reflectometry (GNSS-R) operates as a passive radar that takes advantage from the increasing number of navigation satellites that broadcast their L-band signals. Thereby, most of the GNSS-R receiver architectures are based on dedicated hardware solutions. Software-defined radio (SDR) technology has advanced in the recent years and enabled signal processing in real-time, which makes it an ideal candidate for the realization of a flexible GNSS-R system. Additionally, modern commodity graphic cards, which offer massive parallel computing performances, allow to handle the whole signal processing chain without interfering with the PC's CPU. Thus, this paper describes a GNSS-R system which has been developed on the principles of software-defined radio supported by General Purpose Graphics Processing Units (GPGPUs), and presents results from initial field tests which confirm the anticipated capability of the system.

  14. Tag, You're It: Enhancing Access to Graphic Novels

    Science.gov (United States)

    West, Wendy

    2013-01-01

    Current users of academic libraries are avid readers of graphic novels. These thought-provoking materials are used for leisure reading, in instruction, and for research purposes. Libraries need to take care in providing access to these resources. This study analyzed the cataloging practices and social tagging of a specific list of graphic novel…

  15. Fast, multi-channel real-time processing of signals with microsecond latency using graphics processing units

    Energy Technology Data Exchange (ETDEWEB)

    Rath, N., E-mail: Nikolaus@rath.org; Levesque, J. P.; Mauel, M. E.; Navratil, G. A.; Peng, Q. [Department of Applied Physics and Applied Mathematics, Columbia University, 500 W 120th St, New York, New York 10027 (United States); Kato, S. [Department of Information Engineering, Nagoya University, Nagoya (Japan)

    2014-04-15

    Fast, digital signal processing (DSP) has many applications. Typical hardware options for performing DSP are field-programmable gate arrays (FPGAs), application-specific integrated DSP chips, or general purpose personal computer systems. This paper presents a novel DSP platform that has been developed for feedback control on the HBT-EP tokamak device. The system runs all signal processing exclusively on a Graphics Processing Unit (GPU) to achieve real-time performance with latencies below 8 μs. Signals are transferred into and out of the GPU using PCI Express peer-to-peer direct-memory-access transfers without involvement of the central processing unit or host memory. Tests were performed on the feedback control system of the HBT-EP tokamak using forty 16-bit floating point inputs and outputs each and a sampling rate of up to 250 kHz. Signals were digitized by a D-TACQ ACQ196 module, processing done on an NVIDIA GTX 580 GPU programmed in CUDA, and analog output was generated by D-TACQ AO32CPCI modules.

  16. Fast, multi-channel real-time processing of signals with microsecond latency using graphics processing units

    International Nuclear Information System (INIS)

    Rath, N.; Levesque, J. P.; Mauel, M. E.; Navratil, G. A.; Peng, Q.; Kato, S.

    2014-01-01

    Fast, digital signal processing (DSP) has many applications. Typical hardware options for performing DSP are field-programmable gate arrays (FPGAs), application-specific integrated DSP chips, or general purpose personal computer systems. This paper presents a novel DSP platform that has been developed for feedback control on the HBT-EP tokamak device. The system runs all signal processing exclusively on a Graphics Processing Unit (GPU) to achieve real-time performance with latencies below 8 μs. Signals are transferred into and out of the GPU using PCI Express peer-to-peer direct-memory-access transfers without involvement of the central processing unit or host memory. Tests were performed on the feedback control system of the HBT-EP tokamak using forty 16-bit floating point inputs and outputs each and a sampling rate of up to 250 kHz. Signals were digitized by a D-TACQ ACQ196 module, processing done on an NVIDIA GTX 580 GPU programmed in CUDA, and analog output was generated by D-TACQ AO32CPCI modules

  17. Graphic Organizers: Outlets for Your Thoughts.

    Science.gov (United States)

    Ekhaml, Leticia

    1998-01-01

    Graphs, bars, charts, and diagrams have been used by designers, writers, and scientists to communicate. Now, research suggests that graphic organizers benefit teaching and learning. This article describes graphic organizers: sequential, conceptual, hierarchical, cyclical, Venn, fishbone or Ishikawa, squeeze and stretch, why-why, t-chart, KWL…

  18. Real-time track-less Cherenkov ring fitting trigger system based on Graphics Processing Units

    Science.gov (United States)

    Ammendola, R.; Biagioni, A.; Chiozzi, S.; Cretaro, P.; Cotta Ramusino, A.; Di Lorenzo, S.; Fantechi, R.; Fiorini, M.; Frezza, O.; Gianoli, A.; Lamanna, G.; Lo Cicero, F.; Lonardo, A.; Martinelli, M.; Neri, I.; Paolucci, P. S.; Pastorelli, E.; Piandani, R.; Piccini, M.; Pontisso, L.; Rossetti, D.; Simula, F.; Sozzi, M.; Vicini, P.

    2017-12-01

    The parallel computing power of commercial Graphics Processing Units (GPUs) is exploited to perform real-time ring fitting at the lowest trigger level using information coming from the Ring Imaging Cherenkov (RICH) detector of the NA62 experiment at CERN. To this purpose, direct GPU communication with a custom FPGA-based board has been used to reduce the data transmission latency. The GPU-based trigger system is currently integrated in the experimental setup of the RICH detector of the NA62 experiment, in order to reconstruct ring-shaped hit patterns. The ring-fitting algorithm running on GPU is fed with raw RICH data only, with no information coming from other detectors, and is able to provide more complex trigger primitives with respect to the simple photodetector hit multiplicity, resulting in a higher selection efficiency. The performance of the system for multi-ring Cherenkov online reconstruction obtained during the NA62 physics run is presented.

  19. Using global positioning systems in health research a practical approach to data collection and processing

    DEFF Research Database (Denmark)

    Kerr, Jacqueline; Duncan, Scott; Schipperijn, Jasper

    2011-01-01

    The use of GPS devices in health research is increasingly popular. There are currently no best-practice guidelines for collecting, processing, and analyzing GPS data. The standardization of data collection and processing procedures will improve data quality, allow more-meaningful comparisons acro...

  20. The graphics-based human interface to the DISYS diagnostic/control guidance system at EBR-2

    International Nuclear Information System (INIS)

    Edwards, R.M.; Chavez, C.; Kamarthi, S.; Dharap, S.; Lindsay, R.W.; Staffon, J.

    1990-01-01

    An initial graphics based interface to the real-time DISYS diagnostic system has been developed using the multi-tasking capabilities of the UNIX operating system and X-Windows 11 Xlib graphics library. This system is interfaced to live plant data at the Experimental Breeder Reactor (EBR-2) for the Argon Cooling System of fuel handling operations and the steam plant. The interface includes an intelligent process schematic which highlights problematic components and sensors based on the results of the diagnostic computations. If further explanation of a faulted component is required, the user can call up a display of the diagnostic computations presented in a tree-like diagram. Numerical data on the process schematic and optional diagnostic tree are updated as new real-time data becomes available. The initial X-Windows 11 based interface will be further enhanced using VI Corporation DATAVIEWS graphical data base software. 5 refs., 6 figs

  1. Interactive graphics for the Macintosh: software review of FlexiGraphs.

    Science.gov (United States)

    Antonak, R F

    1990-01-01

    While this product is clearly unique, its usefulness to individuals outside small business environments is somewhat limited. FlexiGraphs is, however, a reasonable first attempt to design a microcomputer software package that controls data through interactive editing within a graph. Although the graphics capabilities of mainframe programs such as MINITAB (Ryan, Joiner, & Ryan, 1981) and the graphic manipulations available through exploratory data analysis (e.g., Velleman & Hoaglin, 1981) will not be surpassed anytime soon by this program, a researcher may want to add this program to a software library containing other Macintosh statistics, drawing, and graphics programs if only to obtain the easy-to-obtain curve fitting and line smoothing options. I welcome the opportunity to review the enhanced "scientific" version of FlexiGraphs that the author of the program indicates is currently under development. An MS-DOS version of the program should be available within the year.

  2. Data-processing strategies for metabolomics studies

    NARCIS (Netherlands)

    Hendriks, M.M.W.B.; Eeuwijk, van F.A.; Jellema, R.H.; Westerhuis, J.A.; Reijmers, T.H.; Hoefsloot, H.C.J.; Smilde, A.K.

    2011-01-01

    Metabolomics studies aim at a better understanding of biochemical processes by studying relations between metabolites and between metabolites and other types of information (e.g., sensory and phenotypic features). The objectives of these studies are diverse, but the types of data generated and the

  3. INTLIB-6, Graphic Device Interface Library for ENDF/B Processing Codes

    International Nuclear Information System (INIS)

    Dunford, L.

    1999-01-01

    1 - Description of program or function: The graphic subroutine libraries DISSPLA and GRALIB (USCD1211) generally produce output which is independent of the output graphic device. A set of device dependent interface routines is required to translate the device independent output to the form required for each graphic device available. The interface library INTLIB provides interface routines for the following output formats: TETRONIX - LN03 PLUS, - video display terminal; POSTSCRIPT - LN03 PLUS with PostScript, - LaserJet III in PostScript mode, - video display terminal; REGIS - VT240 and VT1200; HPGL - LaserJet III in HPGL mode; FR80 - COMP80 film, fiche and hard copy

  4. Active Provenance in Data-intensive Research

    Science.gov (United States)

    Spinuso, Alessandro; Mihajlovski, Andrej; Filgueira, Rosa; Atkinson, Malcolm

    2017-04-01

    management will be also discussed, enabling provenance-driven operations at runtime, regardless of the enactment technologies and connectivity impediments. We proposes a framework based on concepts such as provenance clusters and provenance sensors, envisaging new potential for exploiting large quantities of provenance traces at runtime. Finally the work will also introduce how the underlying provenance model can be explored with big-data visualization techniques, aiming at producing comprehensive and interactive views on top of large and heterogeneous provenance data. We will demonstrate the adoption of alternative visualisation methods, from detailed and localised interactive graphs to radial-views, serving different purposes and expertise. Combining provenance types, selective rules, extensible metadata with reactive clustering opens a new and more versatile role of the lineage information in the research life-cycle, thanks to its improved usability. The flexible profiling of the proposed framework offers aid to the human analysis of the process, with the support of advanced and intuitive interactive graphical tools. The Active provenance methods are discussed in the context of a real implementation for a data-intensive library (dispel4py) and its adoption within use cases for computational seismology, climate studies and generic correlation analysis.

  5. Symptomatic knee disorders in floor layers and graphic designers. A cross-sectional study

    Directory of Open Access Journals (Sweden)

    Jensen Lilli

    2012-09-01

    Full Text Available Abstract Background Previous studies have described an increased risk of developing tibio-femoral osteoarthritis (TF OA, meniscal tears and bursitis among those with a trade as floor layers. The purpose of this study was to analyse symptomatic knee disorders among floor layers that were highly exposed to kneeling work tasks compared to graphic designers without knee-demanding work tasks. Methods Data on the Knee injury and Osteoarthritis Outcome Score (KOOS were collected by questionnaires. In total 134 floor layers and 120 graphic designers had a bilateral radiographic knee examination to detect TF OA and patella-femoral (PF OA. A random sample of 92 floor layers and 49 graphic designers had Magnetic Resonance Imaging (MRI of both knees to examine meniscal tears. Means of the subscales of KOOS were compared by analysis of variance. The risk ratio of symptomatic knee disorders defined as a combination of radiological detected knee OA or MRI-detected meniscal tears combined with a low KOOS score was estimated by logistic regression in floor layers with 95% confidence interval (CI and adjusted for age, body mass index, traumas, and knee-straining sports activities. Symptomatic knee OA or meniscal tears were defined as a combination of low KOOS-scores and radiographic or MRI pathology. Results Symptomatic TF and medial meniscal tears were found in floor layers compared to graphic designers with odds ratios 2.6 (95%CI 0.99-6.9 and 2.04 (95% CI 0.77-5.5, respectively. There were no differences in PF OA. Floor layers scored significantly lower on all KOOS subscales compared to graphic designers. Significantly lower scores on the KOOS subscales were also found for radiographic TF and PF OA regardless of trade but not for meniscal tears. Conclusions The study showed an overall increased risk of developing symptomatic TF OA in a group of floor layers with a substantial amount of kneeling work positions. Prevention would be appropriate to reduce the

  6. Systems Biology Graphical Notation: Entity Relationship language Level 1 Version 2

    Directory of Open Access Journals (Sweden)

    Sorokin Anatoly

    2015-06-01

    Full Text Available The Systems Biological Graphical Notation (SBGN is an international community effort for standardized graphical representations of biological pathways and networks. The goal of SBGN is to provide unambiguous pathway and network maps for readers with different scientific backgrounds as well as to support efficient and accurate exchange of biological knowledge between different research communities, industry, and other players in systems biology. Three SBGN languages, Process Description (PD, Entity Relationship (ER and Activity Flow (AF, allow for the representation of different aspects of biological and biochemical systems at different levels of detail.

  7. ABOUT THE ROMANIAN SOCIETY FOR ENGINEERING GRAPHICS

    Directory of Open Access Journals (Sweden)

    SIMION Ionel

    2015-06-01

    Full Text Available SORGING is a non-profit, non-governmental society, opened to all professionals interested in Engineering Graphics and Design. It aims to promote the research, development and innovation activities, together with the dissemination of best practices and assistance for educational purposes. In this paper the research and educational activities of the Romanian Society for Engineering Graphics will be briefly reviewed.

  8. Visual Showcase: An Illustrative Data Graphic in an 18th-19th Century Style

    OpenAIRE

    Dragicevic, Pierre; Bach, Benjamin; Dufournaud, Nicole; Huron, Samuel; Isenberg, Petra; Jansen, Yvonne; Perin, Charles; Spritzer, André; Vuillemot, Romain; Willett, Wesley; Isenberg, Tobias

    2013-01-01

    Extended abstract and exhibition piece; International audience; We exhibit an data graphic poster that emulates the style of historic hand-made visualizations of the 18th -19th century. Our visualization uses real data and employs style elements such as an emulation of ink lines, hatching and cross-hatching, appropriate typesetting, and unique style of computer-assisted facial drawings.

  9. An X window based graphics user interface for radiation information processing system developed with object-oriented programming technology

    International Nuclear Information System (INIS)

    Gao Wenhuan; Fu Changqing; Kang Kejun

    1993-01-01

    X Window is a network-oriented and network transparent windowing system, and now dominant in the Unix domain. The object-oriented programming technology can be used to change the extensibility of a software system remarkably. An introduction to graphics user interface is given. And how to develop a graphics user interface for radiation information processing system with object-oriented programming technology, which is based on X Window and independent of application is described briefly

  10. Graphical technologies, innovation and aesthetics in the video game industry: a case study of the shift from 2d to 3d graphics in the 1990s

    Directory of Open Access Journals (Sweden)

    Dominic Arsenault

    2013-03-01

    Full Text Available This paper provides an overview of a research project currently in progress at the Université de Montréal (Québec, Canada. Funded by the FQRSC (Fonds de recherche Québec – Société et Culture / Quebec Fund for Research – Society and Culture for a three-year period (from May 2012 to May 2015, the project studies the transition from 2D to 3D graphics in gaming during the 1990s

  11. Barriers to data quality resulting from the process of coding health information to administrative data: a qualitative study.

    Science.gov (United States)

    Lucyk, Kelsey; Tang, Karen; Quan, Hude

    2017-11-22

    Administrative health data are increasingly used for research and surveillance to inform decision-making because of its large sample sizes, geographic coverage, comprehensivity, and possibility for longitudinal follow-up. Within Canadian provinces, individuals are assigned unique personal health numbers that allow for linkage of administrative health records in that jurisdiction. It is therefore necessary to ensure that these data are of high quality, and that chart information is accurately coded to meet this end. Our objective is to explore the potential barriers that exist for high quality data coding through qualitative inquiry into the roles and responsibilities of medical chart coders. We conducted semi-structured interviews with 28 medical chart coders from Alberta, Canada. We used thematic analysis and open-coded each transcript to understand the process of administrative health data generation and identify barriers to its quality. The process of generating administrative health data is highly complex and involves a diverse workforce. As such, there are multiple points in this process that introduce challenges for high quality data. For coders, the main barriers to data quality occurred around chart documentation, variability in the interpretation of chart information, and high quota expectations. This study illustrates the complex nature of barriers to high quality coding, in the context of administrative data generation. The findings from this study may be of use to data users, researchers, and decision-makers who wish to better understand the limitations of their data or pursue interventions to improve data quality.

  12. Accelerating large-scale protein structure alignments with graphics processing units

    Directory of Open Access Journals (Sweden)

    Pang Bin

    2012-02-01

    Full Text Available Abstract Background Large-scale protein structure alignment, an indispensable tool to structural bioinformatics, poses a tremendous challenge on computational resources. To ensure structure alignment accuracy and efficiency, efforts have been made to parallelize traditional alignment algorithms in grid environments. However, these solutions are costly and of limited accessibility. Others trade alignment quality for speedup by using high-level characteristics of structure fragments for structure comparisons. Findings We present ppsAlign, a parallel protein structure Alignment framework designed and optimized to exploit the parallelism of Graphics Processing Units (GPUs. As a general-purpose GPU platform, ppsAlign could take many concurrent methods, such as TM-align and Fr-TM-align, into the parallelized algorithm design. We evaluated ppsAlign on an NVIDIA Tesla C2050 GPU card, and compared it with existing software solutions running on an AMD dual-core CPU. We observed a 36-fold speedup over TM-align, a 65-fold speedup over Fr-TM-align, and a 40-fold speedup over MAMMOTH. Conclusions ppsAlign is a high-performance protein structure alignment tool designed to tackle the computational complexity issues from protein structural data. The solution presented in this paper allows large-scale structure comparisons to be performed using massive parallel computing power of GPU.

  13. Graphical interpretation of numerical model results

    International Nuclear Information System (INIS)

    Drewes, D.R.

    1979-01-01

    Computer software has been developed to produce high quality graphical displays of data from a numerical grid model. The code uses an existing graphical display package (DISSPLA) and overcomes some of the problems of both line-printer output and traditional graphics. The software has been designed to be flexible enough to handle arbitrarily placed computation grids and a variety of display requirements

  14. A CAMAC display module for fast bit-mapped graphics

    International Nuclear Information System (INIS)

    Abdel-Aal, R.E.

    1992-01-01

    In many data acquisition and analysis facilities for nuclear physics research, utilities for the display of two-dimensional (2D) images and spectra on graphics terminals suffer from low speed, poor resolution, and limit accuracy. Developed of CAMAC bit-mapped graphics modules for this purpose has been discouraged in the past by the large device count needed and the long times required to load the image data from the host computer into the CAMAC hardware; particularly since many such facilities have been designed to support fast DMA block transfers only for data acquisition into the host. This paper describes the design and implementation of a prototype CAMAC graphics display module with a resolution of 256x256 pixels at eight colours for which all components can be easily accommodated in a single-width package. Employed is a hardware technique which reduces the number of programmed CAMAC data transfer operations needed for writing 2D images into the display memory by approximately an order of magnitude, with attendant improvements in the display speed and CPU time consumption. Hardware and software details are given together with sample results. Information on the performance of the module in a typical VAX/MBD data acquisition environment is presented, including data on the mutual effects of simultaneous data acquisition traffic. Suggestions are made for further improvements in performance. (orig.)

  15. Resurfacing Graphics

    Directory of Open Access Journals (Sweden)

    Prof. Patty K. Wongpakdee

    2013-06-01

    Full Text Available “Resurfacing Graphics” deals with the subject of unconventional design, with the purpose of engaging the viewer to experience the graphics beyond paper’s passive surface. Unconventional designs serve to reinvigorate people, whose senses are dulled by the typical, printed graphics, which bombard them each day. Today’s cutting-edge designers, illustrators and artists utilize graphics in a unique manner that allows for tactile interaction. Such works serve as valuable teaching models and encourage students to do the following: 1 investigate the trans-disciplines of art and technology; 2 appreciate that this approach can have a positive effect on the environment; 3 examine and research other approaches of design communications and 4 utilize new mediums to stretch the boundaries of artistic endeavor. This paper examines how visuals communicators are “Resurfacing Graphics” by using atypical surfaces and materials such as textile, wood, ceramics and even water. Such non-traditional transmissions of visual language serve to demonstrate student’s overreliance on paper as an outdated medium. With this exposure, students can become forward-thinking, eco-friendly, creative leaders by expanding their creative breadth and continuing the perpetual exploration for new ways to make their mark.

  16. Resurfacing Graphics

    Directory of Open Access Journals (Sweden)

    Prof. Patty K. Wongpakdee

    2013-06-01

    Full Text Available “Resurfacing Graphics” deals with the subject of unconventional design, with the purpose of engaging the viewer to experience the graphics beyond paper’s passive surface. Unconventional designs serve to reinvigorate people, whose senses are dulled by the typical, printed graphics, which bombard them each day. Today’s cutting-edge designers, illustrators and artists utilize graphics in a unique manner that allows for tactile interaction. Such works serve as valuable teaching models and encourage students to do the following: 1 investigate the trans-disciplines of art and technology; 2 appreciate that this approach can have a positive effect on the environment; 3 examine and research other approaches of design communications and 4 utilize new mediums to stretch the boundaries of artistic endeavor. This paper examines how visuals communicators are “Resurfacing Graphics” by using atypical surfaces and materials such as textile, wood, ceramics and even water. Such non-traditional transmissions of visual language serve to demonstrate student’s overreliance on paper as an outdated medium. With this exposure, students can become forward-thinking, eco-friendly, creative leaders by expanding their creative breadth and continuing the perpetual exploration for new ways to make their mark. 

  17. Interactive graphics analysis system for nuclear engineering applications

    International Nuclear Information System (INIS)

    Danchak, M.; Moyer, W.R.; Becker, M.

    1973-01-01

    From working with continuous slowing down theory, the need was recognized for a system which allowed rapid calculation of the theoretical flux, instant comparison with experiment and a simple means of iterating on the slowing down parameters to force flux agreement and reflect cross section modification. Similar requirements exist in other areas of nuclear work for streamlining and simplifying the data analysis process. As a solution, a unique interactive graphics analysis system (RIGAS) was devised to allow a user to calculate, display, compare, manipulate and modify his data without requiring any programming on his part. This was accomplished by establishing human primacy, through extensive human factor considerations, and designing a man-machine dialogue which responds to the mere push of a button. This system results in an instrument which maximizes man's decision making capability and the computer's speed to improve graphic communication and data analysis. (14 figs) (U.S.)

  18. Case Study Observational Research: A Framework for Conducting Case Study Research Where Observation Data Are the Focus.

    Science.gov (United States)

    Morgan, Sonya J; Pullon, Susan R H; Macdonald, Lindsay M; McKinlay, Eileen M; Gray, Ben V

    2017-06-01

    Case study research is a comprehensive method that incorporates multiple sources of data to provide detailed accounts of complex research phenomena in real-life contexts. However, current models of case study research do not particularly distinguish the unique contribution observation data can make. Observation methods have the potential to reach beyond other methods that rely largely or solely on self-report. This article describes the distinctive characteristics of case study observational research, a modified form of Yin's 2014 model of case study research the authors used in a study exploring interprofessional collaboration in primary care. In this approach, observation data are positioned as the central component of the research design. Case study observational research offers a promising approach for researchers in a wide range of health care settings seeking more complete understandings of complex topics, where contextual influences are of primary concern. Future research is needed to refine and evaluate the approach.

  19. An Investigation of Seventh Grade Students’ Performances on Conceptual, Procedural and Graphical Problems Regarding Circles

    Directory of Open Access Journals (Sweden)

    Lütfi İncikabı

    2015-04-01

    Full Text Available The purpose of this study is to determine seventh grade students’ preferences among the procedural, conceptual and graphical questions in the subject of circles, to define their success levels in their preferences, and to compare students’ success levels in one question type with their performances in other question types. The methodology adopted during this research was case study. Based on criterion-based purposive sampling strategy, 98 middle school students were selected as the participants. Data were collected through an achievement test consisting of nine questions (three per question type. The results obtained from the study indicated that students mostly preferred graphical question types. Moreover, majority of students could not succeeded high levels in their preferred question types. In addition, the students performed better in graphical question types; however, the failure in procedural question types was remarkable. Keywords: Multiple representations, middle school students, mathematics education, circles

  20. HISPLT: A history graphics postprocessor

    International Nuclear Information System (INIS)

    Thompson, S.L.; Kmetyk, L.N.

    1991-09-01

    HISPLT is a graphics postprocessor designed to plot time histories for wave propagation codes. HISPLT is available for CRAY UNICOS, CRAY CTSS, VAX VMS computer systems, and a variety of UNIX workstations. The original HISPLT code employs a database structure that allows the program to be used without modification to process data generated by many wave propagation codes. HISPLT has recently been modified to process time histories for the reactor safety analysis code, MELCOR. This report provides a complete set of input instructions for HISPLT and provides examples of the types of plotted output that can be generated using HISPLT. 6 refs., 8 figs., 5 tabs

  1. High-Throughput Characterization of Porous Materials Using Graphics Processing Units

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jihan; Martin, Richard L.; Rübel, Oliver; Haranczyk, Maciej; Smit, Berend

    2012-05-08

    We have developed a high-throughput graphics processing units (GPU) code that can characterize a large database of crystalline porous materials. In our algorithm, the GPU is utilized to accelerate energy grid calculations where the grid values represent interactions (i.e., Lennard-Jones + Coulomb potentials) between gas molecules (i.e., CH$_{4}$ and CO$_{2}$) and material's framework atoms. Using a parallel flood fill CPU algorithm, inaccessible regions inside the framework structures are identified and blocked based on their energy profiles. Finally, we compute the Henry coefficients and heats of adsorption through statistical Widom insertion Monte Carlo moves in the domain restricted to the accessible space. The code offers significant speedup over a single core CPU code and allows us to characterize a set of porous materials at least an order of magnitude larger than ones considered in earlier studies. For structures selected from such a prescreening algorithm, full adsorption isotherms can be calculated by conducting multiple grand canonical Monte Carlo simulations concurrently within the GPU.

  2. A Fast MHD Code for Gravitationally Stratified Media using Graphical Processing Units: SMAUG

    Science.gov (United States)

    Griffiths, M. K.; Fedun, V.; Erdélyi, R.

    2015-03-01

    Parallelization techniques have been exploited most successfully by the gaming/graphics industry with the adoption of graphical processing units (GPUs), possessing hundreds of processor cores. The opportunity has been recognized by the computational sciences and engineering communities, who have recently harnessed successfully the numerical performance of GPUs. For example, parallel magnetohydrodynamic (MHD) algorithms are important for numerical modelling of highly inhomogeneous solar, astrophysical and geophysical plasmas. Here, we describe the implementation of SMAUG, the Sheffield Magnetohydrodynamics Algorithm Using GPUs. SMAUG is a 1-3D MHD code capable of modelling magnetized and gravitationally stratified plasma. The objective of this paper is to present the numerical methods and techniques used for porting the code to this novel and highly parallel compute architecture. The methods employed are justified by the performance benchmarks and validation results demonstrating that the code successfully simulates the physics for a range of test scenarios including a full 3D realistic model of wave propagation in the solar atmosphere.

  3. Graphics-oriented application language for LASNEX

    International Nuclear Information System (INIS)

    Stringer, L.M.

    1985-01-01

    GOAL, a graphics-oriented application language, was developed to help physicists understand the large amounts of data produced by LASNEX. GOAL combines many aspects of the old LASNEX language, computer graphics, and standard computer languages

  4. Archaeological Vector Graphics and SVG: A case study from Cricklade

    Directory of Open Access Journals (Sweden)

    Holly Wright

    2006-07-01

    Full Text Available Currently, there are a variety of ways to make vector-based information available on the Web, but most are browser- and platform-dependent, proprietary, and unevenly supported (Laaker 2002, 13. Of the various solutions currently being explored by the greater Web community, one of the most promising is called Scalable Vector Graphics (SVG, which is part of the eXtensible Markup Language (XML. SVG was defined by a working group of the World Wide Web Consortium (W3C and has subsequently become their official recommendation for representing vector graphics on the Web in XML (Eisenberg 2002, 6; Watt 2002, xviii. Because SVG is an XML application, it is freely available, not dependent on a particular browser or platform, and interoperable with other XML applications. While there is no guarantee that SVG will be widely adopted for rendering vector-based information on the Web, development and recommendation by the W3C generally carries a great deal of weight, especially as browser developers move towards less proprietary support of W3C standards. In addition, use of XML continues to grow, so XML-based solutions like SVG should be explored by those interested in presenting vector graphics on the Web (Harold and Means 2002, 3. This discussion explores SVG as a potential tool for archaeologists. It includes some of the ways vector graphics are used in archaeology, and outlines the development and features of SVG, which are then demonstrated in the form of a case study. Large-scale plan and section drawings originally created on Permatrace were digitised by Guy Hopkinson for use in the Internet Archaeology publication Excavations at Cricklade, Wiltshire, 1975, by Jeremy Haslam, designed as an exercise in 'retrospective publication', to illustrate how traditional forms of visual recording might be digitised for online publication. Hopkinson went on to publish his methodology jointly with Internet Archaeology editor, Judith Winters in Problems with

  5. User-Extensible Graphics Using Abstract Structure,

    Science.gov (United States)

    1987-08-01

    Flex 6 The Algol68 model of the graphical abstract structure 5 The creation of a PictureDefinition 6 The making of a picture from a PictureDefinition 7...data together with the operations that can be performed on that data. i 7! ś I _ § 4, The Alqol68 model of the graphical abstract structure Every

  6. Development of a prototype graphic simulation program for severe accident training

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ko Ryu; Jeong, Kwang Sub; Ha, Jae Joo

    2000-05-01

    This is a report of the development process and related technologies of severe accident graphic simulators, required in industrial severe accident management and training. Here, we say 'a severe accident graphic simulator' as a graphics add-in system to existing calculation codes, which can show the severe accident phenomena dynamically on computer screens and therefore which can supplement one of main defects of existing calculation codes. With graphic simulators it is fairly easy to see the total behavior of nuclear power plants, where it was very difficult to see only from partial variable numerical information. Moreover, the fast processing and control feature of a graphic simulator can give some opportunities of predicting the severe accident advancement among several possibilities, to one who is not an expert. Utilizing graphic simulators' we expect operators' and TSC members' physical phenomena understanding enhancement from the realistic dynamic behavior of plants. We also expect that severe accident training course can gain better training effects using graphic simulator's control functions and predicting capabilities, and therefore we expect that graphic simulators will be effective decision-aids tools both in sever accident training course and in real severe accident situations. With these in mind, we have developed a prototype graphic simulator having surveyed related technologies, and from this development experiences we have inspected the possibility to build a severe accident graphic simulator. The prototype graphic simulator is developed under IBM PC WinNT environments and is suited to Uljin 3and4 nuclear power plant. When supplied with adequate severe accident scenario as an input, the prototype can provide graphical simulations of plant safety systems' dynamic behaviors. The prototype is composed of several different modules, which are phenomena display module, MELCOR data interface module and graphic database

  7. Modeling And Simulation As The Basis For Hybridity In The Graphic Discipline Learning/Teaching Area

    Directory of Open Access Journals (Sweden)

    Jana Žiljak Vujić

    2009-01-01

    Full Text Available Only some fifteen years have passed since the scientific graphics discipline was established. In the transition period from the College of Graphics to «Integrated Graphic Technology Studies» to the contemporary Faculty of Graphics Arts with the University in Zagreb, three main periods of development can be noted: digital printing, computer prepress and automatic procedures in postpress packaging production. Computer technology has enabled a change in the methodology of teaching graphics technology and studying it on the level of secondary and higher education. The task has been set to create tools for simulating printing processes in order to master the program through a hybrid system consisting of methods that are separate in relation to one another: learning with the help of digital models and checking in the actual real system. We are setting a hybrid project for teaching because the overall acquired knowledge is the result of completely different methods. The first method is on the free programs level functioning without consequences. Everything remains as a record in the knowledge database that can be analyzed, statistically processed and repeated with new parameter values of the system being researched. The second method uses the actual real system where the results are in proving the value of new knowledge and this is something that encourages and stimulates new cycles of hybrid behavior in mastering programs. This is the area where individual learning incurs. The hybrid method allows the possibility of studying actual situations on a computer model, proving it on an actual real model and entering the area of learning envisaging future development.

  8. Modeling and Simulation as the Basis for Hybridity in the Graphic Discipline Learning/Teaching Area

    Directory of Open Access Journals (Sweden)

    Vilko Ziljak

    2009-11-01

    Full Text Available Only some fifteen years have passed since the scientific graphics discipline was established. In the transition period from the College of Graphics to «Integrated Graphic Technology Studies» to the contemporary Faculty of Graphics Arts with the University in Zagreb, three main periods of development can be noted: digital printing, computer prepress and automatic procedures in postpress packaging production. Computer technology has enabled a change in the methodology of teaching graphics technology and studying it on the level of secondary and higher education. The task has been set to create tools for simulating printing processes in order to master the program through a hybrid system consisting of methods that are separate in relation to one another: learning with the help of digital models and checking in the actual real system.  We are setting a hybrid project for teaching because the overall acquired knowledge is the result of completely different methods. The first method is on the free programs level functioning without consequences. Everything remains as a record in the knowledge database that can be analyzed, statistically processed and repeated with new parameter values of the system being researched. The second method uses the actual real system where the results are in proving the value of new knowledge and this is something that encourages and stimulates new cycles of hybrid behavior in mastering programs. This is the area where individual learning incurs. The hybrid method allows the possibility of studying actual situations on a computer model, proving it on an actual real model and entering the area of learning envisaging future development.

  9. Animated GIFs as vernacular graphic design

    DEFF Research Database (Denmark)

    Gürsimsek, Ödül Akyapi

    2016-01-01

    and often a mix of some of these modes, seem to enable participatory conversations by the audience communities that continue over a period of time. One example of such multimodal digital content is the graphic format called the animated GIF (graphics interchange format). This article focuses on content......Online television audiences create a variety of digital content on the internet. Fans of television production design produce and share such content to express themselves and engage with the objects of their interest. These digital expressions, which exist in the form of graphics, text, videos...... as design, both in the sense that multimodal meaning making is an act of design and in the sense that web-based graphics are designed graphics that are created through a design process. She specifically focuses on the transmedia television production entitled Lost and analyzes the design of animated GIFs...

  10. Smoldyn on graphics processing units: massively parallel Brownian dynamics simulations.

    Science.gov (United States)

    Dematté, Lorenzo

    2012-01-01

    Space is a very important aspect in the simulation of biochemical systems; recently, the need for simulation algorithms able to cope with space is becoming more and more compelling. Complex and detailed models of biochemical systems need to deal with the movement of single molecules and particles, taking into consideration localized fluctuations, transportation phenomena, and diffusion. A common drawback of spatial models lies in their complexity: models can become very large, and their simulation could be time consuming, especially if we want to capture the systems behavior in a reliable way using stochastic methods in conjunction with a high spatial resolution. In order to deliver the promise done by systems biology to be able to understand a system as whole, we need to scale up the size of models we are able to simulate, moving from sequential to parallel simulation algorithms. In this paper, we analyze Smoldyn, a widely diffused algorithm for stochastic simulation of chemical reactions with spatial resolution and single molecule detail, and we propose an alternative, innovative implementation that exploits the parallelism of Graphics Processing Units (GPUs). The implementation executes the most computational demanding steps (computation of diffusion, unimolecular, and bimolecular reaction, as well as the most common cases of molecule-surface interaction) on the GPU, computing them in parallel on each molecule of the system. The implementation offers good speed-ups and real time, high quality graphics output

  11. Accelerating adaptive inverse distance weighting interpolation algorithm on a graphics processing unit.

    Science.gov (United States)

    Mei, Gang; Xu, Liangliang; Xu, Nengxiong

    2017-09-01

    This paper focuses on designing and implementing parallel adaptive inverse distance weighting (AIDW) interpolation algorithms by using the graphics processing unit (GPU). The AIDW is an improved version of the standard IDW, which can adaptively determine the power parameter according to the data points' spatial distribution pattern and achieve more accurate predictions than those predicted by IDW. In this paper, we first present two versions of the GPU-accelerated AIDW, i.e. the naive version without profiting from the shared memory and the tiled version taking advantage of the shared memory. We also implement the naive version and the tiled version using two data layouts, structure of arrays and array of aligned structures, on both single and double precision. We then evaluate the performance of parallel AIDW by comparing it with its corresponding serial algorithm on three different machines equipped with the GPUs GT730M, M5000 and K40c. The experimental results indicate that: (i) there is no significant difference in the computational efficiency when different data layouts are employed; (ii) the tiled version is always slightly faster than the naive version; and (iii) on single precision the achieved speed-up can be up to 763 (on the GPU M5000), while on double precision the obtained highest speed-up is 197 (on the GPU K40c). To benefit the community, all source code and testing data related to the presented parallel AIDW algorithm are publicly available.

  12. A proposal for the measurement of graphical statistics effectiveness: Does it enhance or interfere with statistical reasoning?

    International Nuclear Information System (INIS)

    Agus, M; Penna, M P; Peró-Cebollero, M; Guàrdia-Olmos, J

    2015-01-01

    Numerous studies have examined students' difficulties in understanding some notions related to statistical problems. Some authors observed that the presentation of distinct visual representations could increase statistical reasoning, supporting the principle of graphical facilitation. But other researchers disagree with this viewpoint, emphasising the impediments related to the use of illustrations that could overcharge the cognitive system with insignificant data. In this work we aim at comparing the probabilistic statistical reasoning regarding two different formats of problem presentations: graphical and verbal-numerical. We have conceived and presented five pairs of homologous simple problems in the verbal numerical and graphical format to 311 undergraduate Psychology students (n=156 in Italy and n=155 in Spain) without statistical expertise. The purpose of our work was to evaluate the effect of graphical facilitation in probabilistic statistical reasoning. Every undergraduate has solved each pair of problems in two formats in different problem presentation orders and sequences. Data analyses have highlighted that the effect of graphical facilitation is infrequent in psychology undergraduates. This effect is related to many factors (as knowledge, abilities, attitudes, and anxiety); moreover it might be considered the resultant of interaction between individual and task characteristics

  13. Cyrus Levinthal, the Kluge and the origins of interactive molecular graphics.

    Science.gov (United States)

    Francoeur, Eric

    2002-12-01

    In the mid-1960s, a group of scientists at Massachusetts Institute of Technology, led by Cyrus Levinthal, took hold of one of the early interactive graphics terminals and used it to visualize, study and model the structure of proteins and nucleic acids. From this encounter between cutting-edge computer technology and molecular biology emerged the crucial elements for the development of a research-technology field known today as interactive molecular graphics. The following account is not only about how computer graphics technology has literally changed the way scientists view the molecular realm, but also a look at how an epistemic and institutional space was created to integrate this technology into scientific research.

  14. Ice-sheet modelling accelerated by graphics cards

    Science.gov (United States)

    Brædstrup, Christian Fredborg; Damsgaard, Anders; Egholm, David Lundbek

    2014-11-01

    Studies of glaciers and ice sheets have increased the demand for high performance numerical ice flow models over the past decades. When exploring the highly non-linear dynamics of fast flowing glaciers and ice streams, or when coupling multiple flow processes for ice, water, and sediment, researchers are often forced to use super-computing clusters. As an alternative to conventional high-performance computing hardware, the Graphical Processing Unit (GPU) is capable of massively parallel computing while retaining a compact design and low cost. In this study, we present a strategy for accelerating a higher-order ice flow model using a GPU. By applying the newest GPU hardware, we achieve up to 180× speedup compared to a similar but serial CPU implementation. Our results suggest that GPU acceleration is a competitive option for ice-flow modelling when compared to CPU-optimised algorithms parallelised by the OpenMP or Message Passing Interface (MPI) protocols.

  15. BarraCUDA - a fast short read sequence aligner using graphics processing units

    Directory of Open Access Journals (Sweden)

    Klus Petr

    2012-01-01

    Full Text Available Abstract Background With the maturation of next-generation DNA sequencing (NGS technologies, the throughput of DNA sequencing reads has soared to over 600 gigabases from a single instrument run. General purpose computing on graphics processing units (GPGPU, extracts the computing power from hundreds of parallel stream processors within graphics processing cores and provides a cost-effective and energy efficient alternative to traditional high-performance computing (HPC clusters. In this article, we describe the implementation of BarraCUDA, a GPGPU sequence alignment software that is based on BWA, to accelerate the alignment of sequencing reads generated by these instruments to a reference DNA sequence. Findings Using the NVIDIA Compute Unified Device Architecture (CUDA software development environment, we ported the most computational-intensive alignment component of BWA to GPU to take advantage of the massive parallelism. As a result, BarraCUDA offers a magnitude of performance boost in alignment throughput when compared to a CPU core while delivering the same level of alignment fidelity. The software is also capable of supporting multiple CUDA devices in parallel to further accelerate the alignment throughput. Conclusions BarraCUDA is designed to take advantage of the parallelism of GPU to accelerate the alignment of millions of sequencing reads generated by NGS instruments. By doing this, we could, at least in part streamline the current bioinformatics pipeline such that the wider scientific community could benefit from the sequencing technology. BarraCUDA is currently available from http://seqbarracuda.sf.net

  16. BarraCUDA - a fast short read sequence aligner using graphics processing units

    LENUS (Irish Health Repository)

    Klus, Petr

    2012-01-13

    Abstract Background With the maturation of next-generation DNA sequencing (NGS) technologies, the throughput of DNA sequencing reads has soared to over 600 gigabases from a single instrument run. General purpose computing on graphics processing units (GPGPU), extracts the computing power from hundreds of parallel stream processors within graphics processing cores and provides a cost-effective and energy efficient alternative to traditional high-performance computing (HPC) clusters. In this article, we describe the implementation of BarraCUDA, a GPGPU sequence alignment software that is based on BWA, to accelerate the alignment of sequencing reads generated by these instruments to a reference DNA sequence. Findings Using the NVIDIA Compute Unified Device Architecture (CUDA) software development environment, we ported the most computational-intensive alignment component of BWA to GPU to take advantage of the massive parallelism. As a result, BarraCUDA offers a magnitude of performance boost in alignment throughput when compared to a CPU core while delivering the same level of alignment fidelity. The software is also capable of supporting multiple CUDA devices in parallel to further accelerate the alignment throughput. Conclusions BarraCUDA is designed to take advantage of the parallelism of GPU to accelerate the alignment of millions of sequencing reads generated by NGS instruments. By doing this, we could, at least in part streamline the current bioinformatics pipeline such that the wider scientific community could benefit from the sequencing technology. BarraCUDA is currently available from http:\\/\\/seqbarracuda.sf.net

  17. Processing Marine Gravity Data Around Korea

    Science.gov (United States)

    Lee, Y.; Choi, K.; Kim, Y.; Ahn, Y.; Chang, M.

    2008-12-01

    In Korea currently 4 research ships are under operating in Korea, after the first research vessel equipped shipborne gravity meter was introduced in 1990s. These are Onnuri(launch 1991) of KORDI(Korea Ocean Research & Development Institute), Haeyang2000(launch 1996), Badaro1(launch 2002) of NORI(National Oceanographic Research Institute) and Tamhae2(launch 1997) of KIGAM(Korea Institute of Geoscience and Mineral Resources). Those of research vessel, Haeyang2000 have observed marine gravity data over 150,000 points each year from year 1996 to year 2003. Haeyang2000, about 2,500 tons, is unable to operate onshore so NORI has constructed another 600 tons research ship Badaro1 that has observed marine gravity data onshore since year 2002. Haeyang2000 finished observing marine gravity data offshore within Korean territorial waters until year 2003. Currently Badaro1 is observing marine gravity data onshore. These shipborne gravity data will be very useful and important on geodesy and geophysics research also those data can make a contribution to developing these studies. In this study NORI's shipbrne gravity data from 1996 to 2007 has been processed for fundamental data to compute Korean precise geoid. Marine gravity processing steps as followed. 1. Check the time sequence, latitude and longitude position, etc. of shipborne gravity data 2. Arrangement of the tide level below the pier and meter drift correction of each cruise. 3. Elimination of turning points. 4. The time lag correction. 5. Computation of RV's velocities, Heading angles and the Eötvös correction. 6. Kalman filtering of GPS navigation data using cross-over points. 7. Cross-over correction using least square adjustment. About 2,058,000 points have been processed with NORI's marine gravity data from 1996 to 2007 in this study. The distribution of free-air anomalies was -41.0 mgal to 136.0 mgal(mean 8.90mgal) within Korean territorial waters. The free-air anomalies processed with the marine gravity data are

  18. Scientific Graphical Displays on the Macintosh

    Energy Technology Data Exchange (ETDEWEB)

    Grotch, S. [Lawrence Livermore National Lab., CA (United States)

    1994-11-15

    In many organizations scientists have ready access to more than one computer, often both a workstation (e.g., SUN, HP, SGI) as well as a Macintosh or other PC. The scientist commonly uses the work station for `number-crunching` and data analysis whereas the Macintosh is relegated to either word processing or serves as a `dumb terminal` to a larger main-frame computer. In an informal poll of my colleagues, very few of them used their Macintoshes for either statistical analysis or for graphical data display. I believe that this state of affairs is particularly unfortunate because over the last few years both the computational capability, and even more so, the software availability for the Macintosh have become quite formidable. In some instances, very powerful tools are now available on the Macintosh that may not exist (or be far too costly) on the so-called `high end` workstations. Many scientists are simply unaware of the wealth of extremely useful, `off-the-shelf` software that already exists on the Macintosh for scientific graphical and statistical analysis.

  19. Effects of computer-based graphic organizers to solve one-step word problems for middle school students with mild intellectual disability: A preliminary study.

    Science.gov (United States)

    Sheriff, Kelli A; Boon, Richard T

    2014-08-01

    The purpose of this study was to examine the effects of computer-based graphic organizers, using Kidspiration 3© software, to solve one-step word problems. Participants included three students with mild intellectual disability enrolled in a functional academic skills curriculum in a self-contained classroom. A multiple probe single-subject research design (Horner & Baer, 1978) was used to evaluate the effectiveness of computer-based graphic organizers to solving mathematical one-step word problems. During the baseline phase, the students completed a teacher-generated worksheet that consisted of nine functional word problems in a traditional format using a pencil, paper, and a calculator. In the intervention and maintenance phases, the students were instructed to complete the word problems using a computer-based graphic organizer. Results indicated that all three of the students improved in their ability to solve the one-step word problems using computer-based graphic organizers compared to traditional instructional practices. Limitations of the study and recommendations for future research directions are discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Monte Carlo method for neutron transport calculations in graphics processing units (GPUs)

    International Nuclear Information System (INIS)

    Pellegrino, Esteban

    2011-01-01

    Monte Carlo simulation is well suited for solving the Boltzmann neutron transport equation in an inhomogeneous media for complicated geometries. However, routine applications require the computation time to be reduced to hours and even minutes in a desktop PC. The interest in adopting Graphics Processing Units (GPUs) for Monte Carlo acceleration is rapidly growing. This is due to the massive parallelism provided by the latest GPU technologies which is the most promising solution to the challenge of performing full-size reactor core analysis on a routine basis. In this study, Monte Carlo codes for a fixed-source neutron transport problem were developed for GPU environments in order to evaluate issues associated with computational speedup using GPUs. Results obtained in this work suggest that a speedup of several orders of magnitude is possible using the state-of-the-art GPU technologies. (author) [es

  1. Area-delay trade-offs of texture decompressors for a graphics processing unit

    Science.gov (United States)

    Novoa Súñer, Emilio; Ituero, Pablo; López-Vallejo, Marisa

    2011-05-01

    Graphics Processing Units have become a booster for the microelectronics industry. However, due to intellectual property issues, there is a serious lack of information on implementation details of the hardware architecture that is behind GPUs. For instance, the way texture is handled and decompressed in a GPU to reduce bandwidth usage has never been dealt with in depth from a hardware point of view. This work addresses a comparative study on the hardware implementation of different texture decompression algorithms for both conventional (PCs and video game consoles) and mobile platforms. Circuit synthesis is performed targeting both a reconfigurable hardware platform and a 90nm standard cell library. Area-delay trade-offs have been extensively analyzed, which allows us to compare the complexity of decompressors and thus determine suitability of algorithms for systems with limited hardware resources.

  2. Evaluating Texts for Graphical Literacy Instruction: The Graphic Rating Tool

    Science.gov (United States)

    Roberts, Kathryn L.; Brugar, Kristy A.; Norman, Rebecca R.

    2015-01-01

    In this article, we present the Graphical Rating Tool (GRT), which is designed to evaluate the graphical devices that are commonly found in content-area, non-fiction texts, in order to identify books that are well suited for teaching about those devices. We also present a "best of" list of science and social studies books, which includes…

  3. Systems Biology Graphical Notation: Activity Flow language Level 1 Version 1.2

    Directory of Open Access Journals (Sweden)

    Mi Huaiyu

    2015-06-01

    Full Text Available The Systems Biological Graphical Notation (SBGN is an international community effort for standardized graphical representations of biological pathways and networks. The goal of SBGN is to provide unambiguous pathway and network maps for readers with different scientific backgrounds as well as to support efficient and accurate exchange of biological knowledge between different research communities, industry, and other players in systems biology. Three SBGN languages, Process Description (PD, Entity Relationship (ER and Activity Flow (AF, allow for the representation of different aspects of biological and biochemical systems at different levels of detail.

  4. A Prototype Lisp-Based Soft Real-Time Object-Oriented Graphical User Interface for Control System Development

    Science.gov (United States)

    Litt, Jonathan; Wong, Edmond; Simon, Donald L.

    1994-01-01

    A prototype Lisp-based soft real-time object-oriented Graphical User Interface for control system development is presented. The Graphical User Interface executes alongside a test system in laboratory conditions to permit observation of the closed loop operation through animation, graphics, and text. Since it must perform interactive graphics while updating the screen in real time, techniques are discussed which allow quick, efficient data processing and animation. Examples from an implementation are included to demonstrate some typical functionalities which allow the user to follow the control system's operation.

  5. Arts-based Research Processes in ECEC: Examples from Preparing and Conducting a Data Collection

    Directory of Open Access Journals (Sweden)

    Torill Vist

    2016-08-01

    Full Text Available In this methodological article, different concepts and possibilities related to how arts-based research processes can contribute in the early phases of ECEC research will be presented and discussed. Despite a setback of art subjects in Norwegian ECEC and early childhood teacher’s education, the field of arts still plays an important role, and is expected to be research-based. Thus, there should be a need for an aesthetical and arts-based dimension in researching ECEC, not only in the subject matter, but also in the method, context, outcome and dissemination. The article focuses on methodological issues in the question development/design phase and the data collection phase, exemplified by the author’s own experiences in arts-based research processes. These processes include participation in dance and music performance as thinking or reflection tools in research, and an arts-based interview method. Some narrative writing processes will also be commented upon. Theoretically, the article primarily leans upon Barone and Eisner’s arts-based research and Irwin and Springgay’s a/r/tography.

  6. Visualization of graphical information fusion results

    Science.gov (United States)

    Blasch, Erik; Levchuk, Georgiy; Staskevich, Gennady; Burke, Dustin; Aved, Alex

    2014-06-01

    Graphical fusion methods are popular to describe distributed sensor applications such as target tracking and pattern recognition. Additional graphical methods include network analysis for social, communications, and sensor management. With the growing availability of various data modalities, graphical fusion methods are widely used to combine data from multiple sensors and modalities. To better understand the usefulness of graph fusion approaches, we address visualization to increase user comprehension of multi-modal data. The paper demonstrates a use case that combines graphs from text reports and target tracks to associate events and activities of interest visualization for testing Measures of Performance (MOP) and Measures of Effectiveness (MOE). The analysis includes the presentation of the separate graphs and then graph-fusion visualization for linking network graphs for tracking and classification.

  7. R graphics

    CERN Document Server

    Murrell, Paul

    2005-01-01

    R is revolutionizing the world of statistical computing. Powerful, flexible, and best of all free, R is now the program of choice for tens of thousands of statisticians. Destined to become an instant classic, R Graphics presents the first complete, authoritative exposition on the R graphical system. Paul Murrell, widely known as the leading expert on R graphics, has developed an in-depth resource that takes nothing for granted and helps both neophyte and seasoned users master the intricacies of R graphics. After an introductory overview of R graphics facilities, the presentation first focuses

  8. Free, cross-platform gRaphical software

    DEFF Research Database (Denmark)

    Dethlefsen, Claus

    2006-01-01

    -recursive graphical models, and models defined using the BUGS language. Today, there exists a wide range of packages to support the analysis of data using graphical models. Here, we focus on Open Source software, making it possible to extend the functionality by integrating these packages into more general tools. We...... will attempt to give an overview of the available Open Source software, with focus on the gR project. This project was launched in 2002 to make facilities in R for graphical modelling. Several R packages have been developed within the gR project both for display and analysis of graphical models...

  9. Programming Language Software For Graphics Applications

    Science.gov (United States)

    Beckman, Brian C.

    1993-01-01

    New approach reduces repetitive development of features common to different applications. High-level programming language and interactive environment with access to graphical hardware and software created by adding graphical commands and other constructs to standardized, general-purpose programming language, "Scheme". Designed for use in developing other software incorporating interactive computer-graphics capabilities into application programs. Provides alternative to programming entire applications in C or FORTRAN, specifically ameliorating design and implementation of complex control and data structures typifying applications with interactive graphics. Enables experimental programming and rapid development of prototype software, and yields high-level programs serving as executable versions of software-design documentation.

  10. GAIML: A New Language for Verbal and Graphical Interaction in Chatbots

    Directory of Open Access Journals (Sweden)

    Roberto Pirrone

    2008-01-01

    Full Text Available Natural and intuitive interaction between users and complex systems is a crucial research topic in human-computer interaction. A major direction is the definition and implementation of systems with natural language understanding capabilities. The interaction in natural language is often performed by means of systems called chatbots. A chatbot is a conversational agent with a proper knowledge base able to interact with users. Chatbots appearance can be very sophisticated with 3D avatars and speech processing modules. However the interaction between the system and the user is only performed through textual areas for inputs and replies. An interaction able to add to natural language also graphical widgets could be more effective. On the other side, a graphical interaction involving also the natural language can increase the comfort of the user instead of using only graphical widgets. In many applications multi-modal communication must be preferred when the user and the system have a tight and complex interaction. Typical examples are cultural heritages applications (intelligent museum guides, picture browsing or systems providing the user with integrated information taken from different and heterogenous sources as in the case of the iGoogle™ interface. We propose to mix the two modalities (verbal and graphical to build systems with a reconfigurable interface, which is able to change with respect to the particular application context. The result of this proposal is the Graphical Artificial Intelligence Markup Language (GAIML an extension of AIML allowing merging both interaction modalities. In this context a suitable chatbot system called Graphbot is presented to support this language. With this language is possible to define personalized interface patterns that are the most suitable ones in relation to the data types exchanged between the user and the system according to the context of the dialogue.

  11. Interactive Learning for Graphic Design Foundations

    Science.gov (United States)

    Chu, Sauman; Ramirez, German Mauricio Mejia

    2012-01-01

    One of the biggest problems for students majoring in pre-graphic design is students' inability to apply their knowledge to different design solutions. The purpose of this study is to examine the effectiveness of interactive learning modules in facilitating knowledge acquisition during the learning process and to create interactive learning modules…

  12. Graphical user interface for wireless sensor networks simulator

    Science.gov (United States)

    Paczesny, Tomasz; Paczesny, Daniel; Weremczuk, Jerzy

    2008-01-01

    Wireless Sensor Networks (WSN) are currently very popular area of development. It can be suited in many applications form military through environment monitoring, healthcare, home automation and others. Those networks, when working in dynamic, ad-hoc model, need effective protocols which must differ from common computer networks algorithms. Research on those protocols would be difficult without simulation tool, because real applications often use many nodes and tests on such a big networks take much effort and costs. The paper presents Graphical User Interface (GUI) for simulator which is dedicated for WSN studies, especially in routing and data link protocols evaluation.

  13. Integrating R and Java for Enhancing Interactivity of Algorithmic Data Analysis Software Solutions

    Directory of Open Access Journals (Sweden)

    Titus Felix FURTUNĂ

    2016-06-01

    Full Text Available Conceiving software solutions for statistical processing and algorithmic data analysis involves handling diverse data, fetched from various sources and in different formats, and presenting the results in a suggestive, tailorable manner. Our ongoing research aims to design programming technics for integrating R developing environment with Java programming language for interoperability at a source code level. The goal is to combine the intensive data processing capabilities of R programing language, along with the multitude of statistical function libraries, with the flexibility offered by Java programming language and platform, in terms of graphical user interface and mathematical function libraries. Both developing environments are multiplatform oriented, and can complement each other through interoperability. R is a comprehensive and concise programming language, benefiting from a continuously expanding and evolving set of packages for statistical analysis, developed by the open source community. While is a very efficient environment for statistical data processing, R platform lacks support for developing user friendly, interactive, graphical user interfaces (GUIs. Java on the other hand, is a high level object oriented programming language, which supports designing and developing performant and interactive frameworks for general purpose software solutions, through Java Foundation Classes, JavaFX and various graphical libraries. In this paper we treat both aspects of integration and interoperability that refer to integrating Java code into R applications, and bringing R processing sequences into Java driven software solutions. Our research has been conducted focusing on case studies concerning pattern recognition and cluster analysis.

  14. Can we be more Graphic about Graphic Design?

    OpenAIRE

    Vienne, Véronique

    2012-01-01

    Can you objectify a subjective notion? This is the question graphic designers must face when they talk about their work. Even though graphic design artifacts are omnipresent in our culture, graphic design is still an exceptionally ill-defined profession. This is one of the reasons design criticism is still a rudimentary discipline. No one knows for sure what is this thing we sometimes call “graphic communication” for lack of a better word–a technique my Webster’s dictionary describes as “the ...

  15. NMRFx Processor: a cross-platform NMR data processing program

    International Nuclear Information System (INIS)

    Norris, Michael; Fetler, Bayard; Marchant, Jan; Johnson, Bruce A.

    2016-01-01

    NMRFx Processor is a new program for the processing of NMR data. Written in the Java programming language, NMRFx Processor is a cross-platform application and runs on Linux, Mac OS X and Windows operating systems. The application can be run in both a graphical user interface (GUI) mode and from the command line. Processing scripts are written in the Python programming language and executed so that the low-level Java commands are automatically run in parallel on computers with multiple cores or CPUs. Processing scripts can be generated automatically from the parameters of NMR experiments or interactively constructed in the GUI. A wide variety of processing operations are provided, including methods for processing of non-uniformly sampled datasets using iterative soft thresholding. The interactive GUI also enables the use of the program as an educational tool for teaching basic and advanced techniques in NMR data analysis.

  16. NMRFx Processor: a cross-platform NMR data processing program

    Energy Technology Data Exchange (ETDEWEB)

    Norris, Michael; Fetler, Bayard [One Moon Scientific, Inc. (United States); Marchant, Jan [University of Maryland Baltimore County, Howard Hughes Medical Institute (United States); Johnson, Bruce A., E-mail: bruce.johnson@asrc.cuny.edu [One Moon Scientific, Inc. (United States)

    2016-08-15

    NMRFx Processor is a new program for the processing of NMR data. Written in the Java programming language, NMRFx Processor is a cross-platform application and runs on Linux, Mac OS X and Windows operating systems. The application can be run in both a graphical user interface (GUI) mode and from the command line. Processing scripts are written in the Python programming language and executed so that the low-level Java commands are automatically run in parallel on computers with multiple cores or CPUs. Processing scripts can be generated automatically from the parameters of NMR experiments or interactively constructed in the GUI. A wide variety of processing operations are provided, including methods for processing of non-uniformly sampled datasets using iterative soft thresholding. The interactive GUI also enables the use of the program as an educational tool for teaching basic and advanced techniques in NMR data analysis.

  17. A Graphics Design Framework to Visualize Multi-Dimensional Economic Datasets

    Science.gov (United States)

    Chandramouli, Magesh; Narayanan, Badri; Bertoline, Gary R.

    2013-01-01

    This study implements a prototype graphics visualization framework to visualize multidimensional data. This graphics design framework serves as a "visual analytical database" for visualization and simulation of economic models. One of the primary goals of any kind of visualization is to extract useful information from colossal volumes of…

  18. Interactive graphics on large datasets drives remote condition monitoring on a cloud

    International Nuclear Information System (INIS)

    Hickinbotham, Simon; Austin, James; McAvoy, John

    2012-01-01

    We demonstrate a new system for condition monitoring using the cloud. The system combines state of the art pattern search capability with youShare, a platform that allows people to run compute-intensive research in an ordered manner over the internet. Data from sensors distributed across one or more assets at one or more sites are uploaded to the cloud compute resource. The uploading triggers the deployment of a range of pattern search services, and is capable of rapidly detecting novel patterns in the data. The outputs of these processes are archived as a matter of course, but are also sent to a further service which processes the data for remote visualisation on a web browser. The system is built in Java, using GWT and RaphaelGWT for graphics rendering. The design of these systems must satisfy conflicting requirements of data currency and data throughput. We present an evaluation of our system that involves processing data at a range of frequencies and bandwidths that are commensurate with commercial requirements. We show that our system has the potential to satisfy a range of processing requirements with minimal latency, and that the user experience is easily sufficient for rapid interpretation of complex condition monitoring data.

  19. Model aerodynamic test results for two variable cycle engine coannular exhaust systems at simulated takeoff and cruise conditions. Comprehensive data report. Volume 3: Graphical data book 1

    Science.gov (United States)

    Nelson, D. P.

    1981-01-01

    A graphical presentation of the aerodynamic data acquired during coannular nozzle performance wind tunnel tests is given. The graphical data consist of plots of nozzle gross thrust coefficient, fan nozzle discharge coefficient, and primary nozzle discharge coefficient. Normalized model component static pressure distributions are presented as a function of primary total pressure, fan total pressure, and ambient static pressure for selected operating conditions. In addition, the supersonic cruise configuration data include plots of nozzle efficiency and secondary-to-fan total pressure pumping characteristics. Supersonic and subsonic cruise data are given.

  20. Graphic Novels in the Secondary Classroom and School Libraries

    Science.gov (United States)

    Griffith, Paula E.

    2010-01-01

    The author examines the rise in popularity of graphic novels, the sales of which have steadily increased as their influence expands into adolescent culture. This article also includes an overview of current research results supporting the use of graphic novels within the classroom and school library; graphic novels support English-language…

  1. VTGRAPH:Computer graphics program using ReGIS mode

    International Nuclear Information System (INIS)

    Benamar, M.A.; Benouali, N.; Tchantchane, A.; Azbouche, A.; Tobbeche, S. Centre de Developpement des Techniques Nucleaires, Laboratoire des Techniques Nucleaires, Algiers

    1993-02-01

    A computer graphics program has been developed for plotting spectra generated from different nuclear analysis techniques and discrete data for the VT240 graphics and compatible IBM PC using ST240 configuration. We have used the Remote Graphics Instruction Set (ReGIS) commands

  2. Acceleration of Linear Finite-Difference Poisson-Boltzmann Methods on Graphics Processing Units.

    Science.gov (United States)

    Qi, Ruxi; Botello-Smith, Wesley M; Luo, Ray

    2017-07-11

    Electrostatic interactions play crucial roles in biophysical processes such as protein folding and molecular recognition. Poisson-Boltzmann equation (PBE)-based models have emerged as widely used in modeling these important processes. Though great efforts have been put into developing efficient PBE numerical models, challenges still remain due to the high dimensionality of typical biomolecular systems. In this study, we implemented and analyzed commonly used linear PBE solvers for the ever-improving graphics processing units (GPU) for biomolecular simulations, including both standard and preconditioned conjugate gradient (CG) solvers with several alternative preconditioners. Our implementation utilizes the standard Nvidia CUDA libraries cuSPARSE, cuBLAS, and CUSP. Extensive tests show that good numerical accuracy can be achieved given that the single precision is often used for numerical applications on GPU platforms. The optimal GPU performance was observed with the Jacobi-preconditioned CG solver, with a significant speedup over standard CG solver on CPU in our diversified test cases. Our analysis further shows that different matrix storage formats also considerably affect the efficiency of different linear PBE solvers on GPU, with the diagonal format best suited for our standard finite-difference linear systems. Further efficiency may be possible with matrix-free operations and integrated grid stencil setup specifically tailored for the banded matrices in PBE-specific linear systems.

  3. Scribl: an HTML5 Canvas-based graphics library for visualizing genomic data over the web.

    Science.gov (United States)

    Miller, Chase A; Anthony, Jon; Meyer, Michelle M; Marth, Gabor

    2013-02-01

    High-throughput biological research requires simultaneous visualization as well as analysis of genomic data, e.g. read alignments, variant calls and genomic annotations. Traditionally, such integrative analysis required desktop applications operating on locally stored data. Many current terabyte-size datasets generated by large public consortia projects, however, are already only feasibly stored at specialist genome analysis centers. As even small laboratories can afford very large datasets, local storage and analysis are becoming increasingly limiting, and it is likely that most such datasets will soon be stored remotely, e.g. in the cloud. These developments will require web-based tools that enable users to access, analyze and view vast remotely stored data with a level of sophistication and interactivity that approximates desktop applications. As rapidly dropping cost enables researchers to collect data intended to answer questions in very specialized contexts, developers must also provide software libraries that empower users to implement customized data analyses and data views for their particular application. Such specialized, yet lightweight, applications would empower scientists to better answer specific biological questions than possible with general-purpose genome browsers currently available. Using recent advances in core web technologies (HTML5), we developed Scribl, a flexible genomic visualization library specifically targeting coordinate-based data such as genomic features, DNA sequence and genetic variants. Scribl simplifies the development of sophisticated web-based graphical tools that approach the dynamism and interactivity of desktop applications. Software is freely available online at http://chmille4.github.com/Scribl/ and is implemented in JavaScript with all modern browsers supported.

  4. Interactive Graphic Journalism

    Directory of Open Access Journals (Sweden)

    Laura Schlichting

    2016-12-01

    Full Text Available This paper examines graphic journalism (GJ in a transmedial context, and argues that transmedial graphic journalism (TMGJ is an important and fruitful new form of visual storytelling, that will re-invigorate the field of journalism, as it steadily tests out and plays with new media, ultimately leading to new challenges in both the production and reception process. With TMGJ, linear narratives may be broken up and ethical issues concerning the emotional and entertainment value are raised when it comes to ‘playing the news’. The aesthetic characteristics of TMGJ will be described and interactivity’s influence on non-fiction storytelling will be explored in an analysis of The Nisoor Square Shooting (2011 and Ferguson Firsthand (2015.

  5. Visual gut punch: persuasion, emotion, and the constitutional meaning of graphic disclosure.

    Science.gov (United States)

    Goodman, Ellen P

    2014-01-01

    The ability of government to "nudge" with information mandates, or merely to inform consumers of risks, is circumscribed by First Amendment interests that have been poorly articulated. New graphic cigarette warning labels supplied courts with the first opportunity to assess the informational interests attending novel forms of product disclosures. The D.C. Circuit enjoined them as unconstitutional, compelled by a narrative that the graphic labels converted government from objective informer to ideological persuader, shouting its warning to manipulate consumer decisions. This interpretation will leave little room for graphic disclosure and is already being used to challenge textual disclosure requirements (such as county-of-origin labeling) as unconstitutional. Graphic warning and the increasing reliance on regulation-by-disclosure present new free speech quandaries related to consumer autonomy, state normativity, and speaker liberty. This Article examines the distinct goals of product disclosure requirements and how those goals may serve to vindicate, or to frustrate, listener interests. I argue that many disclosures, and especially warnings, are necessarily both normative and informative, expressing value along with fact. It is not the existence of a norm that raises constitutional concern but rather the insistence on a controversial norm. Turning to the means of disclosure, this Article examines how emotional and graphic communication might change the constitutional calculus. Using autonomy theory and the communications research on speech processing, I conclude that disclosures do not bypass reason simply by reaching for the heart. If large graphic labels are unconstitutional, it will be because of undue burden on the speaker, not because they are emotionally powerful. This Article makes the following distinct contributions to the compelled commercial speech literature: critiques the leading precedent, Zauderer v. Office of Disciplinary Counsel, from a consumer

  6. Microarray Я US: a user-friendly graphical interface to Bioconductor tools that enables accurate microarray data analysis and expedites comprehensive functional analysis of microarray results.

    Science.gov (United States)

    Dai, Yilin; Guo, Ling; Li, Meng; Chen, Yi-Bu

    2012-06-08

    Microarray data analysis presents a significant challenge to researchers who are unable to use the powerful Bioconductor and its numerous tools due to their lack of knowledge of R language. Among the few existing software programs that offer a graphic user interface to Bioconductor packages, none have implemented a comprehensive strategy to address the accuracy and reliability issue of microarray data analysis due to the well known probe design problems associated with many widely used microarray chips. There is also a lack of tools that would expedite the functional analysis of microarray results. We present Microarray Я US, an R-based graphical user interface that implements over a dozen popular Bioconductor packages to offer researchers a streamlined workflow for routine differential microarray expression data analysis without the need to learn R language. In order to enable a more accurate analysis and interpretation of microarray data, we incorporated the latest custom probe re-definition and re-annotation for Affymetrix and Illumina chips. A versatile microarray results output utility tool was also implemented for easy and fast generation of input files for over 20 of the most widely used functional analysis software programs. Coupled with a well-designed user interface, Microarray Я US leverages cutting edge Bioconductor packages for researchers with no knowledge in R language. It also enables a more reliable and accurate microarray data analysis and expedites downstream functional analysis of microarray results.

  7. A Live-Time Relation: Motion Graphics meets Classical Music

    DEFF Research Database (Denmark)

    Steijn, Arthur

    2014-01-01

    , liveness and atmosphere. The design model will be a framework for both academic analytical studies as well as for designing time-based narratives and visual concepts involving motion graphics in spatial contexts. I focus on cases in which both pre-rendered, and live generated motion graphics are designed......In our digital age, we frequently meet fine examples of live performances of classical music with accompanying visuals. Yet, we find very little theoretical or analytical work on the relation between classical music and digital temporal visuals, nor on the process of creating them. In this paper, I...... present segments of my work toward a working model for the process of design of visuals and motion graphics applied in spatial contexts. I show how various design elements and components: line and shape, tone and colour, time and timing, rhythm and movement interact with conceptualizations of space...

  8. A Graphical, Self-Organizing Approach to Classifying Electronic Meeting Output.

    Science.gov (United States)

    Orwig, Richard E.; Chen, Hsinchun; Nunamaker, Jay F., Jr.

    1997-01-01

    Describes research using an artificial intelligence approach in the application of a Kohonen Self-Organizing Map (SOM) to the problem of classification of electronic brainstorming output and an evaluation of the results. The graphical representation of textual data produced by the Kohonen SOM suggests many opportunities for improving information…

  9. A parallel approximate string matching under Levenshtein distance on graphics processing units using warp-shuffle operations.

    Directory of Open Access Journals (Sweden)

    ThienLuan Ho

    Full Text Available Approximate string matching with k-differences has a number of practical applications, ranging from pattern recognition to computational biology. This paper proposes an efficient memory-access algorithm for parallel approximate string matching with k-differences on Graphics Processing Units (GPUs. In the proposed algorithm, all threads in the same GPUs warp share data using warp-shuffle operation instead of accessing the shared memory. Moreover, we implement the proposed algorithm by exploiting the memory structure of GPUs to optimize its performance. Experiment results for real DNA packages revealed that the performance of the proposed algorithm and its implementation archived up to 122.64 and 1.53 times compared to that of sequential algorithm on CPU and previous parallel approximate string matching algorithm on GPUs, respectively.

  10. Graphics processing units accelerated semiclassical initial value representation molecular dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Tamascelli, Dario; Dambrosio, Francesco Saverio [Dipartimento di Fisica, Università degli Studi di Milano, via Celoria 16, 20133 Milano (Italy); Conte, Riccardo [Department of Chemistry and Cherry L. Emerson Center for Scientific Computation, Emory University, Atlanta, Georgia 30322 (United States); Ceotto, Michele, E-mail: michele.ceotto@unimi.it [Dipartimento di Chimica, Università degli Studi di Milano, via Golgi 19, 20133 Milano (Italy)

    2014-05-07

    This paper presents a Graphics Processing Units (GPUs) implementation of the Semiclassical Initial Value Representation (SC-IVR) propagator for vibrational molecular spectroscopy calculations. The time-averaging formulation of the SC-IVR for power spectrum calculations is employed. Details about the GPU implementation of the semiclassical code are provided. Four molecules with an increasing number of atoms are considered and the GPU-calculated vibrational frequencies perfectly match the benchmark values. The computational time scaling of two GPUs (NVIDIA Tesla C2075 and Kepler K20), respectively, versus two CPUs (Intel Core i5 and Intel Xeon E5-2687W) and the critical issues related to the GPU implementation are discussed. The resulting reduction in computational time and power consumption is significant and semiclassical GPU calculations are shown to be environment friendly.

  11. pedigreejs: a web-based graphical pedigree editor.

    Science.gov (United States)

    Carver, Tim; Cunningham, Alex P; Babb de Villiers, Chantal; Lee, Andrew; Hartley, Simon; Tischkowitz, Marc; Walter, Fiona M; Easton, Douglas F; Antoniou, Antonis C

    2018-03-15

    The collection, management and visualization of clinical pedigree (family history) data is a core activity in clinical genetics centres. However, clinical pedigree datasets can be difficult to manage, as they are time consuming to capture, and can be difficult to build, manipulate and visualize graphically. Several standalone graphical pedigree editors and drawing applications exist but there are no freely available lightweight graphical pedigree editors that can be easily configured and incorporated into web applications. We developed 'pedigreejs', an interactive graphical pedigree editor written in JavaScript, which uses standard pedigree nomenclature. Pedigreejs provides an easily configurable, extensible and lightweight pedigree editor. It makes use of an open-source Javascript library to define a hierarchical layout and to produce images in scalable vector graphics (SVG) format that can be viewed and edited in web browsers. The software is freely available under GPL licence (https://ccge-boadicea.github.io/pedigreejs/). tjc29@cam.ac.uk. Supplementary data are available at Bioinformatics online.

  12. Implementation of RLS-based Adaptive Filterson nVIDIA GeForce Graphics Processing Unit

    OpenAIRE

    Hirano, Akihiro; Nakayama, Kenji

    2011-01-01

    This paper presents efficient implementa- tion of RLS-based adaptive filters with a large number of taps on nVIDIA GeForce graphics processing unit (GPU) and CUDA software development environment. Modification of the order and the combination of calcu- lations reduces the number of accesses to slow off-chip memory. Assigning tasks into multiple threads also takes memory access order into account. For a 4096-tap case, a GPU program is almost three times faster than a CPU program.

  13. High performance direct gravitational N-body simulations on graphics processing units II: An implementation in CUDA

    NARCIS (Netherlands)

    Belleman, R.G.; Bédorf, J.; Portegies Zwart, S.F.

    2008-01-01

    We present the results of gravitational direct N-body simulations using the graphics processing unit (GPU) on a commercial NVIDIA GeForce 8800GTX designed for gaming computers. The force evaluation of the N-body problem is implemented in "Compute Unified Device Architecture" (CUDA) using the GPU to

  14. Neurosurgical simulation by interactive computer graphics on iPad.

    Science.gov (United States)

    Maruyama, Keisuke; Kin, Taichi; Saito, Toki; Suematsu, Shinya; Gomyo, Miho; Noguchi, Akio; Nagane, Motoo; Shiokawa, Yoshiaki

    2014-11-01

    Presurgical simulation before complicated neurosurgery is a state-of-the-art technique, and its usefulness has recently become well known. However, simulation requires complex image processing, which hinders its widespread application. We explored handling the results of interactive computer graphics on the iPad tablet, which can easily be controlled anywhere. Data from preneurosurgical simulations from 12 patients (4 men, 8 women) who underwent complex brain surgery were loaded onto an iPad. First, DICOM data were loaded using Amira visualization software to create interactive computer graphics, and ParaView, another free visualization software package, was used to convert the results of the simulation to be loaded using the free iPad software KiwiViewer. The interactive computer graphics created prior to neurosurgery were successfully displayed and smoothly controlled on the iPad in all patients. The number of elements ranged from 3 to 13 (mean 7). The mean original data size was 233 MB, which was reduced to 10.4 MB (4.4% of original size) after image processing by ParaView. This was increased to 46.6 MB (19.9%) after decompression in KiwiViewer. Controlling the magnification, transfer, rotation, and selection of translucence in 10 levels of each element were smoothly and easily performed using one or two fingers. The requisite skill to smoothly control the iPad software was acquired within 1.8 trials on average in 12 medical students and 6 neurosurgical residents. Using an iPad to handle the result of preneurosurgical simulation was extremely useful because it could easily be handled anywhere.

  15. Harmonious graphics generating based on the 1/f function theory

    International Nuclear Information System (INIS)

    Mao Xia; Xue Yuli; Cheng, L.-L.; Sun Yun

    2007-01-01

    In the 1970s, Richard Voss and John Clarke researched the actual audio physical sound of music. There are three types of noise: white noise, 1/f noise, and Brownian motion noise (1/f 2 ). 1/f noise is found to be most pleasing to human ears. White noise is too random and Brownian noise is too correlated. Similarly, for 2-dimensional sources such as graphics and images, three characteristics of monotonous, harmony and muss are also observed. The 1/f fluctuation theory provides a good way to generate affective signals both for 1-dimensional and 2-dimensional signals. This paper provides an algorithm which can generate affective patterns or graphics and obey the criteria of affective information processing

  16. Data acquisition and analysis system for the Holifield Heavy Ion Research Facility

    International Nuclear Information System (INIS)

    Milner, W.T.; Biggerstaff, J.A.; Hensley, D.C.; Sayer, R.O.

    1979-01-01

    The Holifield Heavy Ion Research Facility is a national resource which will serve a large number of nuclear and atomic physicists who expect to perform experiments which vary widely in type and complexity. Although much consideration must be given to the problem of rapid acquisition and processing of many-parameter data, an equal emphasis will be placed on operational simplicity and the standardization of hardware and software. Two active experimental counting areas and two or more setup areas are served by three remotely located Perkin--Elmer 8/32 computers which are interfaced to the user equipment by means of three CAMAC branch highways. Other equipment includes a large disk system, alphanumeric/graphic terminals and printer--plotters located in each of the counting areas. The system operation as well as techniques for the rapid sorting of data into large (approx. 10 million channels) histograms on disk are discussed

  17. Acceleration of the OpenFOAM-based MHD solver using graphics processing units

    International Nuclear Information System (INIS)

    He, Qingyun; Chen, Hongli; Feng, Jingchao

    2015-01-01

    Highlights: • A 3D PISO-MHD was implemented on Kepler-class graphics processing units (GPUs) using CUDA technology. • A consistent and conservative scheme is used in the code which was validated by three basic benchmarks in a rectangular and round ducts. • Parallelized of CPU and GPU acceleration were compared relating to single core CPU in MHD problems and non-MHD problems. • Different preconditions for solving MHD solver were compared and the results showed that AMG method is better for calculations. - Abstract: The pressure-implicit with splitting of operators (PISO) magnetohydrodynamics MHD solver of the couple of Navier–Stokes equations and Maxwell equations was implemented on Kepler-class graphics processing units (GPUs) using the CUDA technology. The solver is developed on open source code OpenFOAM based on consistent and conservative scheme which is suitable for simulating MHD flow under strong magnetic field in fusion liquid metal blanket with structured or unstructured mesh. We verified the validity of the implementation on several standard cases including the benchmark I of Shercliff and Hunt's cases, benchmark II of fully developed circular pipe MHD flow cases and benchmark III of KIT experimental case. Computational performance of the GPU implementation was examined by comparing its double precision run times with those of essentially the same algorithms and meshes. The resulted showed that a GPU (GTX 770) can outperform a server-class 4-core, 8-thread CPU (Intel Core i7-4770k) by a factor of 2 at least.

  18. Acceleration of the OpenFOAM-based MHD solver using graphics processing units

    Energy Technology Data Exchange (ETDEWEB)

    He, Qingyun; Chen, Hongli, E-mail: hlchen1@ustc.edu.cn; Feng, Jingchao

    2015-12-15

    Highlights: • A 3D PISO-MHD was implemented on Kepler-class graphics processing units (GPUs) using CUDA technology. • A consistent and conservative scheme is used in the code which was validated by three basic benchmarks in a rectangular and round ducts. • Parallelized of CPU and GPU acceleration were compared relating to single core CPU in MHD problems and non-MHD problems. • Different preconditions for solving MHD solver were compared and the results showed that AMG method is better for calculations. - Abstract: The pressure-implicit with splitting of operators (PISO) magnetohydrodynamics MHD solver of the couple of Navier–Stokes equations and Maxwell equations was implemented on Kepler-class graphics processing units (GPUs) using the CUDA technology. The solver is developed on open source code OpenFOAM based on consistent and conservative scheme which is suitable for simulating MHD flow under strong magnetic field in fusion liquid metal blanket with structured or unstructured mesh. We verified the validity of the implementation on several standard cases including the benchmark I of Shercliff and Hunt's cases, benchmark II of fully developed circular pipe MHD flow cases and benchmark III of KIT experimental case. Computational performance of the GPU implementation was examined by comparing its double precision run times with those of essentially the same algorithms and meshes. The resulted showed that a GPU (GTX 770) can outperform a server-class 4-core, 8-thread CPU (Intel Core i7-4770k) by a factor of 2 at least.

  19. Animations to illustrate the Autocalibration process of accelerometer data

    OpenAIRE

    van Hees, Vincent

    2014-01-01

    .avi files: animation_a_6_3D_prepostautocal.avi – 3D animation showing static points before and after autocalibration process. animation_c_4_2D_duringautocal.avi – 2D animation showing how static points moved during autocalibration. animation_d_3_3D_duringautocal.avi – 3D animation showing how static points moved during autocalibration. Supplementary graphics for paper "Autocalibration of accelerometer data collected in daily life based on local gravity and temperature: an evalua...

  20. Specifying the Graphic Characteristics of Words That Influence Children's Handwriting

    Science.gov (United States)

    Gosse, Claire; Carbonnelle, Simon; de Vleeschouwer, Christophe; Van Reybroeck, Marie

    2018-01-01

    Research about the development of the graphomotor side of writing is very scarce. The goal of this study was to gain a better understanding of what constitutes graphic complexity of written material by determining the impact of graphic characteristics on handwriting production. In this end, the pen stroke of cursive handwriting was precisely…

  1. High-performance dynamic quantum clustering on graphics processors

    Energy Technology Data Exchange (ETDEWEB)

    Wittek, Peter, E-mail: peterwittek@acm.org [Swedish School of Library and Information Science, University of Boras, Boras (Sweden)

    2013-01-15

    Clustering methods in machine learning may benefit from borrowing metaphors from physics. Dynamic quantum clustering associates a Gaussian wave packet with the multidimensional data points and regards them as eigenfunctions of the Schroedinger equation. The clustering structure emerges by letting the system evolve and the visual nature of the algorithm has been shown to be useful in a range of applications. Furthermore, the method only uses matrix operations, which readily lend themselves to parallelization. In this paper, we develop an implementation on graphics hardware and investigate how this approach can accelerate the computations. We achieve a speedup of up to two magnitudes over a multicore CPU implementation, which proves that quantum-like methods and acceleration by graphics processing units have a great relevance to machine learning.

  2. Vortex particle method in parallel computations on graphical processing units used in study of the evolution of vortex structures

    International Nuclear Information System (INIS)

    Kudela, Henryk; Kosior, Andrzej

    2014-01-01

    Understanding the dynamics and the mutual interaction among various types of vortical motions is a key ingredient in clarifying and controlling fluid motion. In the paper several different cases related to vortex tube interactions are presented. Due to problems with very long computation times on the single processor, the vortex-in-cell (VIC) method is implemented on the multicore architecture of a graphics processing unit (GPU). Numerical results of leapfrogging of two vortex rings for inviscid and viscous fluid are presented as test cases for the new multi-GPU implementation of the VIC method. Influence of the Reynolds number on the reconnection process is shown for two examples: antiparallel vortex tubes and orthogonally offset vortex tubes. Our aim is to show the great potential of the VIC method for solutions of three-dimensional flow problems and that the VIC method is very well suited for parallel computation. (paper)

  3. VACTIV: A graphical dialog based program for an automatic processing of line and band spectra

    Science.gov (United States)

    Zlokazov, V. B.

    2013-05-01

    and estimation of parameters of interest. VACTIV can run on any standard modern laptop. Reasons for the new version: At the time of its creation (1999) VACTIV was seemingly the first attempt to apply the newest programming languages and styles to systems of spectrum analysis. Its goal was to both get a convenient and efficient technique for data processing, and to elaborate the formalism of spectrum analysis in terms of classes, their properties, their methods and events of an object-oriented programming language. Summary of revisions: Compared with ACTIV, VACTIV preserves all the mathematical algorithms, but provides the user with all the benefits of an interface, based on a graphical dialog. It allows him to make a quick intervention in the work of the program; in particular, to carry out the on-line control of the fitting process: depending on the intermediate results and using the visual form of data representation, to change the conditions for the fitting and so achieve the optimum performance, selecting the optimum strategy. To find the best conditions for the fitting one can compress the spectrum, delete the blunders from it, smooth it using a high-frequency spline filter and build the background using a low-frequency spline filter; use not only automatic methods for the blunder deletion, the peak search, the peak model forming and the calibration, but also use manual mouse clicking on the spectrum graph. Restrictions: To enhance the reliability and portability of the program the majority of the most important arrays have a static allocation; all the arrays are allocated with a surplus, and the total pool of the program is restricted only by the size of the computer virtual memory. A spectrum has the static size of 32 K real words. The maximum size of the least-square matrix is 314 (the maximum number of fitted parameters per one analyzed spectrum interval, not for the whole spectrum), from which it follows that the maximum number of peaks in one spectrum

  4. An Opening: Graphic Design's Discursive Spaces.

    Science.gov (United States)

    Blauvelt, Andrew

    1994-01-01

    Introduces a special issue on critical histories of graphic design with a review of the particular problems identified with the history of graphic design as a field of study and the emerging discipline of graphic design history. Makes a case for the examination of graphic design through its relationships with larger discourses. (SR)

  5. High Performance Processing and Analysis of Geospatial Data Using CUDA on GPU

    Directory of Open Access Journals (Sweden)

    STOJANOVIC, N.

    2014-11-01

    Full Text Available In this paper, the high-performance processing of massive geospatial data on many-core GPU (Graphic Processing Unit is presented. We use CUDA (Compute Unified Device Architecture programming framework to implement parallel processing of common Geographic Information Systems (GIS algorithms, such as viewshed analysis and map-matching. Experimental evaluation indicates the improvement in performance with respect to CPU-based solutions and shows feasibility of using GPU and CUDA for parallel implementation of GIS algorithms over large-scale geospatial datasets.

  6. An interactive graphics program to retrieve, display, compare, manipulate, curve fit, difference and cross plot wind tunnel data

    Science.gov (United States)

    Elliott, R. D.; Werner, N. M.; Baker, W. M.

    1975-01-01

    The Aerodynamic Data Analysis and Integration System (ADAIS), developed as a highly interactive computer graphics program capable of manipulating large quantities of data such that addressable elements of a data base can be called up for graphic display, compared, curve fit, stored, retrieved, differenced, etc., was described. The general nature of the system is evidenced by the fact that limited usage has already occurred with data bases consisting of thermodynamic, basic loads, and flight dynamics data. Productivity using ADAIS of five times that for conventional manual methods of wind tunnel data analysis is routinely achieved. In wind tunnel data analysis, data from one or more runs of a particular test may be called up and displayed along with data from one or more runs of a different test. Curves may be faired through the data points by any of four methods, including cubic spline and least squares polynomial fit up to seventh order.

  7. Graphical models for inferring single molecule dynamics

    Directory of Open Access Journals (Sweden)

    Gonzalez Ruben L

    2010-10-01

    Full Text Available Abstract Background The recent explosion of experimental techniques in single molecule biophysics has generated a variety of novel time series data requiring equally novel computational tools for analysis and inference. This article describes in general terms how graphical modeling may be used to learn from biophysical time series data using the variational Bayesian expectation maximization algorithm (VBEM. The discussion is illustrated by the example of single-molecule fluorescence resonance energy transfer (smFRET versus time data, where the smFRET time series is modeled as a hidden Markov model (HMM with Gaussian observables. A detailed description of smFRET is provided as well. Results The VBEM algorithm returns the model’s evidence and an approximating posterior parameter distribution given the data. The former provides a metric for model selection via maximum evidence (ME, and the latter a description of the model’s parameters learned from the data. ME/VBEM provide several advantages over the more commonly used approach of maximum likelihood (ML optimized by the expectation maximization (EM algorithm, the most important being a natural form of model selection and a well-posed (non-divergent optimization problem. Conclusions The results demonstrate the utility of graphical modeling for inference of dynamic processes in single molecule biophysics.

  8. Signage and wayfinding design a complete guide to creating environmental graphic design systems

    CERN Document Server

    Calori, Chris

    2015-01-01

    A new edition of the market-leading guide to signage and wayfinding design This new edition of Signage and Wayfinding Design: A Complete Guide to Creating Environmental Graphic Design Systems has been fully updated to offer you the latest, most comprehensive coverage of the environmental design process-from research and design development to project execution. Utilizing a cross-disciplinary approach that makes the information relevant to architects, interior designers, landscape architects, graphic designers, and industrial engineers alike, the book arms you with the skills needed to apply a

  9. Modeling and processing for next-generation big-data technologies with applications and case studies

    CERN Document Server

    Barolli, Leonard; Barolli, Admir; Papajorgji, Petraq

    2015-01-01

    This book covers the latest advances in Big Data technologies and provides the readers with a comprehensive review of the state-of-the-art in Big Data processing, analysis, analytics, and other related topics. It presents new models, algorithms, software solutions and methodologies, covering the full data cycle, from data gathering to their visualization and interaction, and includes a set of case studies and best practices. New research issues, challenges and opportunities shaping the future agenda in the field of Big Data are also identified and presented throughout the book, which is intended for researchers, scholars, advanced students, software developers and practitioners working at the forefront in their field.

  10. Deterministic Graphical Games Revisited

    DEFF Research Database (Denmark)

    Andersson, Daniel; Hansen, Kristoffer Arnsfelt; Miltersen, Peter Bro

    2008-01-01

    We revisit the deterministic graphical games of Washburn. A deterministic graphical game can be described as a simple stochastic game (a notion due to Anne Condon), except that we allow arbitrary real payoffs but disallow moves of chance. We study the complexity of solving deterministic graphical...... games and obtain an almost-linear time comparison-based algorithm for computing an equilibrium of such a game. The existence of a linear time comparison-based algorithm remains an open problem....

  11. MENGEMBANGKAN KEMAMPUAN MEMBACA MELALUI GRAPHIC ORGANIZERS

    Directory of Open Access Journals (Sweden)

    Pudiyono Pudiyono

    2015-08-01

    Full Text Available The goal of the research was to develop a comprehension based reading model through graphic organizers. The subjects of the research were fourth semester students of English Education Department of Muhammadiyah University of Purwokerto. The test was aimed at getting clear descriptions on the students’ reading comprehension. The result of data analysis showed that most reading classes (>80 still applied teacher centered approach, even though the teacher did not apply reading classes in rhetoric way anymore. From the observation, it was clear that the exercises designed were not exactly based on contextual understanding. Exercises on difficult words were still much focused on a lexically based work out. Besides that, the reading instructions did not really focus on the whole text content comprehension as the exercises were much oriented on partial comprehension questions, for example just by asking and discussing the main idea of paragraphs. To make the instructions even worse, the teacher did not have a good mind set in applying cooperative nor collaborative learning activities. Some good points the teacher had were on no reading aloud and translation activities. The questionnaire result showed that the student’s interest was not as high as we expected. The test result on their comprehension in their pre-test only showed 47. 50. To solve this problem, the reading classes applied graphic organizers in cooperative and collaborative learning activities. After treatments, the result of their comprehension developed well, from pre-test score 47. 50 to 56. 13 in their post test. This means that their comprehension achievement improved 18.17 %. After having well-rehearsed in reading on graphic organizers, the students did not only get better comprehension results but they also had better-than-desired involvement in classroom activities. Students also commented that they had better understanding and better psychological feeling in reading English

  12. The affect of the use of graphical materials on teaching kinematics

    International Nuclear Information System (INIS)

    Yener, D.

    2005-01-01

    In this study, a review of literature about graphical materials and kinematics was done. Preparing traditional questions supported by graphical materials on kinematics that applied to 119 first year students at secondary education mathematics department, and physics, chemistry and biology departments at Selcuk University Educational Faculty. The effect of the usage of graphical materials on teaching kinematics were searched. The data obtained from traditional questions and graphical questions were evaluated by using SPSS (Statistical Social Science for Package Program). At the end of this evaluation, it is obvious that if kinematics are taught with graphical materials, students can learn the subject better, thus, they solve the questions easierly and more rapidly. As a result, the students were more successful to solve the questions with graphical materials than traditional questions

  13. Extending Graphic Statics for User-Controlled Structural Morphogenesis

    OpenAIRE

    Fivet, Corentin; Zastavni, Denis; Cap, Jean-François; Structural Morphology Group International Seminar 2011

    2011-01-01

    The first geometrical definitions of any structure are of primary importance when considering pertinence and efficiency in structural design processes. Engineering history has taught us how graphic statics can be a very powerful tool since it allows the designer to take shapes and forces into account simultaneously. However, current and past graphic statics methods are more suitable for analysis than structural morphogenesis. This contribution introduces new graphical methods that can supp...

  14. NLEdit: A generic graphical user interface for Fortran programs

    Science.gov (United States)

    Curlett, Brian P.

    1994-01-01

    NLEdit is a generic graphical user interface for the preprocessing of Fortran namelist input files. The interface consists of a menu system, a message window, a help system, and data entry forms. A form is generated for each namelist. The form has an input field for each namelist variable along with a one-line description of that variable. Detailed help information, default values, and minimum and maximum allowable values can all be displayed via menu picks. Inputs are processed through a scientific calculator program that allows complex equations to be used instead of simple numeric inputs. A custom user interface is generated simply by entering information about the namelist input variables into an ASCII file. There is no need to learn a new graphics system or programming language. NLEdit can be used as a stand-alone program or as part of a larger graphical user interface. Although NLEdit is intended for files using namelist format, it can be easily modified to handle other file formats.

  15. A pipeline for comprehensive and automated processing of electron diffraction data in IPLT.

    Science.gov (United States)

    Schenk, Andreas D; Philippsen, Ansgar; Engel, Andreas; Walz, Thomas

    2013-05-01

    Electron crystallography of two-dimensional crystals allows the structural study of membrane proteins in their native environment, the lipid bilayer. Determining the structure of a membrane protein at near-atomic resolution by electron crystallography remains, however, a very labor-intense and time-consuming task. To simplify and accelerate the data processing aspect of electron crystallography, we implemented a pipeline for the processing of electron diffraction data using the Image Processing Library and Toolbox (IPLT), which provides a modular, flexible, integrated, and extendable cross-platform, open-source framework for image processing. The diffraction data processing pipeline is organized as several independent modules implemented in Python. The modules can be accessed either from a graphical user interface or through a command line interface, thus meeting the needs of both novice and expert users. The low-level image processing algorithms are implemented in C++ to achieve optimal processing performance, and their interface is exported to Python using a wrapper. For enhanced performance, the Python processing modules are complemented with a central data managing facility that provides a caching infrastructure. The validity of our data processing algorithms was verified by processing a set of aquaporin-0 diffraction patterns with the IPLT pipeline and comparing the resulting merged data set with that obtained by processing the same diffraction patterns with the classical set of MRC programs. Copyright © 2013 Elsevier Inc. All rights reserved.

  16. Computer-Based Tools for Evaluating Graphical User Interfaces

    Science.gov (United States)

    Moore, Loretta A.

    1997-01-01

    The user interface is the component of a software system that connects two very complex system: humans and computers. Each of these two systems impose certain requirements on the final product. The user is the judge of the usability and utility of the system; the computer software and hardware are the tools with which the interface is constructed. Mistakes are sometimes made in designing and developing user interfaces because the designers and developers have limited knowledge about human performance (e.g., problem solving, decision making, planning, and reasoning). Even those trained in user interface design make mistakes because they are unable to address all of the known requirements and constraints on design. Evaluation of the user inter-face is therefore a critical phase of the user interface development process. Evaluation should not be considered the final phase of design; but it should be part of an iterative design cycle with the output of evaluation being feed back into design. The goal of this research was to develop a set of computer-based tools for objectively evaluating graphical user interfaces. The research was organized into three phases. The first phase resulted in the development of an embedded evaluation tool which evaluates the usability of a graphical user interface based on a user's performance. An expert system to assist in the design and evaluation of user interfaces based upon rules and guidelines was developed during the second phase. During the final phase of the research an automatic layout tool to be used in the initial design of graphical inter- faces was developed. The research was coordinated with NASA Marshall Space Flight Center's Mission Operations Laboratory's efforts in developing onboard payload display specifications for the Space Station.

  17. Atomic data for fusion

    Energy Technology Data Exchange (ETDEWEB)

    Hunter, H.T.; Kirkpatrick, M.I.; Alvarez, I.; Cisneros, C.; Phaneuf, R.A. (eds.); Barnett, C.F.

    1990-07-01

    This report provides a handbook of recommended cross-section and rate-coefficient data for inelastic collisions between hydrogen, helium and lithium atoms, molecules and ions, and encompasses more than 400 different reactions of primary interest in fusion research. Published experimental and theoretical data have been collected and evaluated, and the recommended data are presented in tabular, graphical and parametrized form. Processes include excitation and spectral line emission, charge exchange, ionization, stripping, dissociation and particle interchange reactions. The range of collision energies is appropriate to applications in fusion-energy research.

  18. Atomic data for fusion

    International Nuclear Information System (INIS)

    Hunter, H.T.; Kirkpatrick, M.I.; Alvarez, I.; Cisneros, C.; Phaneuf, R.A.; Barnett, C.F.

    1990-07-01

    This report provides a handbook of recommended cross-section and rate-coefficient data for inelastic collisions between hydrogen, helium and lithium atoms, molecules and ions, and encompasses more than 400 different reactions of primary interest in fusion research. Published experimental and theoretical data have been collected and evaluated, and the recommended data are presented in tabular, graphical and parametrized form. Processes include excitation and spectral line emission, charge exchange, ionization, stripping, dissociation and particle interchange reactions. The range of collision energies is appropriate to applications in fusion-energy research

  19. Monte Carlo MP2 on Many Graphical Processing Units.

    Science.gov (United States)

    Doran, Alexander E; Hirata, So

    2016-10-11

    In the Monte Carlo second-order many-body perturbation (MC-MP2) method, the long sum-of-product matrix expression of the MP2 energy, whose literal evaluation may be poorly scalable, is recast into a single high-dimensional integral of functions of electron pair coordinates, which is evaluated by the scalable method of Monte Carlo integration. The sampling efficiency is further accelerated by the redundant-walker algorithm, which allows a maximal reuse of electron pairs. Here, a multitude of graphical processing units (GPUs) offers a uniquely ideal platform to expose multilevel parallelism: fine-grain data-parallelism for the redundant-walker algorithm in which millions of threads compute and share orbital amplitudes on each GPU; coarse-grain instruction-parallelism for near-independent Monte Carlo integrations on many GPUs with few and infrequent interprocessor communications. While the efficiency boost by the redundant-walker algorithm on central processing units (CPUs) grows linearly with the number of electron pairs and tends to saturate when the latter exceeds the number of orbitals, on a GPU it grows quadratically before it increases linearly and then eventually saturates at a much larger number of pairs. This is because the orbital constructions are nearly perfectly parallelized on a GPU and thus completed in a near-constant time regardless of the number of pairs. In consequence, an MC-MP2/cc-pVDZ calculation of a benzene dimer is 2700 times faster on 256 GPUs (using 2048 electron pairs) than on two CPUs, each with 8 cores (which can use only up to 256 pairs effectively). We also numerically determine that the cost to achieve a given relative statistical uncertainty in an MC-MP2 energy increases as O(n 3 ) or better with system size n, which may be compared with the O(n 5 ) scaling of the conventional implementation of deterministic MP2. We thus establish the scalability of MC-MP2 with both system and computer sizes.

  20. Using R and RStudio for data management, statistical analysis and graphics

    CERN Document Server

    Horton, Nicholas J

    2015-01-01

    This is the second edition of the popular book on using R for statistical analysis and graphics. The authors, who run a popular blog supplementing their books, have focused on adding many new examples to this new edition. These examples are presented primarily in new chapters based on the following themes: simulation, probability, statistics, mathematics/computing, and graphics. The authors have also added many other updates, including a discussion of RStudio-a very popular development environment for R.

  1. Rough surface scattering simulations using graphics cards

    International Nuclear Information System (INIS)

    Klapetek, Petr; Valtr, Miroslav; Poruba, Ales; Necas, David; Ohlidal, Miloslav

    2010-01-01

    In this article we present results of rough surface scattering calculations using a graphical processing unit implementation of the Finite Difference in Time Domain algorithm. Numerical results are compared to real measurements and computational performance is compared to computer processor implementation of the same algorithm. As a basis for computations, atomic force microscope measurements of surface morphology are used. It is shown that the graphical processing unit capabilities can be used to speedup presented computationally demanding algorithms without loss of precision.

  2. Measurement of Spatial Ability in an Introductory Graphic Communications Course

    Science.gov (United States)

    Kelly, Walter F., Jr.

    2012-01-01

    Published articles on spatial ability can be found in the fields of psychology and graphics education. In the "Engineering Design Graphics Journal" for 1936-1978, six articles concerning visualization (spatial ability) were listed. As published graphics research increased, the journal (1975-1996) listed 28 articles in the visualization…

  3. Efficient Acceleration of the Pair-HMMs Forward Algorithm for GATK HaplotypeCaller on Graphics Processing Units.

    Science.gov (United States)

    Ren, Shanshan; Bertels, Koen; Al-Ars, Zaid

    2018-01-01

    GATK HaplotypeCaller (HC) is a popular variant caller, which is widely used to identify variants in complex genomes. However, due to its high variants detection accuracy, it suffers from long execution time. In GATK HC, the pair-HMMs forward algorithm accounts for a large percentage of the total execution time. This article proposes to accelerate the pair-HMMs forward algorithm on graphics processing units (GPUs) to improve the performance of GATK HC. This article presents several GPU-based implementations of the pair-HMMs forward algorithm. It also analyzes the performance bottlenecks of the implementations on an NVIDIA Tesla K40 card with various data sets. Based on these results and the characteristics of GATK HC, we are able to identify the GPU-based implementations with the highest performance for the various analyzed data sets. Experimental results show that the GPU-based implementations of the pair-HMMs forward algorithm achieve a speedup of up to 5.47× over existing GPU-based implementations.

  4. A data processing program for transient sodium boiling and fuel failure propagation tests, (2)

    International Nuclear Information System (INIS)

    Hasebe, Takeshi; Isozaki, Tadashi; Satoh, Akihiro; Yamaguchi, Katsuhisa; Haga, Kazuo.

    1983-01-01

    Transient Sodium Boiling Tests and Fuel Failure Propagation Tests are being conducted with the out-of-pile test facility, SIENA, in the Core Safety Section of O-arai Engineering Center. The experimental data are recorded using a digital data acquisition system controlled by a HP-1000E computer. The SICILIAN (Speedy Illustration Code for Inspection Line Anomaly) code was developed to obtain quick graphic outputs of data recorded in the magnetic tapes. The program is written in BASIC and Assembler languages and uses a data processing system composed of a desktop computer HP 9845B, a magnetic tape system, a magnetic disc and an eightcolor plotter. The SICILIAN code enables us to get graphic outputs soon after a run. These outputs are very helpful to inspect anomaly in the instrument circuit and to check the experimental conditions of coming runs. (author)

  5. Research and development in the Institute for Data Processing in Engineering (IDT)

    International Nuclear Information System (INIS)

    Trauboth, H.

    1980-01-01

    The integration of the IDT within the nuclear research center at Karlsruhe (KfK) is of special importance because in the field of nuclear engineering a great number of problem areas exist, which can successfully be dealt with only if advanced data processing technology is used that cannot be found on the market. Often, in solving specific problems of nuclear engineering, at the same time novel applications and method results which are of use outside the nuclear field, too, and which may be applied directly or after being adapted. The condition for this is that new concepts, method and tools are developed on a long-term basis in the IDT just in view of the creation of extremely reliable and safe data processing systems. (orig.) [de

  6. Role of computer graphics in space telerobotics - Preview and predictive displays

    Science.gov (United States)

    Bejczy, Antal K.; Venema, Steven; Kim, Won S.

    1991-01-01

    The application of computer graphics in space telerobotics research and development work is briefly reviewed and illustrated by specific examples implemented in real time operation. The applications are discussed under the following four major categories: preview displays, predictive displays, sensor data displays, and control system status displays.

  7. Downsizer - A Graphical User Interface-Based Application for Browsing, Acquiring, and Formatting Time-Series Data for Hydrologic Modeling

    Science.gov (United States)

    Ward-Garrison, Christian; Markstrom, Steven L.; Hay, Lauren E.

    2009-01-01

    The U.S. Geological Survey Downsizer is a computer application that selects, downloads, verifies, and formats station-based time-series data for environmental-resource models, particularly the Precipitation-Runoff Modeling System. Downsizer implements the client-server software architecture. The client presents a map-based, graphical user interface that is intuitive to modelers; the server provides streamflow and climate time-series data from over 40,000 measurement stations across the United States. This report is the Downsizer user's manual and provides (1) an overview of the software design, (2) installation instructions, (3) a description of the graphical user interface, (4) a description of selected output files, and (5) troubleshooting information.

  8. Graphic Organizer in Action: Solving Secondary Mathematics Word Problems

    Directory of Open Access Journals (Sweden)

    Khoo Jia Sian

    2016-09-01

    Full Text Available Mathematics word problems are one of the most challenging topics to learn and teach in secondary schools. This is especially the case in countries where English is not the first language for the majority of the people, such as in Brunei Darussalam. Researchers proclaimed that limited language proficiency and limited Mathematics strategies are the possible causes to this problem. However, whatever the reason is behind difficulties students face in solving Mathematical word problems, it is perhaps the teaching and learning of the Mathematics that need to be modified. For example, the use of four-square-and-a-diamond graphic organizer that infuses model drawing skill; and Polya’s problem solving principles, to solve Mathematical word problems may be some of the strategies that can help in improving students’ word problem solving skills. This study, through quantitative analysis found that the use of graphic organizer improved students’ performance in terms of Mathematical knowledge, Mathematical strategy and Mathematical explanation in solving word problems. Further qualitative analysis revealed that the use of graphic organizer boosted students’ confidence level and positive attitudes towards solving word problems.Keywords: Word Problems, Graphic Organizer, Algebra, Action Research, Secondary School Mathematics DOI: http://dx.doi.org/10.22342/jme.7.2.3546.83-90

  9. Flocking-based Document Clustering on the Graphics Processing Unit

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Xiaohui [ORNL; Potok, Thomas E [ORNL; Patton, Robert M [ORNL; ST Charles, Jesse Lee [ORNL

    2008-01-01

    Abstract?Analyzing and grouping documents by content is a complex problem. One explored method of solving this problem borrows from nature, imitating the flocking behavior of birds. Each bird represents a single document and flies toward other documents that are similar to it. One limitation of this method of document clustering is its complexity O(n2). As the number of documents grows, it becomes increasingly difficult to receive results in a reasonable amount of time. However, flocking behavior, along with most naturally inspired algorithms such as ant colony optimization and particle swarm optimization, are highly parallel and have found increased performance on expensive cluster computers. In the last few years, the graphics processing unit (GPU) has received attention for its ability to solve highly-parallel and semi-parallel problems much faster than the traditional sequential processor. Some applications see a huge increase in performance on this new platform. The cost of these high-performance devices is also marginal when compared with the price of cluster machines. In this paper, we have conducted research to exploit this architecture and apply its strengths to the document flocking problem. Our results highlight the potential benefit the GPU brings to all naturally inspired algorithms. Using the CUDA platform from NIVIDA? we developed a document flocking implementation to be run on the NIVIDA?GEFORCE 8800. Additionally, we developed a similar but sequential implementation of the same algorithm to be run on a desktop CPU. We tested the performance of each on groups of news articles ranging in size from 200 to 3000 documents. The results of these tests were very significant. Performance gains ranged from three to nearly five times improvement of the GPU over the CPU implementation. This dramatic improvement in runtime makes the GPU a potentially revolutionary platform for document clustering algorithms.

  10. ggbio: an R package for extending the grammar of graphics for genomic data

    Science.gov (United States)

    2012-01-01

    We introduce ggbio, a new methodology to visualize and explore genomics annotations and high-throughput data. The plots provide detailed views of genomic regions, summary views of sequence alignments and splicing patterns, and genome-wide overviews with karyogram, circular and grand linear layouts. The methods leverage the statistical functionality available in R, the grammar of graphics and the data handling capabilities of the Bioconductor project. The plots are specified within a modular framework that enables users to construct plots in a systematic way, and are generated directly from Bioconductor data structures. The ggbio R package is available at http://www.bioconductor.org/packages/2.11/bioc/html/ggbio.html. PMID:22937822

  11. Determining Normal-Distribution Tolerance Bounds Graphically

    Science.gov (United States)

    Mezzacappa, M. A.

    1983-01-01

    Graphical method requires calculations and table lookup. Distribution established from only three points: mean upper and lower confidence bounds and lower confidence bound of standard deviation. Method requires only few calculations with simple equations. Graphical procedure establishes best-fit line for measured data and bounds for selected confidence level and any distribution percentile.

  12. Data from ‘Graphic Medicine’ as a Mental Health Information Resource: Insights from Comics Producers

    Directory of Open Access Journals (Sweden)

    Anthony Farthing

    2016-08-01

    Full Text Available This dataset contains the full text transcripts from 15 semi-structured interviews (approximately 44,100 words conducted during November and December 2014 with participants involved in various aspects of the process of health-related comics production. These participants are authors and publishers and their work is publicly recognised in the comics community. The dataset has been deposited in the Open Health Data Dataverse repository as a zipped folder containing 15 individual simple text files corresponding to each interview and a ReadMe file containing contextual information and other metadata.  An initial domain analysis of the interviews was published as Farthing, A., & Priego, E. (2016. ‘Graphic Medicine’ as a Mental Health Information Resource: Insights from Comics Producers. 'The Comics Grid: Journal of Comics Scholarship', 6(1, 3. DOI: http://doi.org/10.16995/cg.74

  13. Scientific and Graphic Design Foundations for C2

    Science.gov (United States)

    2007-06-01

    the elements in the composition. This section presents a summary of the concepts in graphic design layout, typography , color, and data graphics ...document for ease of use 4 Graphic Design • Layout – Literally the aesthetic of the display design • Typography – Serif versus san serif – Font sizes...12th ICCRTS “Adapting C2 to the 21st Century” Title: Scientific and graphic design foundations for C2 Topics: C2 Concepts, Theory and

  14. Graphics metafile interface to ARAC emergency response models for remote workstation study

    International Nuclear Information System (INIS)

    Lawver, B.S.

    1985-01-01

    The Department of Energy's Atmospheric Response Advisory Capability models are executed on computers at a central computer center with the output distributed to accident advisors in the field. The output of these atmospheric diffusion models are generated as contoured isopleths of concentrations. When these isopleths are overlayed with local geography, they become a useful tool to the accident site advisor. ARAC has developed a workstation that is located at potential accident sites. The workstation allows the accident advisor to view color plots of the model results, scale those plots and print black and white hardcopy of the model results. The graphics metafile, also known as Virtual Device Metafile (VDM) allows the models to generate a single device independent output file that is partitioned into geography, isoopleths and labeling information. The metafile is a very compact data storage technique that is output device independent. The metafile frees the model from either generating output for all known graphic devices or requiring the model to be rerun for additional graphic devices. With the partitioned metafile ARAC can transmit to the remote workstation the isopleths and labeling for each model. The geography database may not change and can be transmitted only when needed. This paper describes the important features of the remote workstation and how these features are supported by the device independent graphics metafile

  15. MPGT - THE MISSION PLANNING GRAPHICAL TOOL

    Science.gov (United States)

    Jeletic, J. F.

    1994-01-01

    The Mission Planning Graphical Tool (MPGT) provides mission analysts with a mouse driven graphical representation of the spacecraft and environment data used in spaceflight planning. Developed by the Flight Dynamics Division at NASA's Goddard Space Flight Center, MPGT is designed to be a generic tool that can be configured to analyze any specified earth orbiting spacecraft mission. The data is presented as a series of overlays on top of a 2-dimensional or 3-dimensional projection of the earth. Up to six spacecraft orbit tracks can be drawn at one time. Position data can be obtained by either an analytical process or by use of ephemeris files. If the user chooses to propagate the spacecraft orbit using an ephemeris file, then Goddard Trajectory Determination System (GTDS) formatted ephemeris files must be supplied. The MPGT User's Guide provides a complete description of the GTDS ephemeris file format so that users can create their own. Other overlays included are ground station antenna masks, solar and lunar ephemeris, Tracking Data and Relay Satellite System (TDRSS) coverage, a field-of-view swath, and orbit number. From these graphical representations an analyst can determine such spacecraft-related constraints as communication coverage, interference zone infringement, sunlight availability, and instrument target visibility. The presentation of time and geometric data as graphical overlays on a world map makes possible quick analyses of trends and time-oriented parameters. For instance, MPGT can display the propagation of the position of the Sun and Moon over time, shadowing of sunrise/sunset terminators to indicate spacecraft and Earth day/night, and color coding of the spacecraft orbit tracks to indicate spacecraft day/night. With the 3-dimensional display, the user specifies a vector that represents the position in the universe from which the user wishes to view the earth. From these "viewpoint" parameters the user can zoom in on or rotate around the earth

  16. Eureka-DMA: an easy-to-operate graphical user interface for fast comprehensive investigation and analysis of DNA microarray data.

    Science.gov (United States)

    Abelson, Sagi

    2014-02-24

    In the past decade, the field of molecular biology has become increasingly quantitative; rapid development of new technologies enables researchers to investigate and address fundamental issues quickly and in an efficient manner which were once impossible. Among these technologies, DNA microarray provides methodology for many applications such as gene discovery, diseases diagnosis, drug development and toxicological research and it has been used increasingly since it first emerged. Multiple tools have been developed to interpret the high-throughput data produced by microarrays. However, many times, less consideration has been given to the fact that an extensive and effective interpretation requires close interplay between the bioinformaticians who analyze the data and the biologists who generate it. To bridge this gap and to simplify the usability of such tools we developed Eureka-DMA - an easy-to-operate graphical user interface that allows bioinformaticians and bench-biologists alike to initiate analyses as well as to investigate the data produced by DNA microarrays. In this paper, we describe Eureka-DMA, a user-friendly software that comprises a set of methods for the interpretation of gene expression arrays. Eureka-DMA includes methods for the identification of genes with differential expression between conditions; it searches for enriched pathways and gene ontology terms and combines them with other relevant features. It thus enables the full understanding of the data for following testing as well as generating new hypotheses. Here we show two analyses, demonstrating examples of how Eureka-DMA can be used and its capability to produce relevant and reliable results. We have integrated several elementary expression analysis tools to provide a unified interface for their implementation. Eureka-DMA's simple graphical user interface provides effective and efficient framework in which the investigator has the full set of tools for the visualization and interpretation

  17. Use of Cloud-Based Graphic Narrative Software in Medical Ethics Teaching

    Science.gov (United States)

    Weber, Alan S.

    2015-01-01

    Although used as a common pedagogical tool in K-12 education, online graphic narrative ("comics") software has not generally been incorporated into advanced professional or technical education. This contribution reports preliminary data from a study on the use of cloud-based graphics software Pixton.com to teach basic medical ethics…

  18. Computer graphic display of cardiac CT scans

    International Nuclear Information System (INIS)

    Palmer, R.; Carlsson, E.

    1982-01-01

    In order to improve spatial conception and quantitative assessment of the cardiac structures based on cardiac computed tomography, methods for computer graphic display were developed. Excised hearts and living dogs with myocardial infarctions were subjected to CT scanning. The data on the scanner tapes were processed to provide isodensity plots, linear section plots, time-weighted integrated isodensity plots as well as topographical density displays and three-dimensional spatial reconstructions of single and multi-layer scans. (orig.)

  19. Involving Research Stakeholders in Developing Policy on Sharing Public Health Research Data in Kenya

    Science.gov (United States)

    Jao, Irene; Kombe, Francis; Mwalukore, Salim; Bull, Susan; Parker, Michael; Kamuya, Dorcas; Molyneux, Sassy

    2015-01-01

    Increased global sharing of public health research data has potential to advance scientific progress but may present challenges to the interests of research stakeholders, particularly in low-to-middle income countries. Policies for data sharing should be responsive to public views, but there is little evidence of the systematic study of these from low-income countries. This qualitative study explored views on fair data-sharing processes among 60 stakeholders in Kenya with varying research experience, using a deliberative approach. Stakeholders’ attitudes were informed by perceptions of benefit and concerns for research data sharing, including risks of stigmatization, loss of privacy, and undermining scientific careers and validity, reported in detail elsewhere. In this article, we discuss institutional trust-building processes seen as central to perceptions of fairness in sharing research data in this setting, including forms of community involvement, individual prior awareness and agreement to data sharing, independence and accountability of governance mechanisms, and operating under a national framework. PMID:26297748

  20. Integration of rocket turbine design and analysis through computer graphics

    Science.gov (United States)

    Hsu, Wayne; Boynton, Jim

    1988-01-01

    An interactive approach with engineering computer graphics is used to integrate the design and analysis processes of a rocket engine turbine into a progressive and iterative design procedure. The processes are interconnected through pre- and postprocessors. The graphics are used to generate the blade profiles, their stacking, finite element generation, and analysis presentation through color graphics. Steps of the design process discussed include pitch-line design, axisymmetric hub-to-tip meridional design, and quasi-three-dimensional analysis. The viscous two- and three-dimensional analysis codes are executed after acceptable designs are achieved and estimates of initial losses are confirmed.