WorldWideScience

Sample records for great interactive software

  1. Great software debates

    CERN Document Server

    Davis, A

    2004-01-01

    The industry’s most outspoken and insightful critic explains how the software industry REALLY works. In Great Software Debates, Al Davis, shares what he has learned about the difference between the theory and the realities of business and encourages you to question and think about software engineering in ways that will help you succeed where others fail. In short, provocative essays, Davis fearlessly reveals the truth about process improvement, productivity, software quality, metrics, agile development, requirements documentation, modeling, software marketing and sales, empiricism, start-up financing, software research, requirements triage, software estimation, and entrepreneurship.

  2. Software Tools For Large Scale Interactive Hydrodynamic Modeling

    NARCIS (Netherlands)

    Donchyts, G.; Baart, F.; van Dam, A; Jagers, B; van der Pijl, S.; Piasecki, M.

    2014-01-01

    Developing easy-to-use software that combines components for simultaneous visualization, simulation and interaction is a great challenge. Mainly, because it involves a number of disciplines, like computational fluid dynamics, computer graphics, high-performance computing. One of the main

  3. IDES: Interactive Data Entry System: a generalized data acquisition software package

    International Nuclear Information System (INIS)

    Gasser, S.B.

    1980-04-01

    The Interactive Data Entry System (IDES) is a software package which greatly assists in designing and storing forms to be used for the directed acquisition of data. Objective of this package is to provide a viable man/machine interface to any comprehensive data base. This report provides a technical description of the software and can be used as a user's manual

  4. Educational interactive multimedia software: The impact of interactivity on learning

    Science.gov (United States)

    Reamon, Derek Trent

    This dissertation discusses the design, development, deployment and testing of two versions of educational interactive multimedia software. Both versions of the software are focused on teaching mechanical engineering undergraduates about the fundamentals of direct-current (DC) motor physics and selection. The two versions of Motor Workshop software cover the same basic materials on motors, but differ in the level of interactivity between the students and the software. Here, the level of interactivity refers to the particular role of the computer in the interaction between the user and the software. In one version, the students navigate through information that is organized by topic, reading text, and viewing embedded video clips; this is referred to as "low-level interactivity" software because the computer simply presents the content. In the other version, the students are given a task to accomplish---they must design a small motor-driven 'virtual' vehicle that competes against computer-generated opponents. The interaction is guided by the software which offers advice from 'experts' and provides contextual information; we refer to this as "high-level interactivity" software because the computer is actively participating in the interaction. The software was used in two sets of experiments, where students using the low-level interactivity software served as the 'control group,' and students using the highly interactive software were the 'treatment group.' Data, including pre- and post-performance tests, questionnaire responses, learning style characterizations, activity tracking logs and videotapes were collected for analysis. Statistical and observational research methods were applied to the various data to test the hypothesis that the level of interactivity effects the learning situation, with higher levels of interactivity being more effective for learning. The results show that both the low-level and high-level interactive versions of the software were effective

  5. Design Principles for Interactive Software

    DEFF Research Database (Denmark)

    The book addresses the crucial intersection of human-computer interaction (HCI) and software engineering by asking both what users require from interactive systems and what developers need to produce well-engineered software. Needs are expressed as...

  6. Customer Interaction in Software Development: A Comparison of Software Methodologies Deployed in Namibian Software Firms

    CSIR Research Space (South Africa)

    Iyawa, GE

    2016-01-01

    Full Text Available within the Namibian context. An implication for software project managers and software developers is that customer interaction should be properly managed to ensure that the software methodologies for improving software development processes...

  7. Interactive learning software for electrical engineering subjects ...

    African Journals Online (AJOL)

    Interactive learning software for electrical engineering subjects using MATLAB and ... Keywords: electrical engineering; MATLAB; graphic user interface (GUI); educational software. Full Text: EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  8. CACTI: free, open-source software for the sequential coding of behavioral interactions.

    Science.gov (United States)

    Glynn, Lisa H; Hallgren, Kevin A; Houck, Jon M; Moyers, Theresa B

    2012-01-01

    The sequential analysis of client and clinician speech in psychotherapy sessions can help to identify and characterize potential mechanisms of treatment and behavior change. Previous studies required coding systems that were time-consuming, expensive, and error-prone. Existing software can be expensive and inflexible, and furthermore, no single package allows for pre-parsing, sequential coding, and assignment of global ratings. We developed a free, open-source, and adaptable program to meet these needs: The CASAA Application for Coding Treatment Interactions (CACTI). Without transcripts, CACTI facilitates the real-time sequential coding of behavioral interactions using WAV-format audio files. Most elements of the interface are user-modifiable through a simple XML file, and can be further adapted using Java through the terms of the GNU Public License. Coding with this software yields interrater reliabilities comparable to previous methods, but at greatly reduced time and expense. CACTI is a flexible research tool that can simplify psychotherapy process research, and has the potential to contribute to the improvement of treatment content and delivery.

  9. Teaching Radiology Physics Interactively with Scientific Notebook Software.

    Science.gov (United States)

    Richardson, Michael L; Amini, Behrang

    2018-06-01

    The goal of this study is to demonstrate how the teaching of radiology physics can be enhanced with the use of interactive scientific notebook software. We used the scientific notebook software known as Project Jupyter, which is free, open-source, and available for the Macintosh, Windows, and Linux operating systems. We have created a scientific notebook that demonstrates multiple interactive teaching modules we have written for our residents using the Jupyter notebook system. Scientific notebook software allows educators to create teaching modules in a form that combines text, graphics, images, data, interactive calculations, and image analysis within a single document. These notebooks can be used to build interactive teaching modules, which can help explain complex topics in imaging physics to residents. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  10. Multi-physics fluid-structure interaction modelling software

    CSIR Research Space (South Africa)

    Malan, AG

    2008-11-01

    Full Text Available -structure interaction modelling software AG MALAN AND O OXTOBY CSIR Defence, Peace, Safety and Security, PO Box 395, Pretoria, 0001 Email: amalan@csir.co.za – www.csir.co.za Internationally leading aerospace company Airbus sponsored key components... of the development of the CSIR fl uid-structure interaction (FSI) software. Below are extracts from their evaluation of the devel- oped technology: “The fi eld of FSI covers a massive range of engineering problems, each with their own multi-parameter, individual...

  11. Effective UI The Art of Building Great User Experience in Software

    CERN Document Server

    Anderson, Jonathan; Wilson, Robb

    2010-01-01

    People expect effortless, engaging interaction with desktop and web applications, but producing software that generates enjoyable user experiences is much harder than many companies anticipate. With Effective UI, you'll learn proven user-experience strategies that will satisfy your clients and customers, drive business value, and increase brand strength. This book shows you how to capture the collaborative and cooperative spirit among designers, engineers, and management required for building engaging software. You'll also learn valuable methods for maintaining focus throughout the process -

  12. Network Interactions in the Great Altai Region

    Directory of Open Access Journals (Sweden)

    Lev Aleksandrovich Korshunov

    2017-12-01

    Full Text Available To improve the efficiency and competitiveness of the regional economy, an effective interaction between educational institutions in the Great Altai region is needed. The innovation growth can enhancing this interaction. The article explores the state of network structures in the economy and higher education in the border territories of the countries of Great Altai. The authors propose an updated approach to the three-level classification of network interaction. We analyze growing influence of the countries with emerging economies. We define the factors that impede the more stable and multifaceted regional development of these countries. Further, the authors determine indicators of the higher education systems and cooperation systems at the university level between the Shanghai Cooperation Organization countries (SCO and BRICS countries, showing the international rankings of the universities in these countries. The teaching language is important to overcome the obstacles in the interregional cooperation. The authors specify the problems of the development of the universities of the SCO and BRICS countries as global educational networks. The research applies basic scientific logical methods of analysis and synthesis, induction and deduction, as well as the SWOT analysis method. We have indentified and analyzed the existing economic and educational relations. To promote the economic innovation development of the border territories of the Great Altai, we propose a model of regional network university. Modern universities function in a new economic environment. Thus, in a great extent, they form the technological and social aspects of this environment. Innovative network structures contribute to the formation of a new network institutional environment of the regional economy, which impacts the macro- and microeconomic performance of the region as a whole. The results of the research can help to optimize the regional economies of the border

  13. Using Interactive Software to Teach Foundational Mathematical Skills

    Directory of Open Access Journals (Sweden)

    Larysa V Lysenko

    2016-01-01

    Full Text Available The pilot research presented here explores the classroom use of Emerging Literacy in Mathematics (ELM software, a research-based bilingual interactive multimedia instructional tool, and its potential to develop emerging numeracy skills. At the time of the study, a central theme of early mathematics curricula, Number Concept, was fully developed. It was broken down into five mathematical concepts including counting, comparing, adding, subtracting and decomposing. Each of these was further subdivided yielding 22 online activities, each building in a level of complexity and abstraction. In total, 234 grade one students from 12 classes participated in the two-group post-test study that lasted about seven weeks and for which students in the experimental group used ELM for about 30 minutes weekly. The results for the final sample of 186 students showed that ELM students scored higher on the standardized math test (Canadian Achievement Test, 2008 and reported less boredom and lower anxiety as measured on the Academic Emotions Questionnaire than their peers in the control group. This short duration pilot study of one ELM theme holds great promise for ELM’s continued development.

  14. The Case for Open Source Software: The Interactional Discourse Lab

    Science.gov (United States)

    Choi, Seongsook

    2016-01-01

    Computational techniques and software applications for the quantitative content analysis of texts are now well established, and many qualitative data software applications enable the manipulation of input variables and the visualization of complex relations between them via interactive and informative graphical interfaces. Although advances in…

  15. Interaction between systems and software engineering in safety-critical systems

    International Nuclear Information System (INIS)

    Knight, J.

    1994-01-01

    There are three areas of concern: when is software to be considered safe; what, exactly, is the role of the software engineer; and how do systems, or sometimes applications, engineers and software engineers interact with each other. The author presents his perspective on these questions which he feels differ from those of many in the field. He argues for a clear definition of safety in the software arena, so the engineer knows what he is engineering toward. Software must be viewed as part of the entire system, since it does not function on its own, or isolation. He argues for the establishment of clear specifications in this area

  16. Software reliability studies

    Science.gov (United States)

    Hoppa, Mary Ann; Wilson, Larry W.

    1994-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction.

  17. Models of the atomic nucleus. With interactive software

    International Nuclear Information System (INIS)

    Cook, N.D.

    2006-01-01

    This book-and-CD-software package supplies users with an interactive experience for nuclear visualization via a computer-graphical interface, similar in principle to the molecular visualizations already available in chemistry. Models of the Atomic Nucleus, a largely non-technical introduction to nuclear theory, explains the nucleus in a way that makes nuclear physics as comprehensible as chemistry or cell biology. The book/software supplements virtually any of the current textbooks in nuclear physics by providing a means for 3D visual display of the diverse models of nuclear structure. For the first time, an easy-to-master software for scientific visualization of the nucleus makes this notoriously ''non-visual'' field become immediately 'visible.' After a review of the basics, the book explores and compares the competing models, and addresses how the lattice model best resolves remaining controversies. The appendix explains how to obtain the most from the software provided on the accompanying CD. (orig.)

  18. Student Learning from Interactive Software

    Science.gov (United States)

    Lee, Kevin M.; Collaboration of Astronomy Teaching Scholars CATS

    2011-01-01

    For several years at the University of Nebraska we have been developing interactive software to teach introductory astronomy. This software includes the simulations of the Nebraska Astronomy Applet Project, the computer database of visual Think-Pair-Share questions and resources for feedback known as ClassAction, and a library of animated ranking and sorting tasks. All of these projects are publicly available for use over the web or download at http://astro.unl.edu. This presentation will highlight examples of research into student learning using these materials. Results from a multi-institution study of ClassAction using the Light and Spectra Concept Inventory in a pre/post format will be shown. Results from a second study on student learning gains, practices, and attitudes from use of animated ranking tasks focusing on lunar phases will also be included. This material is based upon work supported by the National Science Foundation under Grant No. 0715517, a CCLI Phase III Grant for the Collaboration of Astronomy Teaching Scholars (CATS). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

  19. A communication protocol for interactively controlling software tools

    NARCIS (Netherlands)

    Wulp, van der J.

    2008-01-01

    We present a protocol for interactively using software tools in a loosely coupled tool environment. Such an environment can assist the user in doing tasks that require the use of multiple tools. For example, it can invoke tools on certain input, set processing parameters, await task completion and

  20. Combinations of Methods for Collaborative Evaluation of the Usability of Interactive Software Systems

    Directory of Open Access Journals (Sweden)

    Andrés Solano

    2016-01-01

    Full Text Available Usability is a fundamental quality characteristic for the success of an interactive system. It is a concept that includes a set of metrics and methods in order to obtain easy-to-learn and easy-to-use systems. Usability Evaluation Methods, UEM, are quite diverse; their application depends on variables such as costs, time availability, and human resources. A large number of UEM can be employed to assess interactive software systems, but questions arise when deciding which method and/or combination of methods gives more (relevant information. We propose Collaborative Usability Evaluation Methods, CUEM, following the principles defined by the Collaboration Engineering. This paper analyzes a set of CUEM conducted on different interactive software systems. It proposes combinations of CUEM that provide more complete and comprehensive information about the usability of interactive software systems than those evaluation methods conducted independently.

  1. Usability Testing for Developing Effective Interactive Multimedia Software: Concepts, Dimensions, and Procedures

    Directory of Open Access Journals (Sweden)

    Sung Heum Lee

    1999-04-01

    Full Text Available Usability testing is a dynamic process that can be used throughout the process of developing interactive multimedia software. The purpose of usability testing is to find problems and make recommendations to improve the utility of a product during its design and development. For developing effective interactive multimedia software, dimensions of usability testing were classified into the general categories of: learnability; performance effectiveness; flexibility; error tolerance and system integrity; and user satisfaction. In the process of usability testing, evaluation experts consider the nature of users and tasks, tradeoffs supported by the iterative design paradigm, and real world constraints to effectively evaluate and improve interactive multimedia software. Different methods address different purposes and involve a combination of user and usability testing, however, usability practitioners follow the seven general procedures of usability testing for effective multimedia development. As the knowledge about usability testing grows, evaluation experts will be able to choose more effective and efficient methods and techniques that are appropriate to their goals.

  2. Cytoscape: a software environment for integrated models of biomolecular interaction networks.

    Science.gov (United States)

    Shannon, Paul; Markiel, Andrew; Ozier, Owen; Baliga, Nitin S; Wang, Jonathan T; Ramage, Daniel; Amin, Nada; Schwikowski, Benno; Ideker, Trey

    2003-11-01

    Cytoscape is an open source software project for integrating biomolecular interaction networks with high-throughput expression data and other molecular states into a unified conceptual framework. Although applicable to any system of molecular components and interactions, Cytoscape is most powerful when used in conjunction with large databases of protein-protein, protein-DNA, and genetic interactions that are increasingly available for humans and model organisms. Cytoscape's software Core provides basic functionality to layout and query the network; to visually integrate the network with expression profiles, phenotypes, and other molecular states; and to link the network to databases of functional annotations. The Core is extensible through a straightforward plug-in architecture, allowing rapid development of additional computational analyses and features. Several case studies of Cytoscape plug-ins are surveyed, including a search for interaction pathways correlating with changes in gene expression, a study of protein complexes involved in cellular recovery to DNA damage, inference of a combined physical/functional interaction network for Halobacterium, and an interface to detailed stochastic/kinetic gene regulatory models.

  3. Software engineering aspects of real-time programming concepts

    Science.gov (United States)

    Schoitsch, Erwin

    1986-08-01

    Real-time programming is a discipline of great importance not only in process control, but also in fields like communication, office automation, interactive databases, interactive graphics and operating systems development. General concepts of concurrent programming and constructs for process-synchronization are discussed in detail. Tasking and synchronization concepts, methods of process communication, interrupt and timeout handling in systems based on semaphores, signals, conditional critical regions or on real-time languages like Concurrent PASCAL, MODULA, CHILL and ADA are explained and compared with each other. The second part deals with structuring and modularization of technical processes to build reliable and maintainable real time systems. Software-quality and software engineering aspects are considered throughout the paper.

  4. Interactive Multimedia Software on Fundamental Particles and Forces. Final Technical Report

    International Nuclear Information System (INIS)

    Jack Sculley

    1999-01-01

    Research in the SBIR Phase 2 grant number 95 ER 81944 centered on creating interactive multimedia software for teaching basic concepts in particle physics on fundamental particles and forces. The work was undertaken from February 1997 through July 1998. Overall the project has produced some very encouraging results in terms of product development, interest from the general public and interest from potential Phase 3 funders. Although the original Phase 3 publisher, McGraw Hill Home Interactive, was dissolved by its parent company, and other changes in the CD-ROM industry forced them to change their focus from CD-ROM to the Internet, there has been substantial interest from software publishers and online content providers in the content developed in the course of the Phase 2 research. Results are summarized

  5. Numerical Analyses of Subsoil-structure Interaction in Original Non-commercial Software based on FEM

    Science.gov (United States)

    Cajka, R.; Vaskova, J.; Vasek, J.

    2018-04-01

    For decades attention has been paid to interaction of foundation structures and subsoil and development of interaction models. Given that analytical solutions of subsoil-structure interaction could be deduced only for some simple shapes of load, analytical solutions are increasingly being replaced by numerical solutions (eg. FEM – Finite element method). Numerical analyses provides greater possibilities for taking into account the real factors involved in the subsoil-structure interaction and was also used in this article. This makes it possible to design the foundation structures more efficiently and still reliably and securely. Currently there are several software that, can deal with the interaction of foundations and subsoil. It has been demonstrated that non-commercial software called MKPINTER (created by Cajka) provides appropriately results close to actual measured values. In MKPINTER software stress-strain analysis of elastic half-space by means of Gauss numerical integration and Jacobean of transformation is done. Input data for numerical analysis were observed by experimental loading test of concrete slab. The loading was performed using unique experimental equipment which was constructed in the area Faculty of Civil Engineering, VŠB-TU Ostrava. The purpose of this paper is to compare resulting deformation of the slab with values observed during experimental loading test.

  6. Evaluating the Release, Delivery, and Deployment Processes of Eight Large Product Software Vendors applying the Customer Configuration Update Model

    NARCIS (Netherlands)

    Jansen, S.R.L.; Brinkkemper, S.

    2006-01-01

    For software vendors the processes of release, delivery, and deployment to customers are inherently complex. However, software vendors can greatly improve their product quality and quality of service by applying a model that focuses on customer interaction if such a model were available. This

  7. Using Tablet PCs and Interactive Software in IC Design Courses to Improve Learning

    Science.gov (United States)

    Simoni, M.

    2011-01-01

    This paper describes an initial study of using tablet PCs and interactive course software in integrated circuit (IC) design courses. A rapidly growing community is demonstrating how this technology can improve learning and retention of material by facilitating interaction between faculty and students via cognitive exercises during lectures. While…

  8. Create, share and learn. Experiences with free software, free culture and collaboration in formal and non formal education in Colombia

    OpenAIRE

    Medina Cardona, Luis Fernando

    2012-01-01

    New media technologies, specially software have been of great impact in modern society.The combination of computer/software/networks as creative machines is present in everyday life. This article is focused in this interactions specially from the perspective of free open source software (FOSS). In doing so, the influence of its values is traced from the original hacker culture ethics to the free software four freedoms, showing a software explained culture as in the software studies discipline...

  9. The Use of Flexible, Interactive, Situation-Focused Software for the E-Learning of Mathematics.

    Science.gov (United States)

    Farnsworth, Ralph Edward

    This paper discusses the classroom, home, and distance use of new, flexible, interactive, application-oriented software known as Active Learning Suite. The actual use of the software, not just a controlled experiment, is reported on. Designed for the e-learning of university mathematics, the program was developed by a joint U.S.-Russia team and…

  10. Nonfiction Reading Comprehension in Middle School: Exploring in Interactive Software Approach

    Science.gov (United States)

    Wolff, Evelyn S.; Isecke, Harriet; Rhoads, Christopher; Madura, John P.

    2013-01-01

    The struggles of students in the United States to comprehend non-fiction science text are well documented. Middle school students, in particular, have minimal instruction in comprehending nonfiction and flounder on assessments. This article describes the development process of the Readorium software, an interactive web-based program being…

  11. CASSys: an integrated software-system for the interactive analysis of ChIP-seq data

    Directory of Open Access Journals (Sweden)

    Alawi Malik

    2011-06-01

    Full Text Available The mapping of DNA-protein interactions is crucial for a full understanding of transcriptional regulation. Chromatin-immunoprecipitation followed bymassively parallel sequencing (ChIP-seq has become the standard technique for analyzing these interactions on a genome-wide scale. We have developed a software system called CASSys (ChIP-seq data Analysis Software System spanning all steps of ChIP-seq data analysis. It supersedes the laborious application of several single command line tools. CASSys provides functionality ranging from quality assessment and -control of short reads, over the mapping of reads against a reference genome (readmapping and the detection of enriched regions (peakdetection to various follow-up analyses. The latter are accessible via a state-of-the-art web interface and can be performed interactively by the user. The follow-up analyses allow for flexible user defined association of putative interaction sites with genes, visualization of their genomic context with an integrated genome browser, the detection of putative binding motifs, the identification of over-represented Gene Ontology-terms, pathway analysis and the visualization of interaction networks. The system is client-server based, accessible via a web browser and does not require any software installation on the client side. To demonstrate CASSys’s functionality we used the system for the complete data analysis of a publicly available Chip-seq study that investigated the role of the transcription factor estrogen receptor-α in breast cancer cells.

  12. Developer Initiation and Social Interactions in OSS: A Case Study of the Apache Software Foundation

    Science.gov (United States)

    2014-08-01

    pp. 201–215, 2003. 2. K. Crowston, K. Wei, J. Howison, and A. Wiggins, “Free/ libre open-source software devel- opment: What we know and what we do not...Understanding the process of participating in open source communities,” in International Workshop on Emerging Trends in Free/ Libre /Open Source Software ...Noname manuscript No. (will be inserted by the editor) Developer Initiation and Social Interactions in OSS: A Case Study of the Apache Software

  13. P-MartCancer–Interactive Online Software to Enable Analysis of Shotgun Cancer Proteomic Datasets

    Energy Technology Data Exchange (ETDEWEB)

    Webb-Robertson, Bobbie-Jo M.; Bramer, Lisa M.; Jensen, Jeffrey L.; Kobold, Markus A.; Stratton, Kelly G.; White, Amanda M.; Rodland, Karin D.

    2017-10-31

    P-MartCancer is a new interactive web-based software environment that enables biomedical and biological scientists to perform in-depth analyses of global proteomics data without requiring direct interaction with the data or with statistical software. P-MartCancer offers a series of statistical modules associated with quality assessment, peptide and protein statistics, protein quantification and exploratory data analyses driven by the user via customized workflows and interactive visualization. Currently, P-MartCancer offers access to multiple cancer proteomic datasets generated through the Clinical Proteomics Tumor Analysis Consortium (CPTAC) at the peptide, gene and protein levels. P-MartCancer is deployed using Azure technologies (http://pmart.labworks.org/cptac.html), the web-service is alternatively available via Docker Hub (https://hub.docker.com/r/pnnl/pmart-web/) and many statistical functions can be utilized directly from an R package available on GitHub (https://github.com/pmartR).

  14. Interactive software tool to comprehend the calculation of optimal sequence alignments with dynamic programming.

    Science.gov (United States)

    Ibarra, Ignacio L; Melo, Francisco

    2010-07-01

    Dynamic programming (DP) is a general optimization strategy that is successfully used across various disciplines of science. In bioinformatics, it is widely applied in calculating the optimal alignment between pairs of protein or DNA sequences. These alignments form the basis of new, verifiable biological hypothesis. Despite its importance, there are no interactive tools available for training and education on understanding the DP algorithm. Here, we introduce an interactive computer application with a graphical interface, for the purpose of educating students about DP. The program displays the DP scoring matrix and the resulting optimal alignment(s), while allowing the user to modify key parameters such as the values in the similarity matrix, the sequence alignment algorithm version and the gap opening/extension penalties. We hope that this software will be useful to teachers and students of bioinformatics courses, as well as researchers who implement the DP algorithm for diverse applications. The software is freely available at: http:/melolab.org/sat. The software is written in the Java computer language, thus it runs on all major platforms and operating systems including Windows, Mac OS X and LINUX. All inquiries or comments about this software should be directed to Francisco Melo at fmelo@bio.puc.cl.

  15. Mvox: Interactive 2-4D medical image and graphics visualization software

    DEFF Research Database (Denmark)

    Bro-Nielsen, Morten

    1996-01-01

    Mvox is a new tool for visualization, segmentation and manipulation of a wide range of 2-4D grey level and colour images, and 3D surface graphics, which has been developed at the Department of Mathematical Modelling, Technical University of Denmark. The principal idea behind the software has been...... to provide a flexible tool that is able to handle all the kinds of data that are typically used in a research environment for medical imaging and visualization. At the same time the software should be easy to use and have a consistent interface providing locally only the functions relevant to the context....... This has been achieved by using Unix standards such as X/Motif/OpenGL and conforming to modern standards of interactive windowed programs...

  16. Forecasting of interaction between bee propolis and protective antigenic domain in anthrax using the software and bioinformatics web servers

    Directory of Open Access Journals (Sweden)

    Elmira Mohammadi

    2017-01-01

    Full Text Available Background: Protective antigen of anthrax toxin, after touching the cell receptors, plays an important role in the pathogenesis of toxin. The purpose of this study was to investigate the interaction of anthrax toxin protective antigen and four great combination propolis included caffeic acid, benzyl caffeate, cinnamic acid and kaempferol using the softwares and bioinformatics web servers. Methods: Three-dimensional structure of protective antigen (receptor obtains from Protein Data Bank (PDB. Four of the main components from propolis were selected          as ligand and their 3D-structures were obtained from ChemSpider and ZINC     compound database. The interaction of each ligand and receptor was assessed                   by SwissDock server (http://www.swissdock.ch/ and BSP-SLIM server (http://zhanglab.ccmb.med.umich.edu/BSP-SLIM. Docking results appears with Fullfitness numbers (in kcal/mol. Identification of amino acids involved in ligand and receptor interaction, was performed using the Chimera software; UCSF Chimera program (http://www.cgl.ucsf.edu/. Results: The results of interaction between propolis components and protective antigen by BSP-SLIM server showed that the most interaction was related with benzyl caffeate, caffeic acid, kaempferol and cinnamic acid, respectively. Results for the desired ligand Interaction with protective antigen genes using SwissDock server showed that the caffeic acid had ΔG equals -9.10 kcal/mol and FullFitness equal to -993.16 kcal/mol respectively. The analysis of interaction between ligands with amino-acids of protective antigen indicated that the interaction of Caffeic acid whit Glutamic acid 117 had energy -15.5429 kcal/mol. Conclusion: Finding strong and safe inhibitors for anthrax toxin is very useful method for inhibiting its toxicity to cell. In this study the binding ability of four flavonoids to protective antigen was studied. Glutamic acid 117 is very effective

  17. Interactive graphics for the Macintosh: software review of FlexiGraphs.

    Science.gov (United States)

    Antonak, R F

    1990-01-01

    While this product is clearly unique, its usefulness to individuals outside small business environments is somewhat limited. FlexiGraphs is, however, a reasonable first attempt to design a microcomputer software package that controls data through interactive editing within a graph. Although the graphics capabilities of mainframe programs such as MINITAB (Ryan, Joiner, & Ryan, 1981) and the graphic manipulations available through exploratory data analysis (e.g., Velleman & Hoaglin, 1981) will not be surpassed anytime soon by this program, a researcher may want to add this program to a software library containing other Macintosh statistics, drawing, and graphics programs if only to obtain the easy-to-obtain curve fitting and line smoothing options. I welcome the opportunity to review the enhanced "scientific" version of FlexiGraphs that the author of the program indicates is currently under development. An MS-DOS version of the program should be available within the year.

  18. Celeris: A GPU-accelerated open source software with a Boussinesq-type wave solver for real-time interactive simulation and visualization

    Science.gov (United States)

    Tavakkol, Sasan; Lynett, Patrick

    2017-08-01

    In this paper, we introduce an interactive coastal wave simulation and visualization software, called Celeris. Celeris is an open source software which needs minimum preparation to run on a Windows machine. The software solves the extended Boussinesq equations using a hybrid finite volume-finite difference method and supports moving shoreline boundaries. The simulation and visualization are performed on the GPU using Direct3D libraries, which enables the software to run faster than real-time. Celeris provides a first-of-its-kind interactive modeling platform for coastal wave applications and it supports simultaneous visualization with both photorealistic and colormapped rendering capabilities. We validate our software through comparison with three standard benchmarks for non-breaking and breaking waves.

  19. Free software to analyse the clinical relevance of drug interactions with antiretroviral agents (SIMARV®) in patients with HIV/AIDS.

    Science.gov (United States)

    Giraldo, N A; Amariles, P; Monsalve, M; Faus, M J

    Highly active antiretroviral therapy has extended the expected lifespan of patients with HIV/AIDS. However, the therapeutic benefits of some drugs used simultaneously with highly active antiretroviral therapy may be adversely affected by drug interactions. The goal was to design and develop a free software to facilitate analysis, assessment, and clinical decision making according to the clinical relevance of drug interactions in patients with HIV/AIDS. A comprehensive Medline/PubMed database search of drug interactions was performed. Articles that recognized any drug interactions in HIV disease were selected. The publications accessed were limited to human studies in English or Spanish, with full texts retrieved. Drug interactions were analyzed, assessed, and grouped into four levels of clinical relevance according to gravity and probability. Software to systematize the information regarding drug interactions and their clinical relevance was designed and developed. Overall, 952 different references were retrieved and 446 selected; in addition, 67 articles were selected from the citation lists of identified articles. A total of 2119 pairs of drug interactions were identified; of this group, 2006 (94.7%) were drug-drug interactions, 1982 (93.5%) had an identified pharmacokinetic mechanism, and 1409 (66.5%) were mediated by enzyme inhibition. In terms of clinical relevance, 1285 (60.6%) drug interactions were clinically significant in patients with HIV (levels 1 and 2). With this information, a software program that facilitates identification and assessment of the clinical relevance of antiretroviral drug interactions (SIMARV ® ) was developed. A free software package with information on 2119 pairs of antiretroviral drug interactions was designed and developed that could facilitate analysis, assessment, and clinical decision making according to the clinical relevance of drug interactions in patients with HIV/AIDS. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Software engineer's pocket book

    CERN Document Server

    Tooley, Michael

    2013-01-01

    Software Engineer's Pocket Book provides a concise discussion on various aspects of software engineering. The book is comprised of six chapters that tackle various areas of concerns in software engineering. Chapter 1 discusses software development, and Chapter 2 covers programming languages. Chapter 3 deals with operating systems. The book also tackles discrete mathematics and numerical computation. Data structures and algorithms are also explained. The text will be of great use to individuals involved in the specification, design, development, implementation, testing, maintenance, and qualit

  1. Large scale continuous integration and delivery : Making great software better and faster

    NARCIS (Netherlands)

    Stahl, Daniel

    2017-01-01

    Since the inception of continuous integration, and later continuous delivery, the methods of producing software in the industry have changed dramatically over the last two decades. Automated, rapid and frequent compilation, integration, testing, analysis, packaging and delivery of new software

  2. An interactive audio-visual installation using ubiquitous hardware and web-based software deployment

    Directory of Open Access Journals (Sweden)

    Tiago Fernandes Tavares

    2015-05-01

    Full Text Available This paper describes an interactive audio-visual musical installation, namely MOTUS, that aims at being deployed using low-cost hardware and software. This was achieved by writing the software as a web application and using only hardware pieces that are built-in most modern personal computers. This scenario implies in specific technical restrictions, which leads to solutions combining both technical and artistic aspects of the installation. The resulting system is versatile and can be freely used from any computer with Internet access. Spontaneous feedback from the audience has shown that the provided experience is interesting and engaging, regardless of the use of minimal hardware.

  3. Making embedded systems design patterns for great software

    CERN Document Server

    White, Elecia

    2011-01-01

    Interested in developing embedded systems? Since they don't tolerate inefficiency, these systems require a disciplined approach to programming. This easy-to-read guide helps you cultivate a host of good development practices, based on classic software design patterns and new patterns unique to embedded programming. Learn how to build system architecture for processors, not operating systems, and discover specific techniques for dealing with hardware difficulties and manufacturing requirements. Written by an expert who's created embedded systems ranging from urban surveillance and DNA scanner

  4. Virtual pools for interactive analysis and software development through an integrated Cloud environment

    International Nuclear Information System (INIS)

    Grandi, C; Italiano, A; Salomoni, D; Melcarne, A K Calabrese

    2011-01-01

    WNoDeS, an acronym for Worker Nodes on Demand Service, is software developed at CNAF-Tier1, the National Computing Centre of the Italian Institute for Nuclear Physics (INFN) located in Bologna. WNoDeS provides on demand, integrated access to both Grid and Cloud resources through virtualization technologies. Besides the traditional use of computing resources in batch mode, users need to have interactive and local access to a number of systems. WNoDeS can dynamically select these computers instantiating Virtual Machines, according to the requirements (computing, storage and network resources) of users through either the Open Cloud Computing Interface API, or through a web console. An interactive use is usually limited to activities in user space, i.e. where the machine configuration is not modified. In some other instances the activity concerns development and testing of services and thus implies the modification of the system configuration (and, therefore, root-access to the resource). The former use case is a simple extension of the WNoDeS approach, where the resource is provided in interactive mode. The latter implies saving the virtual image at the end of each user session so that it can be presented to the user at subsequent requests. This work describes how the LHC experiments at INFN-Bologna are testing and making use of these dynamically created ad-hoc machines via WNoDeS to support flexible, interactive analysis and software development at the INFN Tier-1 Computing Centre.

  5. «Great Eurasia»: interests and possibilities of Russia at interaction with China

    Directory of Open Access Journals (Sweden)

    Elena M. Kuzmina

    2017-01-01

    Full Text Available The Author analyzes the regulatory structure proposed by the President Putin’s strategy of «Great Eurasia» in Russian official and economic policy frameworks and development programmes, Examines internal and external factors affecting the Eurasian Union, as a base for building relationships within the «Great Eurasia». The emphasis the author makes the EAEU’s interest in the development of economic relations with the countries of «Great Eurasia» with the use of the Union. He also analyzes the dynamics of the negotiation process for conclusion of trade and economic agreements between the EEC and China and the challenges and opportunities of member countries in economic relations with China. Considered the level of compatibility of the Eurasian Union and the initiative «One belt and one road». A separate part of the article devoted to economic cooperation of Russia and China and their possible interaction in the construction of the Grand Eurasia.  

  6. GRAVE: An Interactive Geometry Construction and Visualization Software System for the TORT Nuclear Radiation Transport Code

    International Nuclear Information System (INIS)

    Blakeman, E.D.

    2000-01-01

    A software system, GRAVE (Geometry Rendering and Visual Editor), has been developed at the Oak Ridge National Laboratory (ORNL) to perform interactive visualization and development of models used as input to the TORT three-dimensional discrete ordinates radiation transport code. Three-dimensional and two-dimensional visualization displays are included. Display capabilities include image rotation, zoom, translation, wire-frame and translucent display, geometry cuts and slices, and display of individual component bodies and material zones. The geometry can be interactively edited and saved in TORT input file format. This system is an advancement over the current, non-interactive, two-dimensional display software. GRAVE is programmed in the Java programming language and can be implemented on a variety of computer platforms. Three- dimensional visualization is enabled through the Visualization Toolkit (VTK), a free-ware C++ software library developed for geometric and data visual display. Future plans include an extension of the system to read inputs using binary zone maps and combinatorial geometry models containing curved surfaces, such as those used for Monte Carlo code inputs. Also GRAVE will be extended to geometry visualization/editing for the DORT two-dimensional transport code and will be integrated into a single GUI-based system for all of the ORNL discrete ordinates transport codes

  7. Software Design for Interactive Graphic Radiation Treatment Simulation Systems*

    Science.gov (United States)

    Kalet, Ira J.; Sweeney, Christine; Jacky, Jonathan

    1990-01-01

    We examine issues in the design of interactive computer graphic simulation programs for radiation treatment planning (RTP), as well as expert system programs that automate parts of the RTP process, in light of ten years of experience at designing, building and using such programs. An experiment in object-oriented design using standard Pascal shows that while some advantage is gained from the design, it is still difficult to achieve modularity and to integrate expert system components. A new design based on the Common LISP Object System (CLOS) is described. This series of designs for RTP software shows that this application benefits in specific ways from object-oriented design methods and appropriate languages and tools.

  8. Integrating R and Java for Enhancing Interactivity of Algorithmic Data Analysis Software Solutions

    Directory of Open Access Journals (Sweden)

    Titus Felix FURTUNĂ

    2016-06-01

    Full Text Available Conceiving software solutions for statistical processing and algorithmic data analysis involves handling diverse data, fetched from various sources and in different formats, and presenting the results in a suggestive, tailorable manner. Our ongoing research aims to design programming technics for integrating R developing environment with Java programming language for interoperability at a source code level. The goal is to combine the intensive data processing capabilities of R programing language, along with the multitude of statistical function libraries, with the flexibility offered by Java programming language and platform, in terms of graphical user interface and mathematical function libraries. Both developing environments are multiplatform oriented, and can complement each other through interoperability. R is a comprehensive and concise programming language, benefiting from a continuously expanding and evolving set of packages for statistical analysis, developed by the open source community. While is a very efficient environment for statistical data processing, R platform lacks support for developing user friendly, interactive, graphical user interfaces (GUIs. Java on the other hand, is a high level object oriented programming language, which supports designing and developing performant and interactive frameworks for general purpose software solutions, through Java Foundation Classes, JavaFX and various graphical libraries. In this paper we treat both aspects of integration and interoperability that refer to integrating Java code into R applications, and bringing R processing sequences into Java driven software solutions. Our research has been conducted focusing on case studies concerning pattern recognition and cluster analysis.

  9. GNSS Software Receiver for UAVs

    DEFF Research Database (Denmark)

    Olesen, Daniel Madelung; Jakobsen, Jakob; von Benzon, Hans-Henrik

    2016-01-01

    This paper describes the current activities of GPS/GNSS Software receiver development at DTU Space. GNSS Software receivers have received a great deal of attention in the last two decades and numerous implementations have already been presented. DTU Space has just recently started development of ...... of our own GNSS software-receiver targeted for mini UAV applications, and we will in in this paper present our current progress and briefly discuss the benefits of Software Receivers in relation to our research interests....

  10. GENIE: a software package for gene-gene interaction analysis in genetic association studies using multiple GPU or CPU cores

    Directory of Open Access Journals (Sweden)

    Wang Kai

    2011-05-01

    Full Text Available Abstract Background Gene-gene interaction in genetic association studies is computationally intensive when a large number of SNPs are involved. Most of the latest Central Processing Units (CPUs have multiple cores, whereas Graphics Processing Units (GPUs also have hundreds of cores and have been recently used to implement faster scientific software. However, currently there are no genetic analysis software packages that allow users to fully utilize the computing power of these multi-core devices for genetic interaction analysis for binary traits. Findings Here we present a novel software package GENIE, which utilizes the power of multiple GPU or CPU processor cores to parallelize the interaction analysis. GENIE reads an entire genetic association study dataset into memory and partitions the dataset into fragments with non-overlapping sets of SNPs. For each fragment, GENIE analyzes: 1 the interaction of SNPs within it in parallel, and 2 the interaction between the SNPs of the current fragment and other fragments in parallel. We tested GENIE on a large-scale candidate gene study on high-density lipoprotein cholesterol. Using an NVIDIA Tesla C1060 graphics card, the GPU mode of GENIE achieves a speedup of 27 times over its single-core CPU mode run. Conclusions GENIE is open-source, economical, user-friendly, and scalable. Since the computing power and memory capacity of graphics cards are increasing rapidly while their cost is going down, we anticipate that GENIE will achieve greater speedups with faster GPU cards. Documentation, source code, and precompiled binaries can be downloaded from http://www.cceb.upenn.edu/~mli/software/GENIE/.

  11. Activities Contributing a Great Deal to the Students' Interactive Skills in Foreign Language Classes

    Science.gov (United States)

    Asatryan, Susanna

    2016-01-01

    While teaching speaking it is desired to provide a rich environment in class for meaningful communication to take place. With this aim, various speaking activities can contribute a great deal to students in developing their interactive skills necessary for life. These activities make students active in the learning process and at the same time…

  12. Data Analysis Tools and Methods for Improving the Interaction Design in E-Learning

    Science.gov (United States)

    Popescu, Paul Stefan

    2015-01-01

    In this digital era, learning from data gathered from different software systems may have a great impact on the quality of the interaction experience. There are two main directions that come to enhance this emerging research domain, Intelligent Data Analysis (IDA) and Human Computer Interaction (HCI). HCI specific research methodologies can be…

  13. Redrawing the map of Great Britain from a network of human interactions.

    Science.gov (United States)

    Ratti, Carlo; Sobolevsky, Stanislav; Calabrese, Francesco; Andris, Clio; Reades, Jonathan; Martino, Mauro; Claxton, Rob; Strogatz, Steven H

    2010-12-08

    Do regional boundaries defined by governments respect the more natural ways that people interact across space? This paper proposes a novel, fine-grained approach to regional delineation, based on analyzing networks of billions of individual human transactions. Given a geographical area and some measure of the strength of links between its inhabitants, we show how to partition the area into smaller, non-overlapping regions while minimizing the disruption to each person's links. We tested our method on the largest non-Internet human network, inferred from a large telecommunications database in Great Britain. Our partitioning algorithm yields geographically cohesive regions that correspond remarkably well with administrative regions, while unveiling unexpected spatial structures that had previously only been hypothesized in the literature. We also quantify the effects of partitioning, showing for instance that the effects of a possible secession of Wales from Great Britain would be twice as disruptive for the human network than that of Scotland.

  14. Redrawing the map of Great Britain from a network of human interactions.

    Directory of Open Access Journals (Sweden)

    Carlo Ratti

    2010-12-01

    Full Text Available Do regional boundaries defined by governments respect the more natural ways that people interact across space? This paper proposes a novel, fine-grained approach to regional delineation, based on analyzing networks of billions of individual human transactions. Given a geographical area and some measure of the strength of links between its inhabitants, we show how to partition the area into smaller, non-overlapping regions while minimizing the disruption to each person's links. We tested our method on the largest non-Internet human network, inferred from a large telecommunications database in Great Britain. Our partitioning algorithm yields geographically cohesive regions that correspond remarkably well with administrative regions, while unveiling unexpected spatial structures that had previously only been hypothesized in the literature. We also quantify the effects of partitioning, showing for instance that the effects of a possible secession of Wales from Great Britain would be twice as disruptive for the human network than that of Scotland.

  15. Interactive software automates personalized radiation safety plans for Na131I therapy.

    Science.gov (United States)

    Friedman, Marvin I; Ghesani, Munir

    2002-11-01

    NRC regulations have liberalized the criteria for release from control of patients administered radioactive materials but require written radiation safety instruction if another individual is expected to receive more than 1 mSv. This necessitates calculation of expected doses, even when the calculated maximum likely dose is well below the 5 mSv release criterion. NRC interpretations of the regulation provide the biokinetic model to be used to evaluate the release criterion for patients administered Na131I, but do not provide guidance as to either the specifics of minimizing the dose of others or the length of time restrictions should remain in effect. Interactive software has been developed to facilitate creation of radiation safety plans tailored to patients' expected interactions. Day-by-day and cumulative effective exposures at several separation distances, including sleeping, are presented in grid format in a graphic interface. In an interview session, the patient proposes daily contacts, which are entered separately for each individual by point-and-click operation. Total dose estimates are accumulated and modified while negotiating contact schedules, guided by suggested age-specific limits. The software produces printed radiation safety recommendations specific to the clinical, dosing, and social situations and reflective of the patient's choice of combinations of close contact with others. It has been used in treating more than 100 patients and has been found to be very useful and well received.

  16. Programming Language Software For Graphics Applications

    Science.gov (United States)

    Beckman, Brian C.

    1993-01-01

    New approach reduces repetitive development of features common to different applications. High-level programming language and interactive environment with access to graphical hardware and software created by adding graphical commands and other constructs to standardized, general-purpose programming language, "Scheme". Designed for use in developing other software incorporating interactive computer-graphics capabilities into application programs. Provides alternative to programming entire applications in C or FORTRAN, specifically ameliorating design and implementation of complex control and data structures typifying applications with interactive graphics. Enables experimental programming and rapid development of prototype software, and yields high-level programs serving as executable versions of software-design documentation.

  17. Structure and software tools of AIDA.

    Science.gov (United States)

    Duisterhout, J S; Franken, B; Witte, F

    1987-01-01

    AIDA consists of a set of software tools to allow for fast development and easy-to-maintain Medical Information Systems. AIDA supports all aspects of such a system both during development and operation. It contains tools to build and maintain forms for interactive data entry and on-line input validation, a database management system including a data dictionary and a set of run-time routines for database access, and routines for querying the database and output formatting. Unlike an application generator, the user of AIDA may select parts of the tools to fulfill his needs and program other subsystems not developed with AIDA. The AIDA software uses as host language the ANSI-standard programming language MUMPS, an interpreted language embedded in an integrated database and programming environment. This greatly facilitates the portability of AIDA applications. The database facilities supported by AIDA are based on a relational data model. This data model is built on top of the MUMPS database, the so-called global structure. This relational model overcomes the restrictions of the global structure regarding string length. The global structure is especially powerful for sorting purposes. Using MUMPS as a host language allows the user an easy interface between user-defined data validation checks or other user-defined code and the AIDA tools. AIDA has been designed primarily for prototyping and for the construction of Medical Information Systems in a research environment which requires a flexible approach. The prototyping facility of AIDA operates terminal independent and is even to a great extent multi-lingual. Most of these features are table-driven; this allows on-line changes in the use of terminal type and language, but also causes overhead. AIDA has a set of optimizing tools by which it is possible to build a faster, but (of course) less flexible code from these table definitions. By separating the AIDA software in a source and a run-time version, one is able to write

  18. Computer-Aided Software Engineering - An approach to real-time software development

    Science.gov (United States)

    Walker, Carrie K.; Turkovich, John J.

    1989-01-01

    A new software engineering discipline is Computer-Aided Software Engineering (CASE), a technology aimed at automating the software development process. This paper explores the development of CASE technology, particularly in the area of real-time/scientific/engineering software, and a history of CASE is given. The proposed software development environment for the Advanced Launch System (ALS CASE) is described as an example of an advanced software development system for real-time/scientific/engineering (RT/SE) software. The Automated Programming Subsystem of ALS CASE automatically generates executable code and corresponding documentation from a suitably formatted specification of the software requirements. Software requirements are interactively specified in the form of engineering block diagrams. Several demonstrations of the Automated Programming Subsystem are discussed.

  19. Test process for the safety-critical embedded software

    International Nuclear Information System (INIS)

    Sung, Ahyoung; Choi, Byoungju; Lee, Jangsoo

    2004-01-01

    Digitalization of nuclear Instrumentation and Control (I and C) system requires high reliability of not only hardware but also software. Verification and Validation (V and V) process is recommended for software reliability. But a more quantitative method is necessary such as software testing. Most of software in the nuclear I and C system is safety-critical embedded software. Safety-critical embedded software is specified, verified and developed according to V and V process. Hence two types of software testing techniques are necessary for the developed code. First, code-based software testing is required to examine the developed code. Second, after code-based software testing, software testing affected by hardware is required to reveal the interaction fault that may cause unexpected results. We call the testing of hardware's influence on software, an interaction testing. In case of safety-critical embedded software, it is also important to consider the interaction between hardware and software. Even if no faults are detected when testing either hardware or software alone, combining these components may lead to unexpected results due to the interaction. In this paper, we propose a software test process that embraces test levels, test techniques, required test tasks and documents for safety-critical embedded software. We apply the proposed test process to safety-critical embedded software as a case study, and show the effectiveness of it. (author)

  20. Calculation of Rydberg interaction potentials

    International Nuclear Information System (INIS)

    Weber, Sebastian; Büchler, Hans Peter; Tresp, Christoph; Urvoy, Alban; Hofferberth, Sebastian; Menke, Henri; Firstenberg, Ofer

    2017-01-01

    The strong interaction between individual Rydberg atoms provides a powerful tool exploited in an ever-growing range of applications in quantum information science, quantum simulation and ultracold chemistry. One hallmark of the Rydberg interaction is that both its strength and angular dependence can be fine-tuned with great flexibility by choosing appropriate Rydberg states and applying external electric and magnetic fields. More and more experiments are probing this interaction at short atomic distances or with such high precision that perturbative calculations as well as restrictions to the leading dipole–dipole interaction term are no longer sufficient. In this tutorial, we review all relevant aspects of the full calculation of Rydberg interaction potentials. We discuss the derivation of the interaction Hamiltonian from the electrostatic multipole expansion, numerical and analytical methods for calculating the required electric multipole moments and the inclusion of electromagnetic fields with arbitrary direction. We focus specifically on symmetry arguments and selection rules, which greatly reduce the size of the Hamiltonian matrix, enabling the direct diagonalization of the Hamiltonian up to higher multipole orders on a desktop computer. Finally, we present example calculations showing the relevance of the full interaction calculation to current experiments. Our software for calculating Rydberg potentials including all features discussed in this tutorial is available as open source. (tutorial)

  1. The SCEC/UseIT Intern Program: Creating Open-Source Visualization Software Using Diverse Resources

    Science.gov (United States)

    Francoeur, H.; Callaghan, S.; Perry, S.; Jordan, T.

    2004-12-01

    The Southern California Earthquake Center undergraduate IT intern program (SCEC UseIT) conducts IT research to benefit collaborative earth science research. Through this program, interns have developed real-time, interactive, 3D visualization software using open-source tools. Dubbed LA3D, a distribution of this software is now in use by the seismic community. LA3D enables the user to interactively view Southern California datasets and models of importance to earthquake scientists, such as faults, earthquakes, fault blocks, digital elevation models, and seismic hazard maps. LA3D is now being extended to support visualizations anywhere on the planet. The new software, called SCEC-VIDEO (Virtual Interactive Display of Earth Objects), makes use of a modular, plugin-based software architecture which supports easy development and integration of new data sets. Currently SCEC-VIDEO is in beta testing, with a full open-source release slated for the future. Both LA3D and SCEC-VIDEO were developed using a wide variety of software technologies. These, which included relational databases, web services, software management technologies, and 3-D graphics in Java, were necessary to integrate the heterogeneous array of data sources which comprise our software. Currently the interns are working to integrate new technologies and larger data sets to increase software functionality and value. In addition, both LA3D and SCEC-VIDEO allow the user to script and create movies. Thus program interns with computer science backgrounds have been writing software while interns with other interests, such as cinema, geology, and education, have been making movies that have proved of great use in scientific talks, media interviews, and education. Thus, SCEC UseIT incorporates a wide variety of scientific and human resources to create products of value to the scientific and outreach communities. The program plans to continue with its interdisciplinary approach, increasing the relevance of the

  2. FACET: A simulation software framework for modeling complex societal processes and interactions

    Energy Technology Data Exchange (ETDEWEB)

    Christiansen, J. H.

    2000-06-02

    FACET, the Framework for Addressing Cooperative Extended Transactions, was developed at Argonne National Laboratory to address the need for a simulation software architecture in the style of an agent-based approach, but with sufficient robustness, expressiveness, and flexibility to be able to deal with the levels of complexity seen in real-world social situations. FACET is an object-oriented software framework for building models of complex, cooperative behaviors of agents. It can be used to implement simulation models of societal processes such as the complex interplay of participating individuals and organizations engaged in multiple concurrent transactions in pursuit of their various goals. These transactions can be patterned on, for example, clinical guidelines and procedures, business practices, government and corporate policies, etc. FACET can also address other complex behaviors such as biological life cycles or manufacturing processes. To date, for example, FACET has been applied to such areas as land management, health care delivery, avian social behavior, and interactions between natural and social processes in ancient Mesopotamia.

  3. Frameworks for user - developer interactions in a software ...

    African Journals Online (AJOL)

    The dependence of today's society on Information and Communications technology has necessitated the need for software project managers to strive for continuous process improvement. A major challenge faced by most software project managers especially in developing countries however centers on effective ...

  4. Learning Photogrammetry with Interactive Software Tool PhoX

    Directory of Open Access Journals (Sweden)

    T. Luhmann

    2016-06-01

    Full Text Available Photogrammetry is a complex topic in high-level university teaching, especially in the fields of geodesy, geoinformatics and metrology where high quality results are demanded. In addition, more and more black-box solutions for 3D image processing and point cloud generation are available that generate nice results easily, e.g. by structure-from-motion approaches. Within this context, the classical approach of teaching photogrammetry (e.g. focusing on aerial stereophotogrammetry has to be reformed in order to educate students and professionals with new topics and provide them with more information behind the scene. Since around 20 years photogrammetry courses at the Jade University of Applied Sciences in Oldenburg, Germany, include the use of digital photogrammetry software that provide individual exercises, deep analysis of calculation results and a wide range of visualization tools for almost all standard tasks in photogrammetry. During the last years the software package PhoX has been developed that is part of a new didactic concept in photogrammetry and related subjects. It also serves as analysis tool in recent research projects. PhoX consists of a project-oriented data structure for images, image data, measured points and features and 3D objects. It allows for almost all basic photogrammetric measurement tools, image processing, calculation methods, graphical analysis functions, simulations and much more. Students use the program in order to conduct predefined exercises where they have the opportunity to analyse results in a high level of detail. This includes the analysis of statistical quality parameters but also the meaning of transformation parameters, rotation matrices, calibration and orientation data. As one specific advantage, PhoX allows for the interactive modification of single parameters and the direct view of the resulting effect in image or object space.

  5. Learning Photogrammetry with Interactive Software Tool PhoX

    Science.gov (United States)

    Luhmann, T.

    2016-06-01

    Photogrammetry is a complex topic in high-level university teaching, especially in the fields of geodesy, geoinformatics and metrology where high quality results are demanded. In addition, more and more black-box solutions for 3D image processing and point cloud generation are available that generate nice results easily, e.g. by structure-from-motion approaches. Within this context, the classical approach of teaching photogrammetry (e.g. focusing on aerial stereophotogrammetry) has to be reformed in order to educate students and professionals with new topics and provide them with more information behind the scene. Since around 20 years photogrammetry courses at the Jade University of Applied Sciences in Oldenburg, Germany, include the use of digital photogrammetry software that provide individual exercises, deep analysis of calculation results and a wide range of visualization tools for almost all standard tasks in photogrammetry. During the last years the software package PhoX has been developed that is part of a new didactic concept in photogrammetry and related subjects. It also serves as analysis tool in recent research projects. PhoX consists of a project-oriented data structure for images, image data, measured points and features and 3D objects. It allows for almost all basic photogrammetric measurement tools, image processing, calculation methods, graphical analysis functions, simulations and much more. Students use the program in order to conduct predefined exercises where they have the opportunity to analyse results in a high level of detail. This includes the analysis of statistical quality parameters but also the meaning of transformation parameters, rotation matrices, calibration and orientation data. As one specific advantage, PhoX allows for the interactive modification of single parameters and the direct view of the resulting effect in image or object space.

  6. A META-COMPOSITE SOFTWARE DEVELOPMENT APPROACH FOR TRANSLATIONAL RESEARCH

    Science.gov (United States)

    Sadasivam, Rajani S.; Tanik, Murat M.

    2013-01-01

    Translational researchers conduct research in a highly data-intensive and continuously changing environment and need to use multiple, disparate tools to achieve their goals. These researchers would greatly benefit from meta-composite software development or the ability to continuously compose and recompose tools together in response to their ever-changing needs. However, the available tools are largely disconnected, and current software approaches are inefficient and ineffective in their support for meta-composite software development. Building on the composite services development approach, the de facto standard for developing integrated software systems, we propose a concept-map and agent-based meta-composite software development approach. A crucial step in composite services development is the modeling of users’ needs as processes, which can then be specified in an executable format for system composition. We have two key innovations. First, our approach allows researchers (who understand their needs best) instead of technicians to take a leadership role in the development of process models, reducing inefficiencies and errors. A second innovation is that our approach also allows for modeling of complex user interactions as part of the process, overcoming the technical limitations of current tools. We demonstrate the feasibility of our approach using a real-world translational research use case. We also present results of usability studies evaluating our approach for future refinements. PMID:23504436

  7. A meta-composite software development approach for translational research.

    Science.gov (United States)

    Sadasivam, Rajani S; Tanik, Murat M

    2013-06-01

    Translational researchers conduct research in a highly data-intensive and continuously changing environment and need to use multiple, disparate tools to achieve their goals. These researchers would greatly benefit from meta-composite software development or the ability to continuously compose and recompose tools together in response to their ever-changing needs. However, the available tools are largely disconnected, and current software approaches are inefficient and ineffective in their support for meta-composite software development. Building on the composite services development approach, the de facto standard for developing integrated software systems, we propose a concept-map and agent-based meta-composite software development approach. A crucial step in composite services development is the modeling of users' needs as processes, which can then be specified in an executable format for system composition. We have two key innovations. First, our approach allows researchers (who understand their needs best) instead of technicians to take a leadership role in the development of process models, reducing inefficiencies and errors. A second innovation is that our approach also allows for modeling of complex user interactions as part of the process, overcoming the technical limitations of current tools. We demonstrate the feasibility of our approach using a real-world translational research use case. We also present results of usability studies evaluating our approach for future refinements.

  8. Software Atom: An approach towards software components structuring to improve reusability

    Directory of Open Access Journals (Sweden)

    Muhammad Hussain Mughal

    2017-12-01

    Full Text Available Diversity of application domain compelled to design sustainable classification scheme for significantly amassing software repository. The atomic reusable software components are articulated to improve the software component reusability in volatile industry.  Numerous approaches of software classification have been proposed over past decades. Each approach has some limitations related to coupling and cohesion. In this paper, we proposed a novel approach by constituting the software based on radical functionalities to improve software reusability. We analyze the element's semantics in Periodic Table used in chemistry to design our classification approach, and present this approach using tree-based classification to curtail software repository search space complexity and further refined based on semantic search techniques. We developed a Global unique Identifier (GUID for indexing the functions and related components. We have exploited the correlation between chemistry element and software elements to simulate one to one mapping between them. Our approach is inspired from sustainability chemical periodic table. We have proposed software periodic table (SPT representing atomic software components extracted from real application software. Based on SPT classified repository tree parsing & extraction to enable the user to program their software by customizing the ingredients of software requirements. The classified repository of software ingredients assist user to exploits their requirements to software engineer and enable requirement engineer to develop a rapid large-scale prototype with great essence. Furthermore, we would predict the usability of the categorized repository based on feedback of users.  The continuous evolution of that proposed repository will be fine-tuned based on utilization and SPT would be gradually optimized by ant colony optimization techniques. Succinctly would provoke automating the software development process.

  9. Agile software development

    CERN Document Server

    Dingsoyr, Torgeir; Moe, Nils Brede

    2010-01-01

    Agile software development has become an umbrella term for a number of changes in how software developers plan and coordinate their work, how they communicate with customers and external stakeholders, and how software development is organized in small, medium, and large companies, from the telecom and healthcare sectors to games and interactive media. Still, after a decade of research, agile software development is the source of continued debate due to its multifaceted nature and insufficient synthesis of research results. Dingsoyr, Dyba, and Moe now present a comprehensive snapshot of the kno

  10. Exploiting Software Tool Towards Easier Use And Higher Efficiency

    Science.gov (United States)

    Lin, G. H.; Su, J. T.; Deng, Y. Y.

    2006-08-01

    In developing countries, using data based on instrument made by themselves in maximum extent is very important. It is not only related to maximizing science returns upon prophase investment -- deep accumulations in every aspects but also science output. Based on the idea, we are exploiting a software (called THDP: Tool of Huairou Data Processing). It is used for processing a series of issues, which is met necessary in processing data. This paper discusses its designed purpose, functions, method and specialities. The primary vehicle for general data interpretation is through various techniques of data visualization, techniques of interactive. In the software, we employed Object Oriented approach. It is appropriate to the vehicle. it is imperative that the approach provide not only function, but do so in as convenient a fashion as possible. As result of the software exploiting, it is not only easier to learn data processing for beginner and more convenienter to need further improvement for senior but also increase greatly efficiency in every phrases include analyse, parameter adjusting, result display. Under frame of virtual observatory, for developing countries, we should study more and newer related technologies, which can advance ability and efficiency in science research, like the software we are developing

  11. Energy-Aware Software Engineering

    DEFF Research Database (Denmark)

    Eder, Kerstin; Gallagher, John Patrick

    2017-01-01

    A great deal of energy in Information and Communication Technology (ICT) systems can be wasted by software, regardless of how energy-efficient the underlying hardware is. To avoid such waste, programmers need to understand the energy consumption of programs during the development process rather......, the chapter discusses how energy analysis and modelling techniques can be incorporated in software engineering tools, including existing compilers, to assist the energy-aware programmer to optimise the energy consumption of code....

  12. Simulation software support (S3) system a software testing and debugging tool

    International Nuclear Information System (INIS)

    Burgess, D.C.; Mahjouri, F.S.

    1990-01-01

    The largest percentage of technical effort in the software development process is accounted for debugging and testing. It is not unusual for a software development organization to spend over 50% of the total project effort on testing. In the extreme, testing of human-rated software (e.g., nuclear reactor monitoring, training simulator) can cost three to five times as much as all other software engineering steps combined. The Simulation Software Support (S 3 ) System, developed by the Link-Miles Simulation Corporation is ideally suited for real-time simulation applications which involve a large database with models programmed in FORTRAN. This paper will focus on testing elements of the S 3 system. In this paper system support software utilities are provided which enable the loading and execution of modules in the development environment. These elements include the Linking/Loader (LLD) for dynamically linking program modules and loading them into memory and the interactive executive (IEXEC) for controlling the execution of the modules. Features of the Interactive Symbolic Debugger (SD) and the Real Time Executive (RTEXEC) to support the unit and integrated testing will be explored

  13. 350 Years of Fire-Climate-Human Interactions in a Great Lakes Sandy Outwash Plain

    Directory of Open Access Journals (Sweden)

    Richard P. Guyette

    2016-08-01

    Full Text Available Throughout much of eastern North America, quantitative records of historical fire regimes and interactions with humans are absent. Annual resolution fire scar histories provide data on fire frequency, extent, and severity, but also can be used to understand fire-climate-human interactions. This study used tree-ring dated fire scars from red pines (Pinus resinosa at four sites in the Northern Sands Ecological Landscapes of Wisconsin to quantify the interactions among fire occurrence and seasonality, drought, and humans. New methods for assessing the influence of human ignitions on fire regimes were developed. A temporal and spatial index of wildland fire was significantly correlated (r = 0.48 with drought indices (Palmer Drought Severity Index, PDSI. Fire intervals varied through time with human activities that included early French Jesuit missions, European trade (fur, diseases, war, and land use. Comparisons of historical fire records suggest that annual climate in this region has a broad influence on the occurrence of fire years in the Great Lakes region.

  14. Software reengineering

    Science.gov (United States)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  15. Exploring classical Greek construction problems with interactive geometry software

    CERN Document Server

    Meskens, Ad

    2017-01-01

    In this book the classical Greek construction problems are explored in a didactical, enquiry based fashion using Interactive Geometry Software. The book traces the history of these problems, stating them in modern terminology. By focusing on constructions and the use of GeoGebra the reader is confronted with the same problems that ancient mathematicians once faced. The reader can step into the footsteps of Euclid, Viète and Cusanus amongst others and then by experimenting and discovering geometric relationships far exceed their accomplishments. Exploring these problems with the neusis-method lets him discover a class of interesting curves. By experimenting he will gain a deeper understanding of how mathematics is created. More than 100 exercises guide him through methods which were developed to try and solve the problems. The exercises are at the level of undergraduate students and only require knowledge of elementary Euclidean geometry and pre-calculus algebra. It is especially well-suited for those student...

  16. Software Development as Music Education Research

    Science.gov (United States)

    Brown, Andrew R.

    2007-01-01

    This paper discusses how software development can be used as a method for music education research. It explains how software development can externalize ideas, stimulate action and reflection, and provide evidence to support the educative value of new software-based experiences. Parallels between the interactive software development process and…

  17. Software Products - Naval Oceanography Portal

    Science.gov (United States)

    are here: Home › USNO › Astronomical Applications › Software Products USNO Logo USNO Navigation Data Services Astronomical Information Center Almanacs and Other Publications Software Products For DoD Users Info Software Products MICA - Multiyear Interactive Computer Almanac MICA CDROM (thumb) MICA is an

  18. The development of WIPPVENT, a windows based interactive mine ventilation simulation software program at the Waste Isolation Pilot Plant

    International Nuclear Information System (INIS)

    McDaniel, K.H.

    1995-01-01

    An interactive mine ventilation simulation software program (WIPPVENT) was developed at the Waste Isolation Pilot Plant (WIPP). The WIPP is a US Department of Energy (DOE) research and development project located near Carlsbad, New Mexico. The facility is designed to provide a permanent, safe underground disposal of US defense generated transuranic waste in bedded salt. In addition to it's regular functions, the underground ventilation system is engineered to prevent the uncontrolled spread of radioactive materials in the unlikely event of a release. To enhance the operability system, Westinghouse Electric Corporation has developed an interactive mine ventilation simulation software program (WIPPVENT). While WIPPVENT includes most of the functions of the commercially available simulation program VNETPC (copyright 1991 Mine Ventilation Services, Inc.), the user interface has been completely rewritten as a Windows reg-sign application and screen graphics have been added. WIPPVENT is designed to interact with the WIPP ventilation monitoring systems through the site wide Central Monitoring System

  19. Laptop Use, Interactive Science Software, and Science Learning Among At-Risk Students

    Science.gov (United States)

    Zheng, Binbin; Warschauer, Mark; Hwang, Jin Kyoung; Collins, Penelope

    2014-08-01

    This year-long, quasi-experimental study investigated the impact of the use of netbook computers and interactive science software on fifth-grade students' science learning processes, academic achievement, and interest in further science, technology, engineering, and mathematics (STEM) study within a linguistically diverse school district in California. Analysis of students' state standardized science test scores indicated that the program helped close gaps in scientific achievement between at-risk learners (i.e., English learners, Hispanics, and free/reduced-lunch recipients) and their counterparts. Teacher and student interviews and classroom observations suggested that computer-supported visual representations and interactions supported diverse learners' scientific understanding and inquiry and enabled more individualized and differentiated instruction. Finally, interviews revealed that the program had a positive impact on students' motivation in science and on their interest in pursuing science-related careers. This study suggests that technology-facilitated science instruction is beneficial for improving at-risk students' science achievement, scaffolding students' scientific understanding, and strengthening students' motivation to pursue STEM-related careers.

  20. Incorporating a Human-Computer Interaction Course into Software Development Curriculums

    Science.gov (United States)

    Janicki, Thomas N.; Cummings, Jeffrey; Healy, R. Joseph

    2015-01-01

    Individuals have increasing options on retrieving information related to hardware and software. Specific hardware devices include desktops, tablets and smart devices. Also, the number of software applications has significantly increased the user's capability to access data. Software applications include the traditional web site, smart device…

  1. Software project management tools in global software development: a systematic mapping study.

    Science.gov (United States)

    Chadli, Saad Yasser; Idri, Ali; Ros, Joaquín Nicolás; Fernández-Alemán, José Luis; de Gea, Juan M Carrillo; Toval, Ambrosio

    2016-01-01

    Global software development (GSD) which is a growing trend in the software industry is characterized by a highly distributed environment. Performing software project management (SPM) in such conditions implies the need to overcome new limitations resulting from cultural, temporal and geographic separation. The aim of this research is to discover and classify the various tools mentioned in literature that provide GSD project managers with support and to identify in what way they support group interaction. A systematic mapping study has been performed by means of automatic searches in five sources. We have then synthesized the data extracted and presented the results of this study. A total of 102 tools were identified as being used in SPM activities in GSD. We have classified these tools, according to the software life cycle process on which they focus and how they support the 3C collaboration model (communication, coordination and cooperation). The majority of the tools found are standalone tools (77%). A small number of platforms (8%) also offer a set of interacting tools that cover the software development lifecycle. Results also indicate that SPM areas in GSD are not adequately supported by corresponding tools and deserve more attention from tool builders.

  2. Calculation of Rydberg interaction potentials

    DEFF Research Database (Denmark)

    Weber, Sebastian; Tresp, Christoph; Menke, Henri

    2017-01-01

    for calculating the required electric multipole moments and the inclusion of electromagnetic fields with arbitrary direction. We focus specifically on symmetry arguments and selection rules, which greatly reduce the size of the Hamiltonian matrix, enabling the direct diagonalization of the Hamiltonian up...... to higher multipole orders on a desktop computer. Finally, we present example calculations showing the relevance of the full interaction calculation to current experiments. Our software for calculating Rydberg potentials including all features discussed in this tutorial is available as open source....

  3. Agile distributed software development

    DEFF Research Database (Denmark)

    Persson, John Stouby; Mathiassen, Lars; Aaen, Ivan

    2012-01-01

    While face-to-face interaction is fundamental in agile software development, distributed environments must rely extensively on mediated interactions. Practicing agile principles in distributed environments therefore poses particular control challenges related to balancing fixed vs. evolving quality...... requirements and people vs. process-based collaboration. To investigate these challenges, we conducted an in-depth case study of a successful agile distributed software project with participants from a Russian firm and a Danish firm. Applying Kirsch’s elements of control framework, we offer an analysis of how...

  4. Experimental research control software system

    International Nuclear Information System (INIS)

    Cohn, I A; Kovalenko, A G; Vystavkin, A N

    2014-01-01

    A software system, intended for automation of a small scale research, has been developed. The software allows one to control equipment, acquire and process data by means of simple scripts. The main purpose of that development is to increase experiment automation easiness, thus significantly reducing experimental setup automation efforts. In particular, minimal programming skills are required and supervisors have no reviewing troubles. Interactions between scripts and equipment are managed automatically, thus allowing to run multiple scripts simultaneously. Unlike well-known data acquisition commercial software systems, the control is performed by an imperative scripting language. This approach eases complex control and data acquisition algorithms implementation. A modular interface library performs interaction with external interfaces. While most widely used interfaces are already implemented, a simple framework is developed for fast implementations of new software and hardware interfaces. While the software is in continuous development with new features being implemented, it is already used in our laboratory for automation of a helium-3 cryostat control and data acquisition. The software is open source and distributed under Gnu Public License.

  5. Experimental research control software system

    Science.gov (United States)

    Cohn, I. A.; Kovalenko, A. G.; Vystavkin, A. N.

    2014-05-01

    A software system, intended for automation of a small scale research, has been developed. The software allows one to control equipment, acquire and process data by means of simple scripts. The main purpose of that development is to increase experiment automation easiness, thus significantly reducing experimental setup automation efforts. In particular, minimal programming skills are required and supervisors have no reviewing troubles. Interactions between scripts and equipment are managed automatically, thus allowing to run multiple scripts simultaneously. Unlike well-known data acquisition commercial software systems, the control is performed by an imperative scripting language. This approach eases complex control and data acquisition algorithms implementation. A modular interface library performs interaction with external interfaces. While most widely used interfaces are already implemented, a simple framework is developed for fast implementations of new software and hardware interfaces. While the software is in continuous development with new features being implemented, it is already used in our laboratory for automation of a helium-3 cryostat control and data acquisition. The software is open source and distributed under Gnu Public License.

  6. Modern software approaches applied to a Hydrological model: the GEOtop Open-Source Software Project

    Science.gov (United States)

    Cozzini, Stefano; Endrizzi, Stefano; Cordano, Emanuele; Bertoldi, Giacomo; Dall'Amico, Matteo

    2017-04-01

    The GEOtop hydrological scientific package is an integrated hydrological model that simulates the heat and water budgets at and below the soil surface. It describes the three-dimensional water flow in the soil and the energy exchange with the atmosphere, considering the radiative and turbulent fluxes. Furthermore, it reproduces the highly non-linear interactions between the water and energy balance during soil freezing and thawing, and simulates the temporal evolution of snow cover, soil temperature and moisture. The core components of the package were presented in the 2.0 version (Endrizzi et al, 2014), which was released as Free Software Open-source project. However, despite the high scientific quality of the project, a modern software engineering approach was still missing. Such weakness hindered its scientific potential and its use both as a standalone package and, more importantly, in an integrate way with other hydrological software tools. In this contribution we present our recent software re-engineering efforts to create a robust and stable scientific software package open to the hydrological community, easily usable by researchers and experts, and interoperable with other packages. The activity takes as a starting point the 2.0 version, scientifically tested and published. This version, together with several test cases based on recent published or available GEOtop applications (Cordano and Rigon, 2013, WRR, Kollet et al, 2016, WRR) provides the baseline code and a certain number of referenced results as benchmark. Comparison and scientific validation can then be performed for each software re-engineering activity performed on the package. To keep track of any single change the package is published on its own github repository geotopmodel.github.io/geotop/ under GPL v3.0 license. A Continuous Integration mechanism by means of Travis-CI has been enabled on the github repository on master and main development branches. The usage of CMake configuration tool

  7. Non-intrusive Instance Level Software Composition

    NARCIS (Netherlands)

    Hatun, Kardelen

    2014-01-01

    A software system is comprised of parts, which interact through shared interfaces. Certain qualities of integration, such as loose-coupling, requiring minimal changes to the software and fine-grained localisation of dependencies, have impact on the overall software quality. Current general-purpose

  8. Development and Evaluation of Multimedia Software for the Communicative Arabic Implementation of the the J-QAF Programme in Primary Schools

    Directory of Open Access Journals (Sweden)

    Maimun Aqsha Lubis

    2013-12-01

    Full Text Available Multimedia can be used to overcome the weaknesses in the process of teaching and learning. However, multimedia elements embedded in the interactive environment of software sometimes fail to make a presentation interesting and motivating for pupils. This study aimed at developing and evaluating interactive multimedia software that can serve as a tutorial for the Communicative Arabic Implementation Expansion Module of the j-QAF programme. This multimedia software was developed based on the Year One j-QAF curriculum issued by the Ministry of Education, Malaysia. Additionally, the objectives of this research were aimed to improve the emphasis on the use of teaching aids. Apart from that, this software was developed to evaluate its effectiveness for the Communicative Arabic Implementation Expansion Module based on the basis of usability and suitability for Year One j-QAF pupils in national primary schools. Next, this software was also designed to help teachers and pupils teach and learn effectively, respectively; in the classroom and to aid self-learning outside the classroom. Eventually, this software from other perspectives will be helpful and can be applied by teachers as an alternative teaching aid because there are combinations of various media in this software and it takes a great deal to get new technology into a position in education in terms of meaning and benefits.

  9. Specdata: Automated Analysis Software for Broadband Spectra

    Science.gov (United States)

    Oliveira, Jasmine N.; Martin-Drumel, Marie-Aline; McCarthy, Michael C.

    2017-06-01

    With the advancement of chirped-pulse techniques, broadband rotational spectra with a few tens to several hundred GHz of spectral coverage are now routinely recorded. When studying multi-component mixtures that might result, for example, with the use of an electrical discharge, lines of new chemical species are often obscured by those of known compounds, and analysis can be laborious. To address this issue, we have developed SPECdata, an open source, interactive tool which is designed to simplify and greatly accelerate the spectral analysis and discovery. Our software tool combines both automated and manual components that free the user from computation, while giving him/her considerable flexibility to assign, manipulate, interpret and export their analysis. The automated - and key - component of the new software is a database query system that rapidly assigns transitions of known species in an experimental spectrum. For each experiment, the software identifies spectral features, and subsequently assigns them to known molecules within an in-house database (Pickett .cat files, list of frequencies...), or those catalogued in Splatalogue (using automatic on-line queries). With suggested assignments, the control is then handed over to the user who can choose to accept, decline or add additional species. Data visualization, statistical information, and interactive widgets assist the user in making decisions about their data. SPECdata has several other useful features intended to improve the user experience. Exporting a full report of the analysis, or a peak file in which assigned lines are removed are among several options. A user may also save their progress to continue at another time. Additional features of SPECdata help the user to maintain and expand their database for future use. A user-friendly interface allows one to search, upload, edit or update catalog or experiment entries.

  10. Software: our quest for excellence. Honoring 50 years of software history, progress, and process

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-06-01

    The Software Quality Forum was established by the Software Quality Assurance (SQA) Subcommittee, which serves as a technical advisory group on software engineering and quality initiatives and issues for DOE`s quality managers. The forum serves as an opportunity for all those involved in implementing SQA programs to meet and share ideas and concerns. Participation from managers, quality engineers, and software professionals provides an ideal environment for identifying and discussing issues and concerns. The interaction provided by the forum contributes to the realization of a shared goal--high quality software product. Topics include: testing, software measurement, software surety, software reliability, SQA practices, assessments, software process improvement, certification and licensing of software professionals, CASE tools, software project management, inspections, and management`s role in ensuring SQA. The bulk of this document consists of vugraphs. Selected papers have been indexed separately for inclusion in the Energy Science and Technology Database.

  11. OST: analysis tool for real time software by simulation of material and software environments

    International Nuclear Information System (INIS)

    Boulc'h; Le Meur; Lapassat; Salichon; Segalard

    1988-07-01

    The utilization of microprocessors systems in a nuclear installation control oblige a great operation safety in the installation operation and in the environment protection. For the safety analysis of these installations the Institute of Protection and Nuclear Safety (IPSN) will dispose tools which permit to make controls during all the life of the software. The simulation and test tool (OST) which have been created is completely made by softwares. It is used on VAX calculators and can be easily transportable on other calculators [fr

  12. Software-type Wave-Particle Interaction Analyzer on board the Arase satellite

    Science.gov (United States)

    Katoh, Yuto; Kojima, Hirotsugu; Hikishima, Mitsuru; Takashima, Takeshi; Asamura, Kazushi; Miyoshi, Yoshizumi; Kasahara, Yoshiya; Kasahara, Satoshi; Mitani, Takefumi; Higashio, Nana; Matsuoka, Ayako; Ozaki, Mitsunori; Yagitani, Satoshi; Yokota, Shoichiro; Matsuda, Shoya; Kitahara, Masahiro; Shinohara, Iku

    2018-01-01

    We describe the principles of the Wave-Particle Interaction Analyzer (WPIA) and the implementation of the Software-type WPIA (S-WPIA) on the Arase satellite. The WPIA is a new type of instrument for the direct and quantitative measurement of wave-particle interactions. The S-WPIA is installed on the Arase satellite as a software function running on the mission data processor. The S-WPIA on board the Arase satellite uses an electromagnetic field waveform that is measured by the waveform capture receiver of the plasma wave experiment (PWE), and the velocity vectors of electrons detected by the medium-energy particle experiment-electron analyzer (MEP-e), the high-energy electron experiment (HEP), and the extremely high-energy electron experiment (XEP). The prime objective of the S-WPIA is to measure the energy exchange between whistler-mode chorus emissions and energetic electrons in the inner magnetosphere. It is essential for the S-WPIA to synchronize instruments to a relative time accuracy better than the time period of the plasma wave oscillations. Since the typical frequency of chorus emissions in the inner magnetosphere is a few kHz, a relative time accuracy of better than 10 μs is required in order to measure the relative phase angle between the wave and velocity vectors. In the Arase satellite, a dedicated system has been developed to realize the time resolution required for inter-instrument communication. Here, both the time index distributed over all instruments through the satellite system and an S-WPIA clock signal are used, that are distributed from the PWE to the MEP-e, HEP, and XEP through a direct line, for the synchronization of instruments within a relative time accuracy of a few μs. We also estimate the number of particles required to obtain statistically significant results with the S-WPIA and the expected accumulation time by referring to the specifications of the MEP-e and assuming a count rate for each detector.

  13. CLIMLAB: a Python-based software toolkit for interactive, process-oriented climate modeling

    Science.gov (United States)

    Rose, B. E. J.

    2015-12-01

    Global climate is a complex emergent property of the rich interactions between simpler components of the climate system. We build scientific understanding of this system by breaking it down into component process models (e.g. radiation, large-scale dynamics, boundary layer turbulence), understanding each components, and putting them back together. Hands-on experience and freedom to tinker with climate models (whether simple or complex) is invaluable for building physical understanding. CLIMLAB is an open-ended software engine for interactive, process-oriented climate modeling. With CLIMLAB you can interactively mix and match model components, or combine simpler process models together into a more comprehensive model. It was created primarily to support classroom activities, using hands-on modeling to teach fundamentals of climate science at both undergraduate and graduate levels. CLIMLAB is written in Python and ties in with the rich ecosystem of open-source scientific Python tools for numerics and graphics. The IPython notebook format provides an elegant medium for distributing interactive example code. I will give an overview of the current capabilities of CLIMLAB, the curriculum we have developed thus far, and plans for the future. Using CLIMLAB requires some basic Python coding skills. We consider this an educational asset, as we are targeting upper-level undergraduates and Python is an increasingly important language in STEM fields. However CLIMLAB is well suited to be deployed as a computational back-end for a graphical gaming environment based on earth-system modeling.

  14. BMRF-Net: a software tool for identification of protein interaction subnetworks by a bagging Markov random field-based method.

    Science.gov (United States)

    Shi, Xu; Barnes, Robert O; Chen, Li; Shajahan-Haq, Ayesha N; Hilakivi-Clarke, Leena; Clarke, Robert; Wang, Yue; Xuan, Jianhua

    2015-07-15

    Identification of protein interaction subnetworks is an important step to help us understand complex molecular mechanisms in cancer. In this paper, we develop a BMRF-Net package, implemented in Java and C++, to identify protein interaction subnetworks based on a bagging Markov random field (BMRF) framework. By integrating gene expression data and protein-protein interaction data, this software tool can be used to identify biologically meaningful subnetworks. A user friendly graphic user interface is developed as a Cytoscape plugin for the BMRF-Net software to deal with the input/output interface. The detailed structure of the identified networks can be visualized in Cytoscape conveniently. The BMRF-Net package has been applied to breast cancer data to identify significant subnetworks related to breast cancer recurrence. The BMRF-Net package is available at http://sourceforge.net/projects/bmrfcjava/. The package is tested under Ubuntu 12.04 (64-bit), Java 7, glibc 2.15 and Cytoscape 3.1.0. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. Space Telescope Pointing Control System software

    Science.gov (United States)

    Dougherty, H.; Rodoni, C.; Rossini, R.; Tompetrini, K.; Nakashima, A.; Bradley, A.

    1982-01-01

    The Space Telescope Pointing Control System software is in the advanced development stage, having been tested on both the airbearing and the static simulator. The overall structure of the software is discussed, along with timing and sizing evaluations. The interaction between the controls analysts and software designer is described.

  16. Integration of interactive three-dimensional image post-processing software into undergraduate radiology education effectively improves diagnostic skills and visual-spatial ability

    Energy Technology Data Exchange (ETDEWEB)

    Rengier, Fabian, E-mail: fabian.rengier@web.de [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology, Im Neuenheimer Feld 110, 69120 Heidelberg (Germany); Häfner, Matthias F. [University Hospital Heidelberg, Department of Radiation Oncology, Im Neuenheimer Feld 400, 69120 Heidelberg (Germany); Unterhinninghofen, Roland [Karlsruhe Institute of Technology (KIT), Institute for Anthropomatics, Department of Informatics, Adenauerring 2, 76131 Karlsruhe (Germany); Nawrotzki, Ralph; Kirsch, Joachim [University of Heidelberg, Institute of Anatomy and Cell Biology, Im Neuenheimer Feld 307, 69120 Heidelberg (Germany); Kauczor, Hans-Ulrich [University Hospital Heidelberg, Department of Diagnostic and Interventional Radiology, Im Neuenheimer Feld 110, 69120 Heidelberg (Germany); Giesel, Frederik L. [University of Heidelberg, Institute of Anatomy and Cell Biology, Im Neuenheimer Feld 307, 69120 Heidelberg (Germany); University Hospital Heidelberg, Department of Nuclear Medicine, Im Neuenheimer Feld 400, 69120 Heidelberg (Germany)

    2013-08-15

    Purpose: Integrating interactive three-dimensional post-processing software into undergraduate radiology teaching might be a promising approach to synergistically improve both visual-spatial ability and radiological skills, thereby reducing students’ deficiencies in image interpretation. The purpose of this study was to test our hypothesis that a hands-on radiology course for medical students using interactive three-dimensional image post-processing software improves radiological knowledge, diagnostic skills and visual-spatial ability. Materials and methods: A hands-on radiology course was developed using interactive three-dimensional image post-processing software. The course consisted of seven seminars held on a weekly basis. The 25 participating fourth- and fifth-year medical students learnt to systematically analyse cross-sectional imaging data and correlated the two-dimensional images with three-dimensional reconstructions. They were instructed by experienced radiologists and collegiate tutors. The improvement in radiological knowledge, diagnostic skills and visual-spatial ability was assessed immediately before and after the course by multiple-choice tests comprising 64 questions each. Wilcoxon signed rank test for paired samples was applied. Results: The total number of correctly answered questions improved from 36.9 ± 4.8 to 49.5 ± 5.4 (p < 0.001) which corresponded to a mean improvement of 12.6 (95% confidence interval 9.9–15.3) or 19.8%. Radiological knowledge improved by 36.0% (p < 0.001), diagnostic skills for cross-sectional imaging by 38.7% (p < 0.001), diagnostic skills for other imaging modalities – which were not included in the course – by 14.0% (p = 0.001), and visual-spatial ability by 11.3% (p < 0.001). Conclusion: The integration of interactive three-dimensional image post-processing software into undergraduate radiology education effectively improves radiological reasoning, diagnostic skills and visual-spatial ability, and thereby

  17. Integration of interactive three-dimensional image post-processing software into undergraduate radiology education effectively improves diagnostic skills and visual-spatial ability

    International Nuclear Information System (INIS)

    Rengier, Fabian; Häfner, Matthias F.; Unterhinninghofen, Roland; Nawrotzki, Ralph; Kirsch, Joachim; Kauczor, Hans-Ulrich; Giesel, Frederik L.

    2013-01-01

    Purpose: Integrating interactive three-dimensional post-processing software into undergraduate radiology teaching might be a promising approach to synergistically improve both visual-spatial ability and radiological skills, thereby reducing students’ deficiencies in image interpretation. The purpose of this study was to test our hypothesis that a hands-on radiology course for medical students using interactive three-dimensional image post-processing software improves radiological knowledge, diagnostic skills and visual-spatial ability. Materials and methods: A hands-on radiology course was developed using interactive three-dimensional image post-processing software. The course consisted of seven seminars held on a weekly basis. The 25 participating fourth- and fifth-year medical students learnt to systematically analyse cross-sectional imaging data and correlated the two-dimensional images with three-dimensional reconstructions. They were instructed by experienced radiologists and collegiate tutors. The improvement in radiological knowledge, diagnostic skills and visual-spatial ability was assessed immediately before and after the course by multiple-choice tests comprising 64 questions each. Wilcoxon signed rank test for paired samples was applied. Results: The total number of correctly answered questions improved from 36.9 ± 4.8 to 49.5 ± 5.4 (p < 0.001) which corresponded to a mean improvement of 12.6 (95% confidence interval 9.9–15.3) or 19.8%. Radiological knowledge improved by 36.0% (p < 0.001), diagnostic skills for cross-sectional imaging by 38.7% (p < 0.001), diagnostic skills for other imaging modalities – which were not included in the course – by 14.0% (p = 0.001), and visual-spatial ability by 11.3% (p < 0.001). Conclusion: The integration of interactive three-dimensional image post-processing software into undergraduate radiology education effectively improves radiological reasoning, diagnostic skills and visual-spatial ability, and thereby

  18. Development of a specific geological mapping software under MAPGIS

    International Nuclear Information System (INIS)

    Zhang Wenkai

    2010-01-01

    The most often used mapping software in geological exploration is MAPGIS system, and related standard is established based on it. The software has more agile functions, except for the following shortages: more parameters to select, difficult to master, different parameters to use for each one, low efficiency. As a result, a specific software is developed for geological mapping by using VC++ on the platform of MAPGIS. According to the standards, toolbars are built for strata, rock, geographic information and materials, etc. By pushing on the buttons, the parameters are selected, and menus of toolbars can be modified to select parameters for each working areas, legends can be sorted automatically. So, the speed can be improved greatly, and the parameters can be identical. The software can complete the transition between Gauss coordinate and longitude-latitude coordinate, drawing points, frames by longitude-latitude, responsible form, plain diagram and profile, etc. The software also improves the way of clipping, topologizing, node catching methods. The application of the software indicates that it can improve the speed of geological mapping greatly, and can improve the standardized level of the final maps. (authors)

  19. Active resources concept of computation for enterprise software

    Directory of Open Access Journals (Sweden)

    Koryl Maciej

    2017-06-01

    Full Text Available Traditional computational models for enterprise software are still to a great extent centralized. However, rapid growing of modern computation techniques and frameworks causes that contemporary software becomes more and more distributed. Towards development of new complete and coherent solution for distributed enterprise software construction, synthesis of three well-grounded concepts is proposed: Domain-Driven Design technique of software engineering, REST architectural style and actor model of computation. As a result new resources-based framework arises, which after first cases of use seems to be useful and worthy of further research.

  20. Software takes command

    CERN Document Server

    Manovich, Lev

    2013-01-01

    Software has replaced a diverse array of physical, mechanical, and electronic technologies used before 21st century to create, store, distribute and interact with cultural artifacts. It has become our interface to the world, to others, to our memory and our imagination - a universal language through which the world speaks, and a universal engine on which the world runs. What electricity and combustion engine were to the early 20th century, software is to the early 21st century. Offering the the first theoretical and historical account of software for media authoring and its effects on the prac

  1. Modernising ATLAS Software Build Infrastructure

    CERN Document Server

    Ritsch, Elmar; The ATLAS collaboration

    2017-01-01

    In the last year ATLAS has radically updated its software development infrastructure hugely reducing the complexity of building releases and greatly improving build speed, flexibility and code testing. The first step in this transition was the adoption of CMake as the software build system over the older CMT. This required the development of an automated translation from the old system to the new, followed by extensive testing and improvements. This resulted in a far more standard build process that was married to the method of building ATLAS software as a series of $12$ separate projects from Subversion. We then proceeded with a migration of the code base from Subversion to Git. As the Subversion repository had been structured to manage each package more or less independently there was no simple mapping that could be used to manage the migration into Git. Instead a specialist set of scripts that captured the software changes across official software releases was developed. With some clean up of the repositor...

  2. Evaluation of drug interaction microcomputer software: Dambro's Drug Interactions.

    Science.gov (United States)

    Poirier, T I; Giudici, R A

    1990-01-01

    Dambro's Drug Interactions was evaluated using general and specific criteria. The installation process, ease of learning and use were rated excellent. The user documentation and quality of the technical support were good. The scope of coverage, clinical documentation, frequency of updates, and overall clinical performance were fair. The primary advantages of the program are the quick searching and detection of drug interactions, and the attempt to provide useful interaction data, i.e., significance and reference. The disadvantages are the lack of current drug interaction information, outdated references, lack of evaluative drug interaction information, and the inability to save or print patient profiles. The program is not a good value for the pharmacist but has limited use as a quick screening tool.

  3. Probability and statistics with integrated software routines

    CERN Document Server

    Deep, Ronald

    2005-01-01

    Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods

  4. State-of-the-art Hydrology Education: Development of Windows-based and Web-based Interactive Teaching-Learning Software

    Science.gov (United States)

    Chu, X.

    2011-12-01

    This study, funded by the NSF CAREER program, focuses on developing new methods to quantify microtopography-controlled overland flow processes and integrating the cutting-edge hydrologic research with all-level education and outreach activities. To achieve the educational goal, an interactive teaching-learning software package has been developed. This software, with enhanced visualization capabilities, integrates the new modeling techniques, computer-guided learning processes, and education-oriented tools in a user-friendly interface. Both Windows-based and web-based versions have been developed. The software is specially designed for three major user levels: elementary level (Level 1: K-12 and outreach education), medium level (Level 2: undergraduate education), and advanced level (Level 3: graduate education). Depending on the levels, users are guided to different educational systems. Each system consists of a series of mini "libraries" featured with movies, pictures, and documentation that cover fundamental theories, varying scale experiments, and computer modeling of overland flow generation, surface runoff, and infiltration processes. Testing and practical use of this educational software in undergraduate and graduate teaching demonstrate its effectiveness to promote students' learning and interest in hydrologic sciences. This educational software also has been used as a hydrologic demonstration tool for K-12 students and Native American students through the Nurturing American Tribal Undergraduate Research Education (NATURE) program and Science, Technology, Engineering and Mathematics (STEM) outreach activities.

  5. Development of interactive software for fuel management analysis

    International Nuclear Information System (INIS)

    Graves, H.W. Jr.

    1986-01-01

    Electronic computation plays a central part in engineering analysis of all types. Utilization of microcomputers for calculations that were formerly carried out on large mainframe computers presents a unique opportunity to develop software that not only takes advantage of the lower cost of using these machines, but also increases the efficiency of the engineers performing these calculations. This paper reviews the use of electronic computers in engineering analysis, discusses the potential for microcomputer utilization in this area, and describes a series of steps to be followed in software development that can yield significant gains in engineering design efficiency

  6. Conceptualizing a Genomics Software Institute (GSI).

    Science.gov (United States)

    Gilbert, Jack A; Catlett, Charlie; Desai, Narayan; Knight, Rob; White, Owen; Robbins, Robert; Sankaran, Rajesh; Sansone, Susanna-Assunta; Field, Dawn; Meyer, Folker

    2012-03-19

    Microbial ecology has been enhanced greatly by the ongoing 'omics revolution, bringing half the world's biomass and most of its biodiversity into analytical view for the first time; indeed, it feels almost like the invention of the microscope and the discovery of the new world at the same time. With major microbial ecology research efforts accumulating prodigious quantities of sequence, protein, and metabolite data, we are now poised to address environmental microbial research at macro scales, and to begin to characterize and understand the dimensions of microbial biodiversity on the planet. What is currently impeding progress is the need for a framework within which the research community can develop, exchange and discuss predictive ecosystem models that describe the biodiversity and functional interactions. Such a framework must encompass data and metadata transparency and interoperation; data and results validation, curation, and search; application programming interfaces for modeling and analysis tools; and human and technical processes and services necessary to ensure broad adoption. Here we discuss the need for focused community interaction to augment and deepen established community efforts, beginning with the Genomic Standards Consortium (GSC), to create a science-driven strategic plan for a Genomic Software Institute (GSI).

  7. P-MartCancer-Interactive Online Software to Enable Analysis of Shotgun Cancer Proteomic Datasets.

    Science.gov (United States)

    Webb-Robertson, Bobbie-Jo M; Bramer, Lisa M; Jensen, Jeffrey L; Kobold, Markus A; Stratton, Kelly G; White, Amanda M; Rodland, Karin D

    2017-11-01

    P-MartCancer is an interactive web-based software environment that enables statistical analyses of peptide or protein data, quantitated from mass spectrometry-based global proteomics experiments, without requiring in-depth knowledge of statistical programming. P-MartCancer offers a series of statistical modules associated with quality assessment, peptide and protein statistics, protein quantification, and exploratory data analyses driven by the user via customized workflows and interactive visualization. Currently, P-MartCancer offers access and the capability to analyze multiple cancer proteomic datasets generated through the Clinical Proteomics Tumor Analysis Consortium at the peptide, gene, and protein levels. P-MartCancer is deployed as a web service (https://pmart.labworks.org/cptac.html), alternatively available via Docker Hub (https://hub.docker.com/r/pnnl/pmart-web/). Cancer Res; 77(21); e47-50. ©2017 AACR . ©2017 American Association for Cancer Research.

  8. Defining decision making strategies in software ecosystem governance

    DEFF Research Database (Denmark)

    Manikas, Konstantinos; Wnuk, Krzysztof; Shollo, Arisa

    Making the right decisions is an essential part of software ecosystem governance. Decisions related to the governance of a software ecosystem can influence the health of the ecosystem and can result in fostering the success or greatly contributing to the failure of the ecosystem. However, very few...... studies touch upon the decision making of software ecosystem governance. In this paper, we propose decomposing software ecosystem governance into three activities: input or data collection, decision making, and applying actions. We focus on the decision making activity of software ecosystem governance...... and review related literature consisted of software ecosystem governance, organizational decision making, and IT governance. Based on the identified studies, we propose a framework for defining the decision making strategies in the governance of software ecosystems. We identify five decision areas...

  9. Interactive multicentre teleconferences using open source software in a team of thoracic surgeons.

    Science.gov (United States)

    Ito, Kazuhiro; Shimada, Junichi; Katoh, Daishiro; Nishimura, Motohiro; Yanada, Masashi; Okada, Satoru; Ishihara, Shunta; Ichise, Kaori

    2012-12-01

    Real-time consultation between a team of thoracic surgeons is important for the management of difficult cases. We established a system for interactive teleconsultation between multiple sites, based on open-source software. The graphical desktop-sharing system VNC (virtual network computing) was used for remotely controlling another computer. An image-processing package (OsiriX) was installed on the server to share the medical images. We set up a voice communication system using Voice Chatter, a free, cross-platform voice communication application. Four hospitals participated in the trials. One was connected by gigabit ethernet, one by WiMAX and one by ADSL. Surgeons at three of the sites found that it was comfortable to view images and consult with each other using the teleconferencing system. However, it was not comfortable using the client that connected via WiMAX, because of dropped frames. Apart from the WiMAX connection, the VNC-based screen-sharing system transferred the clinical images efficiently and in real time. We found the screen-sharing software VNC to be a good application for medical image interpretation, especially for a team of thoracic surgeons using multislice CT scans.

  10. Software Technology for E-Commerce Era

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    The rapid growth of Internet usage and electronic commerce(e-commerce) applica t ions will push traditional industries to transform their business models and to re-engineer their information systems. This direction will give the software in d ustry either great opportunities for their business growth or crucial challenges to their existence. This article describes two essential challenges the softwar e industry will face and presents relevant new technologies that will be helpful for overcoming those challenges.

  11. Incremental Interactive Verification of the Correctness of Object-Oriented Software

    DEFF Research Database (Denmark)

    Mehnert, Hannes

    Development of correct object-oriented software is difficult, in particular if a formalised proof of its correctness is demanded. A lot of current software is developed using the object-oriented programming paradigm. This paradigm compensated for safety and security issues with imperative...... structure. For efficiency, our implementation uses copy-on-write and shared mutable data, not observable by a client. I further use this data structure to verify the correctness of a solution to the point location problem. The results demonstrate that I am able to verify the correctness of object-oriented...... programming, such as manual memory management. Popularly used integrated development environments (IDEs) provide features such as debugging and unit testing to facilitate development of robust software, but hardly any development environment supports the development of provable correct software. A tight...

  12. Biana: a software framework for compiling biological interactions and analyzing networks.

    Science.gov (United States)

    Garcia-Garcia, Javier; Guney, Emre; Aragues, Ramon; Planas-Iglesias, Joan; Oliva, Baldo

    2010-01-27

    The analysis and usage of biological data is hindered by the spread of information across multiple repositories and the difficulties posed by different nomenclature systems and storage formats. In particular, there is an important need for data unification in the study and use of protein-protein interactions. Without good integration strategies, it is difficult to analyze the whole set of available data and its properties. We introduce BIANA (Biologic Interactions and Network Analysis), a tool for biological information integration and network management. BIANA is a Python framework designed to achieve two major goals: i) the integration of multiple sources of biological information, including biological entities and their relationships, and ii) the management of biological information as a network where entities are nodes and relationships are edges. Moreover, BIANA uses properties of proteins and genes to infer latent biomolecular relationships by transferring edges to entities sharing similar properties. BIANA is also provided as a plugin for Cytoscape, which allows users to visualize and interactively manage the data. A web interface to BIANA providing basic functionalities is also available. The software can be downloaded under GNU GPL license from http://sbi.imim.es/web/BIANA.php. BIANA's approach to data unification solves many of the nomenclature issues common to systems dealing with biological data. BIANA can easily be extended to handle new specific data repositories and new specific data types. The unification protocol allows BIANA to be a flexible tool suitable for different user requirements: non-expert users can use a suggested unification protocol while expert users can define their own specific unification rules.

  13. HAZARD ANALYSIS SOFTWARE

    International Nuclear Information System (INIS)

    Sommer, S; Tinh Tran, T.

    2008-01-01

    Washington Safety Management Solutions, LLC developed web-based software to improve the efficiency and consistency of hazard identification and analysis, control selection and classification, and to standardize analysis reporting at Savannah River Site. In the new nuclear age, information technology provides methods to improve the efficiency of the documented safety analysis development process which includes hazard analysis activities. This software provides a web interface that interacts with a relational database to support analysis, record data, and to ensure reporting consistency. A team of subject matter experts participated in a series of meetings to review the associated processes and procedures for requirements and standard practices. Through these meetings, a set of software requirements were developed and compiled into a requirements traceability matrix from which software could be developed. The software was tested to ensure compliance with the requirements. Training was provided to the hazard analysis leads. Hazard analysis teams using the software have verified its operability. The software has been classified as NQA-1, Level D, as it supports the analysis team but does not perform the analysis. The software can be transported to other sites with alternate risk schemes. The software is being used to support the development of 14 hazard analyses. User responses have been positive with a number of suggestions for improvement which are being incorporated as time permits. The software has enforced a uniform implementation of the site procedures. The software has significantly improved the efficiency and standardization of the hazard analysis process

  14. Interactive scalable condensation of reverse engineered UML class diagrams for software comprehension

    NARCIS (Netherlands)

    Osman, Mohd Hafeez Bin

    2015-01-01

    Software design documentation is a valuable aid in software comprehension. However, keeping the software design up-to-date with evolving source code is challenging and time-consuming. Reverse engineering is one of the options for recovering software architecture from the implementation code.

  15. Music Software and Emerging Technology.

    Science.gov (United States)

    Peters, G. David

    1992-01-01

    Traces the history of instructional computing in music education. Describes the development of music software and hardware. Discusses potential benefits of using the newly developed software in the classroom. Suggests that educators and musicians interact with the publishing community to help define their needs in music education. (DK)

  16. Implementation Of Carlson Survey Software2009 In Survey Works And Comparison With CDS Software

    Directory of Open Access Journals (Sweden)

    Mohamed Faraj EL Megrahi

    2017-02-01

    Full Text Available The automation surveying is one of the most influential changes to surveying concept and profession has had to go through, this has taken effect in two major courses, hardware (instrumentation used in data collection and presentation, and the software (the applications used in data processing and manipulation. Automation is majorly computer based and just like all such systems is subject to improvement often; this is manifested in the new kinds of instrumentation models every few years such as total station and newer versions of software’s. The software that has the potential to completely affect survey automation is Carlson Surveying Software. This when coupled with total station as data processing and collection methods respectively; is capable of greatly improving productivity while reducing time and cost required in the long run. However, it is only natural for users to desire a competent software and be able to choose from what is available on the market based on guided research and credible information from previous researches. Such studies not only help in choice of software but are also handy when it comes to testing approaches and recommending improvements based on advantages and disadvantages to the manufacturers to help in advancement in the software industry for better and more comfortable use. The expected outcome of the research is a successful implementation of Carlson survey 2009 software in survey works and a comparison with other existing software like Civil Design Software (CDS was highlighted its advantages and disadvantages.

  17. Improving the Agency's Software Acquisition Capability

    Science.gov (United States)

    Hankinson, Allen

    2003-01-01

    External development of software has oftc n led to unsatisfactory results and great frustration for the assurE 7ce community. Contracts frequently omit critical assuranc 4 processes or the right to oversee software development activitie: At a time when NASA depends more and more on software to in plement critical system functions, combination of three factors ex; cerbate this problem: I ) the ever-increasing trend to acquire rather than develop software in-house, 2) the trend toward performance based contracts, and 3) acquisition vehicles that only state softwar 2 requirements while leaving development standards and assur! ince methodologies up to the contractor. We propose to identify specific methods at d tools that NASA projects can use to mitigate the adverse el ects of the three problems. TWO broad classes of methoddt ols will be explored. The first will be those that provide NASA p ojects with insight and oversight into contractors' activities. The st cond will be those that help projects objectively assess, and thus i nprwe, their software acquisition capability. Of particular interest is the Software Engineering Institute's (SEI) Software Acqt isition Capability Maturity Model (SA-CMMO).

  18. Integration of interactive three-dimensional image post-processing software into undergraduate radiology education effectively improves diagnostic skills and visual-spatial ability.

    Science.gov (United States)

    Rengier, Fabian; Häfner, Matthias F; Unterhinninghofen, Roland; Nawrotzki, Ralph; Kirsch, Joachim; Kauczor, Hans-Ulrich; Giesel, Frederik L

    2013-08-01

    Integrating interactive three-dimensional post-processing software into undergraduate radiology teaching might be a promising approach to synergistically improve both visual-spatial ability and radiological skills, thereby reducing students' deficiencies in image interpretation. The purpose of this study was to test our hypothesis that a hands-on radiology course for medical students using interactive three-dimensional image post-processing software improves radiological knowledge, diagnostic skills and visual-spatial ability. A hands-on radiology course was developed using interactive three-dimensional image post-processing software. The course consisted of seven seminars held on a weekly basis. The 25 participating fourth- and fifth-year medical students learnt to systematically analyse cross-sectional imaging data and correlated the two-dimensional images with three-dimensional reconstructions. They were instructed by experienced radiologists and collegiate tutors. The improvement in radiological knowledge, diagnostic skills and visual-spatial ability was assessed immediately before and after the course by multiple-choice tests comprising 64 questions each. Wilcoxon signed rank test for paired samples was applied. The total number of correctly answered questions improved from 36.9±4.8 to 49.5±5.4 (pability by 11.3% (psoftware into undergraduate radiology education effectively improves radiological reasoning, diagnostic skills and visual-spatial ability, and thereby even diagnostic skills for imaging modalities not included in the course. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  19. Simulation of pellet-cladding interaction with the Pleiades fuel performance software environment

    International Nuclear Information System (INIS)

    Michel, B.; Nonon, C.; Sercombe, J.; Michel, F.; Marelle, V.

    2013-01-01

    This paper focuses on the PLEIADES fuel performance software environment and its application to the modeling of pellet-cladding interaction (PCI). The PLEIADES platform has been under development for 10 yr; a unified software environment, including the multidimensional finite element solver CAST3M, has been used to develop eight computation schemes now under operation. Among the latter, the ALCYONE application is devoted to pressurized water reactor fuel rod behavior. This application provides a three-dimensional (3-D) model for a detailed analysis of fuel element behavior and enables validation through comparing simulation and post-irradiation examination results (cladding residual diameter and ridges, dishing filling, pellet cracking, etc.). These last years the 3-D computation scheme of the ALCYONE application has been enriched with a complete set of physical models to take into account thermomechanical and chemical-physical behavior of the fuel element under irradiation. These models have been validated through the ALCYONE application on a large experimental database composed of approximately 400 study cases. The strong point of the ALCYONE application concerns the local approach of stress-corrosion-cracking rupture under PCI, which can be computed with the 3-D finite element solver. Further developments for PCI modeling in the PLEIADES platform are devoted to a new mesh refinement method for assessing stress-and-strain concentration (multigrid technique) and a new component for assessing fission product chemical recombination. (authors)

  20. Modernising ATLAS Software Build Infrastructure

    CERN Document Server

    Gaycken, Goetz; The ATLAS collaboration

    2017-01-01

    In the last year ATLAS has radically updated its software development infrastructure hugely reducing the complexity of building releases and greatly improving build speed, flexibility and code testing. The first step in this transition was the adoption of CMake as the software build system over the older CMT. This required the development of an automated translation from the old system to the new, followed by extensive testing and improvements. This resulted in a far more standard build process that was married to the method of building ATLAS software as a series of 12 separate projects from SVN. We then proceeded with a migration of its code base from SVN to git. As the SVN repository had been structured to manage each package more or less independently there was no simple mapping that could be used to manage the migration into git. Instead a specialist set of scripts that captured the software changes across official software releases was developed. With some clean up of the repository and the policy of onl...

  1. A Reconfigurable Simulation-Based Test System for Automatically Assessing Software Operating Skills

    Science.gov (United States)

    Su, Jun-Ming; Lin, Huan-Yu

    2015-01-01

    In recent years, software operating skills, the ability in computer literacy to solve problems using specific software, has become much more important. A great deal of research has also proven that students' software operating skills can be efficiently improved by practicing customized virtual and simulated examinations. However, constructing…

  2. System support software for TSTA

    International Nuclear Information System (INIS)

    Claborn, G.W.; Mann, L.W.; Nielson, C.W.

    1987-01-01

    The software at the Tritium Systems Test Assembly (TSTA) is logically broken into two parts, the system support software and the subsystem software. The purpose of the system support software is to isolate the subsystem software from the physical hardware. In this sense the system support software forms the kernel of the software at TSTA. The kernel software performs several functions. It gathers data from CAMAC modules and makes that data available for subsystem processes. It services requests to send commands to CAMAC modules. It provides a system of logging functions and provides for a system-wide global program state that allows highly structured interaction between subsystem processes. The kernel's most visible function is to provide the Man-Machine Interface (MMI). The MMI allows the operators a window into the physical hardware and subsystem process state. Finally the kernel provides a data archiving and compression function that allows archival data to be accessed and plotted. Such kernel software as developed and implemented at TSTA is described

  3. Interactive reconstructions of cranial 3D implants under MeVisLab as an alternative to commercial planning software.

    Directory of Open Access Journals (Sweden)

    Jan Egger

    Full Text Available In this publication, the interactive planning and reconstruction of cranial 3D Implants under the medical prototyping platform MeVisLab as alternative to commercial planning software is introduced. In doing so, a MeVisLab prototype consisting of a customized data-flow network and an own C++ module was set up. As a result, the Computer-Aided Design (CAD software prototype guides a user through the whole workflow to generate an implant. Therefore, the workflow begins with loading and mirroring the patients head for an initial curvature of the implant. Then, the user can perform an additional Laplacian smoothing, followed by a Delaunay triangulation. The result is an aesthetic looking and well-fitting 3D implant, which can be stored in a CAD file format, e.g. STereoLithography (STL, for 3D printing. The 3D printed implant can finally be used for an in-depth pre-surgical evaluation or even as a real implant for the patient. In a nutshell, our research and development shows that a customized MeVisLab software prototype can be used as an alternative to complex commercial planning software, which may also not be available in every clinic. Finally, not to conform ourselves directly to available commercial software and look for other options that might improve the workflow.

  4. Interactive reconstructions of cranial 3D implants under MeVisLab as an alternative to commercial planning software

    Science.gov (United States)

    Egger, Jan; Gall, Markus; Tax, Alois; Ücal, Muammer; Zefferer, Ulrike; Li, Xing; von Campe, Gord; Schäfer, Ute; Schmalstieg, Dieter; Chen, Xiaojun

    2017-01-01

    In this publication, the interactive planning and reconstruction of cranial 3D Implants under the medical prototyping platform MeVisLab as alternative to commercial planning software is introduced. In doing so, a MeVisLab prototype consisting of a customized data-flow network and an own C++ module was set up. As a result, the Computer-Aided Design (CAD) software prototype guides a user through the whole workflow to generate an implant. Therefore, the workflow begins with loading and mirroring the patients head for an initial curvature of the implant. Then, the user can perform an additional Laplacian smoothing, followed by a Delaunay triangulation. The result is an aesthetic looking and well-fitting 3D implant, which can be stored in a CAD file format, e.g. STereoLithography (STL), for 3D printing. The 3D printed implant can finally be used for an in-depth pre-surgical evaluation or even as a real implant for the patient. In a nutshell, our research and development shows that a customized MeVisLab software prototype can be used as an alternative to complex commercial planning software, which may also not be available in every clinic. Finally, not to conform ourselves directly to available commercial software and look for other options that might improve the workflow. PMID:28264062

  5. Intelligent Data-Driven Reverse Engineering of Software Design Patterns

    OpenAIRE

    Alhusain, Sultan

    2016-01-01

    Recognising implemented instances of Design Patterns (DPs) in software design discloses and recovers a wealth of information about the intention of the original designers and the rationale for their design decisions. Because it is often the case that the documentation available for software systems, if any, is poor and/or obsolete, recovering such information can be of great help and importance for maintenance tasks. However, since DPs are abstractly and vaguely defined, a set of software cla...

  6. Formal synthesis of application and platform behaviors of embedded software systems

    DEFF Research Database (Denmark)

    Kim, Jin Hyun; Kang, Inhye; Choi, Jin-Young

    2015-01-01

    Two main embedded software components, application software and platform software, i.e., the real-time operating system (RTOS), interact with each other in order to achieve the functionality of the system. However, they are so different in behaviors that one behavior modeling language is not suff......Two main embedded software components, application software and platform software, i.e., the real-time operating system (RTOS), interact with each other in order to achieve the functionality of the system. However, they are so different in behaviors that one behavior modeling language...... is not sufficient to model both styles of behaviors and to reason about the characteristics of their individual behaviors as well as their parallel behavior and interaction properties. In this paper, we present a formal approach to the synthesis of the application software and the RTOS behavior models...

  7. GREAT: a web portal for Genome Regulatory Architecture Tools.

    Science.gov (United States)

    Bouyioukos, Costas; Bucchini, François; Elati, Mohamed; Képès, François

    2016-07-08

    GREAT (Genome REgulatory Architecture Tools) is a novel web portal for tools designed to generate user-friendly and biologically useful analysis of genome architecture and regulation. The online tools of GREAT are freely accessible and compatible with essentially any operating system which runs a modern browser. GREAT is based on the analysis of genome layout -defined as the respective positioning of co-functional genes- and its relation with chromosome architecture and gene expression. GREAT tools allow users to systematically detect regular patterns along co-functional genomic features in an automatic way consisting of three individual steps and respective interactive visualizations. In addition to the complete analysis of regularities, GREAT tools enable the use of periodicity and position information for improving the prediction of transcription factor binding sites using a multi-view machine learning approach. The outcome of this integrative approach features a multivariate analysis of the interplay between the location of a gene and its regulatory sequence. GREAT results are plotted in web interactive graphs and are available for download either as individual plots, self-contained interactive pages or as machine readable tables for downstream analysis. The GREAT portal can be reached at the following URL https://absynth.issb.genopole.fr/GREAT and each individual GREAT tool is available for downloading. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  8. Applying CASE Tools for On-Board Software Development

    Science.gov (United States)

    Brammer, U.; Hönle, A.

    For many space projects the software development is facing great pressure with respect to quality, costs and schedule. One way to cope with these challenges is the application of CASE tools for automatic generation of code and documentation. This paper describes two CASE tools: Rhapsody (I-Logix) featuring UML and ISG (BSSE) that provides modeling of finite state machines. Both tools have been used at Kayser-Threde in different space projects for the development of on-board software. The tools are discussed with regard to the full software development cycle.

  9. Interface-based software integration

    Directory of Open Access Journals (Sweden)

    Aziz Ahmad Rais

    2016-07-01

    Full Text Available Enterprise architecture frameworks define the goals of enterprise architecture in order to make business processes and IT operations more effective, and to reduce the risk of future investments. These enterprise architecture frameworks offer different architecture development methods that help in building enterprise architecture. In practice, the larger organizations become, the larger their enterprise architecture and IT become. This leads to an increasingly complex system of enterprise architecture development and maintenance. Application software architecture is one type of architecture that, along with business architecture, data architecture and technology architecture, composes enterprise architecture. From the perspective of integration, enterprise architecture can be considered a system of interaction between multiple examples of application software. Therefore, effective software integration is a very important basis for the future success of the enterprise architecture in question. This article will provide interface-based integration practice in order to help simplify the process of building such a software integration system. The main goal of interface-based software integration is to solve problems that may arise with software integration requirements and developing software integration architecture.

  10. Software-Engineering Process Simulation (SEPS) model

    Science.gov (United States)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  11. MARS software package status

    International Nuclear Information System (INIS)

    Azhgirej, I.L.; Talanov, V.V.

    2000-01-01

    The MARS software package is intended for simulating the nuclear-electromagnetic cascades and the secondary neutrons and muons transport in the heterogeneous medium of arbitrary complexity in the magnetic fields presence. The inclusive approach to describing the particle production in the nuclear and electromagnetic interactions and by the unstable particles decay is realized in the package. The MARS software package was actively applied for solving various radiation physical problems [ru

  12. Analyzing, Modelling, and Designing Software Ecosystems

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    as the software development and distribution by a set of actors dependent on each other and the ecosystem. We commence on the hypothesis that the establishment of a software ecosystem on the telemedicine services of Denmark would address these issues and investigate how a software ecosystem can foster...... the development, implementation, and use of telemedicine services. We initially expand the theory of software ecosystems by contributing to the definition and understanding of software ecosystems, providing means of analyzing existing and designing new ecosystems, and defining and measuring the qualities...... of software ecosystems. We use these contributions to design a software ecosystem in the telemedicine services of Denmark with (i) a common platform that supports and promotes development from different actors, (ii) high software interaction, (iii) strong social network of actors, (iv) robust business...

  13. SACA: Software Assisted Call Analysis--an interactive tool supporting content exploration, online guidance and quality improvement of counseling dialogues.

    Science.gov (United States)

    Trinkaus, Hans L; Gaisser, Andrea E

    2010-09-01

    Nearly 30,000 individual inquiries are answered annually by the telephone cancer information service (CIS, KID) of the German Cancer Research Center (DKFZ). The aim was to develop a tool for evaluating these calls, and to support the complete counseling process interactively. A novel software tool is introduced, based on a structure similar to a music score. Treating the interaction as a "duet", guided by the CIS counselor, the essential contents of the dialogue are extracted automatically. For this, "trained speech recognition" is applied to the (known) counselor's part, and "keyword spotting" is used on the (unknown) client's part to pick out specific items from the "word streams". The outcomes fill an abstract score representing the dialogue. Pilot tests performed on a prototype of SACA (Software Assisted Call Analysis) resulted in a basic proof of concept: Demographic data as well as information regarding the situation of the caller could be identified. The study encourages following up on the vision of an integrated SACA tool for supporting calls online and performing statistics on its knowledge database offline. Further research perspectives are to check SACA's potential in comparison with established interaction analysis systems like RIAS. Copyright (c) 2010 Elsevier Ireland Ltd. All rights reserved.

  14. Software life after in-service

    International Nuclear Information System (INIS)

    Tseng, M.; Eng, P.

    1993-01-01

    Software engineers and designers tend to conclude a software project at the in-service milestone of the software life cycle. But the reality is that the 'life after in-service' is significantly longer than other phases of the life cycle, typically 20 years or more depending on the maintainability of the hardware platform and the designed life of the plant. During this period, the software asset (as with other physical assets in the plant) continues to be upgraded to correct deficiencies, meet new requirements, cope with obsolescence of equipment and so on. The software life cycle ends with a migration of the software to a different platform. It is typical in a software development project to put a great deal of emphasis on design methodologies, techniques, tools, development environment, standard procedures, and project management to ensure quality product is delivered on schedule and within budget. More often than not, a disproportion of emphasis is placed on the issues and needs of the in-service phase. Once the software is in-service, the designers move on to other projects, while the maintenance and support staff must manage the software. This paper examines the issues in three steps. First it presents a view of software from maintenance and support staff perspectives, including complexity of software, suitability of documentation, configuration management, training, difficulties and risks associated with making changes, required skills and knowledge. Second, it identifies the concerns raised from these viewpoints, including costs of maintaining the software, ability to meet additional requirements, availability of support tools, length of time required to engineer and install changes, and a strategy for the migration of software asset. Finally it discusses some approaches to deal with the concerns. (Author) 5 refs., fig

  15. COSINE software development based on code generation technology

    International Nuclear Information System (INIS)

    Ren Hao; Mo Wentao; Liu Shuo; Zhao Guang

    2013-01-01

    The code generation technology can significantly improve the quality and productivity of software development and reduce software development risk. At present, the code generator is usually based on UML model-driven technology, which can not satisfy the development demand of nuclear power calculation software. The feature of scientific computing program was analyzed and the FORTRAN code generator (FCG) based on C# was developed in this paper. FCG can generate module variable definition FORTRAN code automatically according to input metadata. FCG also can generate memory allocation interface for dynamic variables as well as data access interface. FCG was applied to the core and system integrated engine for design and analysis (COSINE) software development. The result shows that FCG can greatly improve the development efficiency of nuclear power calculation software, and reduce the defect rate of software development. (authors)

  16. Nothing Great Is Easy

    OpenAIRE

    Stansbie, Lisa

    2014-01-01

    A solo exhibition of 13 pieces of art work.\\ud \\ud Nothing Great is Easy is an exhibition of sculpture, film, drawing and photography that proposes reconstructed narratives using the sport of swimming and in particular the collective interaction and identity of the channel swimmer. The work utilises the processes, rituals/rules, language and the apparatus of sport.\\ud \\ud “Nothing great is easy” are the words on the memorial to Captain Matthew Webb who was the first man to swim the English ch...

  17. Zoneminder as ‘Software as a Service’ and Load Balancing of Video Surveillance Requests

    DEFF Research Database (Denmark)

    Deshmukh, Aaradhana A.; Mihovska, Albena D.; Prasad, Ramjee

    2012-01-01

    Cloud computing is evolving as a key computing platform for sharing resources that include infrastructures, softwares, applications, and business processes. Virtualization is a core technology for enabling cloud resource sharing. Software as a Service (SaaS) on the cloud platform provides software...... application vendors a Web based delivery model to serve large amount of clients with multi-tenancy based infrastructure and application sharing architecture so as to get great benefit from the economy of scale. The emergence of the Software-as-a-Service (SaaS) business model has attracted great attentions...... from both researchers and practitioners. SaaS vendors deliver on demand information processing services to users, and thus offer computing utility rather than the standalone software itself. This paper proposes a deployment of an open source video surveillance application named Zoneminder...

  18. A SOFTWARE TO PROMOTE INTERACTIVE TEACHING OF WATER PROPERTIES IN BIOCHEMISTRY CLASSES

    Directory of Open Access Journals (Sweden)

    O.M.M. Lapouble

    2005-07-01

    Full Text Available The improvement and  development of new tools in design and  informatics  helped the  creation  of biochemistry teaching  material. Many molecules, metabolic  pathways, reactions,  and interactions are best  explained  and  understood when  shown  in three  dimensions  and  allowing interactivity.  Water is, usually,  the first topic to be presented  during  basic biochemistry courses.  Importance, properties, ionization,  pH, buffering and  titration curves,  are frequently  presented  subjects,  but  static graphics don´t show to  students the  interactions between water molecules,  interactions with  the  solutes  and buffer titration in a clear way.  In this  work, Flash  software  from Macromedia,  was used to produce the llustrations, animations, and ActionScript programming was used to simulate  the titration of some buffers and correlate  the molecular  concept  to the graphic  charts.With  this  work,  we are trying  to improve  the  quality  of biochemistry teaching  material, and  to show, in a clear way, subjects  that are difficult to explain by static  graphics limitation. This material could be used in regular classes, to be projected  or showed in computers  and could be used by students in self-guided study because it allows optional  visualization of texts.  An assisted navigation tool could suggest to students, a sequence of topics but still allowing the freedom of choice of any available topic.

  19. Music Learning Based on Computer Software

    Directory of Open Access Journals (Sweden)

    Baihui Yan

    2017-12-01

    Full Text Available In order to better develop and improve students’ music learning, the authors proposed the method of music learning based on computer software. It is still a new field to use computer music software to assist teaching. Hereby, we conducted an in-depth analysis on the computer-enabled music learning and the music learning status in secondary schools, obtaining the specific analytical data. Survey data shows that students have many cognitive problems in the current music classroom, and yet teachers have not found a reasonable countermeasure to them. Against this background, the introduction of computer music software to music learning is a new trial that can not only cultivate the students’ initiatives of music learning, but also enhance their abilities to learn music. Therefore, it is concluded that the computer software based music learning is of great significance to improving the current music learning modes and means.

  20. PiCO QL: A software library for runtime interactive queries on program data

    Science.gov (United States)

    Fragkoulis, Marios; Spinellis, Diomidis; Louridas, Panos

    PiCO QL is an open source C/C++ software whose scientific scope is real-time interactive analysis of in-memory data through SQL queries. It exposes a relational view of a system's or application's data structures, which is queryable through SQL. While the application or system is executing, users can input queries through a web-based interface or issue web service requests. Queries execute on the live data structures through the respective relational views. PiCO QL makes a good candidate for ad-hoc data analysis in applications and for diagnostics in systems settings. Applications of PiCO QL include the Linux kernel, the Valgrind instrumentation framework, a GIS application, a virtual real-time observatory of stellar objects, and a source code analyser.

  1. PiCO QL: A software library for runtime interactive queries on program data

    Directory of Open Access Journals (Sweden)

    Marios Fragkoulis

    2016-01-01

    Full Text Available Pico ql is an open source c/c++ software whose scientific scope is real-time interactive analysis of in-memory data through sql queries. It exposes a relational view of a system’s or application’s data structures, which is queryable through sql. While the application or system is executing, users can input queries through a web-based interface or issue web service requests. Queries execute on the live data structures through the respective relational views. pico ql makes a good candidate for ad-hoc data analysis in applications and for diagnostics in systems settings. Applications of pico ql include the Linux kernel, the Valgrind instrumentation framework, a gis application, a virtual real-time observatory of stellar objects, and a source code analyser.

  2. Managing Distributed Software Projects

    DEFF Research Database (Denmark)

    Persson, John Stouby

    Increasingly, software projects are becoming geographically distributed, with limited face-toface interaction between participants. These projects face particular challenges that need careful managerial attention. This PhD study reports on how we can understand and support the management...... of distributed software projects, based on a literature study and a case study. The main emphasis of the literature study was on how to support the management of distributed software projects, but also contributed to an understanding of these projects. The main emphasis of the case study was on how to understand...... the management of distributed software projects, but also contributed to supporting the management of these projects. The literature study integrates what we know about risks and risk-resolution techniques, into a framework for managing risks in distributed contexts. This framework was developed iteratively...

  3. E language based on MCNP modeling software for autonomous

    International Nuclear Information System (INIS)

    Li Fei; Ge Liangquan; Zhang Qingxian

    2010-01-01

    MCNP (Monte Carlo N-Particle Code) is based on the Monte Carlo method for computing neutron, photon and other particles as the object of the movement simulation computer program. Because of its powerful computing simulation, flexible and universal features in many fields has been widely used, but due to a software professional in the operating area has been greatly restricted, so that in later development has been greatly hindered. E-language was used in order to develop the autonomy of MCNP modeling software, used to address users not familiar with MCNP and can not create object model, get rid of dull red tape 'notebook' type of program type and built a new MCNP modeling system. (authors)

  4. Robotic Software Integration Using MARIE

    Directory of Open Access Journals (Sweden)

    Carle Côté

    2006-03-01

    Full Text Available This paper presents MARIE, a middleware framework oriented towards developing and integrating new and existing software for robotic systems. By using a generic communication framework, MARIE aims to create a flexible distributed component system that allows robotics developers to share software programs and algorithms, and design prototypes rapidly based on their own integration needs. The use of MARIE is illustrated with the design of a socially interactive autonomous mobile robot platform capable of map building, localization, navigation, tasks scheduling, sound source localization, tracking and separation, speech recognition and generation, visual tracking, message reading and graphical interaction using a touch screen interface.

  5. Quality assurance of EDP software in practical application

    International Nuclear Information System (INIS)

    Winkler, H.

    1982-01-01

    Alongside the specific properties of the soft software, it is mainly points outside the traditional testing field which apply for the quality assurance thereof. Measures for quality assurance must in particular, start in the development. This presupposes a partial-result orientated development process of software. Due to the high qualitative demands, implements for testing and inspection are of great importance. The problems in software quality assurance are typical for a young technical field where the necessity of which is indisputed, but which has to effect on an empirical-pragmatical level still, due to insufficient scientific foundation. (orig.) [de

  6. A Reusable Software Architecture for Small Satellite AOCS Systems

    DEFF Research Database (Denmark)

    Alminde, Lars; Bendtsen, Jan Dimon; Laursen, Karl Kaas

    2006-01-01

    This paper concerns the software architecture called Sophy, which is an abbreviation for Simulation, Observation, and Planning in HYbrid systems. We present a framework that allows execution of hybrid dynamical systems in an on-line distributed computing environment, which includes interaction...... with both hardware and on-board software. Some of the key issues addressed by the framework are automatic translation of mathematical specifications of hybrid systems into executable software entities, management of execution of coupled models in a parallel distributed environment, as well as interaction...... with external components, hardware and/or software, through generic interfaces. Sophy is primarily intended as a tool for development of model based reusable software for the control and autonomous functions of satellites and/or satellite clusters....

  7. xSyn: A Software Tool for Identifying Sophisticated 3-Way Interactions From Cancer Expression Data

    Directory of Open Access Journals (Sweden)

    Baishali Bandyopadhyay

    2017-08-01

    Full Text Available Background: Constructing gene co-expression networks from cancer expression data is important for investigating the genetic mechanisms underlying cancer. However, correlation coefficients or linear regression models are not able to model sophisticated relationships among gene expression profiles. Here, we address the 3-way interaction that 2 genes’ expression levels are clustered in different space locations under the control of a third gene’s expression levels. Results: We present xSyn, a software tool for identifying such 3-way interactions from cancer gene expression data based on an optimization procedure involving the usage of UPGMA (Unweighted Pair Group Method with Arithmetic Mean and synergy. The effectiveness is demonstrated by application to 2 real gene expression data sets. Conclusions: xSyn is a useful tool for decoding the complex relationships among gene expression profiles. xSyn is available at http://www.bdxconsult.com/xSyn.html .

  8. Research of real-time communication software

    Science.gov (United States)

    Li, Maotang; Guo, Jingbo; Liu, Yuzhong; Li, Jiahong

    2003-11-01

    Real-time communication has been playing an increasingly important role in our work, life and ocean monitor. With the rapid progress of computer and communication technique as well as the miniaturization of communication system, it is needed to develop the adaptable and reliable real-time communication software in the ocean monitor system. This paper involves the real-time communication software research based on the point-to-point satellite intercommunication system. The object-oriented design method is adopted, which can transmit and receive video data and audio data as well as engineering data by satellite channel. In the real-time communication software, some software modules are developed, which can realize the point-to-point satellite intercommunication in the ocean monitor system. There are three advantages for the real-time communication software. One is that the real-time communication software increases the reliability of the point-to-point satellite intercommunication system working. Second is that some optional parameters are intercalated, which greatly increases the flexibility of the system working. Third is that some hardware is substituted by the real-time communication software, which not only decrease the expense of the system and promotes the miniaturization of communication system, but also aggrandizes the agility of the system.

  9. Desktop Publishing on the Macintosh: A Software Perspective.

    Science.gov (United States)

    Devan, Steve

    1987-01-01

    Discussion of factors to be considered in selecting desktop publishing software for the Macintosh microcomputer focuses on the two approaches to such software, i.e., batch and interactive, and three technical considerations, i.e., document, text, and graphics capabilities. Some new developments in graphics software are also briefly described. (MES)

  10. IEFIT - An Interactive Approach to High Temperature Fusion Plasma Magnetic Equilibrium Fitting

    International Nuclear Information System (INIS)

    Peng, Q.; Schachter, J.; Schissel, D.P.; Lao, L.L.

    1999-01-01

    An interactive IDL based wrapper, IEFIT, has been created for the magnetic equilibrium reconstruction code EFIT written in FORTRAN. It allows high temperature fusion physicists to rapidly optimize a plasma equilibrium reconstruction by eliminating the unnecessarily repeated initialization in the conventional approach along with the immediate display of the fitting results of each input variation. It uses a new IDL based graphics package, GaPlotObj, developed in cooperation with Fanning Software Consulting, that provides a unified interface with great flexibility in presenting and analyzing scientific data. The overall interactivity reduces the process to minutes from the usual hours

  11. Unblockable Compositions of Software Components

    DEFF Research Database (Denmark)

    Dong, Ruzhen; Faber, Johannes; Liu, Zhiming

    2012-01-01

    We present a new automata-based interface model describing the interaction behavior of software components. Contrary to earlier component- or interface-based approaches, the interface model we propose specifies all the non-blockable interaction behaviors of a component with any environment...... composition of interface models preserves unblockable sequences of provided services....

  12. Software ecosystems – a systematic literature review

    DEFF Research Database (Denmark)

    Manikas, Konstantinos; Hansen, Klaus Marius

    2013-01-01

    A software ecosystem is the interaction of a set of actors on top of a common technological platform that results in a number of software solutions or services. Arguably, software ecosystems are gaining importance with the advent of, e.g., the Google Android, Apache, and Salesforce.com ecosystems....... However, there exists no systematic overview of the research done on software ecosystems from a software engineering perspective. We performed a systematic literature review of software ecosystem research, analyzing 90 papers on the subject taken from a gross collection of 420. Our main conclusions...... are that while research on software ecosystems is increasing (a) there is little consensus on what constitutes a software ecosystem, (b) few analytical models of software ecosystems exist, and (c) little research is done in the context of real-world ecosystems. This work provides an overview of the field, while...

  13. Software-defined Quantum Networking Ecosystem

    Energy Technology Data Exchange (ETDEWEB)

    2017-01-01

    The software enables a user to perform modeling and simulation of software-defined quantum networks. The software addresses the problem of how to synchronize transmission of quantum and classical signals through multi-node networks and to demonstrate quantum information protocols such as quantum teleportation. The software approaches this problem by generating a graphical model of the underlying network and attributing properties to each node and link in the graph. The graphical model is then simulated using a combination of discrete-event simulators to calculate the expected state of each node and link in the graph at a future time. A user interacts with the software by providing an initial network model and instantiating methods for the nodes to transmit information with each other. This includes writing application scripts in python that make use of the software library interfaces. A user then initiates the application scripts, which invokes the software simulation. The user then uses the built-in diagnostic tools to query the state of the simulation and to collect statistics on synchronization.

  14. Las ontologías en la ingeniería de software: un acercamiento de dos grandes áreas del conocimiento Ontologies in software engineering: approaching two great knowledge areas

    Directory of Open Access Journals (Sweden)

    Carlos Mario Zapata Jaramillo

    2010-01-01

    Full Text Available Los conceptos ontológicos se suelen acercar más a la ingeniería del conocimiento, por lo que los ingenieros del software no los suelen aplicar para resolver problemas de su área. Es necesario que los ingenieros de software se apropien de las ontologías, pues éstas proporcionan un vocabulario común, que podría contribuir en la solución de problemas recurrentes en ingeniería del software, tales como la dificultad de la comunicación entre analista e interesado para definir los requisitos de un sistema, la baja reutilización de componentes y la escasa generación automática de código, entre otros. En este artículo se presenta un primer enlace entre las ontologías y la ingeniería de software mediante la recopilación y análisis de la literatura relativa a la utilización de las ontologías en las diferentes fases del ciclo de vida de un producto de software.Ontology concepts have been traditionally linked to knowledge engineering and software engineers have not applied them to solve problems of this area. It is necessary that software engineers use these ontologies, since they provide a common language, which can contribute to the solution of some common software engineering problems like difficulties in communication between the analyst and the interested person in order to define a system requirements, the low components re-use, and scarce automatic generation in code generation, among others. In this paper, a first encounter between ontologies and software engineering by means of a state-of-the-art analysis related to the use of ontologies in several phases of software development life cycle is presented.

  15. Using mathematical software to design power electronic converters

    Science.gov (United States)

    Hinov, Nikolay; Hranov, Tsveti

    2017-12-01

    In the paper is presented mathematical software, which was used for design of power electronic devices. Examined to different example, which are applied to designing electronic converters. In this way, it is possible to play different combinations of the circuit elements by simple means, thus optimizing according to certain criteria and limitations. Free software with a simple and intuitive interface is selected. No special user training is required to work with it and no further training is required. The use of mathematical software greatly facilitates the design, assists and makes it attractive and accessible to a wider range of students and specialists in power electronics training.

  16. Where Does the Time Go in Software DSMs?--Experiences with JIAJIA

    Institute of Scientific and Technical Information of China (English)

    SHI Weisong; HU Weiwu; TANGZhimin

    1999-01-01

    The performance gap between softwareDSM systems and message passing platforms prevents the prevalence ofsoftware DSM system greatly, though great efforts have been delivered inthis area in the past decade. In this paper, we take the challenge tofind where we should focus our efforts in the future design. Thecomponents of total system overhead of software DSM systems are analyzedin detail firstly. Based on a state-of-the-art software DSM systemJIAJIA, we measure these components on Dawning parallel system and drawfive important conclusions which are different from some traditionalviewpoints. (1) The performance of the JIAJIA software DSM system isacceptable. For four of eight applications, the parallel efficiencyachieved by JIAJIA is about 80%, while for two others, 70% efficiencycan be obtained. (2) 40.94% interrupt service time is overlapped withwaiting time. (3) Encoding and decoding diffs do not cost muchtime (<1%), so using hardware support to encode/decode diffs andsend/receive messages is not worthwhile. (4) Great endeavours should beput to reduce data miss penalty and optimize synchronization operations,which occupy 11.75% and 13.65% of total execution time respectively.(5) Communication hardware overhead occupies 66.76% of the wholecommunication time in the experimental environment, and communicationsoftware overhead does not take much time as expected.Moreover, by studying the effect of CPU speed to system overhead, wefind that the common speedup formula for distributed memory systems doesnot work under software DSM systems. Therefore, we design a new speedupformula special to software DSM systems, and point out that when the CPUspeed increases the speedup can be increased too even if the networkspeed is fixed, which is impossible in message passing systems. Finally,we argue that JIAJIA system has desired scalability.

  17. Evaluation of multiple intelligences in children aged 7 to 11 years through the implementation of an interactive software

    OpenAIRE

    Rebolledo Rodríguez, Rigel A.; Samaniego González, Euclides

    2017-01-01

    This document describes the evaluation project of multiple intelligences in children aged 7 to 11 years through the implementation of an interactive software, developed on Android for tablets, allowing parents, teachers, tutors, psy-chologists or other responsible adult, identifying the different types of intelligences that have children in order to know each other and develop their skills and their future potential. Similarly, research was developed to evaluate the effective-ness of the t...

  18. Usability Evaluation Method for Agile Software Development

    Directory of Open Access Journals (Sweden)

    Saad Masood Butt

    2015-02-01

    Full Text Available Agile methods are the best fit for tremendously growing software industry due to its flexible and dynamic nature. But the software developed using agile methods do meet the usability standards? To answer this question we can see that majority of agile software development projects currently involve interactive user interface designs, which can only be possible by following User Centered Design (UCD in agile methods. The question here is, how to integrate UCD with agile models. Both Agile models and UCD are iterative in nature but agile models focus on coding and development of software; whereas, UCD focuses on user interface of the software. Similarly, both of them have testing features where the agile model involves automated tested code while UCD involves an expert or a user to test the user interface. In this paper, a new agile usability model is proposed and the evaluation is of the proposed model is presented by practically implementing it in three real life projects. . Key results from these projects clearly show: the proposed agile model incorporates usability evaluation methods, improves the relationship between usability experts to work with agile software experts; in addition, allows agile developers to incorporate the result from UCD into subsequent interactions.

  19. Human-Computer Interaction Software: Lessons Learned, Challenges Ahead

    Science.gov (United States)

    1989-01-01

    domain communi- Iatelligent s t s s Me cation. Users familiar with problem Inteligent support systes. High-func- anddomains but inxperienced with comput...8217i. April 1987, pp. 7.3-78. His research interests include artificial intel- Creating better HCI softw-are will have a 8. S.K Catrd. I.P. Moran. arid

  20. Evaluation for nuclear safety-critical software reliability of DCS

    International Nuclear Information System (INIS)

    Liu Ying

    2015-01-01

    With the development of control and information technology at NPPs, software reliability is important because software failure is usually considered as one form of common cause failures in Digital I and C Systems (DCS). The reliability analysis of DCS, particularly qualitative and quantitative evaluation on the nuclear safety-critical software reliability belongs to a great challenge. To solve this problem, not only comprehensive evaluation model and stage evaluation models are built in this paper, but also prediction and sensibility analysis are given to the models. It can make besement for evaluating the reliability and safety of DCS. (author)

  1. Debris Examination Using Ballistic and Radar Integrated Software

    Science.gov (United States)

    Griffith, Anthony; Schottel, Matthew; Lee, David; Scully, Robert; Hamilton, Joseph; Kent, Brian; Thomas, Christopher; Benson, Jonathan; Branch, Eric; Hardman, Paul; hide

    2012-01-01

    The Debris Examination Using Ballistic and Radar Integrated Software (DEBRIS) program was developed to provide rapid and accurate analysis of debris observed by the NASA Debris Radar (NDR). This software provides a greatly improved analysis capacity over earlier manual processes, allowing for up to four times as much data to be analyzed by one-quarter of the personnel required by earlier methods. There are two applications that comprise the DEBRIS system: the Automated Radar Debris Examination Tool (ARDENT) and the primary DEBRIS tool.

  2. Radiobiological modelling with MarCell software

    International Nuclear Information System (INIS)

    Hasan, J.S.; Jones, T.D.

    1996-01-01

    Jones introduced a bone marrow radiation cell kinetics model with great potential for application in the fields of health physics, radiation research, and medicine. However, until recently, only the model developers have been able to apply it because of the complex array of biological and physical assignments needed for evaluation of a particular radiation exposure protocol. The purpose of this article is to illustrate the use of MarCell (MARrow CELL Kinetics) software for MS-DOS, a user-friendly computer implementation of that mathematical model that allows almost anyone with an elementary knowledge of radiation physics and/or medical procedures to apply the model. A hands-on demonstration of the software will be given by guiding the user through evaluation of a medical total body irradiation protocol and a nuclear fallout scenario. A brief overview of the software is given in the Appendix

  3. CAX a software for automated spectrum analysis

    International Nuclear Information System (INIS)

    Zahn, Guilherme S.; Genezini, Frederico A.

    2017-01-01

    In this work, the scripting capabilities of Genie-2000 were used to develop a software that automatically analyses all spectrum files in either Ortec's CHN or Canberra's MCA or CNF formats in a folder, generating two output files: a print-ready text le (.DAT) and a Comma-Separated Values (.CSV) le which can be easily imported in any major spreadsheet software. This software, named CAX ('Convert and Analyse for eXcel'), uses Genie-2000's functions to import spectrum files into Genie's native CNF format and analyze the converted spectra. The software can also, if requested, import energy and FWHM calibrations from a stored calibrated spectrum. The print-ready output le (.DAT) is generated by Genie-2000 using a customized script, and the CSV le is generated by a custom-built DAT2CSV software which generates a CSV le that complies to the Brazilian standards, with commas as a decimal indicator and semicolons as eld separators. This software is already used in the daily routines in IPEN's Neutron Activation Laboratory, greatly reducing the time required for sample analyses, as well as reducing the possibility of transcription errors. (author)

  4. Data processing in Software-type Wave-Particle Interaction Analyzer onboard the Arase satellite

    Science.gov (United States)

    Hikishima, Mitsuru; Kojima, Hirotsugu; Katoh, Yuto; Kasahara, Yoshiya; Kasahara, Satoshi; Mitani, Takefumi; Higashio, Nana; Matsuoka, Ayako; Miyoshi, Yoshizumi; Asamura, Kazushi; Takashima, Takeshi; Yokota, Shoichiro; Kitahara, Masahiro; Matsuda, Shoya

    2018-05-01

    The software-type wave-particle interaction analyzer (S-WPIA) is an instrument package onboard the Arase satellite, which studies the magnetosphere. The S-WPIA represents a new method for directly observing wave-particle interactions onboard a spacecraft in a space plasma environment. The main objective of the S-WPIA is to quantitatively detect wave-particle interactions associated with whistler-mode chorus emissions and electrons over a wide energy range (from several keV to several MeV). The quantity of energy exchanges between waves and particles can be represented as the inner product of the wave electric-field vector and the particle velocity vector. The S-WPIA requires accurate measurement of the phase difference between wave and particle gyration. The leading edge of the S-WPIA system allows us to collect comprehensive information, including the detection time, energy, and incoming direction of individual particles and instantaneous-wave electric and magnetic fields, at a high sampling rate. All the collected particle and waveform data are stored in the onboard large-volume data storage. The S-WPIA executes calculations asynchronously using the collected electric and magnetic wave data, data acquired from multiple particle instruments, and ambient magnetic-field data. The S-WPIA has the role of handling large amounts of raw data that are dedicated to calculations of the S-WPIA. Then, the results are transferred to the ground station. This paper describes the design of the S-WPIA and its calculations in detail, as implemented onboard Arase.[Figure not available: see fulltext.

  5. Visual querying and analysis of large software repositories

    NARCIS (Netherlands)

    Voinea, Lucian; Telea, Alexandru

    We present a software framework for mining software repositories. Our extensible framework enables the integration of data extraction from repositories with data analysis and interactive visualization. We demonstrate the applicability of the framework by presenting several case studies performed on

  6. Software hazard analysis for nuclear digital protection system by Colored Petri Net

    International Nuclear Information System (INIS)

    Bai, Tao; Chen, Wei-Hua; Liu, Zhen; Gao, Feng

    2017-01-01

    Highlights: •A dynamic hazard analysis method is proposed for the safety-critical software. •The mechanism relies on Colored Petri Net. •Complex interactions between software and hardware are captured properly. •Common failure mode in software are identified effectively. -- Abstract: The software safety of a nuclear digital protection system is critical for the safety of nuclear power plants as any software defect may result in severe damage. In order to ensure the safety and reliability of safety-critical digital system products and their applications, software hazard analysis is required to be performed during the lifecycle of software development. The dynamic software hazard modeling and analysis method based on Colored Petri Net is proposed and applied to the safety-critical control software of the nuclear digital protection system in this paper. The analysis results show that the proposed method can explain the complex interactions between software and hardware and identify the potential common cause failure in software properly and effectively. Moreover, the method can find the dominant software induced hazard to safety control actions, which aids in increasing software quality.

  7. A Padawan Programmer's Guide to Developing Software Libraries.

    Science.gov (United States)

    Yurkovich, James T; Yurkovich, Benjamin J; Dräger, Andreas; Palsson, Bernhard O; King, Zachary A

    2017-11-22

    With the rapid adoption of computational tools in the life sciences, scientists are taking on the challenge of developing their own software libraries and releasing them for public use. This trend is being accelerated by popular technologies and platforms, such as GitHub, Jupyter, R/Shiny, that make it easier to develop scientific software and by open-source licenses that make it easier to release software. But how do you build a software library that people will use? And what characteristics do the best libraries have that make them enduringly popular? Here, we provide a reference guide, based on our own experiences, for developing software libraries along with real-world examples to help provide context for scientists who are learning about these concepts for the first time. While we can only scratch the surface of these topics, we hope that this article will act as a guide for scientists who want to write great software that is built to last. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Evolution of Secondary Software Businesses: Understanding Industry Dynamics

    Science.gov (United States)

    Tyrväinen, Pasi; Warsta, Juhani; Seppänen, Veikko

    Primary software industry originates from IBM's decision to unbundle software-related computer system development activities to external partners. This kind of outsourcing from an enterprise internal software development activity is a common means to start a new software business serving a vertical software market. It combines knowledge of the vertical market process with competence in software development. In this research, we present and analyze the key figures of the Finnish secondary software industry, in order to quantify its interaction with the primary software industry during the period of 2000-2003. On the basis of the empirical data, we present a model for evolution of a secondary software business, which makes explicit the industry dynamics. It represents the shift from internal software developed for competitive advantage to development of products supporting standard business processes on top of standardized technologies. We also discuss the implications for software business strategies in each phase.

  9. The Great Plains IDEA Gerontology Program: An Online, Interinstitutional Graduate Degree

    Science.gov (United States)

    Sanders, Gregory F.

    2011-01-01

    The Great-Plains IDEA Gerontology Program is a graduate program developed and implemented by the Great Plains Interactive Distance Education Alliance (Great Plains IDEA). The Great Plains IDEA (Alliance) originated as a consortium of Colleges of Human Sciences ranging across the central United States. This Alliance's accomplishments have included…

  10. Proceedings of the First International Workshop on Automotive Software Architecture (WASA'15, Montreal, Canada, May 4, 2015)

    NARCIS (Netherlands)

    Kruchten, P.; Dajsuren, Y.; Altinger, H.; Staron, M.

    2015-01-01

    It is our great pleasure to welcome you to the First International Workshop on Automotive Software Architectures -- WASA'15. More than a decade ago, the term automotive software engineering was officially introduced in the software community addressing research challenges and technical issues

  11. Introduction to adoption of lean canvas in software test architecture design

    Directory of Open Access Journals (Sweden)

    Padmaraj Nidagundi

    2017-01-01

    Full Text Available The growth of the software dependent businesses, as well as the use of electronic devices in daily life, brings new challenges requiring the software to work error free all the time, to achieve this goal software needs to be sufficiently and effectively tested during various development phases. Most software development companies make great efforts in testing; it is even more difficult to reach the error-free software goal. Different software development methodologies (e.g. traditional waterfall, agile brought in a new dimension for both - development and testing - introducing new technologies and tools. In software test automation the test architecture design plays a key role in managing written test cases and effectively executing them. Having the more effective software test automation architecture design in test process saves resources, efforts and reduces the technical depth. This paper provides the new dimension and possibilities of using lean canvas in the design of the software test architecture.

  12. Categorical Regression and Benchmark Dose Software 3.0

    Science.gov (United States)

    The objective of this full-day course is to provide participants with interactive training on the use of the U.S. Environmental Protection Agency’s (EPA) Benchmark Dose software (BMDS, version 3.0, released fall 2018) and Categorical Regression software (CatReg, version 3.1...

  13. A Single Case Design Evaluation of a Software and Tutor Intervention Addressing Emotion Recognition and Social Interaction in Four Boys with ASD

    Science.gov (United States)

    Lacava, Paul G.; Rankin, Ana; Mahlios, Emily; Cook, Katie; Simpson, Richard L.

    2010-01-01

    Many students with Autism Spectrum Disorders (ASD) have delays learning to recognize emotions. Social behavior is also challenging, including initiating interactions, responding to others, developing peer relationships, and so forth. In this single case design study we investigated the relationship between use of computer software ("Mind Reading:…

  14. An interactive end-user software application for a deep-sea photographic database

    Digital Repository Service at National Institute of Oceanography (India)

    Jaisankar, S.; Sharma, R.

    . The software is the first of its kind in deep-sea applications and it also attempts to educate the user about deep-sea photography. The application software is developed by modifying established routines and by creating new routines to save the retrieved...

  15. HCI^2 Framework: A software framework for multimodal human-computer interaction systems

    NARCIS (Netherlands)

    Shen, Jie; Pantic, Maja

    2013-01-01

    This paper presents a novel software framework for the development and research in the area of multimodal human-computer interface (MHCI) systems. The proposed software framework, which is called the HCI∧2 Framework, is built upon publish/subscribe (P/S) architecture. It implements a

  16. Accuracy of Estimation of Graft Size for Living-Related Liver Transplantation: First Results of a Semi-Automated Interactive Software for CT-Volumetry

    Science.gov (United States)

    Mokry, Theresa; Bellemann, Nadine; Müller, Dirk; Lorenzo Bermejo, Justo; Klauß, Miriam; Stampfl, Ulrike; Radeleff, Boris; Schemmer, Peter; Kauczor, Hans-Ulrich; Sommer, Christof-Matthias

    2014-01-01

    Objectives To evaluate accuracy of estimated graft size for living-related liver transplantation using a semi-automated interactive software for CT-volumetry. Materials and Methods Sixteen donors for living-related liver transplantation (11 male; mean age: 38.2±9.6 years) underwent contrast-enhanced CT prior to graft removal. CT-volumetry was performed using a semi-automated interactive software (P), and compared with a manual commercial software (TR). For P, liver volumes were provided either with or without vessels. For TR, liver volumes were provided always with vessels. Intraoperative weight served as reference standard. Major study goals included analyses of volumes using absolute numbers, linear regression analyses and inter-observer agreements. Minor study goals included the description of the software workflow: degree of manual correction, speed for completion, and overall intuitiveness using five-point Likert scales: 1–markedly lower/faster/higher for P compared with TR, 2–slightly lower/faster/higher for P compared with TR, 3–identical for P and TR, 4–slightly lower/faster/higher for TR compared with P, and 5–markedly lower/faster/higher for TR compared with P. Results Liver segments II/III, II–IV and V–VIII served in 6, 3, and 7 donors as transplanted liver segments. Volumes were 642.9±368.8 ml for TR with vessels, 623.8±349.1 ml for P with vessels, and 605.2±345.8 ml for P without vessels (Pvolumetry performed with P can predict accurately graft size for living-related liver transplantation while improving workflow compared with TR. PMID:25330198

  17. The STARLINK software collection

    Science.gov (United States)

    Penny, A. J.; Wallace, P. T.; Sherman, J. C.; Terret, D. L.

    1993-12-01

    A demonstration will be given of some recent Starlink software. STARLINK is: a network of computers used by UK astronomers; a collection of programs for the calibration and analysis of astronomical data; a team of people giving hardware, software and administrative support. The Starlink Project has been in operation since 1980 to provide UK astronomers with interactive image processing and data reduction facilities. There are now Starlink computer systems at 25 UK locations, serving about 1500 registered users. The Starlink software collection now has about 25 major packages covering a wide range of astronomical data reduction and analysis techniques, as well as many smaller programs and utilities. At the core of most of the packages is a common `software environment', which provides many of the functions which applications need and offers standardized methods of structuring and accessing data. The software environment simplifies programming and support, and makes it easy to use different packages for different stages of the data reduction. Users see a consistent style, and can mix applications without hitting problems of differing data formats. The Project group coordinates the writing and distribution of this software collection, which is Unix based. Outside the UK, Starlink is used at a large number of places, which range from installations at major UK telescopes, which are Starlink-compatible and managed like Starlink sites, to individuals who run only small parts of the Starlink software collection.

  18. A Software Framework for Multimodal Human-Computer Interaction Systems

    NARCIS (Netherlands)

    Shen, Jie; Pantic, Maja

    2009-01-01

    This paper describes a software framework we designed and implemented for the development and research in the area of multimodal human-computer interface. The proposed framework is based on publish / subscribe architecture, which allows developers and researchers to conveniently configure, test and

  19. Next-generation business intelligence software with Silverlight 3

    CERN Document Server

    Czernicki, Bart

    2010-01-01

    Business Intelligence (BI) software is the code and tools that allow you to view different components of a business using a single visual platform, making comprehending mountains of data easier. Applications that include reports, analytics, statistics, and historical and predictive modeling are all examples of BI applications. Currently, we are in the second generation of BI software, called BI 2.0. This generation is focused on writing BI software that is predictive, adaptive, simple, and interactive. As computers and software have evolved, more data can be presented to end users with increas

  20. Analyser Framework to Verify Software Components

    Directory of Open Access Journals (Sweden)

    Rolf Andreas Rasenack

    2009-01-01

    Full Text Available Today, it is important for software companies to build software systems in a short time-interval, to reduce costs and to have a good market position. Therefore well organized and systematic development approaches are required. Reusing software components, which are well tested, can be a good solution to develop software applications in effective manner. The reuse of software components is less expensive and less time consuming than a development from scratch. But it is dangerous to think that software components can be match together without any problems. Software components itself are well tested, of course, but even if they composed together problems occur. Most problems are based on interaction respectively communication. Avoiding such errors a framework has to be developed for analysing software components. That framework determines the compatibility of corresponding software components. The promising approach discussed here, presents a novel technique for analysing software components by applying an Abstract Syntax Language Tree (ASLT. A supportive environment will be designed that checks the compatibility of black-box software components. This article is concerned to the question how can be coupled software components verified by using an analyzer framework and determines the usage of the ASLT. Black-box Software Components and Abstract Syntax Language Tree are the basis for developing the proposed framework and are discussed here to provide the background knowledge. The practical implementation of this framework is discussed and shows the result by using a test environment.

  1. An overview of 3D software visualization.

    Science.gov (United States)

    Teyseyre, Alfredo R; Campo, Marcelo R

    2009-01-01

    Software visualization studies techniques and methods for graphically representing different aspects of software. Its main goal is to enhance, simplify and clarify the mental representation a software engineer has of a computer system. During many years, visualization in 2D space has been actively studied, but in the last decade, researchers have begun to explore new 3D representations for visualizing software. In this article, we present an overview of current research in the area, describing several major aspects like: visual representations, interaction issues, evaluation methods and development tools. We also perform a survey of some representative tools to support different tasks, i.e., software maintenance and comprehension, requirements validation and algorithm animation for educational purposes, among others. Finally, we conclude identifying future research directions.

  2. CAX a software for automated spectrum analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zahn, Guilherme S.; Genezini, Frederico A., E-mail: gzahn@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (CRPq/IPEN/CNEN-SP), Sao Paulo, SP (Brazil). Centro do Reator de Pesquisas

    2017-11-01

    In this work, the scripting capabilities of Genie-2000 were used to develop a software that automatically analyses all spectrum files in either Ortec's CHN or Canberra's MCA or CNF formats in a folder, generating two output files: a print-ready text le (.DAT) and a Comma-Separated Values (.CSV) le which can be easily imported in any major spreadsheet software. This software, named CAX ('Convert and Analyse for eXcel'), uses Genie-2000's functions to import spectrum files into Genie's native CNF format and analyze the converted spectra. The software can also, if requested, import energy and FWHM calibrations from a stored calibrated spectrum. The print-ready output le (.DAT) is generated by Genie-2000 using a customized script, and the CSV le is generated by a custom-built DAT2CSV software which generates a CSV le that complies to the Brazilian standards, with commas as a decimal indicator and semicolons as eld separators. This software is already used in the daily routines in IPEN's Neutron Activation Laboratory, greatly reducing the time required for sample analyses, as well as reducing the possibility of transcription errors. (author)

  3. A new metric method-improved structural holes researches on software networks

    Science.gov (United States)

    Li, Bo; Zhao, Hai; Cai, Wei; Li, Dazhou; Li, Hui

    2013-03-01

    The scale software systems quickly increase with the rapid development of software technologies. Hence, how to understand, measure, manage and control software structure is a great challenge for software engineering. there are also many researches on software networks metrics: C&K, MOOD, McCabe and etc, the aim of this paper is to propose a new and better method to metric software networks. The metric method structural holes are firstly introduced to in this paper, which can not directly be applied as a result of modular characteristics on software network. Hence, structural holes is redefined in this paper and improved, calculation process and results are described in detail. The results shows that the new method can better reflect bridge role of vertexes on software network and there is a significant correlation between degree and improved structural holes. At last, a hydropower simulation system is taken as an example to show validity of the new metric method.

  4. Integrating Usability Evaluations into the Software Development Process

    DEFF Research Database (Denmark)

    Lizano, Fulvio

    as relevant and strategic human–computer interaction (HCI) activities in the software development process, there are obstacles that limit the complete, effective and efficient integration of this kind of testing into the software development process. Two main obstacles are the cost of usability evaluations...... and the software developers' resistance to accepting users’ opinions regarding the lack of usability in their software systems. The ‘cost obstacle’ refers to the constraint of conducting usability evaluations in the software process due to the significant amount of resources required by this type of testing. Some......This thesis addresses the integration of usability evaluations into the software development process. The integration here is contextualized in terms of how to include usability evaluation as an activity in the software development lifecycle. Even though usability evaluations are considered...

  5. Software for virtual accelerator designing

    International Nuclear Information System (INIS)

    Kulabukhova, N.; Ivanov, A.; Korkhov, V.; Lazarev, A.

    2012-01-01

    The article discusses appropriate technologies for software implementation of the Virtual Accelerator. The Virtual Accelerator is considered as a set of services and tools enabling transparent execution of computational software for modeling beam dynamics in accelerators on distributed computing resources. Distributed storage and information processing facilities utilized by the Virtual Accelerator make use of the Service-Oriented Architecture (SOA) according to a cloud computing paradigm. Control system tool-kits (such as EPICS, TANGO), computing modules (including high-performance computing), realization of the GUI with existing frameworks and visualization of the data are discussed in the paper. The presented research consists of software analysis for realization of interaction between all levels of the Virtual Accelerator and some samples of middle-ware implementation. A set of the servers and clusters at St.-Petersburg State University form the infrastructure of the computing environment for Virtual Accelerator design. Usage of component-oriented technology for realization of Virtual Accelerator levels interaction is proposed. The article concludes with an overview and substantiation of a choice of technologies that will be used for design and implementation of the Virtual Accelerator. (authors)

  6. Tool Use Within NASA Software Quality Assurance

    Science.gov (United States)

    Shigeta, Denise; Port, Dan; Nikora, Allen P.; Wilf, Joel

    2013-01-01

    As space mission software systems become larger and more complex, it is increasingly important for the software assurance effort to have the ability to effectively assess both the artifacts produced during software system development and the development process itself. Conceptually, assurance is a straightforward idea - it is the result of activities carried out by an organization independent of the software developers to better inform project management of potential technical and programmatic risks, and thus increase management's confidence in the decisions they ultimately make. In practice, effective assurance for large, complex systems often entails assessing large, complex software artifacts (e.g., requirements specifications, architectural descriptions) as well as substantial amounts of unstructured information (e.g., anomaly reports resulting from testing activities during development). In such an environment, assurance engineers can benefit greatly from appropriate tool support. In order to do so, an assurance organization will need accurate and timely information on the tool support available for various types of assurance activities. In this paper, we investigate the current use of tool support for assurance organizations within NASA, and describe on-going work at JPL for providing assurance organizations with the information about tools they need to use them effectively.

  7. Software approach to minimizing problems of student-lecturer ...

    African Journals Online (AJOL)

    Lecturer Interaction in Higher institutions of learning. The Software was developed using PHP and hosted in the University web server, and the interaction between students and their lecturers was compared using both the traditional approaches ...

  8. Manifesto for the Software Development Professionalization

    Directory of Open Access Journals (Sweden)

    Red Latinoamericana en Ingeniería de Software (RedLatinaIS

    2013-12-01

    Full Text Available One of the central problems of current economic development and industrial competitiveness, social and scientific, is the complexity of large and intensive software systems, and processes for their development and implementation. This complexity is defined by the amount and heterogeneity of the interaction of the hardware with the software components, their inter-relationships, of incorporation of the technical and organizational environments, and the interfaces to humans. The domain of these systems requires actions and scientific thoughts, hierarchical and systematic; also, the success of the products, services and organizations, is increasingly determined by the availability of suitable software products. Therefore, highly qualified professionals, able to understand and master the systems, involved in the entire life cycle of software engineering, and adopt different roles during the development. This is the reason that guide the thinking of this Manifesto , which aims is to achieve the Professionalization of Software Development.

  9. The IceCube Data Acquisition Software: Lessons Learned during Distributed, Collaborative, Multi-Disciplined Software Development.

    Energy Technology Data Exchange (ETDEWEB)

    Beattie, Keith S; Beattie, Keith; Day Ph.D., Christopher; Glowacki, Dave; Hanson Ph.D., Kael; Jacobsen Ph.D., John; McParland, Charles; Patton Ph.D., Simon

    2007-09-21

    In this experiential paper we report on lessons learned during the development ofthe data acquisition software for the IceCube project - specifically, how to effectively address the unique challenges presented by a distributed, collaborative, multi-institutional, multi-disciplined project such as this. While development progress in software projects is often described solely in terms of technical issues, our experience indicates that non- and quasi-technical interactions play a substantial role in the effectiveness of large software development efforts. These include: selection and management of multiple software development methodologies, the effective useof various collaborative communication tools, project management structure and roles, and the impact and apparent importance of these elements when viewed through the differing perspectives of hardware, software, scientific and project office roles. Even in areas clearly technical in nature, success is still influenced by non-technical issues that can escape close attention. In particular we describe our experiences on software requirements specification, development methodologies and communication tools. We make observations on what tools and techniques have and have not been effective in this geographically disperse (including the South Pole) collaboration and offer suggestions on how similarly structured future projects may build upon our experiences.

  10. Generic Accounting and Reporting Software for Research Reactors

    International Nuclear Information System (INIS)

    Jaernry, C.

    2010-01-01

    This poster will give a general description of a generic accounting and reporting software for research reactors. The program has been designed to be a powerful working tool for the operator. The software is designed to be extremely easy to handle and at the same time give the operator the maximal power of searches, calculations and reports that a modern advanced system can produce. The systems are designed to take make the relevant accountancy for the site and perform the reporting to IAEA in Code 10. Generic software for accounting and reporting for research and other smaller facilities is something that has been asked for by many parties. Software like this could be distributed to facilities in need of a powerful accountancy and reporting system for a much lower cost than the systems that is now custom made for these facilities. This will greatly increase the quality of accountancy at these facilities and the quality of the reporting to IAEA and other state offices etc. The software is based on our advanced custom made software series 'STAR' that has been in use in many places for more than a decade with successful use. (author)

  11. Software Security and the "Building Security in Maturity" Model

    CERN Document Server

    CERN. Geneva

    2011-01-01

    Using the framework described in my book "Software Security: Building Security In" I will discuss and describe the state of the practice in software security. This talk is peppered with real data from the field, based on my work with several large companies as a Cigital consultant. As a discipline, software security has made great progress over the last decade. Of the sixty large-scale software security initiatives we are aware of, thirty-two---all household names---are currently included in the BSIMM study. Those companies among the thirty-two who graciously agreed to be identified include: Adobe, Aon, Bank of America, Capital One, The Depository Trust & Clearing Corporation (DTCC), EMC, Google, Intel, Intuit, McKesson, Microsoft, Nokia, QUALCOMM, Sallie Mae, Standard Life, SWIFT, Symantec, Telecom Italia, Thomson Reuters, VMware, and Wells Fargo. The BSIMM was created by observing and analyzing real-world data from thirty-two leading software security initiatives. The BSIMM can...

  12. RNA-Pareto: interactive analysis of Pareto-optimal RNA sequence-structure alignments.

    Science.gov (United States)

    Schnattinger, Thomas; Schöning, Uwe; Marchfelder, Anita; Kestler, Hans A

    2013-12-01

    Incorporating secondary structure information into the alignment process improves the quality of RNA sequence alignments. Instead of using fixed weighting parameters, sequence and structure components can be treated as different objectives and optimized simultaneously. The result is not a single, but a Pareto-set of equally optimal solutions, which all represent different possible weighting parameters. We now provide the interactive graphical software tool RNA-Pareto, which allows a direct inspection of all feasible results to the pairwise RNA sequence-structure alignment problem and greatly facilitates the exploration of the optimal solution set.

  13. Web-based Interactive Simulator for Rotating Machinery.

    Science.gov (United States)

    Sirohi, Vijayalaxmi

    1999-01-01

    Baroma (Balance of Rotating Machinery), the Web-based educational engineering interactive software for teaching/learning combines didactical and software ergonomical approaches. The software in tutorial form simulates a problem using Visual Interactive Simulation in graphic display, and animation is brought about through graphical user interface…

  14. SOFTWARE OPEN SOURCE, SOFTWARE GRATIS?

    Directory of Open Access Journals (Sweden)

    Nur Aini Rakhmawati

    2006-01-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 Berlakunya Undang – undang Hak Atas Kekayaan Intelektual (HAKI, memunculkan suatu alternatif baru untuk menggunakan software open source. Penggunaan software open source menyebar seiring dengan isu global pada Information Communication Technology (ICT saat ini. Beberapa organisasi dan perusahaan mulai menjadikan software open source sebagai pertimbangan. Banyak konsep mengenai software open source ini. Mulai dari software yang gratis sampai software tidak berlisensi. Tidak sepenuhnya isu software open source benar, untuk itu perlu dikenalkan konsep software open source mulai dari sejarah, lisensi dan bagaimana cara memilih lisensi, serta pertimbangan dalam memilih software open source yang ada. Kata kunci :Lisensi, Open Source, HAKI

  15. ScienceSoft: Open software for open science

    CERN Document Server

    Di Meglio, Alberto

    2012-01-01

    Most of the software developed today by research institutes, university, research projects, etc. is typically stored in local source and binary repositories and available for the duration of a project lifetime only. Finding software based on given functional characteristics is almost impossible and binary packages are mostly available from local university or project repositories rather than the open source community repositories like Fedora/EPEL or Debian. Furthermore general information about who develops, contributes to and most importantly uses a given software program is very difficult to find out and yet the widespread availability of such information would give more visibility and credibility to the software products. The creation of links or relationships not only among pieces of software, but equally among the people interacting with the software across and beyond specific project and communities would foster a more active community and create the conditions for sharing ideas and skills, a ...

  16. Object-oriented design of medical imaging software.

    Science.gov (United States)

    Ligier, Y; Ratib, O; Logean, M; Girard, C; Perrier, R; Scherrer, J R

    1994-01-01

    A special software package for interactive display and manipulation of medical images was developed at the University Hospital of Geneva, as part of a hospital wide Picture Archiving and Communication System (PACS). This software package, called Osiris, was especially designed to be easily usable and adaptable to the needs of noncomputer-oriented physicians. The Osiris software has been developed to allow the visualization of medical images obtained from any imaging modality. It provides generic manipulation tools, processing tools, and analysis tools more specific to clinical applications. This software, based on an object-oriented paradigm, is portable and extensible. Osiris is available on two different operating systems: the Unix X-11/OSF-Motif based workstations, and the Macintosh family.

  17. Software Engineering Technology Infusion Within NASA

    Science.gov (United States)

    Zelkowitz, Marvin V.

    1996-01-01

    Abstract technology transfer is of crucial concern to both government and industry today. In this paper, several software engineering technologies used within NASA are studied, and the mechanisms, schedules, and efforts at transferring these technologies are investigated. The goals of this study are: 1) to understand the difference between technology transfer (the adoption of a new method by large segments of an industry) as an industry-wide phenomenon and the adoption of a new technology by an individual organization (called technology infusion); and 2) to see if software engineering technology transfer differs from other engineering disciplines. While there is great interest today in developing technology transfer models for industry, it is the technology infusion process that actually causes changes in the current state of the practice.

  18. ORNL's DCAL software package

    International Nuclear Information System (INIS)

    Eckerman, K.F.

    2007-01-01

    Oak Ridge National Laboratory has released its Dose and Risk Calculation software, DCAL. DCAL, developed with the support of the U.S. Environmental Protection Agency, consists of a series of computational modules, driven in either an interactive or a batch mode for computation of dose and risk coefficients from intakes of radionuclides or exposure to radionuclides in environmental media. The software package includes extensive libraries of biokinetic and dosimetric data that represent the current state of the art. The software has unique capability for addressing intakes of radionuclides by non-adults. DCAL runs as 32-bit extended DOS and console applications under Windows 98/NT/2000/XP. It is intended for users familiar with the basic elements of computational radiation dosimetry. Components of DCAL have been used to prepare U.S. Environmental Protection Agency's Federal Guidance Reports 12 and 13 and several publications of the International Commission on Radiological Protection. (author)

  19. RT-Syn: A real-time software system generator

    Science.gov (United States)

    Setliff, Dorothy E.

    1992-01-01

    This paper presents research into providing highly reusable and maintainable components by using automatic software synthesis techniques. This proposal uses domain knowledge combined with automatic software synthesis techniques to engineer large-scale mission-critical real-time software. The hypothesis centers on a software synthesis architecture that specifically incorporates application-specific (in this case real-time) knowledge. This architecture synthesizes complex system software to meet a behavioral specification and external interaction design constraints. Some examples of these external constraints are communication protocols, precisions, timing, and space limitations. The incorporation of application-specific knowledge facilitates the generation of mathematical software metrics which are used to narrow the design space, thereby making software synthesis tractable. Success has the potential to dramatically reduce mission-critical system life-cycle costs not only by reducing development time, but more importantly facilitating maintenance, modifications, and extensions of complex mission-critical software systems, which are currently dominating life cycle costs.

  20. Development of Cross-Platform Software for Well Logging Data Visualization

    Science.gov (United States)

    Akhmadulin, R. K.; Miraev, A. I.

    2017-07-01

    Well logging data processing is one of the main sources of information in the oil-gas field analysis and is of great importance in the process of its development and operation. Therefore, it is important to have the software which would accurately and clearly provide the user with processed data in the form of well logs. In this work, there have been developed a software product which not only has the basic functionality for this task (loading data from .las files, well log curves display, etc.), but can be run in different operating systems and on different devices. In the article a subject field analysis and task formulation have been performed, and the software design stage has been considered. At the end of the work the resulting software product interface has been described.

  1. Dynamic Capabilities and Project Management in Small Software Companies

    DEFF Research Database (Denmark)

    Nørbjerg, Jacob; Nielsen, Peter Axel; Persson, John Stouby

    2017-01-01

    A small software company depends on its capability to adapt to rapid technological and other changes in its environment—its dynamic capabilities. In this paper, we argue that to evolve and maintain its dynamic capabilities a small software company must pay attention to the interaction between...... dynamic capabilities at different levels of the company — particularly between the project management and the company levels. We present a case study of a small software company and show how successful dynamic capabilities at the company level can affect project management in small software companies...

  2. Qualification of software and hardware

    International Nuclear Information System (INIS)

    Gossner, S.; Schueller, H.; Gloee, G.

    1987-01-01

    The qualification of on-line process control equipment is subdivided into three areas: 1) materials and structural elements; 2) on-line process-control components and devices; 3) electrical systems (reactor protection and confinement system). Microprocessor-aided process-control equipment are difficult to verify for failure-free function owing to the complexity of the functional structures of the hardware and to the variety of the software feasible for microprocessors. Hence, qualification will make great demands on the inspecting expert. (DG) [de

  3. A software radio platform based on ARM and FPGA

    Directory of Open Access Journals (Sweden)

    Yang Xin.

    2016-01-01

    Full Text Available The rapid rise in computational performance offered by computer systems has greatly increased the number of practical software radio applications. A scheme presented in this paper is a software radio platform based on ARM and FPGA. FPGA works as the coprocessor together with the ARM, which serves as the core processor. ARM is used for digital signal processing and real-time data transmission, and FPGA is used for synchronous timing control and serial-parallel conversion. A SPI driver for real-time data transmission between ARM and FPGA under ARM-Linux system is provided. By adopting modular design, the software radio platform is capable of implementing wireless communication functions and satisfies the requirements of real-time signal processing platform for high security and broad applicability.

  4. Application Reuse Library for Software, Requirements, and Guidelines

    Science.gov (United States)

    Malin, Jane T.; Thronesbery, Carroll

    1994-01-01

    Better designs are needed for expert systems and other operations automation software, for more reliable, usable and effective human support. A prototype computer-aided Application Reuse Library shows feasibility of supporting concurrent development and improvement of advanced software by users, analysts, software developers, and human-computer interaction experts. Such a library expedites development of quality software, by providing working, documented examples, which support understanding, modification and reuse of requirements as well as code. It explicitly documents and implicitly embodies design guidelines, standards and conventions. The Application Reuse Library provides application modules with Demo-and-Tester elements. Developers and users can evaluate applicability of a library module and test modifications, by running it interactively. Sub-modules provide application code and displays and controls. The library supports software modification and reuse, by providing alternative versions of application and display functionality. Information about human support and display requirements is provided, so that modifications will conform to guidelines. The library supports entry of new application modules from developers throughout an organization. Example library modules include a timer, some buttons and special fonts, and a real-time data interface program. The library prototype is implemented in the object-oriented G2 environment for developing real-time expert systems.

  5. Revenue Management and Demand Fulfillment: Matching Applications, Models, and Software

    NARCIS (Netherlands)

    R. Quante (Rainer); H. Meyr (Herbert); M. Fleischmann (Moritz)

    2007-01-01

    textabstractRecent years have seen great successes of revenue management, notably in the airline, hotel, and car rental business. Currently, an increasing number of industries, including manufacturers and retailers, are exploring ways to adopt similar concepts. Software companies are taking an

  6. Great apes distinguish true from false beliefs in an interactive helping task.

    Directory of Open Access Journals (Sweden)

    David Buttelmann

    Full Text Available Understanding the behavior of others in a wide variety of circumstances requires an understanding of their psychological states. Humans' nearest primate relatives, the great apes, understand many psychological states of others, for example, perceptions, goals, and desires. However, so far there is little evidence that they possess the key marker of advanced human social cognition: an understanding of false beliefs. Here we demonstrate that in a nonverbal (implicit false-belief test which is passed by human 1-year-old infants, great apes as a group, including chimpanzees (Pan troglodytes, bonobos (Pan paniscus, and orangutans (Pongo abelii, distinguish between true and false beliefs in their helping behavior. Great apes thus may possess at least some basic understanding that an agent's actions are based on her beliefs about reality. Hence, such understanding might not be the exclusive province of the human species.

  7. Deindividuation and Internet software piracy.

    Science.gov (United States)

    Hinduja, Sameer

    2008-08-01

    Computer crime has increased exponentially in recent years as hardware, software, and network resources become more affordable and available to individuals from all walks of life. Software piracy is one prevalent type of cybercrime and has detrimentally affected the economic health of the software industry. Moreover, piracy arguably represents a rend in the moral fabric associated with the respect of intellectual property and reduces the financial incentive of product creation and innovation. Deindividuation theory, originating from the field of social psychology, argues that individuals are extricated from responsibility for their actions simply because they no longer have an acute awareness of the identity of self and of others. That is, external and internal constraints that would typically regulate questionable behavior are rendered less effective via certain anonymizing and disinhibiting conditions of the social and environmental context. This exploratory piece seeks to establish the role of deindividuation in liberating individuals to commit software piracy by testing the hypothesis that persons who prefer the anonymity and pseudonymity associated with interaction on the Internet are more likely to pirate software. Through this research, it is hoped that the empirical identification of such a social psychological determinant will help further illuminate the phenomenon.

  8. Open Source Software Development with Your Mother Language : Intercultural Collaboration Experiment 2002

    DEFF Research Database (Denmark)

    Nomura, Saeko; Ishida, Saeko; Jensen, Mika Yasuoka

    2002-01-01

    ”Open Source Software Development with Your Mother Language: Intercultural Collaboration Experiment 2002,” 10th International Conference on Human – Computer Interaction (HCII2003), June 2003, Crete, Greece.......”Open Source Software Development with Your Mother Language: Intercultural Collaboration Experiment 2002,” 10th International Conference on Human – Computer Interaction (HCII2003), June 2003, Crete, Greece....

  9. Ragnarok: An Architecture Based Software Development Environment

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    of the development process. The main contributions presented in the thesis have evolved from work with two of the hypotheses: These address the problems of management of evolution, and overview, comprehension and navigation respectively. The first main contribution is the Architectural Software Configuration...... Management Model: A software configuration management model where the abstractions and hierarchy of the logical aspect of software architecture forms the basis for version control and configuration management. The second main contribution is the Geographic Space Architecture Visualisation Model......: A visualisation model where entities in a software architecture are organised geographically in a two-dimensional plane, their visual appearance determined by processing a subset of the data in the entities, and interaction with the project's underlying data performed by direct manipulation of the landscape...

  10. Free Open Source Software: Social Phenomenon, New Management, New Business Models

    Directory of Open Access Journals (Sweden)

    Žilvinas Jančoras

    2011-08-01

    Full Text Available In the paper assumptions of free open source software existence, development, financing and competition models are presented. The free software as a social phenomenon and the open source software as the technological and managerial innovation environment are revealed. The social and business interaction processes are analyzed.Article in Lithuanian

  11. Human-machine interface software package

    International Nuclear Information System (INIS)

    Liu, D.K.; Zhang, C.Z.

    1992-01-01

    The Man-Machine Interface software Package (MMISP) is designed to configure the console software of PLS 60 Mev LINAC control system. The control system of PLS 60 Mev LINAC is a distributed control system which includes the main computer (Intel 310) four local station, and two sets of industrial level console computer. The MMISP provides the operator with the display page editor, various I/O configuration such as digital signals In/Out, analog signal In/Out, waveform TV graphic display, and interactive with operator through graphic picture display, voice explanation, and touch panel. This paper describes its function and application. (author)

  12. An Introduction to Quantitative Measures for Software Maintenance of Nuclear Power Plant

    International Nuclear Information System (INIS)

    Jo, Hyun Jun; Seong, Poong Hyun

    2007-01-01

    The I and C system of NPP has changed from the analog system to the digital-based system using microcontrollers and software. Thus, software has become very important for NPP control system. The software life cycle is divided into the development and maintenance phase largely. Because poor software maintenance work introduces new errors and makes software much complex, we have to consider the effective maintenance methods for the reliability and maintainability of NPP software. Function Block Diagram (FBD) is a standard application programming language for the Programmable Logic Controller (PLC) and currently being used in the development of a fully-digitalized reactor protection system (RPS) under the KNICS project. Therefore, the maintenance work will be of great importance in a few years. This paper studies on the measures which give quantitative information to software maintainer and manager before and after modification. The remainder of this paper is organized as follows. Section 2 briefly describes software maintenance types and model. In Section 3-5, we introduce the quantitative measures for software maintenance and characteristics of FBD program. A conclusion is provided in Section 6

  13. DEVELOPING EVALUATION INSTRUMENT FOR MATHEMATICS EDUCATIONAL SOFTWARE

    Directory of Open Access Journals (Sweden)

    Wahyu Setyaningrum

    2012-02-01

    Full Text Available The rapid increase and availability of mathematics software, either for classroom or individual learning activities, presents a challenge for teachers. It has been argued that many products are limited in quality. Some of the more commonly used software products have been criticized for poor content, activities which fail to address some learning issues, poor graphics presentation, inadequate documentation, and other technical problems. The challenge for schools is to ensure that the educational software used in classrooms is appropriate and effective in supporting intended outcomes and goals. This paper aimed to develop instrument for evaluating mathematics educational software in order to help teachers in selecting the appropriate software. The instrument considers the notion of educational including content, teaching and learning skill, interaction, and feedback and error correction; and technical aspects of educational software including design, clarity, assessment and documentation, cost and hardware and software interdependence. The instrument use a checklist approach, the easier and effective methods in assessing the quality of educational software, thus the user needs to put tick in each criteria. The criteria in this instrument are adapted and extended from standard evaluation instrument in several references.   Keywords: mathematics educational software, educational aspect, technical aspect.

  14. Software functions for safe operation - learning from Sizewell-B

    International Nuclear Information System (INIS)

    Welbourne, D.

    1996-01-01

    Future nuclear plants will use computer-based systems extensively. Regulatory acceptance must be planned and not underestimated. Commercial software packages will simplify it, but costly analysis and demonstration may be needed. Multiplexed control needs preparation of extensive configuration data and careful checking. On-screen soft control will need consideration of the integrity of the control path. Display design should follow human factors analysis of the operators' needs, and display layout needs great care for clarity. Computer-based system with planned quality will then bring great benefits in safe operation. (author) 1 fig., 3 refs

  15. Interactive Mathematics Textbooks

    DEFF Research Database (Denmark)

    Sinclair, Robert

    1999-01-01

    We claim that important considerations have been overlooked in designinginteractive mathematics educational software in the past.In particular,most previous work has concentrated on how to make use ofpre-existing software in mathematics education, rather than firstasking the more...... fundamentalquestion of which requirements mathematics education puts on software, and thendesigning software to fulfil these requirements.We present a working prototype system which takes a script defining an interactivemathematicaldocument and then provides a reader with an interactive realization of thatdocument....

  16. Fostering Multirepresentational Levels of Chemical Concepts: A Framework to Develop Educational Software

    Science.gov (United States)

    Marson, Guilherme A.; Torres, Bayardo B.

    2011-01-01

    This work presents a convenient framework for developing interactive chemical education software to facilitate the integration of macroscopic, microscopic, and symbolic dimensions of chemical concepts--specifically, via the development of software for gel permeation chromatography. The instructional role of the software was evaluated in a study…

  17. Computational intelligence and quantitative software engineering

    CERN Document Server

    Succi, Giancarlo; Sillitti, Alberto

    2016-01-01

    In a down-to-the earth manner, the volume lucidly presents how the fundamental concepts, methodology, and algorithms of Computational Intelligence are efficiently exploited in Software Engineering and opens up a novel and promising avenue of a comprehensive analysis and advanced design of software artifacts. It shows how the paradigm and the best practices of Computational Intelligence can be creatively explored to carry out comprehensive software requirement analysis, support design, testing, and maintenance. Software Engineering is an intensive knowledge-based endeavor of inherent human-centric nature, which profoundly relies on acquiring semiformal knowledge and then processing it to produce a running system. The knowledge spans a wide variety of artifacts, from requirements, captured in the interaction with customers, to design practices, testing, and code management strategies, which rely on the knowledge of the running system. This volume consists of contributions written by widely acknowledged experts ...

  18. Metabolic interrelationships software application: Interactive learning tool for intermediary metabolism

    NARCIS (Netherlands)

    A.J.M. Verhoeven (Adrie); M. Doets (Mathijs); J.M.J. Lamers (Jos); J.F. Koster (Johan)

    2005-01-01

    textabstractWe developed and implemented the software application titled Metabolic Interrelationships as a self-learning and -teaching tool for intermediary metabolism. It is used by undergraduate medical students in an integrated organ systems-based and disease-oriented core curriculum, which

  19. G3 : GENESIS software envrionment update

    OpenAIRE

    Castagné, Nicolas; Cadoz, Claude; Allaoui, Ali; Tache, Olivier Michel

    2009-01-01

    GENESIS3 is the new version of the GENESIS software environment for musical creation by means of mass-interaction physics network modeling. It was designed, and developed from scratch, in hindsight of more than 10 years working on and using the previous version. We take the opportunity of this birth to provide in this article (1) an analysis of the peculiarities in GENESIS, aiming at highlighting its core ?software paradigm?; and (2) an update on the features of the new version as compared to...

  20. Development of Agile Practices in Romanian Software Community

    Directory of Open Access Journals (Sweden)

    Eduard BUDACU

    2017-01-01

    Full Text Available Agile Software Development (ASD promotes flexibility to change and emphasis the importance of individuals and interactions in producing software. The study presents the development of agile practices in Romanian software community. A literature review is conducted and the main agile methods are described. The characteristics of Romanian ICT sector is presented in relation with agile methodology. Practices are identified by a survey and an analysis on the groups of interests formed on Meetup website is performed. Future directions and development of agile practices is evaluated.

  1. The software environment of RODOS

    International Nuclear Information System (INIS)

    Schuele, O.; Rafat, M.; Kossykh, V.

    1996-01-01

    The Software Environment of RODOS provides tools for processing and managing a large variety of different types of information, including those which are categorized in terms of meteorology, radiology, economy, emergency actions and countermeasures, rules, preferences, facts, maps, statistics, catalogues, models and methods. The main tasks of the Operating Subsystem OSY, which is based on the Client-Server Model, are the control of system operation, data management, and the exchange of information among various modules as well as the interaction with users in distributed computer systems. The paper describes the software environment of RODOS, in particular, the individual modules of its Operating Subsystem OSY, its distributed database, the geographical information system RoGIS, the on-line connections to radiological and meteorological networks and the software environment for the integration of external programs into the RODOS system

  2. The software environment of RODOS

    International Nuclear Information System (INIS)

    Schuele, O.; Rafat, M.

    1998-01-01

    The Software Environment of RODOS provides tools for processing and managing a large variety of different types of information, including those which are categorised in terms of meteorology, radiology, economy, emergency actions and countermeasures, rules, preferences, facts, maps, statistics, catalogues, models and methods. The main tasks of the Operating Subsystem OSY, which is based on the Client-Server Model, are the control of system operation, data management, and the exchange of information among various modules as well as the interaction with users in distributed computer systems. The paper describes the software environment of RODOS, in particular, the individual modules of its Operating Subsystem OSY, its distributed database, the geographical information system RoGIS, the on-line connections to radiological and meteorological networks and the software environment for the integration of external programs into the RODOS system. (orig.)

  3. The software environment of RODOS

    Energy Technology Data Exchange (ETDEWEB)

    Schuele, O; Rafat, M [Forschungszentrum Karlsruhe, Institut fuer Neutronenphysik und Reaktortechnik, Karlsruhe (Germany); Kossykh, V [Scientific Production Association ' TYPHOON' , Emergency Centre, Obninsk (Russian Federation)

    1996-07-01

    The Software Environment of RODOS provides tools for processing and managing a large variety of different types of information, including those which are categorized in terms of meteorology, radiology, economy, emergency actions and countermeasures, rules, preferences, facts, maps, statistics, catalogues, models and methods. The main tasks of the Operating Subsystem OSY, which is based on the Client-Server Model, are the control of system operation, data management, and the exchange of information among various modules as well as the interaction with users in distributed computer systems. The paper describes the software environment of RODOS, in particular, the individual modules of its Operating Subsystem OSY, its distributed database, the geographical information system RoGIS, the on-line connections to radiological and meteorological networks and the software environment for the integration of external programs into the RODOS system.

  4. Integrated conception of hardware/software mixed systems used in nuclear instrumentation

    International Nuclear Information System (INIS)

    Dias, Ailton F.; Sorel, Yves; Akil, Mohamed

    2002-01-01

    Hardware/software codesign carries out the design of systems composed by a hardware portion, with specific components, and a software portion, with microprocessor based architecture. This paper describes the Algorithm Architecture Adequation (AAA) design methodology - originally oriented to programmable multicomponent architectures, its extension to reconfigurable circuits and its application to design and development of nuclear instrumentation systems composed by programmable and configurable circuits. AAA methodology uses an unified model to describe algorithm, architecture and implementation, based on graph theory. The great advantage of AAA methodology is the utilization of a same model from the specification to the implementation of hardware/software systems, reducing the complexity and design time. (author)

  5. A graphical simulation software for instruction in cardiovascular mechanics physiology

    Directory of Open Access Journals (Sweden)

    Wenger Roland H

    2011-01-01

    Full Text Available Abstract Background Computer supported, interactive e-learning systems are widely used in the teaching of physiology. However, the currently available complimentary software tools in the field of the physiology of cardiovascular mechanics have not yet been adapted to the latest systems software. Therefore, a simple-to-use replacement for undergraduate and graduate students' education was needed, including an up-to-date graphical software that is validated and field-tested. Methods Software compatible to Windows, based on modified versions of existing mathematical algorithms, has been newly developed. Testing was performed during a full term of physiological lecturing to medical and biology students. Results The newly developed CLabUZH software models a reduced human cardiovascular loop containing all basic compartments: an isolated heart including an artificial electrical stimulator, main vessels and the peripheral resistive components. Students can alter several physiological parameters interactively. The resulting output variables are printed in x-y diagrams and in addition shown in an animated, graphical model. CLabUZH offers insight into the relations of volume, pressure and time dependency in the circulation and their correlation to the electrocardiogram (ECG. Established mechanisms such as the Frank-Starling Law or the Windkessel Effect are considered in this model. The CLabUZH software is self-contained with no extra installation required and runs on most of today's personal computer systems. Conclusions CLabUZH is a user-friendly interactive computer programme that has proved to be useful in teaching the basic physiological principles of heart mechanics.

  6. DYNA3D, INGRID, and TAURUS: an integrated, interactive software system for crashworthiness engineering

    International Nuclear Information System (INIS)

    Benson, D.J.; Hallquist, J.O.; Stillman, D.W.

    1985-04-01

    Crashworthiness engineering has always been a high priority at Lawrence Livermore National Laboratory because of its role in the safe transport of radioactive material for the nuclear power industry and military. As a result, the authors have developed an integrated, interactive set of finite element programs for crashworthiness analysis. The heart of the system is DYNA3D, an explicit, fully vectorized, large deformation structural dynamics code. DYNA3D has the following four capabilities that are critical for the efficient and accurate analysis of crashes: (1) fully nonlinear solid, shell, and beam elements for representing a structure, (2) a broad range of constitutive models for representing the materials, (3) sophisticated contact algorithms for the impact interactions, and (4) a rigid body capability to represent the bodies away from the impact zones at a greatly reduced cost without sacrificing any accuracy in the momentum calculations. To generate the large and complex data files for DYNA3D, INGRID, a general purpose mesh generator, is used. It runs on everything from IBM PCs to CRAYS, and can generate 1000 nodes/minute on a PC. With its efficient hidden line algorithms and many options for specifying geometry, INGRID also doubles as a geometric modeller. TAURUS, an interactive post processor, is used to display DYNA3D output. In addition to the standard monochrome hidden line display, time history plotting, and contouring, TAURUS generates interactive color displays on 8 color video screens by plotting color bands superimposed on the mesh which indicate the value of the state variables. For higher quality color output, graphic output files may be sent to the DICOMED film recorders. We have found that color is every bit as important as hidden line removal in aiding the analyst in understanding his results. In this paper the basic methodologies of the programs are presented along with several crashworthiness calculations

  7. Virtual immunology: software for teaching basic immunology.

    Science.gov (United States)

    Berçot, Filipe Faria; Fidalgo-Neto, Antônio Augusto; Lopes, Renato Matos; Faggioni, Thais; Alves, Luiz Anastácio

    2013-01-01

    As immunology continues to evolve, many educational methods have found difficulty in conveying the degree of complexity inherent in its basic principles. Today, the teaching-learning process in such areas has been improved with tools such as educational software. This article introduces "Virtual Immunology," a software program available free of charge in Portuguese and English, which can be used by teachers and students in physiology, immunology, and cellular biology classes. We discuss the development of the initial two modules: "Organs and Lymphoid Tissues" and "Inflammation" and the use of interactive activities to provide microscopic and macroscopic understanding in immunology. Students, both graduate and undergraduate, were questioned along with university level professors about the quality of the software and intuitiveness of use, facility of navigation, and aesthetic organization using a Likert scale. An overwhelmingly satisfactory result was obtained with both students and immunology teachers. Programs such as "Virtual Immunology" are offering more interactive, multimedia approaches to complex scientific principles that increase student motivation, interest, and comprehension. © 2013 by The International Union of Biochemistry and Molecular Biology.

  8. A control system verifier using automated reasoning software

    International Nuclear Information System (INIS)

    Smith, D.E.; Seeman, S.E.

    1985-08-01

    An on-line, automated reasoning software system for verifying the actions of other software or human control systems has been developed. It was demonstrated by verifying the actions of an automated procedure generation system. The verifier uses an interactive theorem prover as its inference engine with the rules included as logical axioms. Operation of the verifier is generally transparent except when the verifier disagrees with the actions of the monitored software. Testing with an automated procedure generation system demonstrates the successful application of automated reasoning software for verification of logical actions in a diverse, redundant manner. A higher degree of confidence may be placed in the verified actions of the combined system

  9. EON: software for long time simulations of atomic scale systems

    Science.gov (United States)

    Chill, Samuel T.; Welborn, Matthew; Terrell, Rye; Zhang, Liang; Berthet, Jean-Claude; Pedersen, Andreas; Jónsson, Hannes; Henkelman, Graeme

    2014-07-01

    The EON software is designed for simulations of the state-to-state evolution of atomic scale systems over timescales greatly exceeding that of direct classical dynamics. States are defined as collections of atomic configurations from which a minimization of the potential energy gives the same inherent structure. The time evolution is assumed to be governed by rare events, where transitions between states are uncorrelated and infrequent compared with the timescale of atomic vibrations. Several methods for calculating the state-to-state evolution have been implemented in EON, including parallel replica dynamics, hyperdynamics and adaptive kinetic Monte Carlo. Global optimization methods, including simulated annealing, basin hopping and minima hopping are also implemented. The software has a client/server architecture where the computationally intensive evaluations of the interatomic interactions are calculated on the client-side and the state-to-state evolution is managed by the server. The client supports optimization for different computer architectures to maximize computational efficiency. The server is written in Python so that developers have access to the high-level functionality without delving into the computationally intensive components. Communication between the server and clients is abstracted so that calculations can be deployed on a single machine, clusters using a queuing system, large parallel computers using a message passing interface, or within a distributed computing environment. A generic interface to the evaluation of the interatomic interactions is defined so that empirical potentials, such as in LAMMPS, and density functional theory as implemented in VASP and GPAW can be used interchangeably. Examples are given to demonstrate the range of systems that can be modeled, including surface diffusion and island ripening of adsorbed atoms on metal surfaces, molecular diffusion on the surface of ice and global structural optimization of nanoparticles.

  10. Methodologic model to scheduling on service systems: a software engineering approach

    Directory of Open Access Journals (Sweden)

    Eduyn Ramiro Lopez-Santana

    2016-06-01

    Full Text Available This paper presents an approach of software engineering to a research proposal to make an Expert System to scheduling on service systems using methodologies and processes of software development. We use the adaptive software development as methodology for the software architecture based on the description as a software metaprocess that characterizes the research process. We make UML’s diagrams (Unified Modeling Language to provide a visual modeling that describes the research methodology in order to identify the actors, elements and interactions in the research process.

  11. Computer software review procedures

    International Nuclear Information System (INIS)

    Mauck, J.L.

    1993-01-01

    This article reviews the procedures which are used to review software written for computer based instrumentation and control functions in nuclear facilities. The utilization of computer based control systems is becoming much more prevalent in such installations, in addition to being retrofit into existing systems. Currently, the Nuclear Regulatory System uses Regulatory Guide 1.152, open-quotes Criteria for Programmable Digital Computer System Software in Safety-Related Systems of Nuclear Power Plantsclose quotes and ANSI/IEEE-ANS-7-4.3.2-1982, open-quotes Application Criteria for Programmable Digital Computer Systems in Safety Systems of Nuclear Power Generating Stationsclose quotes for guidance when performing reviews of digital systems. There is great concern about the process of verification and validation of these codes, so when inspections are done of such systems, inspectors examine very closely the processes which were followed in developing the codes, the errors which were detected, how they were found, and the analysis which went into tracing down the causes behind the errors to insure such errors were not propagated again in the future

  12. Archiving Software Systems: Approaches to Preserve Computational Capabilities

    Science.gov (United States)

    King, T. A.

    2014-12-01

    A great deal of effort is made to preserve scientific data. Not only because data is knowledge, but it is often costly to acquire and is sometimes collected under unique circumstances. Another part of the science enterprise is the development of software to process and analyze the data. Developed software is also a large investment and worthy of preservation. However, the long term preservation of software presents some challenges. Software often requires a specific technology stack to operate. This can include software, operating systems and hardware dependencies. One past approach to preserve computational capabilities is to maintain ancient hardware long past its typical viability. On an archive horizon of 100 years, this is not feasible. Another approach to preserve computational capabilities is to archive source code. While this can preserve details of the implementation and algorithms, it may not be possible to reproduce the technology stack needed to compile and run the resulting applications. This future forward dilemma has a solution. Technology used to create clouds and process big data can also be used to archive and preserve computational capabilities. We explore how basic hardware, virtual machines, containers and appropriate metadata can be used to preserve computational capabilities and to archive functional software systems. In conjunction with data archives, this provides scientist with both the data and capability to reproduce the processing and analysis used to generate past scientific results.

  13. Software design for resilient computer systems

    CERN Document Server

    Schagaev, Igor

    2016-01-01

    This book addresses the question of how system software should be designed to account for faults, and which fault tolerance features it should provide for highest reliability. The authors first show how the system software interacts with the hardware to tolerate faults. They analyze and further develop the theory of fault tolerance to understand the different ways to increase the reliability of a system, with special attention on the role of system software in this process. They further develop the general algorithm of fault tolerance (GAFT) with its three main processes: hardware checking, preparation for recovery, and the recovery procedure. For each of the three processes, they analyze the requirements and properties theoretically and give possible implementation scenarios and system software support required. Based on the theoretical results, the authors derive an Oberon-based programming language with direct support of the three processes of GAFT. In the last part of this book, they introduce a simulator...

  14. The SSCL framework software plans

    International Nuclear Information System (INIS)

    Frederiksen, S.

    1993-12-01

    In about ten years the Superconducting Super Collider Laboratory (SSCL) will be Producing 40 TeV proton-proton interactions. The size and scale of the effort demands new approaches to design and develop software used by the experimental collaborations. The Physics Research Division Computing Department (PRCD) of the SSCL is developing (in collaboration with the Solenoidal Detector Collaboration (SDC) and Gamma, Electron and Muon (GEM) collaborations a support system which will be used to build and run the collaboration software. It will be used for simulating the events needed for detector development and for the analysis of these complicated events. The plans status of this program will be discussed

  15. Software for physics of tau lepton decay in LHC experiments

    CERN Document Server

    Przedzinski, Tomasz

    2010-01-01

    Software development in high energy physics experiments offers unique experience with rapidly changing environment and variety of different standards and frameworks that software must be adapted to. As such, regular methods of software development are hard to use as they do not take into account how greatly some of these changes influence the whole structure. The following thesis summarizes development of TAUOLA C++ Interface introducing tau decays to new event record standard. Documentation of the program is already published. That is why it is not recalled here again. We focus on the development cycle and methodology used in the project, starting from the definition of the expectations through planning and designing the abstract model and concluding with the implementation. In the last part of the paper we present installation of the software within different experiments surrounding Large Hadron Collider and the problems that emerged during this process.

  16. THE IMPACT OF MARKETING EXPERIMENTS ON THE RELATIONSHIP BETWEEN SOFTWARE PRODUCERS AND THEIR RETAILERS

    Directory of Open Access Journals (Sweden)

    HERȚANU ANDREEA

    2013-07-01

    Full Text Available This paper presents the results of a marketing experiment done on the Romanian software market. The main purpose of this research is to determine how the marketing campaigns of software manufacturers can influence the decisions of software retailers. Through this marketing experimental research an evaluation and an analysis of the impact that marketing policies of software companies have on the retailers from all over the country is made. Three different marketing campaigns were proposed to three groups of software vendors from the most important cities of the country. The total number of software retailers included in this experiment is of 45, and the marketing campaigns proposed by the authors in this experiment refer to the Microsoft brand. Promotion strategies such as: sales promotion by encouraging sales force and promotional pricing or even the policy of partner relationship management have a great impact on three aspects regarding software retailers: loyalty, purchase and resale intention and attitude towards a brand. The results of the experiment show a high interest for the strategy of promotional pricing. The representatives of the software vendors have a positive orientation towards sales promotion by encouraging sales force. Regarding the influences of the manipulations used in the experiment, the greatest impact on the loyalty of the software vendors it has the strategy of promotional pricing. Also the policy of sales promotion by encouraging sales force has the biggest impact on the purchase and sale intention of the software retailers. All three manipulations have also an impact on the attitude towards a brand of the vendors, but the differences are too small to determine which of the proposed stimuli has a greater impact on this aspect. The results of the experiment may help and could have a great influence on the future marketing decisions of manufacturers regarding the strategies and marketing policies used on the Romanian

  17. Carnegie Learning Curricula and Cognitive Tutor[R] Software. What Works Clearinghouse Intervention Report

    Science.gov (United States)

    What Works Clearinghouse, 2010

    2010-01-01

    The combination of "Carnegie Learning Curricula and Cognitive Tutor[R] Software" merges algebra textbooks with interactive software developed around an artificial intelligence model that identifies strengths and weaknesses in an individual student's mastery of mathematical concepts. The software customizes prompts to focus on areas in…

  18. The MINERVA Software Development Process

    Science.gov (United States)

    Narkawicz, Anthony; Munoz, Cesar A.; Dutle, Aaron M.

    2017-01-01

    This paper presents a software development process for safety-critical software components of cyber-physical systems. The process is called MINERVA, which stands for Mirrored Implementation Numerically Evaluated against Rigorously Verified Algorithms. The process relies on formal methods for rigorously validating code against its requirements. The software development process uses: (1) a formal specification language for describing the algorithms and their functional requirements, (2) an interactive theorem prover for formally verifying the correctness of the algorithms, (3) test cases that stress the code, and (4) numerical evaluation on these test cases of both the algorithm specifications and their implementations in code. The MINERVA process is illustrated in this paper with an application to geo-containment algorithms for unmanned aircraft systems. These algorithms ensure that the position of an aircraft never leaves a predetermined polygon region and provide recovery maneuvers when the region is inadvertently exited.

  19. Implementing a modeling software for animated protein-complex interactions using a physics simulation library.

    Science.gov (United States)

    Ueno, Yutaka; Ito, Shuntaro; Konagaya, Akihiko

    2014-12-01

    To better understand the behaviors and structural dynamics of proteins within a cell, novel software tools are being developed that can create molecular animations based on the findings of structural biology. This study proposes our method developed based on our prototypes to detect collisions and examine the soft-body dynamics of molecular models. The code was implemented with a software development toolkit for rigid-body dynamics simulation and a three-dimensional graphics library. The essential functions of the target software system included the basic molecular modeling environment, collision detection in the molecular models, and physical simulations of the movement of the model. Taking advantage of recent software technologies such as physics simulation modules and interpreted scripting language, the functions required for accurate and meaningful molecular animation were implemented efficiently.

  20. Software development for a switch-based data acquisition system

    Energy Technology Data Exchange (ETDEWEB)

    Booth, A. (Superconducting Super Collider Lab., Dallas, TX (United States)); Black, D.; Walsh, D. (Fermi National Accelerator Lab., Batavia, IL (United States))

    1991-12-01

    We report on the software aspects of the development of a switch-based data acquisition system at Fermilab. This paper describes how, with the goal of providing an integrated systems engineering'' environment, several powerful software tools were put in place to facilitate extensive exploration of all aspects of the design. These tools include a simulation package, graphics package and an Expert System shell which have been integrated to provide an environment which encourages the close interaction of hardware and software engineers. This paper includes a description of the simulation, user interface, embedded software, remote procedure calls, and diagnostic software which together have enabled us to provide real-time control and monitoring of a working prototype switch-based data acquisition (DAQ) system.

  1. Usability study of clinical exome analysis software: top lessons learned and recommendations.

    Science.gov (United States)

    Shyr, Casper; Kushniruk, Andre; Wasserman, Wyeth W

    2014-10-01

    New DNA sequencing technologies have revolutionized the search for genetic disruptions. Targeted sequencing of all protein coding regions of the genome, called exome analysis, is actively used in research-oriented genetics clinics, with the transition to exomes as a standard procedure underway. This transition is challenging; identification of potentially causal mutation(s) amongst ∼10(6) variants requires specialized computation in combination with expert assessment. This study analyzes the usability of user interfaces for clinical exome analysis software. There are two study objectives: (1) To ascertain the key features of successful user interfaces for clinical exome analysis software based on the perspective of expert clinical geneticists, (2) To assess user-system interactions in order to reveal strengths and weaknesses of existing software, inform future design, and accelerate the clinical uptake of exome analysis. Surveys, interviews, and cognitive task analysis were performed for the assessment of two next-generation exome sequence analysis software packages. The subjects included ten clinical geneticists who interacted with the software packages using the "think aloud" method. Subjects' interactions with the software were recorded in their clinical office within an urban research and teaching hospital. All major user interface events (from the user interactions with the packages) were time-stamped and annotated with coding categories to identify usability issues in order to characterize desired features and deficiencies in the user experience. We detected 193 usability issues, the majority of which concern interface layout and navigation, and the resolution of reports. Our study highlights gaps in specific software features typical within exome analysis. The clinicians perform best when the flow of the system is structured into well-defined yet customizable layers for incorporation within the clinical workflow. The results highlight opportunities to

  2. Web accessibility and open source software.

    Science.gov (United States)

    Obrenović, Zeljko

    2009-07-01

    A Web browser provides a uniform user interface to different types of information. Making this interface universally accessible and more interactive is a long-term goal still far from being achieved. Universally accessible browsers require novel interaction modalities and additional functionalities, for which existing browsers tend to provide only partial solutions. Although functionality for Web accessibility can be found as open source and free software components, their reuse and integration is complex because they were developed in diverse implementation environments, following standards and conventions incompatible with the Web. To address these problems, we have started several activities that aim at exploiting the potential of open-source software for Web accessibility. The first of these activities is the development of Adaptable Multi-Interface COmmunicator (AMICO):WEB, an infrastructure that facilitates efficient reuse and integration of open source software components into the Web environment. The main contribution of AMICO:WEB is in enabling the syntactic and semantic interoperability between Web extension mechanisms and a variety of integration mechanisms used by open source and free software components. Its design is based on our experiences in solving practical problems where we have used open source components to improve accessibility of rich media Web applications. The second of our activities involves improving education, where we have used our platform to teach students how to build advanced accessibility solutions from diverse open-source software. We are also partially involved in the recently started Eclipse projects called Accessibility Tools Framework (ACTF), the aim of which is development of extensible infrastructure, upon which developers can build a variety of utilities that help to evaluate and enhance the accessibility of applications and content for people with disabilities. In this article we briefly report on these activities.

  3. Mapping modern software process engineering techniques onto an HEP development environment

    CERN Document Server

    Wellisch, J P

    2003-01-01

    One of the most challenging issues faced in HEP in recent years is the question of how to capitalise on software development and maintenance experience in a continuous manner. To capitalise means in our context to evaluate and apply new process technologies as they arise, and to further evolve technologies already widely in use. It also implies the definition and adoption of standards. The CMS off- line software improvement effort aims at continual software quality improvement, and continual improvement in the efficiency of the working environment with the goal to facilitate doing great new physics. To achieve this, we followed a process improvement program based on ISO-15504, and Rational Unified Process. This experiment in software process improvement in HEP has been progressing now for a period of 3 years. Taking previous experience from ATLAS and SPIDER into account, we used a soft approach of continuous change within the limits of current culture to create of de facto software process standards within th...

  4. Global Software Engineering: A Software Process Approach

    Science.gov (United States)

    Richardson, Ita; Casey, Valentine; Burton, John; McCaffery, Fergal

    Our research has shown that many companies are struggling with the successful implementation of global software engineering, due to temporal, cultural and geographical distance, which causes a range of factors to come into play. For example, cultural, project managementproject management and communication difficulties continually cause problems for software engineers and project managers. While the implementation of efficient software processes can be used to improve the quality of the software product, published software process models do not cater explicitly for the recent growth in global software engineering. Our thesis is that global software engineering factors should be included in software process models to ensure their continued usefulness in global organisations. Based on extensive global software engineering research, we have developed a software process, Global Teaming, which includes specific practices and sub-practices. The purpose is to ensure that requirements for successful global software engineering are stipulated so that organisations can ensure successful implementation of global software engineering.

  5. Real-time animation software for customized training to use motor prosthetic systems.

    Science.gov (United States)

    Davoodi, Rahman; Loeb, Gerald E

    2012-03-01

    Research on control of human movement and development of tools for restoration and rehabilitation of movement after spinal cord injury and amputation can benefit greatly from software tools for creating precisely timed animation sequences of human movement. Despite their ability to create sophisticated animation and high quality rendering, existing animation software are not adapted for application to neural prostheses and rehabilitation of human movement. We have developed a software tool known as MSMS (MusculoSkeletal Modeling Software) that can be used to develop models of human or prosthetic limbs and the objects with which they interact and to animate their movement using motion data from a variety of offline and online sources. The motion data can be read from a motion file containing synthesized motion data or recordings from a motion capture system. Alternatively, motion data can be streamed online from a real-time motion capture system, a physics-based simulation program, or any program that can produce real-time motion data. Further, animation sequences of daily life activities can be constructed using the intuitive user interface of Microsoft's PowerPoint software. The latter allows expert and nonexpert users alike to assemble primitive movements into a complex motion sequence with precise timing by simply arranging the order of the slides and editing their properties in PowerPoint. The resulting motion sequence can be played back in an open-loop manner for demonstration and training or in closed-loop virtual reality environments where the timing and speed of animation depends on user inputs. These versatile animation utilities can be used in any application that requires precisely timed animations but they are particularly suited for research and rehabilitation of movement disorders. MSMS's modeling and animation tools are routinely used in a number of research laboratories around the country to study the control of movement and to develop and test

  6. [Development of a Software for Automatically Generated Contours in Eclipse TPS].

    Science.gov (United States)

    Xie, Zhao; Hu, Jinyou; Zou, Lian; Zhang, Weisha; Zou, Yuxin; Luo, Kelin; Liu, Xiangxiang; Yu, Luxin

    2015-03-01

    The automatic generation of planning targets and auxiliary contours have achieved in Eclipse TPS 11.0. The scripting language autohotkey was used to develop a software for automatically generated contours in Eclipse TPS. This software is named Contour Auto Margin (CAM), which is composed of operational functions of contours, script generated visualization and script file operations. RESULTS Ten cases in different cancers have separately selected, in Eclipse TPS 11.0 scripts generated by the software could not only automatically generate contours but also do contour post-processing. For different cancers, there was no difference between automatically generated contours and manually created contours. The CAM is a user-friendly and powerful software, and can automatically generated contours fast in Eclipse TPS 11.0. With the help of CAM, it greatly save plan preparation time and improve working efficiency of radiation therapy physicists.

  7. Digital image processing software system using an array processor

    International Nuclear Information System (INIS)

    Sherwood, R.J.; Portnoff, M.R.; Journeay, C.H.; Twogood, R.E.

    1981-01-01

    A versatile array processor-based system for general-purpose image processing was developed. At the heart of this system is an extensive, flexible software package that incorporates the array processor for effective interactive image processing. The software system is described in detail, and its application to a diverse set of applications at LLNL is briefly discussed. 4 figures, 1 table

  8. The December 2006 ATLAS Computing & Software Workshop

    CERN Multimedia

    Fred Luehring

    The 29th ATLAS Computing & Software Workshop was held on December 11-15 at CERN. With the rapidly approaching onset of data taking, the workshop participants had an air of urgency about them. There was considerable discussion on hot topics such as physics validation of the software, data analysis, actual software production on the GRID, and the schedule of work for 2007 including the Final Dress Rehearsal (FDR). However don't be fooled, the workshop was not all work - there were also two social events which were greatly enjoyed by the attendees. The workshop welcomed Wouter Verkerke as the new Physics Validation Coordinator (replacing Davide Costanzo). Most recent validation work has centered on the 12.0.X release series that will be used for the Computing System Commissioning (CSC) exercise. The validation is now a big job because it needs to be done over a variety of conditions (magnetic field on/off, aligned/misaligned geometry) for every candidate release. Luckily there have been a large number of pe...

  9. Software engineering architecture-driven software development

    CERN Document Server

    Schmidt, Richard F

    2013-01-01

    Software Engineering: Architecture-driven Software Development is the first comprehensive guide to the underlying skills embodied in the IEEE's Software Engineering Body of Knowledge (SWEBOK) standard. Standards expert Richard Schmidt explains the traditional software engineering practices recognized for developing projects for government or corporate systems. Software engineering education often lacks standardization, with many institutions focusing on implementation rather than design as it impacts product architecture. Many graduates join the workforce with incomplete skil

  10. Software for precise tracking of cell proliferation

    International Nuclear Information System (INIS)

    Kurokawa, Hiroshi; Noda, Hisayori; Sugiyama, Mayu; Sakaue-Sawano, Asako; Fukami, Kiyoko; Miyawaki, Atsushi

    2012-01-01

    Highlights: ► We developed software for analyzing cultured cells that divide as well as migrate. ► The active contour model (Snakes) was used as the core algorithm. ► The time backward analysis was also used for efficient detection of cell division. ► With user-interactive correction functions, the software enables precise tracking. ► The software was successfully applied to cells with fluorescently-labeled nuclei. -- Abstract: We have developed a multi-target cell tracking program TADOR, which we applied to a series of fluorescence images. TADOR is based on an active contour model that is modified in order to be free of the problem of locally optimal solutions, and thus is resistant to signal fluctuation and morphological changes. Due to adoption of backward tracing and addition of user-interactive correction functions, TADOR is used in an off-line and semi-automated mode, but enables precise tracking of cell division. By applying TADOR to the analysis of cultured cells whose nuclei had been fluorescently labeled, we tracked cell division and cell-cycle progression on coverslips over an extended period of time.

  11. Worldwide collaborative efforts in plasma control software development

    International Nuclear Information System (INIS)

    Penaflor, B.G.; Ferron, J.R.; Walker, M.L.; Humphreys, D.A.; Leuer, J.A.; Piglowski, D.A.; Johnson, R.D.; Xiao, B.J.; Hahn, S.H.; Gates, D.A.

    2008-01-01

    This presentation will describe the DIII-D collaborations with various tokamak experiments throughout the world which have adapted custom versions of the DIII-D plasma control system (PCS) software for their own use. Originally developed by General Atomics for use on the DIII-D tokamak, the PCS has been successfully installed and used for the NSTX experiment in Princeton, the MAST experiment in Culham UK, the EAST experiment in China, and the Pegasus experiment in the University of Wisconsin. In addition to these sites, a version of the PCS is currently being developed for use by the KSTAR tokamak in Korea. A well-defined and robust PCS software infrastructure has been developed to provide a common foundation for implementing the real-time data acquisition and feedback control codes. The PCS infrastructure provides a flexible framework that has allowed the PCS to be easily adapted to fulfill the unique needs of each site. The software has also demonstrated great flexibility in allowing for different computing, data acquisition and real-time networking hardware to be used. A description of the current PCS software architecture will be given along with experiences in developing and supporting the various PCS installations throughout the world

  12. The application of image processing software: Photoshop in environmental design

    Science.gov (United States)

    Dong, Baohua; Zhang, Chunmi; Zhuo, Chen

    2011-02-01

    In the process of environmental design and creation, the design sketch holds a very important position in that it not only illuminates the design's idea and concept but also shows the design's visual effects to the client. In the field of environmental design, computer aided design has made significant improvement. Many types of specialized design software for environmental performance of the drawings and post artistic processing have been implemented. Additionally, with the use of this software, working efficiency has greatly increased and drawings have become more specific and more specialized. By analyzing the application of photoshop image processing software in environmental design and comparing and contrasting traditional hand drawing and drawing with modern technology, this essay will further explore the way for computer technology to play a bigger role in environmental design.

  13. Application of software to development of reactor-safety codes

    International Nuclear Information System (INIS)

    Wilburn, N.P.; Niccoli, L.G.

    1980-09-01

    Over the past two-and-a-half decades, the application of new techniques has reduced hardware cost for digital computer systems and increased computational speed by several orders of magnitude. A corresponding cost reduction in business and scientific software development has not occurred. The same situation is seen for software developed to model the thermohydraulic behavior of nuclear systems under hypothetical accident situations. For all cases this is particularly noted when costs over the total software life cycle are considered. A solution to this dilemma for reactor safety code systems has been demonstrated by applying the software engineering techniques which have been developed over the course of the last few years in the aerospace and business communities. These techniques have been applied recently with a great deal of success in four major projects at the Hanford Engineering Development Laboratory (HEDL): 1) a rewrite of a major safety code (MELT); 2) development of a new code system (CONACS) for description of the response of LMFBR containment to hypothetical accidents, and 3) development of two new modules for reactor safety analysis

  14. Visualizer: 3D Gridded Data Visualization Software for Geoscience Education and Research

    Science.gov (United States)

    Harwood, C.; Billen, M. I.; Kreylos, O.; Jadamec, M.; Sumner, D. Y.; Kellogg, L. H.; Hamann, B.

    2008-12-01

    In both research and education learning is an interactive and iterative process of exploring and analyzing data or model results. However, visualization software often presents challenges on the path to learning because it assumes the user already knows the locations and types of features of interest, instead of enabling flexible and intuitive examination of results. We present examples of research and teaching using the software, Visualizer, specifically designed to create an effective and intuitive environment for interactive, scientific analysis of 3D gridded data. Visualizer runs in a range of 3D virtual reality environments (e.g., GeoWall, ImmersaDesk, or CAVE), but also provides a similar level of real-time interactivity on a desktop computer. When using Visualizer in a 3D-enabled environment, the software allows the user to interact with the data images as real objects, grabbing, rotating or walking around the data to gain insight and perspective. On the desktop, simple features, such as a set of cross-bars marking the plane of the screen, provide extra 3D spatial cues that allow the user to more quickly understand geometric relationships within the data. This platform portability allows the user to more easily integrate research results into classroom demonstrations and exercises, while the interactivity provides an engaging environment for self-directed and inquiry-based learning by students. Visualizer software is freely available for download (www.keckcaves.org) and runs on Mac OSX and Linux platforms.

  15. Application of automated reasoning software: procedure generation system verifier

    International Nuclear Information System (INIS)

    Smith, D.E.; Seeman, S.E.

    1984-09-01

    An on-line, automated reasoning software system for verifying the actions of other software or human control systems has been developed. It was demonstrated by verifying the actions of an automated procedure generation system. The verifier uses an interactive theorem prover as its inference engine with the rules included as logic axioms. Operation of the verifier is generally transparent except when the verifier disagrees with the actions of the monitored software. Testing with an automated procedure generation system demonstrates the successful application of automated reasoning software for verification of logical actions in a diverse, redundant manner. A higher degree of confidence may be placed in the verified actions gathered by the combined system

  16. Software FMEA analysis for safety-related application software

    International Nuclear Information System (INIS)

    Park, Gee-Yong; Kim, Dong Hoon; Lee, Dong Young

    2014-01-01

    Highlights: • We develop a modified FMEA analysis suited for applying to software architecture. • A template for failure modes on a specific software language is established. • A detailed-level software FMEA analysis on nuclear safety software is presented. - Abstract: A method of a software safety analysis is described in this paper for safety-related application software. The target software system is a software code installed at an Automatic Test and Interface Processor (ATIP) in a digital reactor protection system (DRPS). For the ATIP software safety analysis, at first, an overall safety or hazard analysis is performed over the software architecture and modules, and then a detailed safety analysis based on the software FMEA (Failure Modes and Effect Analysis) method is applied to the ATIP program. For an efficient analysis, the software FMEA analysis is carried out based on the so-called failure-mode template extracted from the function blocks used in the function block diagram (FBD) for the ATIP software. The software safety analysis by the software FMEA analysis, being applied to the ATIP software code, which has been integrated and passed through a very rigorous system test procedure, is proven to be able to provide very valuable results (i.e., software defects) that could not be identified during various system tests

  17. Have You Heard This? Designing Mobile Social Software

    Directory of Open Access Journals (Sweden)

    Jørn Georg Sannes Knutsen

    2010-07-01

    Full Text Available ‘Desktop’ social networking services are migrating to mobile devices. Research into the design of mobile social software (MoSoSo, especially its communication design, is emerging. The case we present is from a collaborative, interdisciplinary research project into communicative design innovation concerning these technologies. In focus is the design of what we label the communicative prototype for an interaction and media centred view of social software development. This view is applied to an exploratory design research case that extends an established online social service to the iPhone/iPod platform. The conceptual design in the case is intended to enable the discovery of independent, non-commercial music. The projected service was developed in consultation with a national public service broadcaster. We frame the design and analysis within a sociocultural approach to mediated communication and research by design. We employ mixed methods both in design and in research. We argue that a communicative stance in early concept development offers valuable insights on the ongoing design of social software. The communication expertise of interaction designers is central to this.

  18. Mass market development strategies of software industries: Case study based research

    Directory of Open Access Journals (Sweden)

    Varun Gupta

    2016-09-01

    Full Text Available The success in competitive mass market software development depends on the quality of software development and market segments targeted. Market segments are categorized by uncertainties contributed by “Newness” and “turbulences”, making the software success stochastic in nature. Selecting good market segments, delivering high quality software versions in the lowest time than competitors, result in increasing demand in markets and ultimately revenues. Enhanced customer base is beneficial for current product as well as for future products of industry in the form of increased reputation and increased involvement of customers in future development. The case study was conducted with 13 representatives drawing experiences of 14 mass market projects. Results indicate that software solutions are delivered to few investors or in highly competitive markets, as per the survey's findings of the marketing departments. The software organizations are reluctant to deliver relatively complex solutions in new markets unless and until strongly convinced with the probable success. The method for selection of market segments belonging to new and existing markets for undertaking the software delivery is also proposed in this paper. The model will help software industry decide the market segments and high abstract level features that could increase probability of software success. Poor selection of markets or targeting markets of “improper” size affects the market share of the industry to a great extend.

  19. Mapping modern software process engineering techniques onto an HEP development environment

    International Nuclear Information System (INIS)

    Wellisch, J.P.

    2003-01-01

    One of the most challenging issues faced in HEP in recent years is the question of how to capitalise on software development and maintenance experience in a continuous manner. To capitalise means in our context to evaluate and apply new process technologies as they arise, and to further evolve technologies already widely in use. It also implies the definition and adoption of standards. The CMS off-line software improvement effort aims at continual software quality improvement, and continual improvement in the efficiency of the working environment with the goal to facilitate doing great new physics. To achieve this, we followed a process improvement program based on ISO-15504, and Rational Unified Process. This experiment in software process improvement in HEP has been progressing now for a period of 3 years. Taking previous experience from ATLAS and SPIDER into account, we used a soft approach of continuous change within the limits of current culture to create of de facto software process standards within the CMS off line community as the only viable route to a successful software process improvement program in HEP. We will present the CMS approach to software process improvement in this process R and D, describe lessons learned, and mistakes made. We will demonstrate the benefits gained, and the current status of the software processes established in CMS off-line software

  20. Mapping modern software process engineering techniques onto an HEP development environment

    Science.gov (United States)

    Wellisch, J. P.

    2003-04-01

    One of the most challenging issues faced in HEP in recent years is the question of how to capitalise on software development and maintenance experience in a continuous manner. To capitalise means in our context to evaluate and apply new process technologies as they arise, and to further evolve technologies already widely in use. It also implies the definition and adoption of standards. The CMS off-line software improvement effort aims at continual software quality improvement, and continual improvement in the efficiency of the working environment with the goal to facilitate doing great new physics. To achieve this, we followed a process improvement program based on ISO-15504, and Rational Unified Process. This experiment in software process improvement in HEP has been progressing now for a period of 3 years. Taking previous experience from ATLAS and SPIDER into account, we used a soft approach of continuous change within the limits of current culture to create of de facto software process standards within the CMS off line community as the only viable route to a successful software process improvement program in HEP. We will present the CMS approach to software process improvement in this process R&D, describe lessons learned, and mistakes made. We will demonstrate the benefits gained, and the current status of the software processes established in CMS off-line software.

  1. Software to model AXAF-I image quality

    Science.gov (United States)

    Ahmad, Anees; Feng, Chen

    1995-01-01

    A modular user-friendly computer program for the modeling of grazing-incidence type x-ray optical systems has been developed. This comprehensive computer software GRAZTRACE covers the manipulation of input data, ray tracing with reflectivity and surface deformation effects, convolution with x-ray source shape, and x-ray scattering. The program also includes the capabilities for image analysis, detector scan modeling, and graphical presentation of the results. A number of utilities have been developed to interface the predicted Advanced X-ray Astrophysics Facility-Imaging (AXAF-I) mirror structural and thermal distortions with the ray-trace. This software is written in FORTRAN 77 and runs on a SUN/SPARC station. An interactive command mode version and a batch mode version of the software have been developed.

  2. A user-friendly LabVIEW software platform for grating based X-ray phase-contrast imaging.

    Science.gov (United States)

    Wang, Shenghao; Han, Huajie; Gao, Kun; Wang, Zhili; Zhang, Can; Yang, Meng; Wu, Zhao; Wu, Ziyu

    2015-01-01

    X-ray phase-contrast imaging can provide greatly improved contrast over conventional absorption-based imaging for weakly absorbing samples, such as biological soft tissues and fibre composites. In this study, we introduced an easy and fast way to develop a user-friendly software platform dedicated to the new grating-based X-ray phase-contrast imaging setup at the National Synchrotron Radiation Laboratory of the University of Science and Technology of China. The control of 21 motorized stages, of a piezoelectric stage and of an X-ray tube are achieved with this software, it also covers image acquisition with a flat panel detector for automatic phase stepping scan. Moreover, a data post-processing module for signals retrieval and other custom features are in principle available. With a seamless integration of all the necessary functions in one software package, this platform greatly facilitate users' activities during experimental runs with this grating based X-ray phase contrast imaging setup.

  3. Presenting an Evaluation Model for the Cancer Registry Software.

    Science.gov (United States)

    Moghaddasi, Hamid; Asadi, Farkhondeh; Rabiei, Reza; Rahimi, Farough; Shahbodaghi, Reihaneh

    2017-12-01

    As cancer is increasingly growing, cancer registry is of great importance as the main core of cancer control programs, and many different software has been designed for this purpose. Therefore, establishing a comprehensive evaluation model is essential to evaluate and compare a wide range of such software. In this study, the criteria of the cancer registry software have been determined by studying the documents and two functional software of this field. The evaluation tool was a checklist and in order to validate the model, this checklist was presented to experts in the form of a questionnaire. To analyze the results of validation, an agreed coefficient of %75 was determined in order to apply changes. Finally, when the model was approved, the final version of the evaluation model for the cancer registry software was presented. The evaluation model of this study contains tool and method of evaluation. The evaluation tool is a checklist including the general and specific criteria of the cancer registry software along with their sub-criteria. The evaluation method of this study was chosen as a criteria-based evaluation method based on the findings. The model of this study encompasses various dimensions of cancer registry software and a proper method for evaluating it. The strong point of this evaluation model is the separation between general criteria and the specific ones, while trying to fulfill the comprehensiveness of the criteria. Since this model has been validated, it can be used as a standard to evaluate the cancer registry software.

  4. Development of tools for safety analysis of control software in advanced reactors

    International Nuclear Information System (INIS)

    Guarro, S.; Yau, M.; Motamed, M.

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described

  5. Development of tools for safety analysis of control software in advanced reactors

    Energy Technology Data Exchange (ETDEWEB)

    Guarro, S.; Yau, M.; Motamed, M. [Advanced Systems Concepts Associates, El Segundo, CA (United States)

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described.

  6. The impact of using computer decision-support software in primary care nurse-led telephone triage: interactional dilemmas and conversational consequences.

    Science.gov (United States)

    Murdoch, Jamie; Barnes, Rebecca; Pooler, Jillian; Lattimer, Valerie; Fletcher, Emily; Campbell, John L

    2015-02-01

    Telephone triage represents one strategy to manage demand for face-to-face GP appointments in primary care. Although computer decision-support software (CDSS) is increasingly used by nurses to triage patients, little is understood about how interaction is organized in this setting. Specifically any interactional dilemmas this computer-mediated setting invokes; and how these may be consequential for communication with patients. Using conversation analytic methods we undertook a multi-modal analysis of 22 audio-recorded telephone triage nurse-caller interactions from one GP practice in England, including 10 video-recordings of nurses' use of CDSS during triage. We draw on Goffman's theoretical notion of participation frameworks to make sense of these interactions, presenting 'telling cases' of interactional dilemmas nurses faced in meeting patient's needs and accurately documenting the patient's condition within the CDSS. Our findings highlight troubles in the 'interactional workability' of telephone triage exposing difficulties faced in aligning the proximal and wider distal context that structures CDSS-mediated interactions. Patients present with diverse symptoms, understanding of triage consultations, and communication skills which nurses need to negotiate turn-by-turn with CDSS requirements. Nurses therefore need to have sophisticated communication, technological and clinical skills to ensure patients' presenting problems are accurately captured within the CDSS to determine safe triage outcomes. Dilemmas around how nurses manage and record information, and the issues of professional accountability that may ensue, raise questions about the impact of CDSS and its use in supporting nurses to deliver safe and effective patient care. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Sandia software guidelines: Software quality planning

    Energy Technology Data Exchange (ETDEWEB)

    1987-08-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. In consonance with the IEEE Standard for Software Quality Assurance Plans, this volume identifies procedures to follow in producing a Software Quality Assurance Plan for an organization or a project, and provides an example project SQA plan. 2 figs., 4 tabs.

  8. The Blue Dog: evaluation of an interactive software program to teach young children how to interact safely with dogs.

    Science.gov (United States)

    Schwebel, David C; Morrongiello, Barbara A; Davis, Aaron L; Stewart, Julia; Bell, Melissa

    2012-04-01

    Pre-post-randomized design evaluated The Blue Dog, a dog safety software program. 76 children aged 3.5-6 years completed 3 tasks to evaluate dog safety pre- and postintervention: (a) pictures (recognition of safe/risky behavior), (b) dollhouse (recall of safe behavior via simulated dollhouse scenarios), and (c) live dog (actual behavior with unfamiliar live dog). Following preintervention evaluation, children were randomly assigned to dog or fire safety conditions, each involving 3 weeks of home computer software use. Children using Blue Dog had greater change in recognition of risky dog situations than children learning fire safety. No between-group differences emerged in recall (dollhouse) or engagement (live-dog) in risky behavior. Families enjoyed using the software. Blue Dog taught children knowledge about safe engagement with dogs, but did not influence recall or implementation of safe behaviors. Dog bites represent a significant pediatric injury concern and continued development of effective interventions is needed.

  9. Active learning in Operations Management: interactive multimedia software for teaching JIT/Lean Production

    Directory of Open Access Journals (Sweden)

    Carmen Medina-López

    2011-04-01

    Full Text Available Purpose: Information & Communication Technologies (ICT can be a fundamental aid for the design of new teaching methods that better adapt to the framework of the European Higher Education Area. In this context, this study aims to develop and assess a complex and truly interactive ICT-based teaching tool for instruction in OM.Design/methodology/approach: A multimedia application for Just-in-Time (JIT / Lean Production has been conceived, designed and assessed. A constructivist focus was followed in its conception and design to encourage active and flexible learning adapted to each individual’s own requirements. Using empirical research the tool has been assessed by students and compared to the traditional teaching methods.Findings: The interactive multimedia application has been clearly valued for the way it conveys information and for its usability, for the way the application is structured and the improvements to students’ understanding of the knowledge. Students are also in favour of ICT being incorporated into teaching over more traditional methods. The assessment took students’ gender and the average overall mark on their academic records as control variables but, broadly-speaking, no significant differences were found. Research limitations/implications: The study was carried out in a controlled environment and not in the normal on-site university teaching process. Conclusions could be extended to OM and other related subjects, especially if they make use of similar tools to the one described in this paper. Practical implications: This study provides a contribution that allows reflections to be made on the design of specific software for OM and students’ perceptions when using it.Originality/value: Through this paper we contribute to an improvement in learning methods in general and to higher education in OM in particular.

  10. Software

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, R.; Budd, G.; Ross, E.; Wells, P.

    2010-07-15

    The software section of this journal presented new software programs that have been developed to help in the exploration and development of hydrocarbon resources. Software provider IHS Inc. has made additions to its geological and engineering analysis software tool, IHS PETRA, a product used by geoscientists and engineers to visualize, analyze and manage well production, well log, drilling, reservoir, seismic and other related information. IHS PETRA also includes a directional well module and a decline curve analysis module to improve analysis capabilities in unconventional reservoirs. Petris Technology Inc. has developed a software to help manage the large volumes of data. PetrisWinds Enterprise (PWE) helps users find and manage wellbore data, including conventional wireline and MWD core data; analysis core photos and images; waveforms and NMR; and external files documentation. Ottawa-based Ambercore Software Inc. has been collaborating with Nexen on the Petroleum iQ software for steam assisted gravity drainage (SAGD) producers. Petroleum iQ integrates geology and geophysics data with engineering data in 3D and 4D. Calgary-based Envirosoft Corporation has developed a software that reduces the costly and time-consuming effort required to comply with Directive 39 of the Alberta Energy Resources Conservation Board. The product includes an emissions modelling software. Houston-based Seismic Micro-Technology (SMT) has developed the Kingdom software that features the latest in seismic interpretation. Holland-based Joa Oil and Gas and Calgary-based Computer Modelling Group have both supplied the petroleum industry with advanced reservoir simulation software that enables reservoir interpretation. The 2010 software survey included a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In

  11. Real-time Control Mediation in Agile Distributed Software Development

    DEFF Research Database (Denmark)

    Persson, John Stouby; Aaen, Ivan; Mathiassen, Lars

    2008-01-01

    Agile distributed environments pose particular challenges related to control of quality and collaboration in software development. Moreover, while face-to-face interaction is fundamental in agile development, distributed environments must rely extensively on mediated interactions. On this backdrop...... control was mediated over distance by technology through real-time exchanges. Contrary to previous research, the analysis suggests that both formal and informal elements of real-time mediated control were used; that evolving goals and adjustment of expectations were two of the main issues in real......-time mediated control exchanges; and, that the actors, despite distances in space and culture, developed a clan-like pattern mediated by technology to help control quality and collaboration in software development....

  12. Information Engineering in Autonomous Robot Software

    NARCIS (Netherlands)

    Ziafati, P.

    2015-01-01

    In order to engage and help in our daily life, autonomous robots are to operate in dynamic and unstructured environments and interact with people. As the robot's environment and its behaviour are getting more complex, so are the robot's software and the knowledge that the robot needs to carry out

  13. RAGE Reusable Game Software Components and Their Integration into Serious Game Engines

    NARCIS (Netherlands)

    Van der Vegt, Wim; Nyamsuren, Enkhbold; Westera, Wim

    2016-01-01

    This paper presents and validates a methodology for integrating reusable software components in diverse game engines. While conforming to the RAGE com-ponent-based architecture described elsewhere, the paper explains how the interac-tions and data exchange processes between a reusable software

  14. A Review of Predictive Software for the Design of Community Microgrids

    Directory of Open Access Journals (Sweden)

    Mina Rahimian

    2018-01-01

    Full Text Available This paper discusses adding a spatial dimension to the design of community microgrid projects in the interest of expanding the existing discourse related to energy performance optimization measures. A multidimensional vision for designing community microgrids with higher energy performance is considered, leveraging urban form (superstructure to understand how it impacts the performance of the system’s distributed energy resources and loads (infrastructure. This vision engages the design sector in the technical conversation of developing community microgrids, leading to energy efficient designs of microgrid-connected communities well before their construction. A new generation of computational modeling and simulation tools that address this interaction are required. In order to position the research, this paper presents a survey of existing software packages, belonging to two distinct categories of modeling, simulation, and evaluation of community microgrids: the energy infrastructure modeling and the urban superstructure energy modeling. Results of this software survey identify a lack in software tools and simulation packages that simultaneously address the necessary interaction between the superstructure and infrastructure of community microgrids, given the importance of its study. Conclusions represent how a proposed experimental software prototype may fill an existing gap in current related software packages.

  15. Advanced human machine interaction for an image interpretation workstation

    Science.gov (United States)

    Maier, S.; Martin, M.; van de Camp, F.; Peinsipp-Byma, E.; Beyerer, J.

    2016-05-01

    In recent years, many new interaction technologies have been developed that enhance the usability of computer systems and allow for novel types of interaction. The areas of application for these technologies have mostly been in gaming and entertainment. However, in professional environments, there are especially demanding tasks that would greatly benefit from improved human machine interfaces as well as an overall improved user experience. We, therefore, envisioned and built an image-interpretation-workstation of the future, a multi-monitor workplace comprised of four screens. Each screen is dedicated to a complex software product such as a geo-information system to provide geographic context, an image annotation tool, software to generate standardized reports and a tool to aid in the identification of objects. Using self-developed systems for hand tracking, pointing gestures and head pose estimation in addition to touchscreens, face identification, and speech recognition systems we created a novel approach to this complex task. For example, head pose information is used to save the position of the mouse cursor on the currently focused screen and to restore it as soon as the same screen is focused again while hand gestures allow for intuitive manipulation of 3d objects in mid-air. While the primary focus is on the task of image interpretation, all of the technologies involved provide generic ways of efficiently interacting with a multi-screen setup and could be utilized in other fields as well. In preliminary experiments, we received promising feedback from users in the military and started to tailor the functionality to their needs

  16. Uso de software libre en las empresas colombianas

    Directory of Open Access Journals (Sweden)

    Carlos Nelson Henrìquez Miranda

    2009-01-01

    gran potencialidaden su departamento tecnológico. Con software libre se adquieren productos a un costo muy bajo, y si se tiene elconocimiento suficiente, se puede llegar a aprovecharlo oportunamente sin invertir ningún solo peso.Este artículopretende mostrar la importancia de la adopción del software libre en las pequeñas y medianas empresas colombianasque por desconocimiento o temor siguen trabajando y realizando grandes inversiones en software comercial.Palabras claves: Software libre, GNU, Software Comercial. Abstract:Currently companies use the technology to support their business processes, but locally we can realize that thereare many firms unaware of the existence of tools ( software to help them in their business tasks. Some entrepreneursthrough ignorance acquire software applications (owner that are very expensive and which use a very smallpercentage of their true capabilities. So make a great investment that is not redeemable in the future because theyneed too much advice and consultancy of these companies marketing tool . Also if companies are growing andinclude new workstations , you need to buy more software because this type of product is typically sold under proprietarylicense terms. Then we found two situations , those companies that do not invest in software because it isvery expensive, and those that do but the investment made , sometimes very high, not real capitalize ignorance ,or because of bad advice . In this situation you have to find new ways to ensure that small and medium enterprisesget their hands achieve all the benefits that entails the use of computers (speed, efficiency , organization, etc. .An option to this problem is the adoption of programs that are free or relatively inexpensive , these programs areframed in the movement known as “Free Software”. This comes as an alternative to using a good quality product ata low cost to allow partial or total systematization of companies, in addition to using office software for

  17. Advanced Visualization Software System for Nuclear Power Plant Inspection

    International Nuclear Information System (INIS)

    Kukic, I.; Jambresic, D.; Reskovic, S.

    2006-01-01

    Visualization techniques have been widely used in industrial environment for enhancing process control. Traditional techniques of visualization are based on control panels with switches and lights, and 2D graphic representations of processes. However, modern visualization systems enable significant new opportunities in creating 3D virtual environments. These opportunities arise from the availability of high end graphics capabilities in low cost personal computers. In this paper we describe implementation of process visualization software, developed by INETEC. This software is used to visualize testing equipment, components being tested and the overall power plant inspection process. It improves security of the process due to its real-time visualization and collision detection capabilities, and therefore greatly enhances the inspection process. (author)

  18. A comprehensive software suite for protein family construction and functional site prediction.

    Directory of Open Access Journals (Sweden)

    David Renfrew Haft

    Full Text Available In functionally diverse protein families, conservation in short signature regions may outperform full-length sequence comparisons for identifying proteins that belong to a subgroup within which one specific aspect of their function is conserved. The SIMBAL workflow (Sites Inferred by Metabolic Background Assertion Labeling is a data-mining procedure for finding such signature regions. It begins by using clues from genomic context, such as co-occurrence or conserved gene neighborhoods, to build a useful training set from a large number of uncharacterized but mutually homologous proteins. When training set construction is successful, the YES partition is enriched in proteins that share function with the user's query sequence, while the NO partition is depleted. A selected query sequence is then mined for short signature regions whose closest matches overwhelmingly favor proteins from the YES partition. High-scoring signature regions typically contain key residues critical to functional specificity, so proteins with the highest sequence similarity across these regions tend to share the same function. The SIMBAL algorithm was described previously, but significant manual effort, expertise, and a supporting software infrastructure were required to prepare the requisite training sets. Here, we describe a new, distributable software suite that speeds up and simplifies the process for using SIMBAL, most notably by providing tools that automate training set construction. These tools have broad utility for comparative genomics, allowing for flexible collection of proteins or protein domains based on genomic context as well as homology, a capability that can greatly assist in protein family construction. Armed with this new software suite, SIMBAL can serve as a fast and powerful in silico alternative to direct experimentation for characterizing proteins and their functional interactions.

  19. Implementation of highly parallel and large scale GW calculations within the OpenAtom software

    Science.gov (United States)

    Ismail-Beigi, Sohrab

    The need to describe electronic excitations with better accuracy than provided by band structures produced by Density Functional Theory (DFT) has been a long-term enterprise for the computational condensed matter and materials theory communities. In some cases, appropriate theoretical frameworks have existed for some time but have been difficult to apply widely due to computational cost. For example, the GW approximation incorporates a great deal of important non-local and dynamical electronic interaction effects but has been too computationally expensive for routine use in large materials simulations. OpenAtom is an open source massively parallel ab initiodensity functional software package based on plane waves and pseudopotentials (http://charm.cs.uiuc.edu/OpenAtom/) that takes advantage of the Charm + + parallel framework. At present, it is developed via a three-way collaboration, funded by an NSF SI2-SSI grant (ACI-1339804), between Yale (Ismail-Beigi), IBM T. J. Watson (Glenn Martyna) and the University of Illinois at Urbana Champaign (Laxmikant Kale). We will describe the project and our current approach towards implementing large scale GW calculations with OpenAtom. Potential applications of large scale parallel GW software for problems involving electronic excitations in semiconductor and/or metal oxide systems will be also be pointed out.

  20. Roadmap for Peridynamic Software Implementation

    Energy Technology Data Exchange (ETDEWEB)

    Littlewood, David John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-10-01

    The application of peridynamics for engineering analysis requires an efficient and robust software implementation. Key elements include processing of the discretization, the proximity search for identification of pairwise interactions, evaluation of the con- stitutive model, application of a bond-damage law, and contact modeling. Additional requirements may arise from the choice of time integration scheme, for example esti- mation of the maximum stable time step for explicit schemes, and construction of the tangent stiffness matrix for many implicit approaches. This report summaries progress to date on the software implementation of the peridynamic theory of solid mechanics. Discussion is focused on parallel implementation of the meshfree discretization scheme of Silling and Askari [33] in three dimensions, although much of the discussion applies to computational peridynamics in general.

  1. A Component-Oriented Programming for Embedded Mobile Robot Software

    Directory of Open Access Journals (Sweden)

    Safaai Deris

    2008-11-01

    Full Text Available Applying software reuse to many Embedded Real-Time (ERT systems poses significant challenges to industrial software processes due to the resource-constrained and real-time requirements of the systems. Autonomous Mobile Robot (AMR system is a class of ERT systems, hence, inherits the challenge of applying software reuse in general ERT systems. Furthermore, software reuse in AMR systems is challenged by the diversities in terms of robot physical size and shape, environmental interaction and implementation platform. Thus, it is envisioned that component-based software engineering will be the suitable way to promote software reuse in AMR systems with consideration to general requirements to be self-contained, platform-independent and real-time predictable. A framework for component-oriented programming for AMR software development using PECOS component model is proposed in this paper. The main features of this framework are: (1 use graphical representation for components definition and composition; (2 target C language for optimal code generation with resource-constrained micro-controller; and (3 minimal requirement for run-time support. Real-time implementation indicates that, the PECOS component model together with the proposed framework is suitable for resource constrained embedded AMR systems software development.

  2. The Great Recession was not so Great

    NARCIS (Netherlands)

    van Ours, J.C.

    2015-01-01

    The Great Recession is characterized by a GDP-decline that was unprecedented in the past decades. This paper discusses the implications of the Great Recession analyzing labor market data from 20 OECD countries. Comparing the Great Recession with the 1980s recession it is concluded that there is a

  3. Reconfiguring film studies through software cinema and procedural spectatorship

    Directory of Open Access Journals (Sweden)

    Marina Hassapopoulou

    2014-12-01

    Full Text Available The increasing use of software and database aesthetics in film and video production has created hybrid modes of spectatorship by altering the dynamic between media production and reception. Software-generated narratives (preprogrammed databases that create films through random selection and combination of discrete audio, visual, and/or textual tracks remove the viewer from the actual algorithmic process, drawing his/her attention instead on interactions between hardware and software. Here, the element of unpredictability that is part of cinematic pleasure lies in the recombination of discrete elements (audio, visuals, subtitles, and so on and the unexpected ways in which the software stitches those elements together. The subsequent reduction in the degree and compass of authorial control invites us to reconsider existing frameworks of spectatorship and narration within new contexts of mobility, performance, and databases. In this article I consider Soft Cinema films (Lev Manovich, Andreas Kratky, et al., 2003 as prototypical software-driven examples of this shift in viewing conditions and reception contexts. I argue that, despite its emerging and changing techniques and aesthetics, software-generated cinema retains one of the primitive socio-pedagogical functions of the cinema: training audiences to receive and buffer contemporary medial sensations. Just as early cinema prepared audiences and worked as a buffer for shocks of technological and industrial modernity, software cinema trains the viewer in new modes of film spectatorship and new modes of narrative and affective subjectivity that correspond to the hypertextual ways in which we interact with digital technologies. These viewing modes create a new form of procedural spectatorship that has been evident since the first pioneering experiments in generative cinema and a form that is, nonetheless, not entirely detached from existing theoretical paradigms of cinematic spectatorship and the

  4. The Great Recession, unemployment and suicide.

    Science.gov (United States)

    Norström, Thor; Grönqvist, Hans

    2015-02-01

    How have suicide rates responded to the marked increase in unemployment spurred by the Great Recession? Our paper puts this issue into a wider perspective by assessing (1) whether the unemployment-suicide link is modified by the degree of unemployment protection, and (2) whether the effect on suicide of the present crisis differs from the effects of previous economic downturns. We analysed the unemployment-suicide link using time-series data for 30 countries spanning the period 1960-2012. Separate fixed-effects models were estimated for each of five welfare state regimes with different levels of unemployment protection (Eastern, Southern, Anglo-Saxon, Bismarckian and Scandinavian). We included an interaction term to capture the possible excess effect of unemployment during the Great Recession. The largest unemployment increases occurred in the welfare state regimes with the least generous unemployment protection. The unemployment effect on male suicides was statistically significant in all welfare regimes, except the Scandinavian one. The effect on female suicides was significant only in the eastern European country group. There was a significant gradient in the effects, being stronger the less generous the unemployment protection. The interaction term capturing the possible excess effect of unemployment during the financial crisis was not significant. Our findings suggest that the more generous the unemployment protection the weaker the detrimental impact on suicide of the increasing unemployment during the Great Recession. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  5. Examining software complexity and quality for scientific software

    International Nuclear Information System (INIS)

    Kelly, D.; Shepard, T.

    2005-01-01

    Research has not found a simple relationship between software complexity and software quality, and particularly no relationship between commonly used software complexity metrics and the occurrence of software faults. A study with an example of scientific software from the nuclear power industry illustrates the importance of addressing cognitive complexity, the complexity related to understanding the intellectual content of the software. Simple practices such as aptly-named variables contributes more to high quality software than limiting code sizes. This paper examines the research into complexity and quality and reports on a longitudinal study using the example of nuclear software. (author)

  6. Infrastructure for genomic interactions: Bioconductor classes for Hi-C, ChIA-PET and related experiments [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Aaron T. L. Lun

    2016-05-01

    Full Text Available The study of genomic interactions has been greatly facilitated by techniques such as chromatin conformation capture with high-throughput sequencing (Hi-C. These genome-wide experiments generate large amounts of data that require careful analysis to obtain useful biological conclusions. However, development of the appropriate software tools is hindered by the lack of basic infrastructure to represent and manipulate genomic interaction data. Here, we present the InteractionSet package that provides classes to represent genomic interactions and store their associated experimental data, along with the methods required for low-level manipulation and processing of those classes. The InteractionSet package exploits existing infrastructure in the open-source Bioconductor project, while in turn being used by Bioconductor packages designed for higher-level analyses. For new packages, use of the functionality in InteractionSet will simplify development, allow access to more features and improve interoperability between packages.

  7. Infrastructure for genomic interactions: Bioconductor classes for Hi-C, ChIA-PET and related experiments [version 2; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Aaron T. L. Lun

    2016-06-01

    Full Text Available The study of genomic interactions has been greatly facilitated by techniques such as chromatin conformation capture with high-throughput sequencing (Hi-C. These genome-wide experiments generate large amounts of data that require careful analysis to obtain useful biological conclusions. However, development of the appropriate software tools is hindered by the lack of basic infrastructure to represent and manipulate genomic interaction data. Here, we present the InteractionSet package that provides classes to represent genomic interactions and store their associated experimental data, along with the methods required for low-level manipulation and processing of those classes. The InteractionSet package exploits existing infrastructure in the open-source Bioconductor project, while in turn being used by Bioconductor packages designed for higher-level analyses. For new packages, use of the functionality in InteractionSet will simplify development, allow access to more features and improve interoperability between packages.

  8. Monitoring Student Activity in Collaborative Software Development

    DEFF Research Database (Denmark)

    Dietsch, Daniel; Podelski, Andreas; Nam, Jaechang

    2013-01-01

    year of studies formed 20 groups and worked collaboratively to develop video games. Throughout the lab, students have to use a variety of tools for managing and developing their projects, such as software version control, static analysis tools, wikis, mailing lists, etc. The students are also supported......This paper presents data analysis from a course on Software Engineering in an effort to identify metrics and techniques that would allow instructor to act proactively and identify patterns of low engagement and inefficient peer collaboration. Over the last two terms, 106 students in their second...... by weekly meetings with teaching assistants and instructors regarding group progress, code quality, and management issues. Through these meetings and their interactions with the software tools, students leave a detailed trace of data related to their individual engagement and their collaboration behavior...

  9. Product-oriented Software Certification Process for Software Synthesis

    Science.gov (United States)

    Nelson, Stacy; Fischer, Bernd; Denney, Ewen; Schumann, Johann; Richardson, Julian; Oh, Phil

    2004-01-01

    The purpose of this document is to propose a product-oriented software certification process to facilitate use of software synthesis and formal methods. Why is such a process needed? Currently, software is tested until deemed bug-free rather than proving that certain software properties exist. This approach has worked well in most cases, but unfortunately, deaths still occur due to software failure. Using formal methods (techniques from logic and discrete mathematics like set theory, automata theory and formal logic as opposed to continuous mathematics like calculus) and software synthesis, it is possible to reduce this risk by proving certain software properties. Additionally, software synthesis makes it possible to automate some phases of the traditional software development life cycle resulting in a more streamlined and accurate development process.

  10. Interplay between usability evaluation and software development (I-USED 2009)

    DEFF Research Database (Denmark)

    Abrahão, S.; Hornbæk, Kasper Anders Søren; Law, E.

    2009-01-01

    This workshop is aimed at bringing together researchers and practitioners from the Human-Computer Interaction (HCI) and Software Engineering (SE) fields to determine the state-of-the-art in the interplay between usability evaluation and software development and to generate ideas for new...... and improved relations between these activities. The aim is to base the determination of the current state on empirical studies. Presentations of new ideas on how to improve the interplay between HCI & SE to the design of usable software systems should also be based on empirical studies....

  11. Description of research interests and current work related to automating software design

    Science.gov (United States)

    Kaindl, Hermann

    1992-01-01

    Enclosed is a list of selected and recent publications. Most of these publications concern applied research in the areas of software engineering and human-computer interaction. It is felt that domain-specific knowledge plays a major role in software development. Additionally, it is believed that improvements in the general software development process (e.g., object-oriented approaches) will have to be combined with the use of large domain-specific knowledge bases.

  12. Improving Flight Software Module Validation Efforts : a Modular, Extendable Testbed Software Framework

    Science.gov (United States)

    Lange, R. Connor

    2012-01-01

    Ever since Explorer-1, the United States' first Earth satellite, was developed and launched in 1958, JPL has developed many more spacecraft, including landers and orbiters. While these spacecraft vary greatly in their missions, capabilities,and destination, they all have something in common. All of the components of these spacecraft had to be comprehensively tested. While thorough testing is important to mitigate risk, it is also a very expensive and time consuming process. Thankfully,since virtually all of the software testing procedures for SMAP are computer controlled, these procedures can be automated. Most people testing SMAP flight software (FSW) would only need to write tests that exercise specific requirements and then check the filtered results to verify everything occurred as planned. This gives developers the ability to automatically launch tests on the testbed, distill the resulting logs into only the important information, generate validation documentation, and then deliver the documentation to management. With many of the steps in FSW testing automated, developers can use their limited time more effectively and can validate SMAP FSW modules quicker and test them more rigorously. As a result of the various benefits of automating much of the testing process, management is considering this automated tools use in future FSW validation efforts.

  13. A concept of software testing for SMART MMIS software

    International Nuclear Information System (INIS)

    Seo, Yong Seok; Seong, Seung Hwan; Park, Keun Ok; Hur, Sub; Kim, Dong Hoon

    2001-01-01

    In order to achieve high quality of SMART MMIS software, the well-constructed software testing concept shall be required. This paper established software testing concept which is to be applied to SMART MMIS software, in terms of software testing organization, documentation. procedure, and methods. The software testing methods are classified into source code static analysis and dynamic testing. The software dynamic testing methods are discussed with two aspects: white-box and black-box testing. As software testing concept introduced in this paper is applied to the SMART MMIS software. the high quality of the software will be produced. In the future, software failure data will be collected through the construction of SMART MMIS prototyping facility which the software testing concept of this paper is applied to

  14. A Reference Model for Software and System Inspections. White Paper

    Science.gov (United States)

    He, Lulu; Shull, Forrest

    2009-01-01

    Software Quality Assurance (SQA) is an important component of the software development process. SQA processes provide assurance that the software products and processes in the project life cycle conform to their specified requirements by planning, enacting, and performing a set of activities to provide adequate confidence that quality is being built into the software. Typical techniques include: (1) Testing (2) Simulation (3) Model checking (4) Symbolic execution (5) Management reviews (6) Technical reviews (7) Inspections (8) Walk-throughs (9) Audits (10) Analysis (complexity analysis, control flow analysis, algorithmic analysis) (11) Formal method Our work over the last few years has resulted in substantial knowledge about SQA techniques, especially the areas of technical reviews and inspections. But can we apply the same QA techniques to the system development process? If yes, what kind of tailoring do we need before applying them in the system engineering context? If not, what types of QA techniques are actually used at system level? And, is there any room for improvement.) After a brief examination of the system engineering literature (especially focused on NASA and DoD guidance) we found that: (1) System and software development process interact with each other at different phases through development life cycle (2) Reviews are emphasized in both system and software development. (Figl.3). For some reviews (e.g. SRR, PDR, CDR), there are both system versions and software versions. (3) Analysis techniques are emphasized (e.g. Fault Tree Analysis, Preliminary Hazard Analysis) and some details are given about how to apply them. (4) Reviews are expected to use the outputs of the analysis techniques. In other words, these particular analyses are usually conducted in preparation for (before) reviews. The goal of our work is to explore the interaction between the Quality Assurance (QA) techniques at the system level and the software level.

  15. Bega - Android-Based Beergame Simulation Software for Interactive Training and Innovation

    Science.gov (United States)

    Lestyánszka Škůrková, Katarína; Szander, Norina

    2013-12-01

    The supply chain management challenges and inventory holding problems can easily be demonstrated by the widely known BeerGame simulation. In the Szabó-Szoba R&D Laboratory, we developed an android-based software application for tablets and smart phones for the purpose of having an adaptable, entertaining and effective program which can provide a real life experience to the participants about the nature of the bullwhip effect. Having an appropriate and comprehensive performance measurement system with the critical parameters and KPIs is inevitable for finding the right solutions - We used four perspectives of the Balanced Scorecard method. The innovative force of our research is based on the trainings: the discussion on outcomes and the team learning. The purpose of the current development is to build a new feature in the software: an artificial client can substitute one or more players in the supply chain, which makes decisions by using genetic algorithms.

  16. Experience Report: Introducing Kanban Into Automotive Software Project

    Directory of Open Access Journals (Sweden)

    Marek Majchrzak

    2017-03-01

    Full Text Available The boundaries between traditional and agile approach methods are disappearing. A significant number of software projects require a continuous implementation of tasks without dividing them into sprints or strict project phases. Customers expect more flexibility and responsiveness from software vendors in response to the ever-changing business environment. To achieve better results in this field, Capgemini has begun using the Lean philosophy and Kanban techniques. \\\\The following article illustrates examples of different uses of Kanban and the main stakeholder of the process. The article presents the main advantages of transparency and ways to improve the customer co-operation as well as stakeholder relationships. The Authors try to visualise all of the elements in the context of the project. \\\\There is also a discussion of different approaches in two software projects. The article fokuses on the main challenges and the evolutionary approach used. An attempt is made to answer the question how to convince both the team as well as the customer, and how to optimise ways to achieve great results.

  17. EQ3/6 software test and verification report 9/94

    International Nuclear Information System (INIS)

    Kishi, T.

    1996-02-01

    This document is the Software Test and Verification Report (STVR) for the EQ3/6 suite of codes as stipulated in the Individual Software Plan for Initial Qualification of EQ3/6 (ISP-NF-07, Revision 1, 11/25/92). The software codes, EQPT, EQ3NR, EQ6, and the software library EQLIB constitute the EQ3/6 software package. This software test and verification project for EQ3/6 was started under the requirements of the LLNL Yucca Mountain Project Software Quality Assurance Plan (SQAP), Revision 0, December 14, 1989, but QP 3.2, Revision 2, June 21, 1994 is now the operative controlling procedure. This is a ''V and V'' report in the language of QP 3.2, Revision 2. Because the author of this report does not have a background in geochemistry, other technical sources were consulted in order to acquire some familiarity with geochemisty, the terminology minology involved, and to review comparable computational methods especially, geochemical aqueous speciation-solubility calculations. The software for the EQ3/6 package consists of approximately 47,000 lines of FORTRAN77 source code and nine on platforms ranging from workstations to supercomputers. The physical control of EQ3/6 software package and documentation is on a SUN SPARC station. Walkthroughs of each principal software packages, EQPT, EQ3NR, and EQ6 were conducted in order to understand the computational procedures involved, to determine any commonality in procedures, and then to establish a plan for the test and verification of EQ3/6. It became evident that all three phases depended upon solving an n x n matrix by the Newton-Raphson Method. Thus, a great deal of emphasis on the test and verification of this procedure was carried out on the first code in the software package EQPT

  18. EQ3/6 software test and verification report 9/94

    Energy Technology Data Exchange (ETDEWEB)

    Kishi, T.

    1996-02-01

    This document is the Software Test and Verification Report (STVR) for the EQ3/6 suite of codes as stipulated in the Individual Software Plan for Initial Qualification of EQ3/6 (ISP-NF-07, Revision 1, 11/25/92). The software codes, EQPT, EQ3NR, EQ6, and the software library EQLIB constitute the EQ3/6 software package. This software test and verification project for EQ3/6 was started under the requirements of the LLNL Yucca Mountain Project Software Quality Assurance Plan (SQAP), Revision 0, December 14, 1989, but QP 3.2, Revision 2, June 21, 1994 is now the operative controlling procedure. This is a ``V and V`` report in the language of QP 3.2, Revision 2. Because the author of this report does not have a background in geochemistry, other technical sources were consulted in order to acquire some familiarity with geochemisty, the terminology minology involved, and to review comparable computational methods especially, geochemical aqueous speciation-solubility calculations. The software for the EQ3/6 package consists of approximately 47,000 lines of FORTRAN77 source code and nine on platforms ranging from workstations to supercomputers. The physical control of EQ3/6 software package and documentation is on a SUN SPARC station. Walkthroughs of each principal software packages, EQPT, EQ3NR, and EQ6 were conducted in order to understand the computational procedures involved, to determine any commonality in procedures, and then to establish a plan for the test and verification of EQ3/6. It became evident that all three phases depended upon solving an n x n matrix by the Newton-Raphson Method. Thus, a great deal of emphasis on the test and verification of this procedure was carried out on the first code in the software package EQPT.

  19. Reproducible Bioconductor workflows using browser-based interactive notebooks and containers.

    Science.gov (United States)

    Almugbel, Reem; Hung, Ling-Hong; Hu, Jiaming; Almutairy, Abeer; Ortogero, Nicole; Tamta, Yashaswi; Yeung, Ka Yee

    2018-01-01

    Bioinformatics publications typically include complex software workflows that are difficult to describe in a manuscript. We describe and demonstrate the use of interactive software notebooks to document and distribute bioinformatics research. We provide a user-friendly tool, BiocImageBuilder, that allows users to easily distribute their bioinformatics protocols through interactive notebooks uploaded to either a GitHub repository or a private server. We present four different interactive Jupyter notebooks using R and Bioconductor workflows to infer differential gene expression, analyze cross-platform datasets, process RNA-seq data and KinomeScan data. These interactive notebooks are available on GitHub. The analytical results can be viewed in a browser. Most importantly, the software contents can be executed and modified. This is accomplished using Binder, which runs the notebook inside software containers, thus avoiding the need to install any software and ensuring reproducibility. All the notebooks were produced using custom files generated by BiocImageBuilder. BiocImageBuilder facilitates the publication of workflows with a point-and-click user interface. We demonstrate that interactive notebooks can be used to disseminate a wide range of bioinformatics analyses. The use of software containers to mirror the original software environment ensures reproducibility of results. Parameters and code can be dynamically modified, allowing for robust verification of published results and encouraging rapid adoption of new methods. Given the increasing complexity of bioinformatics workflows, we anticipate that these interactive software notebooks will become as necessary for documenting software methods as traditional laboratory notebooks have been for documenting bench protocols, and as ubiquitous. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  20. A software system for oilfield facility investment minimization

    International Nuclear Information System (INIS)

    Ding, Z.X.; Startzman, R.A.

    1996-01-01

    Minimizing investment in oilfield development is an important subject that has attracted a considerable amount of industry attention. One method to reduce investment involves the optimal placement and selection of production facilities. Because of the large amount of capital used in this process, saving a small percent of the total investment may represent a large monetary value. The literature reports algorithms using mathematical programming techniques that were designed to solve the proposed problem in a global optimal manner. Owing to the high-computational complexity and the lack of user-friendly interfaces for data entry and results display, mathematical programming techniques have not been given enough attention in practice. This paper describes an interactive, graphical software system that provides a global optimal solution to the problem of placement and selection of production facilities in oil-field development processes. This software system can be used as an investment minimization tool and a scenario-study simulator. The developed software system consists of five basic modules: (1) an interactive data-input unit, (2) a cost function generator, (3) an optimization unit, (4) a graphic-output display, and (5) a sensitivity-analysis unit

  1. Use of Writing with Symbols 2000 Software to Facilitate Emergent Literacy Development

    Science.gov (United States)

    Parette, Howard P.; Boeckmann, Nichole M.; Hourcade, Jack J.

    2008-01-01

    This paper outlines the use of the "Writing with Symbols 2000" software to facilitate emergent literacy development. The program's use of pictures incorporated with text has great potential to help young children with and without disabilities acquire fundamental literacy concepts about print, phonemic awareness, alphabetic principle, vocabulary…

  2. A flexible object-based software framework for modeling complex systems with interacting natural and societal processes.

    Energy Technology Data Exchange (ETDEWEB)

    Christiansen, J. H.

    2000-06-15

    The Dynamic Information Architecture System (DIAS) is a flexible, extensible, object-based framework for developing and maintaining complex multidisciplinary simulations. The DIAS infrastructure makes it feasible to build and manipulate complex simulation scenarios in which many thousands of objects can interact via dozens to hundreds of concurrent dynamic processes. The flexibility and extensibility of the DIAS software infrastructure stem mainly from (1) the abstraction of object behaviors, (2) the encapsulation and formalization of model functionality, and (3) the mutability of domain object contents. DIAS simulation objects are inherently capable of highly flexible and heterogeneous spatial realizations. Geospatial graphical representation of DIAS simulation objects is addressed via the GeoViewer, an object-based GIS toolkit application developed at ANL. DIAS simulation capabilities have been extended by inclusion of societal process models generated by the Framework for Addressing Cooperative Extended Transactions (FACET), another object-based framework developed at Argonne National Laboratory. By using FACET models to implement societal behaviors of individuals and organizations within larger DIAS-based natural systems simulations, it has become possible to conveniently address a broad range of issues involving interaction and feedback among natural and societal processes. Example DIAS application areas discussed in this paper include a dynamic virtual oceanic environment, detailed simulation of clinical, physiological, and logistical aspects of health care delivery, and studies of agricultural sustainability of urban centers under environmental stress in ancient Mesopotamia.

  3. An Interactive and Comprehensive Working Environment for High-Energy Physics Software with Python and Jupyter Notebooks

    Science.gov (United States)

    Braun, N.; Hauth, T.; Pulvermacher, C.; Ritter, M.

    2017-10-01

    Today’s analyses for high-energy physics (HEP) experiments involve processing a large amount of data with highly specialized algorithms. The contemporary workflow from recorded data to final results is based on the execution of small scripts - often written in Python or ROOT macros which call complex compiled algorithms in the background - to perform fitting procedures and generate plots. During recent years interactive programming environments, such as Jupyter, became popular. Jupyter allows to develop Python-based applications, so-called notebooks, which bundle code, documentation and results, e.g. plots. Advantages over classical script-based approaches is the feature to recompute only parts of the analysis code, which allows for fast and iterative development, and a web-based user frontend, which can be hosted centrally and only requires a browser on the user side. In our novel approach, Python and Jupyter are tightly integrated into the Belle II Analysis Software Framework (basf2), currently being developed for the Belle II experiment in Japan. This allows to develop code in Jupyter notebooks for every aspect of the event simulation, reconstruction and analysis chain. These interactive notebooks can be hosted as a centralized web service via jupyterhub with docker and used by all scientists of the Belle II Collaboration. Because of its generality and encapsulation, the setup can easily be scaled to large installations.

  4. Software Maintenance and Evolution: The Implication for Software ...

    African Journals Online (AJOL)

    Software Maintenance and Evolution: The Implication for Software Development. ... Software maintenance is the process of modifying existing operational software by correcting errors, ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  5. Software Prototyping

    Science.gov (United States)

    Del Fiol, Guilherme; Hanseler, Haley; Crouch, Barbara Insley; Cummins, Mollie R.

    2016-01-01

    Summary Background Health information exchange (HIE) between Poison Control Centers (PCCs) and Emergency Departments (EDs) could improve care of poisoned patients. However, PCC information systems are not designed to facilitate HIE with EDs; therefore, we are developing specialized software to support HIE within the normal workflow of the PCC using user-centered design and rapid prototyping. Objective To describe the design of an HIE dashboard and the refinement of user requirements through rapid prototyping. Methods Using previously elicited user requirements, we designed low-fidelity sketches of designs on paper with iterative refinement. Next, we designed an interactive high-fidelity prototype and conducted scenario-based usability tests with end users. Users were asked to think aloud while accomplishing tasks related to a case vignette. After testing, the users provided feedback and evaluated the prototype using the System Usability Scale (SUS). Results Survey results from three users provided useful feedback that was then incorporated into the design. After achieving a stable design, we used the prototype itself as the specification for development of the actual software. Benefits of prototyping included having 1) subject-matter experts heavily involved with the design; 2) flexibility to make rapid changes, 3) the ability to minimize software development efforts early in the design stage; 4) rapid finalization of requirements; 5) early visualization of designs; 6) and a powerful vehicle for communication of the design to the programmers. Challenges included 1) time and effort to develop the prototypes and case scenarios; 2) no simulation of system performance; 3) not having all proposed functionality available in the final product; and 4) missing needed data elements in the PCC information system. PMID:27081404

  6. A coverage and slicing dependencies analysis for seeking software security defects.

    Science.gov (United States)

    He, Hui; Zhang, Dongyan; Liu, Min; Zhang, Weizhe; Gao, Dongmin

    2014-01-01

    Software security defects have a serious impact on the software quality and reliability. It is a major hidden danger for the operation of a system that a software system has some security flaws. When the scale of the software increases, its vulnerability has becoming much more difficult to find out. Once these vulnerabilities are exploited, it may lead to great loss. In this situation, the concept of Software Assurance is carried out by some experts. And the automated fault localization technique is a part of the research of Software Assurance. Currently, automated fault localization method includes coverage based fault localization (CBFL) and program slicing. Both of the methods have their own location advantages and defects. In this paper, we have put forward a new method, named Reverse Data Dependence Analysis Model, which integrates the two methods by analyzing the program structure. On this basis, we finally proposed a new automated fault localization method. This method not only is automation lossless but also changes the basic location unit into single sentence, which makes the location effect more accurate. Through several experiments, we proved that our method is more effective. Furthermore, we analyzed the effectiveness among these existing methods and different faults.

  7. Customizing Standard Software as a Business Model in the IT Industry

    DEFF Research Database (Denmark)

    Kautz, Karlheinz; Rab, Sameen M.; Sinnet, Michael

    2011-01-01

    This research studies a new business model in the IT industry, the customization of standard software as the sole foundation for a software company’s earnings. Based on a theoretical background which combines the concepts of inter-organizational networks and open innovation we provide an interpre......This research studies a new business model in the IT industry, the customization of standard software as the sole foundation for a software company’s earnings. Based on a theoretical background which combines the concepts of inter-organizational networks and open innovation we provide...... an interpretive case study of a small software company which customizes a standard product. We investigate the company’s interactions with a large global software company which is the producer of the original software product and with other companies which are involved in the software customization process. We...... primarily on complex, formal partnerships, in which also opportunistic behavior occurs and where informal relations are invaluable sources of knowledge. In addition, the original software producer’s view and treatment of these companies has a vital impact on the customizing company’s practice which...

  8. Human Factors in Software Development Processes: Measuring System Quality

    DEFF Research Database (Denmark)

    Abrahão, Silvia; Baldassarre, Maria Teresa; Caivano, Danilo

    2016-01-01

    Software Engineering and Human-Computer Interaction look at the development process from different perspectives. They apparently use very different approaches, are inspired by different principles and address different needs. But, they definitively have the same goal: develop high quality software...... in the most effective way. The second edition of the workshop puts particular attention on efforts of the two communities in enhancing system quality. The research question discussed is: who, what, where, when, why, and how should we evaluate?...

  9. Current practice in software development for computational neuroscience and how to improve it.

    Science.gov (United States)

    Gewaltig, Marc-Oliver; Cannon, Robert

    2014-01-01

    Almost all research work in computational neuroscience involves software. As researchers try to understand ever more complex systems, there is a continual need for software with new capabilities. Because of the wide range of questions being investigated, new software is often developed rapidly by individuals or small groups. In these cases, it can be hard to demonstrate that the software gives the right results. Software developers are often open about the code they produce and willing to share it, but there is little appreciation among potential users of the great diversity of software development practices and end results, and how this affects the suitability of software tools for use in research projects. To help clarify these issues, we have reviewed a range of software tools and asked how the culture and practice of software development affects their validity and trustworthiness. We identified four key questions that can be used to categorize software projects and correlate them with the type of product that results. The first question addresses what is being produced. The other three concern why, how, and by whom the work is done. The answers to these questions show strong correlations with the nature of the software being produced, and its suitability for particular purposes. Based on our findings, we suggest ways in which current software development practice in computational neuroscience can be improved and propose checklists to help developers, reviewers, and scientists to assess the quality of software and whether particular pieces of software are ready for use in research.

  10. Current practice in software development for computational neuroscience and how to improve it.

    Directory of Open Access Journals (Sweden)

    Marc-Oliver Gewaltig

    2014-01-01

    Full Text Available Almost all research work in computational neuroscience involves software. As researchers try to understand ever more complex systems, there is a continual need for software with new capabilities. Because of the wide range of questions being investigated, new software is often developed rapidly by individuals or small groups. In these cases, it can be hard to demonstrate that the software gives the right results. Software developers are often open about the code they produce and willing to share it, but there is little appreciation among potential users of the great diversity of software development practices and end results, and how this affects the suitability of software tools for use in research projects. To help clarify these issues, we have reviewed a range of software tools and asked how the culture and practice of software development affects their validity and trustworthiness. We identified four key questions that can be used to categorize software projects and correlate them with the type of product that results. The first question addresses what is being produced. The other three concern why, how, and by whom the work is done. The answers to these questions show strong correlations with the nature of the software being produced, and its suitability for particular purposes. Based on our findings, we suggest ways in which current software development practice in computational neuroscience can be improved and propose checklists to help developers, reviewers, and scientists to assess the quality of software and whether particular pieces of software are ready for use in research.

  11. Using CONFIG for Simulation of Operation of Water Recovery Subsystems for Advanced Control Software Evaluation

    Science.gov (United States)

    Malin, Jane T.; Flores, Luis; Fleming, Land; Throop, Daiv

    2002-01-01

    A hybrid discrete/continuous simulation tool, CONFIG, has been developed to support evaluation of the operability life support systems. CON FIG simulates operations scenarios in which flows and pressures change continuously while system reconfigurations occur as discrete events. In simulations, intelligent control software can interact dynamically with hardware system models. CONFIG simulations have been used to evaluate control software and intelligent agents for automating life support systems operations. A CON FIG model of an advanced biological water recovery system has been developed to interact with intelligent control software that is being used in a water system test at NASA Johnson Space Center

  12. Software Engineering Program: Software Process Improvement Guidebook

    Science.gov (United States)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  13. Software Engineering and Swarm-Based Systems

    Science.gov (United States)

    Hinchey, Michael G.; Sterritt, Roy; Pena, Joaquin; Rouff, Christopher A.

    2006-01-01

    We discuss two software engineering aspects in the development of complex swarm-based systems. NASA researchers have been investigating various possible concept missions that would greatly advance future space exploration capabilities. The concept mission that we have focused on exploits the principles of autonomic computing as well as being based on the use of intelligent swarms, whereby a (potentially large) number of similar spacecraft collaborate to achieve mission goals. The intent is that such systems not only can be sent to explore remote and harsh environments but also are endowed with greater degrees of protection and longevity to achieve mission goals.

  14. Ready! Set! Go! An Action Research Agenda for Software Architecture

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Hansen, Klaus Marius; Schougaard, Kari Rye

    2008-01-01

    Software architecture practice is highly complex. Software architects interact with business as well as technical aspects of systems, often embedded in large and changing organizations. We first make an argument that an appropriate research agenda for understanding, describing, and changing...... architectural practice in this context is based on an action research agenda in which researchers use ethnographic techniques to understand practice and engages directly with and in practice when proposing and designing new practices. Secondly, we present an overview of an ongoing project which applies action...... research techniques to understand and potentially change architectural practice in four Danish software companies....

  15. Computer organization and design the hardware/software interface

    CERN Document Server

    Hennessy, John L

    1994-01-01

    Computer Organization and Design: The Hardware/Software Interface presents the interaction between hardware and software at a variety of levels, which offers a framework for understanding the fundamentals of computing. This book focuses on the concepts that are the basis for computers.Organized into nine chapters, this book begins with an overview of the computer revolution. This text then explains the concepts and algorithms used in modern computer arithmetic. Other chapters consider the abstractions and concepts in memory hierarchies by starting with the simplest possible cache. This book di

  16. Software quality assurance plans for safety-critical software

    International Nuclear Information System (INIS)

    Liddle, P.

    2006-01-01

    Application software is defined as safety-critical if a fault in the software could prevent the system components from performing their nuclear-safety functions. Therefore, for nuclear-safety systems, the AREVA TELEPERM R XS (TXS) system is classified 1E, as defined in the Inst. of Electrical and Electronics Engineers (IEEE) Std 603-1998. The application software is classified as Software Integrity Level (SIL)-4, as defined in IEEE Std 7-4.3.2-2003. The AREVA NP Inc. Software Program Manual (SPM) describes the measures taken to ensure that the TELEPERM XS application software attains a level of quality commensurate with its importance to safety. The manual also describes how TELEPERM XS correctly performs the required safety functions and conforms to established technical and documentation requirements, conventions, rules, and standards. The program manual covers the requirements definition, detailed design, integration, and test phases for the TELEPERM XS application software, and supporting software created by AREVA NP Inc. The SPM is required for all safety-related TELEPERM XS system applications. The program comprises several basic plans and practices: 1. A Software Quality-Assurance Plan (SQAP) that describes the processes necessary to ensure that the software attains a level of quality commensurate with its importance to safety function. 2. A Software Safety Plan (SSP) that identifies the process to reasonably ensure that safety-critical software performs as intended during all abnormal conditions and events, and does not introduce any new hazards that could jeopardize the health and safety of the public. 3. A Software Verification and Validation (V and V) Plan that describes the method of ensuring the software is in accordance with the requirements. 4. A Software Configuration Management Plan (SCMP) that describes the method of maintaining the software in an identifiable state at all times. 5. A Software Operations and Maintenance Plan (SO and MP) that

  17. Accuracy of estimation of graft size for living-related liver transplantation: first results of a semi-automated interactive software for CT-volumetry.

    Directory of Open Access Journals (Sweden)

    Theresa Mokry

    Full Text Available To evaluate accuracy of estimated graft size for living-related liver transplantation using a semi-automated interactive software for CT-volumetry.Sixteen donors for living-related liver transplantation (11 male; mean age: 38.2±9.6 years underwent contrast-enhanced CT prior to graft removal. CT-volumetry was performed using a semi-automated interactive software (P, and compared with a manual commercial software (TR. For P, liver volumes were provided either with or without vessels. For TR, liver volumes were provided always with vessels. Intraoperative weight served as reference standard. Major study goals included analyses of volumes using absolute numbers, linear regression analyses and inter-observer agreements. Minor study goals included the description of the software workflow: degree of manual correction, speed for completion, and overall intuitiveness using five-point Likert scales: 1--markedly lower/faster/higher for P compared with TR, 2--slightly lower/faster/higher for P compared with TR, 3--identical for P and TR, 4--slightly lower/faster/higher for TR compared with P, and 5--markedly lower/faster/higher for TR compared with P.Liver segments II/III, II-IV and V-VIII served in 6, 3, and 7 donors as transplanted liver segments. Volumes were 642.9±368.8 ml for TR with vessels, 623.8±349.1 ml for P with vessels, and 605.2±345.8 ml for P without vessels (P<0.01. Regression equations between intraoperative weights and volumes were y = 0.94x+30.1 (R2 = 0.92; P<0.001 for TR with vessels, y = 1.00x+12.0 (R2 = 0.92; P<0.001 for P with vessels, and y = 1.01x+28.0 (R2 = 0.92; P<0.001 for P without vessels. Inter-observer agreement showed a bias of 1.8 ml for TR with vessels, 5.4 ml for P with vessels, and 4.6 ml for P without vessels. For the degree of manual correction, speed for completion and overall intuitiveness, scale values were 2.6±0.8, 2.4±0.5 and 2.CT-volumetry performed with P can predict accurately graft

  18. Western aeronautical test range real-time graphics software package MAGIC

    Science.gov (United States)

    Malone, Jacqueline C.; Moore, Archie L.

    1988-01-01

    The master graphics interactive console (MAGIC) software package used on the Western Aeronautical Test Range (WATR) of the NASA Ames Research Center is described. MAGIC is a resident real-time research tool available to flight researchers-scientists in the NASA mission control centers of the WATR at the Dryden Flight Research Facility at Edwards, California. The hardware configuration and capabilities of the real-time software package are also discussed.

  19. Intellectual Property Protection of Software – At the Crossroads of Software Patents and Open Source Software

    OpenAIRE

    Tantarimäki, Maria

    2018-01-01

    The thesis considers the intellectual property protection of software in Europe and in the US, which is increasingly important subject as the world is globalizing and digitalizing. The special nature of software has challenges the intellectual property rights. The current protection of software is based on copyright protection but in this thesis, two other options are considered: software patents and open source software. Software patents provide strong protection for software whereas the pur...

  20. Software Quality Assurance in Software Projects: A Study of Pakistan

    OpenAIRE

    Faisal Shafique Butt; Sundus Shaukat; M. Wasif Nisar; Ehsan Ullah Munir; Muhammad Waseem; Kashif Ayyub

    2013-01-01

    Software quality is specific property which tells what kind of standard software should have. In a software project, quality is the key factor of success and decline of software related organization. Many researches have been done regarding software quality. Software related organization follows standards introduced by Capability Maturity Model Integration (CMMI) to achieve good quality software. Quality is divided into three main layers which are Software Quality Assurance (SQA), Software Qu...

  1. Software package as an information center product

    International Nuclear Information System (INIS)

    Butler, M.K.

    1977-01-01

    The Argonne Code Center serves as a software exchange and information center for the U.S. Energy Research and Development Administration and the Nuclear Regulatory Commission. The goal of the Center's program is to provide a means for sharing of software among agency offices and contractors, and for transferring computing applications and technology, developed within the agencies, to the information-processing community. A major activity of the Code Center is the acquisition, review, testing, and maintenance of a collection of software--computer systems, applications programs, subroutines, modules, and data compilations--prepared by agency offices and contractors to meet programmatic needs. A brief review of the history of computer program libraries and software sharing is presented to place the Code Center activity in perspective. The state-of-the-art discussion starts off with an appropriate definition of the term software package, together with descriptions of recommended package contents and the Carter's package evaluation activity. An effort is made to identify the various users of the product, to enumerate their individual needs, to document the Center's efforts to meet these needs and the ongoing interaction with the user community. Desirable staff qualifications are considered, and packaging problems, reviewed. The paper closes with a brief look at recent developments and a forecast of things to come. 2 tables

  2. A software engineering process for safety-critical software application

    International Nuclear Information System (INIS)

    Kang, Byung Heon; Kim, Hang Bae; Chang, Hoon Seon; Jeon, Jong Sun

    1995-01-01

    Application of computer software to safety-critical systems in on the increase. To be successful, the software must be designed and constructed to meet the functional and performance requirements of the system. For safety reason, the software must be demonstrated not only to meet these requirements, but also to operate safely as a component within the system. For longer-term cost consideration, the software must be designed and structured to ease future maintenance and modifications. This paper presents a software engineering process for the production of safety-critical software for a nuclear power plant. The presentation is expository in nature of a viable high quality safety-critical software development. It is based on the ideas of a rational design process and on the experience of the adaptation of such process in the production of the safety-critical software for the shutdown system number two of Wolsung 2, 3 and 4 nuclear power generation plants. This process is significantly different from a conventional process in terms of rigorous software development phases and software design techniques, The process covers documentation, design, verification and testing using mathematically precise notations and highly reviewable tabular format to specify software requirements and software requirements and software requirements and code against software design using static analysis. The software engineering process described in this paper applies the principle of information-hiding decomposition in software design using a modular design technique so that when a change is required or an error is detected, the affected scope can be readily and confidently located. it also facilitates a sense of high degree of confidence in the 'correctness' of the software production, and provides a relatively simple and straightforward code implementation effort. 1 figs., 10 refs. (Author)

  3. Software architecture analysis tool : software architecture metrics collection

    NARCIS (Netherlands)

    Muskens, J.; Chaudron, M.R.V.; Westgeest, R.

    2002-01-01

    The Software Engineering discipline lacks the ability to evaluate software architectures. Here we describe a tool for software architecture analysis that is based on metrics. Metrics can be used to detect possible problems and bottlenecks in software architectures. Even though metrics do not give a

  4. Information technologies and software packages for education of specialists in materials science [In Russian

    NARCIS (Netherlands)

    Krzhizhanovskaya, V.; Ryaboshuk, S.

    2009-01-01

    This paper presents methodological materials, interactive text-books and software packages developed and extensively used for education of specialists in materials science. These virtual laboratories for education and research are equipped with tutorials and software environment for modeling complex

  5. SOFTCOST - DEEP SPACE NETWORK SOFTWARE COST MODEL

    Science.gov (United States)

    Tausworthe, R. C.

    1994-01-01

    The early-on estimation of required resources and a schedule for the development and maintenance of software is usually the least precise aspect of the software life cycle. However, it is desirable to make some sort of an orderly and rational attempt at estimation in order to plan and organize an implementation effort. The Software Cost Estimation Model program, SOFTCOST, was developed to provide a consistent automated resource and schedule model which is more formalized than the often used guesswork model based on experience, intuition, and luck. SOFTCOST was developed after the evaluation of a number of existing cost estimation programs indicated that there was a need for a cost estimation program with a wide range of application and adaptability to diverse kinds of software. SOFTCOST combines several software cost models found in the open literature into one comprehensive set of algorithms that compensate for nearly fifty implementation factors relative to size of the task, inherited baseline, organizational and system environment, and difficulty of the task. SOFTCOST produces mean and variance estimates of software size, implementation productivity, recommended staff level, probable duration, amount of computer resources required, and amount and cost of software documentation. Since the confidence level for a project using mean estimates is small, the user is given the opportunity to enter risk-biased values for effort, duration, and staffing, to achieve higher confidence levels. SOFTCOST then produces a PERT/CPM file with subtask efforts, durations, and precedences defined so as to produce the Work Breakdown Structure (WBS) and schedule having the asked-for overall effort and duration. The SOFTCOST program operates in an interactive environment prompting the user for all of the required input. The program builds the supporting PERT data base in a file for later report generation or revision. The PERT schedule and the WBS schedule may be printed and stored in a

  6. A SOFTWARE RELIABILITY ESTIMATION METHOD TO NUCLEAR SAFETY SOFTWARE

    Directory of Open Access Journals (Sweden)

    GEE-YONG PARK

    2014-02-01

    Full Text Available A method for estimating software reliability for nuclear safety software is proposed in this paper. This method is based on the software reliability growth model (SRGM, where the behavior of software failure is assumed to follow a non-homogeneous Poisson process. Two types of modeling schemes based on a particular underlying method are proposed in order to more precisely estimate and predict the number of software defects based on very rare software failure data. The Bayesian statistical inference is employed to estimate the model parameters by incorporating software test cases as a covariate into the model. It was identified that these models are capable of reasonably estimating the remaining number of software defects which directly affects the reactor trip functions. The software reliability might be estimated from these modeling equations, and one approach of obtaining software reliability value is proposed in this paper.

  7. Managing Risks in Distributed Software Projects: An Integrative Framework

    DEFF Research Database (Denmark)

    Persson, John Stouby; Mathiassen, Lars; Boeg, Jesper

    2009-01-01

    techniques into an integrative framework for managing risks in distributed contexts. Subsequent implementation of a Web-based tool helped us refine the framework based on empirical evaluation of its practical usefulness.We conclude by discussing implications for both research and practice.......Software projects are increasingly geographically distributed with limited face-to-face interaction between participants. These projects face particular challenges that need carefulmanagerial attention. While risk management has been adopted with success to address other challenges within software...... development, there are currently no frameworks available for managing risks related to geographical distribution. On this background, we systematically review the literature on geographically distributed software projects. Based on the review, we synthesize what we know about risks and risk resolution...

  8. MCNP trademark Software Quality Assurance plan

    International Nuclear Information System (INIS)

    Abhold, H.M.; Hendricks, J.S.

    1996-04-01

    MCNP is a computer code that models the interaction of radiation with matter. MCNP is developed and maintained by the Transport Methods Group (XTM) of the Los Alamos National Laboratory (LANL). This plan describes the Software Quality Assurance (SQA) program applied to the code. The SQA program is consistent with the requirements of IEEE-730.1 and the guiding principles of ISO 900

  9. THE THIRD GRAVITATIONAL LENSING ACCURACY TESTING (GREAT3) CHALLENGE HANDBOOK

    International Nuclear Information System (INIS)

    Mandelbaum, Rachel; Kannawadi, Arun; Simet, Melanie; Rowe, Barnaby; Kacprzak, Tomasz; Bosch, James; Miyatake, Hironao; Chang, Chihway; Gill, Mandeep; Courbin, Frederic; Jarvis, Mike; Armstrong, Bob; Lackner, Claire; Leauthaud, Alexie; Nakajima, Reiko; Rhodes, Jason; Zuntz, Joe; Bridle, Sarah; Coupon, Jean; Dietrich, Jörg P.

    2014-01-01

    The GRavitational lEnsing Accuracy Testing 3 (GREAT3) challenge is the third in a series of image analysis challenges, with a goal of testing and facilitating the development of methods for analyzing astronomical images that will be used to measure weak gravitational lensing. This measurement requires extremely precise estimation of very small galaxy shape distortions, in the presence of far larger intrinsic galaxy shapes and distortions due to the blurring kernel caused by the atmosphere, telescope optics, and instrumental effects. The GREAT3 challenge is posed to the astronomy, machine learning, and statistics communities, and includes tests of three specific effects that are of immediate relevance to upcoming weak lensing surveys, two of which have never been tested in a community challenge before. These effects include many novel aspects including realistically complex galaxy models based on high-resolution imaging from space; a spatially varying, physically motivated blurring kernel; and a combination of multiple different exposures. To facilitate entry by people new to the field, and for use as a diagnostic tool, the simulation software for the challenge is publicly available, though the exact parameters used for the challenge are blinded. Sample scripts to analyze the challenge data using existing methods will also be provided. See http://great3challenge.info and http://great3.projects.phys.ucl.ac.uk/leaderboard/ for more information

  10. The Five 'R's' for Developing Trusted Software Frameworks to increase confidence in, and maximise reuse of, Open Source Software.

    Science.gov (United States)

    Fraser, Ryan; Gross, Lutz; Wyborn, Lesley; Evans, Ben; Klump, Jens

    2015-04-01

    benchmark cases. This will be achieved by linking the code in the software framework to peer review forums such as Mozilla Science or appropriate Journals (e.g. Geoscientific Model Development Journal) to assist users to know which codes to trust. 3) Referencing will be accomplished by linking the Software Framework to groups such as Figshare or ImpactStory that help disseminate and measure the impact of scientific research, including program code. 4) The Run component will draw on information supplied in the registration process, benchmark cases described in the review and relevant information to instantiate the scientific code on the selected environment. 5) The Repeat component will tap into existing Provenance Workflow engines that will automatically capture information that relate to a particular run of that software, including identification of all input and output artefacts, and all elements and transactions within that workflow. The proposed trusted software framework will enable users to rapidly discover and access reliable code, reduce the time to deploy it and greatly facilitate sharing, reuse and reinstallation of code. Properly designed it could enable an ability to scale out to massively parallel systems and be accessed nationally/ internationally for multiple use cases, including Supercomputer centres, cloud facilities, and local computers.

  11. The Digital Economy: Social Interaction Technologies – an Overview

    Directory of Open Access Journals (Sweden)

    Teófilo Redondo

    2015-03-01

    Full Text Available Social interaction technologies (SIT is a very broad field that encompasses a large list of topics: interactive and networked computing, mobile social services and the Social Web, social software and social media, marketing and advertising, various aspects and uses of blogs and podcasting, corporate value and web-based collaboration, e-government and online democracy, virtual volunteering, different aspects and uses of folksonomies, tagging and the social semantic cloud of tags, blog-based knowledge management systems, systems of online learning, with their ePortfolios, blogs and wikis in education and journalism, legal issues and social interaction technology, dataveillance and online fraud, neogeography, social software usability, social software in libraries and nonprofit organizations, and broadband visual communication technology for enhancing social interaction. The fact is that the daily activities of many businesses are being socialized, as is the case with Yammer (https://www.yammer.com/, the social enterprise social network. The leitmotivs of social software are: create, connect, contribute, and collaborate.

  12. Integrating existing software toolkits into VO system

    Science.gov (United States)

    Cui, Chenzhou; Zhao, Yong-Heng; Wang, Xiaoqian; Sang, Jian; Luo, Ze

    2004-09-01

    Virtual Observatory (VO) is a collection of interoperating data archives and software tools. Taking advantages of the latest information technologies, it aims to provide a data-intensively online research environment for astronomers all around the world. A large number of high-qualified astronomical software packages and libraries are powerful and easy of use, and have been widely used by astronomers for many years. Integrating those toolkits into the VO system is a necessary and important task for the VO developers. VO architecture greatly depends on Grid and Web services, consequently the general VO integration route is "Java Ready - Grid Ready - VO Ready". In the paper, we discuss the importance of VO integration for existing toolkits and discuss the possible solutions. We introduce two efforts in the field from China-VO project, "gImageMagick" and "Galactic abundance gradients statistical research under grid environment". We also discuss what additional work should be done to convert Grid service to VO service.

  13. A Development Framework for Software Security in Nuclear Safety Systems: Integrating Secure Development and System Security Activities

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jaekwan; Suh, Yongsuk [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-02-15

    The protection of nuclear safety software is essential in that a failure can result in significant economic loss and physical damage to the public. However, software security has often been ignored in nuclear safety software development. To enforce security considerations, nuclear regulator commission recently issued and revised the security regulations for nuclear computer-based systems. It is a great challenge for nuclear developers to comply with the security requirements. However, there is still no clear software development process regarding security activities. This paper proposes an integrated development process suitable for the secure development requirements and system security requirements described by various regulatory bodies. It provides a three-stage framework with eight security activities as the software development process. Detailed descriptions are useful for software developers and licensees to understand the regulatory requirements and to establish a detailed activity plan for software design and engineering.

  14. Space Flight Software Development Software for Intelligent System Health Management

    Science.gov (United States)

    Trevino, Luis C.; Crumbley, Tim

    2004-01-01

    The slide presentation examines the Marshall Space Flight Center Flight Software Branch, including software development projects, mission critical space flight software development, software technical insight, advanced software development technologies, and continuous improvement in the software development processes and methods.

  15. A cultura de colaboração e inovação dos desenvolvedores de software livre | The collaborative and innovative culture of free software developers

    Directory of Open Access Journals (Sweden)

    Clóvis Ricardo Montenegro de Lima

    2010-03-01

    Abstract This article aims to investigate the culture of the developers of free software, especially as regards collaboration and productive organizational and social innovation. It discusses the relationship between immaterial labor and free software collaborative development. It starts with the main feature of software development with open source: a productive partnership. This implies the use of living knowledge and the formation of a shared intersubjectivity. Communication mediated by language is the basis of this production. The article also discusses the relationship between collaborative production and innovation. Collaboration extends the communicative interaction between producers, opening up more possibilities for innovation. Intersubjectivity shared condenses in a specific cultural background. It is concluded that the culture of the developers of free software, as generated from the productive collaboration, serves as a context for collaboration and innovation. Keywords culture; free software; innovation; collaboration; cyberculture.

  16. Radiobiology software for educational purpose

    International Nuclear Information System (INIS)

    Pandey, A.K.; Sharma, S.K.; Kumar, R.; Bal, C.S.; Nair, O.; Haresh, K.P.; Julka, P.K.

    2014-01-01

    To understand radio-nuclide therapy and the basis of radiation protection, it is essential to understand radiobiology. With limited time for classroom teaching and limited time and resources for radiobiology experiments students do not acquire firm grasp of theoretical mathematical models and experimental knowledge of target theory and Linear quadratic models that explain nature of cell survival curves. We believe that this issue might be addressed with numerical simulation of cell survival curves using mathematical models. Existing classroom teaching can be reoriented to understand the subject using the concept of modeling, simulation and virtual experiments. After completion of the lecture, students can practice with simulation tool at their convenient time. In this study we have developed software that can help the students to acquire firm grasp of theoretical and experimental radiobiology. The software was developed using FreeMat ver 4.0, open source software. Target theory, linear quadratic model, cell killing based on Poisson model have been included. The implementation of the program structure was to display the menu for the user choice to be made and then program flows depending on the users choice. The program executes by typing 'Radiobiology' on the command line interface. Students can investigate the effect of radiation dose on cell, interactively. They can practice to draw the cell survival curve based on the input and output data and they can also compare their handmade graphs with automatically generated graphs by the program. This software is in the early stage of development and will evolve on user feedback. We feel this simulation software will be quite useful for students entering in the nuclear medicine, radiology and radiotherapy disciplines. (author)

  17. Improving Software Performance in the Compute Unified Device Architecture

    Directory of Open Access Journals (Sweden)

    Alexandru PIRJAN

    2010-01-01

    Full Text Available This paper analyzes several aspects regarding the improvement of software performance for applications written in the Compute Unified Device Architecture CUDA. We address an issue of great importance when programming a CUDA application: the Graphics Processing Unit’s (GPU’s memory management through ranspose ernels. We also benchmark and evaluate the performance for progressively optimizing a transposing matrix application in CUDA. One particular interest was to research how well the optimization techniques, applied to software application written in CUDA, scale to the latest generation of general-purpose graphic processors units (GPGPU, like the Fermi architecture implemented in the GTX480 and the previous architecture implemented in GTX280. Lately, there has been a lot of interest in the literature for this type of optimization analysis, but none of the works so far (to our best knowledge tried to validate if the optimizations can apply to a GPU from the latest Fermi architecture and how well does the Fermi architecture scale to these software performance improving techniques.

  18. Design and implementation of embedded Bluetooth software system

    Science.gov (United States)

    Zhou, Zhijian; Zhou, Shujie; Xu, Huimin

    2001-10-01

    This thesis introduces the background knowledge and characteristics of Bluetooth technology. Then it summarizes the architecture and working principle of Bluetooth software. After carefully studying the characteristics of embedded operating system and Bluetooth software, this thesis declared two sets of module about Bluetooth software. Corresponding to these module's characteristics, this thesis introduces the design and implementation of LAN Access and Bluetooth headset. The Headset part introduces a developing method corresponding to the particularity of Bluetooth control software. Although these control software are application entity, the control signaling exchanged between them are regulations according to former definitions and they functions through the interaction of data and control information. These data and control information construct the protocol data unit (PDU), and the former definition can be seen as protocol in fact. This thesis uses the advanced development flow on communication protocol development as reference, a formal method - SDL (Specification and Description Language) - describing, validating and coding manually to C. This method not only reserved the efficiency of manually coded code, but also it ensures the quality of codes. The introduction also involves finite state machine theory while introduces the practical developing method on protocol development with the aid of SDL.

  19. Deployment of the CMS software on the WLCG Grid

    International Nuclear Information System (INIS)

    Behrenhoff, W; Wissing, C; Kim, B; Blyweert, S; D'Hondt, J; Maes, J; Maes, M; Mulders, P Van; Villella, I; Vanelderen, L

    2011-01-01

    The CMS Experiment is taking high energy collision data at CERN. The computing infrastructure used to analyse the data is distributed round the world in a tiered structure. In order to use the 7 Tier-1 sites, the 50 Tier-2 sites and a still growing number of about 30 Tier-3 sites, the CMS software has to be available at those sites. Except for a very few sites the deployment and the removal of CMS software is managed centrally. Since the deployment team has no local accounts at the remote sites all installation jobs have to be sent via Grid jobs. Via a VOMS role the job has a high priority in the batch system and gains write privileges to the software area. Due to the lack of interactive access the installation jobs must be very robust against possible failures, in order not to leave a broken software installation. The CMS software is packaged in RPMs that are installed in the software area independent of the host OS. The apt-get tool is used to resolve package dependencies. This paper reports about the recent deployment experiences and the achieved performance.

  20. Integrated software system for low level waste management

    International Nuclear Information System (INIS)

    Worku, G.

    1995-01-01

    In the continually changing and uncertain world of low level waste management, many generators in the US are faced with the prospect of having to store their waste on site for the indefinite future. This consequently increases the set of tasks performed by the generators in the areas of packaging, characterizing, classifying, screening (if a set of acceptance criteria applies), and managing the inventory for the duration of onsite storage. When disposal sites become available, it is expected that the work will require re-evaluating the waste packages, including possible re-processing, re-packaging, or re-classifying in preparation for shipment for disposal under the regulatory requirements of the time. In this day and age, when there is wide use of computers and computer literacy is at high levels, an important waste management tool would be an integrated software system that aids waste management personnel in conducting these tasks quickly and accurately. It has become evident that such an integrated radwaste management software system offers great benefits to radwaste generators both in the US and other countries. This paper discusses one such approach to integrated radwaste management utilizing some globally accepted radiological assessment software applications

  1. ALICES: an advanced object-oriented software workshop for simulators

    International Nuclear Information System (INIS)

    Sayet, R.L.; Rouault, G.; Pieroux, D.; Houte, U. Van

    1999-01-01

    Reducing simulator development costs while improving model quality, user-friendliness and teaching capabilities, is a major target for many years in the simulation industry. It has led to the development of specific software tools which have been improved progressively following the new features and capabilities offered by the software industry. Unlike most of these software tools, ALICES (which is a French acronym for 'Interactive Software Workshop for the Design of Simulators') is not an upgrade of a previous generation of tools, like putting a graphical front-end to a classical code generator, but a really new development. Its design specification is based on previous experience with different tools as well as on new capabilities of software technology, mainly in Object Oriented Design. This allowed us to make a real technological 'jump' in the simulation industry, beyond the constraints of some traditional approaches. The main objectives behind the development of ALICES were the following: (1) Minimizing the simulator development time and costs: a simulator development consists mainly in developing software. One way to reduce costs is to facilitate reuse of existing software by developing standard components, and by defining interface standards, (2) Insuring that the produced simulator can be maintained and updated at a minimal cost: a simulator must evolve along with the simulated process, and it is then necessary to update periodically the simulator. The cost of an adequate maintenance is highly dependent of the quality of the software workshop, (3) Covering the whole simulator development process: from the data package to the acceptance tests and for maintenance and upgrade activities; with the whole development team, even if it is dispatched at different working sites; respecting the Quality Assurance rules and procedures (CORYS T.E.S.S. and TRACTEBEL are ISO-9001 certified). The development of ALICES was also done to comply with the following two main

  2. Software attribute visualization for high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  3. Availability of software services for a hospital information system.

    Science.gov (United States)

    Sakamoto, N

    1998-03-01

    information system is designed and developed with these points in mind, it's availability of software services should increase greatly.

  4. Antipatterns - Implementing Productive Solutions to Avoid Developing Problems for Web and Software Applications

    Directory of Open Access Journals (Sweden)

    Valentin Corneliu PAU

    2010-01-01

    Full Text Available Patterns an transfer mechanism in many domains,including software engineering, web development process,business process management, and more recently in the field ofinteraction design. In software engineering a concerted effort isalso being made to identify and document anti-patterns forcapturing expert knowledge and transferring this to novices.This work paper present an review of reported studies of theuse of patterns and anti-patterns in teaching softwareengineering and human-computer interaction, and also reportson three studies examining the impact of using guidelines,patterns and anti-patterns in teaching interaction designprinciples.

  5. Assessing ergonomic risks of software: Development of the SEAT.

    Science.gov (United States)

    Peres, S Camille; Mehta, Ranjana K; Ritchey, Paul

    2017-03-01

    Software utilizing interaction designs that require extensive dragging or clicking of icons may increase users' risks for upper extremity cumulative trauma disorders. The purpose of this research is to develop a Self-report Ergonomic Assessment Tool (SEAT) for assessing the risks of software interaction designs and facilitate mitigation of those risks. A 28-item self-report measure was developed by combining and modifying items from existing industrial ergonomic tools. Data were collected from 166 participants after they completed four different tasks that varied by method of input (touch or keyboard and mouse) and type of task (selecting or typing). Principal component analysis found distinct factors associated with stress (i.e., demands) and strain (i.e., response). Repeated measures analyses of variance showed that participants could discriminate the different strain induced by the input methods and tasks. However, participants' ability to discriminate between the stressors associated with that strain was mixed. Further validation of the SEAT is necessary but these results indicate that the SEAT may be a viable method of assessing ergonomics risks presented by software design. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Development and evaluation of new semi-automatic TLD reader software

    International Nuclear Information System (INIS)

    Pathan, M.S.; Pradhan, S.M.; Palani Selvam, T.; Datta, D.

    2018-01-01

    Nowadays, all technology advancement is primarily focused on creating the user-friendly environment while operating any machine, also minimizing the human errors by automation of procedures. In the present study development and evaluation of new software for semi-automatic TLD badge reader (TLDBR-7B) is presented. The software provides an interactive interface and is compatible with latest windows OS as well as USB mode of data communication. Important new features of the software are automatic glow curve analysis for identifying any abnormality, event log register, user defined limits on TL count and time of temperature stabilization for readout interruption and auto reading resumption options

  7. Software cost/resource modeling: Software quality tradeoff measurement

    Science.gov (United States)

    Lawler, R. W.

    1980-01-01

    A conceptual framework for treating software quality from a total system perspective is developed. Examples are given to show how system quality objectives may be allocated to hardware and software; to illustrate trades among quality factors, both hardware and software, to achieve system performance objectives; and to illustrate the impact of certain design choices on software functionality.

  8. Availability simulation software adaptation to the IFMIF accelerator facility RAMI analyses

    Energy Technology Data Exchange (ETDEWEB)

    Bargalló, Enric, E-mail: enric.bargallo-font@upc.edu [Fusion Energy Engineering Laboratory (FEEL), Technical University of Catalonia (UPC) Barcelona-Tech, Barcelona (Spain); Sureda, Pere Joan [Fusion Energy Engineering Laboratory (FEEL), Technical University of Catalonia (UPC) Barcelona-Tech, Barcelona (Spain); Arroyo, Jose Manuel [Laboratorio Nacional de Fusión por Confinamiento Magnético – CIEMAT, Madrid (Spain); Abal, Javier; De Blas, Alfredo; Dies, Javier; Tapia, Carlos [Fusion Energy Engineering Laboratory (FEEL), Technical University of Catalonia (UPC) Barcelona-Tech, Barcelona (Spain); Mollá, Joaquín; Ibarra, Ángel [Laboratorio Nacional de Fusión por Confinamiento Magnético – CIEMAT, Madrid (Spain)

    2014-10-15

    Highlights: • The reason why IFMIF RAMI analyses needs a simulation is explained. • Changes, modifications and software validations done to AvailSim are described. • First IFMIF RAMI results obtained with AvailSim 2.0 are shown. • Implications of AvailSim 2.0 in IFMIF RAMI analyses are evaluated. - Abstract: Several problems were found when using generic reliability tools to perform RAMI (Reliability Availability Maintainability Inspectability) studies for the IFMIF (International Fusion Materials Irradiation Facility) accelerator. A dedicated simulation tool was necessary to model properly the complexity of the accelerator facility. AvailSim, the availability simulation software used for the International Linear Collider (ILC) became an excellent option to fulfill RAMI analyses needs. Nevertheless, this software needed to be adapted and modified to simulate the IFMIF accelerator facility in a useful way for the RAMI analyses in the current design phase. Furthermore, some improvements and new features have been added to the software. This software has become a great tool to simulate the peculiarities of the IFMIF accelerator facility allowing obtaining a realistic availability simulation. Degraded operation simulation and maintenance strategies are the main relevant features. In this paper, the necessity of this software, main modifications to improve it and its adaptation to IFMIF RAMI analysis are described. Moreover, first results obtained with AvailSim 2.0 and a comparison with previous results is shown.

  9. Availability simulation software adaptation to the IFMIF accelerator facility RAMI analyses

    International Nuclear Information System (INIS)

    Bargalló, Enric; Sureda, Pere Joan; Arroyo, Jose Manuel; Abal, Javier; De Blas, Alfredo; Dies, Javier; Tapia, Carlos; Mollá, Joaquín; Ibarra, Ángel

    2014-01-01

    Highlights: • The reason why IFMIF RAMI analyses needs a simulation is explained. • Changes, modifications and software validations done to AvailSim are described. • First IFMIF RAMI results obtained with AvailSim 2.0 are shown. • Implications of AvailSim 2.0 in IFMIF RAMI analyses are evaluated. - Abstract: Several problems were found when using generic reliability tools to perform RAMI (Reliability Availability Maintainability Inspectability) studies for the IFMIF (International Fusion Materials Irradiation Facility) accelerator. A dedicated simulation tool was necessary to model properly the complexity of the accelerator facility. AvailSim, the availability simulation software used for the International Linear Collider (ILC) became an excellent option to fulfill RAMI analyses needs. Nevertheless, this software needed to be adapted and modified to simulate the IFMIF accelerator facility in a useful way for the RAMI analyses in the current design phase. Furthermore, some improvements and new features have been added to the software. This software has become a great tool to simulate the peculiarities of the IFMIF accelerator facility allowing obtaining a realistic availability simulation. Degraded operation simulation and maintenance strategies are the main relevant features. In this paper, the necessity of this software, main modifications to improve it and its adaptation to IFMIF RAMI analysis are described. Moreover, first results obtained with AvailSim 2.0 and a comparison with previous results is shown

  10. International Collaboration in the Development of NPP Software

    International Nuclear Information System (INIS)

    Jiang, S.; Liu, L.; Yu, H.

    2015-01-01

    In this paper, we first review the progress and current status of international collaboration and technical exchange in the development of nuclear power plant (NPP) software by The State Nuclear Power Software Development Center (SNPSDC) in China. Then we discuss the importance of the international collaboration and exchange in the trend of globalisation of NPP technology. We also identify the role and contribution of professional women in this process. SNPSDC, the first professional software development centre for NPP in China, has been developing COSINE — a self-reliance NPP design and analysis software product with China brand—since 2010. Through participating in OECD/NEA’s joint projects, such as ROSA-2 Project, PKL–3 Project, HYMERES Project and ATLAS Project, SNPSDC shared data with other countries involved with respect to particular areas, such as high quality reactor thermal hydraulics test data. SNPSDC’s engineers have also been actively participating in international technical and research exchange for presenting their innovative work to the community while learning from peers. Our record shows that over 30 papers have been presented in international conferences with respect to nuclear reactor thermal hydraulics, safety analysis, reactor physics and software engineering within the past 4 years. The above international collaboration and technical exchange helped SNPSDC’s engineers to keep up with the state-of-art technology in this field. The large amount of valuable experimental data transferred to SNPSDC ensured the functionality, usability and reliability of software while greatly reduced the cost and shortened the cycle of development. Female engineers and other employees of SNPSDC either drove or got actively involved in a lot of aspects of the above collaboration and exchange, such as technical communication, business negotiation and overseas affairs management. These professional women played an irreplaceable role in this project by

  11. Evaluation of The Virtual Cells Software: a Teaching Tool

    Directory of Open Access Journals (Sweden)

    C.C.P. da Silva

    2005-07-01

    Full Text Available Studies show that the use of games and interactive materials  at schools is a good educational strategy, motivating students to create mental  outlines and developing the reasoning and facilitating  the learn- ing. In this context, the Scientific Dissemination Coordination of the Center  for Structural Molecular Biotechnology  (CBME,  developed  a series of educational materials  destined  to the  elementary and high  schools,  universities  and  general  public.   Among  these,  we highlighted  the  Virtual  Cells soft- ware that was developed  with  the  aim of helping  in the  understanding of the  basic concepts  of cell types,  their  structures, organelles  and  specific functions.   Characterized by its  interactive  interface, this  software shows eukaryotes  and prokaryotes cells images, where organelles are shown as dynamic structures. In addition, it presents exercises in another  step that reinforce the comprehension  of Cy- tology.  A speaker  narrates the  resources  offered by the  program  and  the  necessary  steps  for its use. During  the  stage  of development of the  software,  students and  teachers of public and  private  high schools from Sao Carlos  city, Sao Paulo  State,  were invited  to register their  opinions  regarding  the language and content of the software in order to help us in the improvement of it.  After this stage, the Scientific Dissemination Coordination of CBME organized a series of workshops, where 120 individuals evaluated the software (students and teachers  of high school and others undergraduate students. For this evaluation, a questionnaire was elaborated based on the international current literature in the area of sciences teaching  and it was applied  after the interactive section with the software.  The analysis of the results demonstrated that most of the individuals  considered the software of easy

  12. Calculation Software versus Illustration Software for Teaching Statistics

    DEFF Research Database (Denmark)

    Mortensen, Peter Stendahl; Boyle, Robin G.

    1999-01-01

    As personal computers have become more and more powerful, so have the software packages available to us for teaching statistics. This paper investigates what software packages are currently being used by progressive statistics instructors at university level, examines some of the deficiencies...... of such software, and indicates features that statistics instructors wish to have incorporated in software in the future. The basis of the paper is a survey of participants at ICOTS-5 (the Fifth International Conference on Teaching Statistics). These survey results, combined with the software based papers...

  13. Software trigger for the TOPAZ detector at TRISTAN

    International Nuclear Information System (INIS)

    Tsukamoto, T.; Yamauchi, M.; Enomoto, R.

    1990-01-01

    A new software trigger system was developed and installed at the TOPAZ detector to the trigger system for the TRISTAN e + e - collider to take data efficiently in the scheduled high luminosity experiment. This software trigger requires two or more charged tracks originated at the interaction point by examining the timing of signals from the time projection chamber. To execute the vertex finding very quickly, four microprocessors are used in parallel. By this new trigger the rate of the track trigger was reduced down to 30-40% with very small inefficiency. The additional dead time by this trigger is negligible. (orig.)

  14. Contracting for Computer Software in Standardized Computer Languages

    Science.gov (United States)

    Brannigan, Vincent M.; Dayhoff, Ruth E.

    1982-01-01

    The interaction between standardized computer languages and contracts for programs which use these languages is important to the buyer or seller of software. The rationale for standardization, the problems in standardizing computer languages, and the difficulties of determining whether the product conforms to the standard are issues which must be understood. The contract law processes of delivery, acceptance testing, acceptance, rejection, and revocation of acceptance are applicable to the contracting process for standard language software. Appropriate contract language is suggested for requiring strict compliance with a standard, and an overview of remedies is given for failure to comply.

  15. Software de balanced scorecard: proposta de um roteiro de implantação Balanced scorecard software: proposal of a guideline implementation

    Directory of Open Access Journals (Sweden)

    Oswaldo Keiji Hikage

    2006-04-01

    Full Text Available Em 1992, o conceito de Balanced Scorecard (BSC foi apresentado por Robert Kaplan e David Norton e em razão de sua disseminação muitas empresas que o adotaram e passaram por processos de fusões e aquisições tiveram suas informações aumentadas consideravelmente em seus bancos de dados. Esta série de informações e a necessidade de gerenciar com eficiência os indicadores estratégicos, de disponibilizar rapidamente relatórios gerenciais, analisar e simular cenários levaram as empresas a buscar um sistema automatizado. Dessa forma, algumas preocupações passaram a ter importância: como selecionar um software de BSC? Como implantar um software de BSC? Além das dificuldades inerentes à aquisição de qualquer software, a situação especificamente tratada neste artigo apresenta algumas peculiaridades, diante da intensa interação entre a sistemática do BSC e os softwares que o suportam. Este artigo, por meio de um estudo de caso realizado em uma empresa do setor de telecomunicações, enfoca a implantação de um software de BSC, visando ao desenvolvimento de um roteiro que possa sistematizar o processo.Since 1992, when Robert Kaplan and David Norton developed the concepts of Balanced Scorecard (BSC, many companies that adopted these concepts and survived in a higher competitive environment charicterized of high number of mergers, purchases and shutdowns increased the volume of information in their database. So, in consequence of the adoption of BSC, there is the needed for implementing a computerized control system, due to the work with the BSC and the management of a great volume of information. In this context, when the companies decide to implement a BSC software, they faced two problems: how to choose the BSC software? How to implement the selected software? This study proposes to develop a guideline for the BSC software introduction based on a case study in a telecommunication company and the experience from the practice on

  16. Software use cases to elicit the software requirements analysis within the ASTRI project

    Science.gov (United States)

    Conforti, Vito; Antolini, Elisa; Bonnoli, Giacomo; Bruno, Pietro; Bulgarelli, Andrea; Capalbi, Milvia; Fioretti, Valentina; Fugazza, Dino; Gardiol, Daniele; Grillo, Alessandro; Leto, Giuseppe; Lombardi, Saverio; Lucarelli, Fabrizio; Maccarone, Maria Concetta; Malaguti, Giuseppe; Pareschi, Giovanni; Russo, Federico; Sangiorgi, Pierluca; Schwarz, Joseph; Scuderi, Salvatore; Tanci, Claudio; Tosti, Gino; Trifoglio, Massimo; Vercellone, Stefano; Zanmar Sanchez, Ricardo

    2016-07-01

    The Italian National Institute for Astrophysics (INAF) is leading the Astrofisica con Specchi a Tecnologia Replicante Italiana (ASTRI) project whose main purpose is the realization of small size telescopes (SST) for the Cherenkov Telescope Array (CTA). The first goal of the ASTRI project has been the development and operation of an innovative end-to-end telescope prototype using a dual-mirror optical configuration (SST-2M) equipped with a camera based on silicon photo-multipliers and very fast read-out electronics. The ASTRI SST-2M prototype has been installed in Italy at the INAF "M.G. Fracastoro" Astronomical Station located at Serra La Nave, on Mount Etna, Sicily. This prototype will be used to test several mechanical, optical, control hardware and software solutions which will be used in the ASTRI mini-array, comprising nine telescopes proposed to be placed at the CTA southern site. The ASTRI mini-array is a collaborative and international effort led by INAF and carried out by Italy, Brazil and South-Africa. We present here the use cases, through UML (Unified Modeling Language) diagrams and text details, that describe the functional requirements of the software that will manage the ASTRI SST-2M prototype, and the lessons learned thanks to these activities. We intend to adopt the same approach for the Mini Array Software System that will manage the ASTRI miniarray operations. Use cases are of importance for the whole software life cycle; in particular they provide valuable support to the validation and verification activities. Following the iterative development approach, which breaks down the software development into smaller chunks, we have analysed the requirements, developed, and then tested the code in repeated cycles. The use case technique allowed us to formalize the problem through user stories that describe how the user procedurally interacts with the software system. Through the use cases we improved the communication among team members, fostered

  17. Design of Mariner 9 Science Sequences using Interactive Graphics Software

    Science.gov (United States)

    Freeman, J. E.; Sturms, F. M, Jr.; Webb, W. A.

    1973-01-01

    This paper discusses the analyst/computer system used to design the daily science sequences required to carry out the desired Mariner 9 science plan. The Mariner 9 computer environment, the development and capabilities of the science sequence design software, and the techniques followed in the daily mission operations are discussed. Included is a discussion of the overall mission operations organization and the individual components which played an essential role in the sequence design process. A summary of actual sequences processed, a discussion of problems encountered, and recommendations for future applications are given.

  18. Integrating HCI Specialists into Open Source Software Development Projects

    Science.gov (United States)

    Hedberg, Henrik; Iivari, Netta

    Typical open source software (OSS) development projects are organized around technically talented developers, whose communication is based on technical aspects and source code. Decision-making power is gained through proven competence and activity in the project, and non-technical end-user opinions are too many times neglected. In addition, also human-computer interaction (HCI) specialists have encountered difficulties in trying to participate in OSS projects, because there seems to be no clear authority and responsibility for them. In this paper, based on HCI and OSS literature, we introduce an extended OSS development project organization model that adds a new level of communication and roles for attending human aspects of software. The proposed model makes the existence of HCI specialists visible in the projects, and promotes interaction between developers and the HCI specialists in the course of a project.

  19. Software system safety

    Science.gov (United States)

    Uber, James G.

    1988-01-01

    Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.

  20. The Milky Way and the Local Group: playing with great circles.

    Science.gov (United States)

    Fusi Pecci, F.; Bellazzini, M.; Ferraro, F. R.

    The small group of recently discovered galactic globular clusters (Pal 12, Ter 7, Rup 106, Arp 2) significantly younger than the average cluster population of the Galaxy are shown to lie near great circles passing in the proximity of most satellite galaxies of the Milky Way. Assuming that these great circles are in some way preferential planes of interaction between the Galaxy and its companions, the authors identified along one of them another candidate "young" globular cluster, IC 4499. Within this observational framework, the possibility that the sample of young globulars found in the halo of the Galaxy could have been captured from a satellite galaxy or formed during a close interaction between the Milky Way and one of its companions is briefly discussed.

  1. A Study of Performance and Effort Expectancy Factors among Generational and Gender Groups to Predict Enterprise Social Software Technology Adoption

    Science.gov (United States)

    Patel, Sunil S.

    2013-01-01

    Social software technology has gained considerable popularity over the last decade and has had a great impact on hundreds of millions of people across the globe. Businesses have also expressed their interest in leveraging its use in business contexts. As a result, software vendors and business consumers have invested billions of dollars to use…

  2. Knowledge-based software design for Defense-in-Depth risk monitor system and application for AP1000

    International Nuclear Information System (INIS)

    Ma Zhanguo; Yoshikawa, Hidekazu; Yang Ming; Nakagawa, Takashi

    2017-01-01

    As part of the new risk monitor system, the software for the plant Defense-in-Depth (DiD) risk monitor system was designed based on the state-transition and finite-state machine, and then the knowledge-based software was developed by object-oriented method utilizing the Unified Modeling Language (UML). Currently, there are mainly two functions in the developed plant DiD risk monitor software that are knowledge-base editor which is used to model the system in a hierarchical manner and the interaction simulator that simulates the interactions between the different actors in the model. In this paper, a model for playing its behavior is called an Actor which is modeled at the top level. The passive safety AP1000 power plant was studied and the small-break loss-of-coolant accident (SBLOCA) design basis accident transient is modeled using the plant DiD risk monitor software. Furthermore, the simulation result is shown for the interactions between the actors which are defined in the plant DiD risk monitor system as PLANT actor, OPERATOR actor, and SUPERVISOR actor. This paper shows that it is feasible to model the nuclear power plant knowledge base using the software modeling technique. The software can make the large knowledge base for the nuclear power plant with small effort. (author)

  3. A Method to Select Software Test Cases in Consideration of Past Input Sequence

    International Nuclear Information System (INIS)

    Kim, Hee Eun; Kim, Bo Gyung; Kang, Hyun Gook

    2015-01-01

    In the Korea Nuclear I and C Systems (KNICS) project, the software for the fully-digitalized reactor protection system (RPS) was developed under a strict procedure. Even though the behavior of the software is deterministic, the randomness of input sequence produces probabilistic behavior of software. A software failure occurs when some inputs to the software occur and interact with the internal state of the digital system to trigger a fault that was introduced into the software during the software lifecycle. In this paper, the method to select test set for software failure probability estimation is suggested. This test set reflects past input sequence of software, which covers all possible cases. In this study, the method to select test cases for software failure probability quantification was suggested. To obtain profile of paired state variables, relationships of the variables need to be considered. The effect of input from human operator also have to be considered. As an example, test set of PZR-PR-Lo-Trip logic was examined. This method provides framework for selecting test cases of safety-critical software

  4. Impact of social capital on psychological distress and interaction with house destruction and displacement after the Great East Japan Earthquake of 2011.

    Science.gov (United States)

    Tsuchiya, Naho; Nakaya, Naoki; Nakamura, Tomohiro; Narita, Akira; Kogure, Mana; Aida, Jun; Tsuji, Ichiro; Hozawa, Atsushi; Tomita, Hiroaki

    2017-01-01

    Social capital has been considered an important factor affecting mental-health outcomes, such as psychological distress in post-disaster settings. Although disaster-related house condition and displacement could affect both social capital and psychological distress, limited studies have investigated interactions. This study aimed to examine the association between social capital and psychological distress, taking into consideration the interaction of disaster-related house condition after the Great East Japan Earthquake of 2011. Using data from 3793 adults living in Shichigahama, Miyagi Prefecture, Japan, we examined the association between social capital measured by generalized trust and psychological distress measured by the Kessler 6 scale. We conducted stratified analysis to investigate an interaction of house destruction and displacement. Multivariate analyses taking into consideration the interaction were performed. In the crude analysis, low social capital (odds ratio [OR] 4.46; 95% confidence interval [CI], 3.27-6.07) and large-scale house destruction (OR 1.96; 95%CI, 1.47-2.62) were significantly associated with psychological distress. Stratified analyses detected an interaction with house destruction and displacement (P for interaction = 0.04). Multivariate analysis with interaction term revealed that individuals with low social capital, large-scale house damage, and displacement were at greater risk of psychological distress, corresponding to adjusted OR of 5.78 (95%CI, 3.48-9.60). In the post-disaster setting, low social capital increased the risk of psychological distress, especially among individuals who had large-scale house destruction. Among the participants with severe disaster damage, high social capital would play an important role in protecting mental health. © 2016 The Authors. Psychiatry and Clinical Neurosciences published by John Wiley & Sons Australia, Ltd on behalf of Japanese Society of Psychiatry and Neurology.

  5. Protyping machine vision software on the World Wide Web

    Science.gov (United States)

    Karantalis, George; Batchelor, Bruce G.

    1998-10-01

    Interactive image processing is a proven technique for analyzing industrial vision applications and building prototype systems. Several of the previous implementations have used dedicated hardware to perform the image processing, with a top layer of software providing a convenient user interface. More recently, self-contained software packages have been devised and these run on a standard computer. The advent of the Java programming language has made it possible to write platform-independent software, operating over the Internet, or a company-wide Intranet. Thus, there arises the possibility of designing at least some shop-floor inspection/control systems, without the vision engineer ever entering the factories where they will be used. It successful, this project will have a major impact on the productivity of vision systems designers.

  6. Instrument hardware and software upgrades at IPNS

    International Nuclear Information System (INIS)

    Worlton, Thomas; Hammonds, John; Mikkelson, D.; Mikkelson, Ruth; Porter, Rodney; Tao, Julian; Chatterjee, Alok

    2006-01-01

    IPNS is in the process of upgrading their time-of-flight neutron scattering instruments with improved hardware and software. The hardware upgrades include replacing old VAX Qbus and Multibus-based data acquisition systems with new systems based on VXI and VME. Hardware upgrades also include expanded detector banks and new detector electronics. Old VAX Fortran-based data acquisition and analysis software is being replaced with new software as part of the ISAW project. ISAW is written in Java for ease of development and portability, and is now used routinely for data visualization, reduction, and analysis on all upgraded instruments. ISAW provides the ability to process and visualize the data from thousands of detector pixels, each having thousands of time channels. These operations can be done interactively through a familiar graphical user interface or automatically through simple scripts. Scripts and operators provided by end users are automatically included in the ISAW menu structure, along with those distributed with ISAW, when the application is started

  7. PuMA: the Porous Microstructure Analysis software

    Science.gov (United States)

    Ferguson, Joseph C.; Panerai, Francesco; Borner, Arnaud; Mansour, Nagi N.

    2018-01-01

    The Porous Microstructure Analysis (PuMA) software has been developed in order to compute effective material properties and perform material response simulations on digitized microstructures of porous media. PuMA is able to import digital three-dimensional images obtained from X-ray microtomography or to generate artificial microstructures. PuMA also provides a module for interactive 3D visualizations. Version 2.1 includes modules to compute porosity, volume fractions, and surface area. Two finite difference Laplace solvers have been implemented to compute the continuum tortuosity factor, effective thermal conductivity, and effective electrical conductivity. A random method has been developed to compute tortuosity factors from the continuum to rarefied regimes. Representative elementary volume analysis can be performed on each property. The software also includes a time-dependent, particle-based model for the oxidation of fibrous materials. PuMA was developed for Linux operating systems and is available as a NASA software under a US & Foreign release.

  8. Avoidable Software Procurements

    Science.gov (United States)

    2012-09-01

    software license, software usage, ELA, Software as a Service , SaaS , Software Asset...PaaS Platform as a Service SaaS Software as a Service SAM Software Asset Management SMS System Management Server SEWP Solutions for Enterprise Wide...delivery of full Cloud Services , we will see the transition of the Cloud Computing service model from Iaas to SaaS , or Software as a Service . Software

  9. Remote Software Application and Display Development

    Science.gov (United States)

    Sanders, Brandon T.

    2014-01-01

    The era of the shuttle program has come to an end, but only to give rise to newer and more exciting projects. Now is the time of the Orion spacecraft, a work of art designed to exceed all previous endeavors of man. NASA is exiting the time of exploration and is entering a new period, a period of pioneering. With this new mission, many of NASAs organizations must undergo a great deal of change and development to support the Orion missions. The Spaceport Command and Control System (SCCS) is the new system that will provide NASA the ability to launch rockets into orbit and thus control Orion and other spacecraft as the goal of populating Mars becomes ever increasingly tangible. Since the previous control system, Launch Processing System (LPS), was primarily designed to launch the shuttles, SCCS was needed as Kennedy Space Center (KSC) reorganized to a multiuser spaceport for commercial flights, providing a more versatile control over rockets. Within SCCS, is the Launch Control System (LCS), which is the remote software behind the command and monitoring of flight and ground system hardware. This internship at KSC has involved two main components in LCS, including Remote Software Application and Display development. The display environment provides a graphical user interface for an operator to view and see if any cautions are raised, while the remote applications are the backbone that communicate with hardware, and then relay the data back to the displays. These elements go hand in hand as they provide monitoring and control over hardware and software alike from the safety of the Launch Control Center. The remote software applications are written in Application Control Language (ACL), which must undergo unit testing to ensure data integrity. This paper describes both the implementation and writing of unit tests in ACL code for remote software applications, as well as the building of remote displays to be used in the Launch Control Center (LCC).

  10. TRITON: graphic software for rational engineering of enzymes.

    Science.gov (United States)

    Damboský, J; Prokop, M; Koca, J

    2001-01-01

    Engineering of the catalytic properties of enzymes requires knowledge about amino acid residues interacting with the transition state of the substrate. TRITON is a graphic software package for modelling enzymatic reactions for the analysis of essential interactions between the enzyme and its substrate and for in silico construction of protein mutants. The reactions are modelled using semi-empirical quantum-mechanic methods and the protein mutants are constructed by homology modelling. The users are guided through the calculation and data analysis by wizards.

  11. Software tools for the particle accelerator designs

    International Nuclear Information System (INIS)

    Sugimoto, Masayoshi

    1988-01-01

    The software tools used for the designs of the particle accelerators are going to be implemented on the small computer systems, such as the personal computers or the work stations. These are called from the interactive environment like a window application program. The environment contains the small expert system to make easy to select the design parameters. (author)

  12. Computing Aspects of Interactive Video.

    Science.gov (United States)

    Butcher, P. G.

    1986-01-01

    Describes design and production of the award-winning software used to control Great Britain's Open University Materials Science videodisc, the Teddy Bear Disc, which is used to teach undergraduate students about materials engineering. The disc is designed for use in one-week sessions, which students attend in July or August. (MBR)

  13. Software engineering

    CERN Document Server

    Sommerville, Ian

    2010-01-01

    The ninth edition of Software Engineering presents a broad perspective of software engineering, focusing on the processes and techniques fundamental to the creation of reliable, software systems. Increased coverage of agile methods and software reuse, along with coverage of 'traditional' plan-driven software engineering, gives readers the most up-to-date view of the field currently available. Practical case studies, a full set of easy-to-access supplements, and extensive web resources make teaching the course easier than ever.

  14. A Study about the 3S-based Great Ruins Monitoring and Early-warning System

    Science.gov (United States)

    Xuefeng, W.; Zhongyuan, H.; Gongli, L.; Li, Z.

    2015-08-01

    Large-scale urbanization construction and new countryside construction, frequent natural disasters, and natural corrosion pose severe threat to the great ruins. It is not uncommon that the cultural relics are damaged and great ruins are occupied. Now the ruins monitoring mainly adopt general monitoring data processing system which can not effectively exert management, display, excavation analysis and data sharing of the relics monitoring data. Meanwhile those general software systems require layout of large number of devices or apparatuses, but they are applied to small-scope relics monitoring only. Therefore, this paper proposes a method to make use of the stereoscopic cartographic satellite technology to improve and supplement the great ruins monitoring index system and combine GIS and GPS to establish a highly automatic, real-time and intelligent great ruins monitoring and early-warning system in order to realize collection, processing, updating, spatial visualization, analysis, distribution and sharing of the monitoring data, and provide scientific and effective data for the relics protection, scientific planning, reasonable development and sustainable utilization.

  15. Free software, Open source software, licenses. A short presentation including a procedure for research software and data dissemination

    OpenAIRE

    Gomez-Diaz , Teresa

    2014-01-01

    4 pages. Spanish version: Software libre, software de código abierto, licencias. Donde se propone un procedimiento de distribución de software y datos de investigación; The main goal of this document is to help the research community to understand the basic concepts of software distribution: Free software, Open source software, licenses. This document also includes a procedure for research software and data dissemination.

  16. User interface issues in radiotherapy CAD software. 116

    International Nuclear Information System (INIS)

    Sherouse, G.W.; Mosher, C.E. Jr.

    1987-01-01

    A major growth area in computerized planning of radiotherapy over the last five years has been the development of programs for treatment design based on the interactive graphic display of three dimensional patient models. This word has been recognized as being a close relative of Computer-Aided Design (CAD) software which is used in industry for a broad spectrum of engineering and design applications. A CAD system for radiotherapy has been constructed which combines some of the more attractive aspects of other institutions' treatment design systems with CAD technology and a user interface which is designed for use by clinicians. The importance of an appropriate user interface cannot be overemphasized and is often sadly neglected in radiotherapy software. In order to realize the potential of this new treatment design technology it is essential that clinicians, particularly physicians, perceive RT CAD systems as tools rather than as obstacles. It is felt that thhe best way to achieve that perception is for the software to implement a superset of the functions used in conventional practice while retaining the ambiance of the traditional methods. It is the aim to construct a system which an experienced physician or dosimetrist can use with essentially no retraining. The model for this CAD tool is that of a virtual simulator. It is meant to faithfully reproduce both the function and the feel of a physical simulator. A number of techniques has been identified for realizing this goal. These include simple intuitive input devices, natural coordinate systems graphic interaction, good interactive responsiveness, and high-quality 3D display modalities. Here a description of these techniques and some details of their implementation is presented. 10 refs.; 2 figs

  17. Generic MIDI devices as new input for specialized software

    Directory of Open Access Journals (Sweden)

    Marcin Badurowicz

    2016-12-01

    Full Text Available There are plenty of sublime devices, including input devices, for all kinds of specialists working with computers available on the market. Furthermore, the more specific solutions are needed, the more expensive and complicated they are. At the time when many people prefer to try as many things as possible before selecting the specific learning paths, both high price and high entry threshold, can appear as blockers. In the paper, there are selected some hardware and software solutions for facilitating the work of the professionals , who expect more analog-like interfaces and more natural ways to control computers presented. Additionally, the authors describe original software and hardware solution that allows the use of wide range MIDI devices as custom input devices. The concept of the software made is being presented, as well as some results of initial interaction of different kind of professionals and the proposed solution software and hardware.

  18. Integration testing through reusing representative unit test cases for high-confidence medical software.

    Science.gov (United States)

    Shin, Youngsul; Choi, Yunja; Lee, Woo Jin

    2013-06-01

    As medical software is getting larger-sized, complex, and connected with other devices, finding faults in integrated software modules gets more difficult and time consuming. Existing integration testing typically takes a black-box approach, which treats the target software as a black box and selects test cases without considering internal behavior of each software module. Though it could be cost-effective, this black-box approach cannot thoroughly test interaction behavior among integrated modules and might leave critical faults undetected, which should not happen in safety-critical systems such as medical software. This work anticipates that information on internal behavior is necessary even for integration testing to define thorough test cases for critical software and proposes a new integration testing method by reusing test cases used for unit testing. The goal is to provide a cost-effective method to detect subtle interaction faults at the integration testing phase by reusing the knowledge obtained from unit testing phase. The suggested approach notes that the test cases for the unit testing include knowledge on internal behavior of each unit and extracts test cases for the integration testing from the test cases for the unit testing for a given test criteria. The extracted representative test cases are connected with functions under test using the state domain and a single test sequence to cover the test cases is produced. By means of reusing unit test cases, the tester has effective test cases to examine diverse execution paths and find interaction faults without analyzing complex modules. The produced test sequence can have test coverage as high as the unit testing coverage and its length is close to the length of optimal test sequences. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Constructing a working taxonomy of functional Ada software components for real-time embedded system applications

    Science.gov (United States)

    Wallace, Robert

    1986-01-01

    A major impediment to a systematic attack on Ada software reusability is the lack of an effective taxonomy for software component functions. The scope of all possible applications of Ada software is considered too great to allow the practical development of a working taxonomy. Instead, for the purposes herein, the scope of Ada software application is limited to device and subsystem control in real-time embedded systems. A functional approach is taken in constructing the taxonomy tree for identified Ada domain. The use of modular software functions as a starting point fits well with the object oriented programming philosophy of Ada. Examples of the types of functions represented within the working taxonomy are real time kernels, interrupt service routines, synchronization and message passing, data conversion, digital filtering and signal conditioning, and device control. The constructed taxonomy is proposed as a framework from which a need analysis can be performed to reveal voids in current Ada real-time embedded programming efforts for Space Station.

  20. Impact of Agile Software Development Model on Software Maintainability

    Science.gov (United States)

    Gawali, Ajay R.

    2012-01-01

    Software maintenance and support costs account for up to 60% of the overall software life cycle cost and often burdens tightly budgeted information technology (IT) organizations. Agile software development approach delivers business value early, but implications on software maintainability are still unknown. The purpose of this quantitative study…

  1. NASA Software Engineering Benchmarking Study

    Science.gov (United States)

    Rarick, Heather L.; Godfrey, Sara H.; Kelly, John C.; Crumbley, Robert T.; Wifl, Joel M.

    2013-01-01

    .onsolidate, collect and, if needed, develop common processes principles and other assets across the Agency in order to provide more consistency in software development and acquisition practices and to reduce the overall cost of maintaining or increasing current NASA CMMI maturity levels. 6. Provide additional support for small projects that includes: (a) guidance for appropriate tailoring of requirements for small projects, (b) availability of suitable tools, including support tool set-up and training, and (c) training for small project personnel, assurance personnel and technical authorities on the acceptable options for tailoring requirements and performing assurance on small projects. 7. Develop software training classes for the more experienced software engineers using on-line training, videos, or small separate modules of training that can be accommodated as needed throughout a project. 8. Create guidelines to structure non-classroom training opportunities such as mentoring, peer reviews, lessons learned sessions, and on-the-job training. 9. Develop a set of predictive software defect data and a process for assessing software testing metric data against it. 10. Assess Agency-wide licenses for commonly used software tools. 11. Fill the knowledge gap in common software engineering practices for new hires and co-ops.12. Work through the Science, Technology, Engineering and Mathematics (STEM) program with universities in strengthening education in the use of common software engineering practices and standards. 13. Follow up this benchmark study with a deeper look into what both internal and external organizations perceive as the scope of software assurance, the value they expect to obtain from it, and the shortcomings they experience in the current practice. 14. Continue interactions with external software engineering environment through collaborations, knowledge sharing, and benchmarking.

  2. Software Should be Written by Writers.

    Science.gov (United States)

    Sheridan, James

    1983-01-01

    Considering the computer as a collaborator rather than a machine, it is encouraged that those in the humanities and the arts fields take advantage of the great potential that artificial intelligence can offer. Stresses that unless deliberately restricted, the computer is an inherently interdisciplinary medium, and capable of interacting with any…

  3. Jagiellonian University Development of the LHCb VELO monitoring software platform

    CERN Document Server

    Majewski, Maciej

    2017-01-01

    One of the most important parts of the LHCb spectrometer is the VErtex LOcator (VELO), dedicated to the precise tracking close to the proton–proton interaction point. The quality of data produced by the VELO depends on the calibration process, which must be monitored to ensure its correctness. This work presents details on how the calibration monitoring is conducted and how it could be improved. It also includes information on monitoring software and data flow in the LHCb software framework.

  4. Automation Hardware & Software for the STELLA Robotic Telescope

    Science.gov (United States)

    Weber, M.; Granzer, Th.; Strassmeier, K. G.

    The STELLA telescope (a joint project of the AIP, Hamburger Sternwarte and the IAC) is to operate in fully robotic mode, with no human interaction necessary for regular operation. Thus, the hardware must be kept as simple as possible to avoid unnecessary failures, and the environmental conditions must be monitored accurately to protect the telescope in case of bad weather. All computers are standard PCs running Linux, and communication with specialized hardware is done via a RS232/RS485 bus system. The high level (java based) control software consists of independent modules to ease bug-tracking and to allow the system to be extended without changing existing modules. Any command cycle consists of three messages, the actual command sent from the central node to the operating device, an immediate acknowledge, and a final done message, both sent back from the receiving device to the central node. This reply-splitting allows a direct distinction between communication problems (no acknowledge message) and hardware problems (no or a delayed done message). To avoid bug-prone packing of all the sensor-analyzing software into a single package, each sensor-reading and interaction with other sensors is done within a self-contained thread. Weather-decision making is therefore totally decoupled from the core control software to avoid dead-locks in the core module.

  5. Software V and V of PPS for Shin-Hanul Nuclear Power Plant Units 1 and 2

    International Nuclear Information System (INIS)

    Park, Cheollak; Kang, Dongpa; Choe, Changhui; Sohn, Sedo; Beak, Seungmin

    2013-01-01

    Software V and V processes determine whether the development products of a given activity conform to the requirements of that activity and whether the software satisfies its intended use and user needs. This paper introduces the software V and V activities and tasks performed during the software development life cycle performed during the software development life cycle of the Plant Protection System (PPS) for Shin-Hanul Nuclear Power Plant Units 1 and 2 (SHN 1 and 2). The PPS generates signals to actuate Reactor Trip (RT) and Engineered Safety Features (ESF) whenever monitored processes exceed predetermined limits, and the PPS software is classified safety critical and an independent V and V is thus required according to regulations, code and standards. The software V and V efforts, sufficiently disciplined and rigorous, are quite essential to demonstrate that the software development process is of a high quality. The software V and V of PPS for SHN 1 and 2 has been accomplished successfully with systematic V and V procedures and methods established until test phase in compliance with related code and standards. In particular, the use of automated tools such as LDRA and DOORS greatly has contributed to an improvement of a software quality, and a reduction of a verification time and human errors

  6. Client Mobile Software Design Principles for Mobile Learning Systems

    Directory of Open Access Journals (Sweden)

    Qing Tan

    2009-01-01

    Full Text Available In a client-server mobile learning system, client mobile software must run on the mobile phone to acquire, package, and send student’s interaction data via the mobile communications network to the connected mobile application server. The server will receive and process the client data in order to offer appropriate content and learning activities. To develop the mobile learning systems there are a number of very important issues that must be addressed. Mobile phones have scarce computing resources. They consist of heterogeneous devices and use various mobile operating systems, they have limitations with their user/device interaction capabilities, high data communications cost, and must provide for device mobility and portability. In this paper we propose five principles for designing Client mobile learning software. A location-based adaptive mobile learning system is presented as a proof of concept to demonstrate the applicability of these design principles.

  7. Software design practice using two SCADA software packages

    DEFF Research Database (Denmark)

    Basse, K.P.; Christensen, Georg Kronborg; Frederiksen, P. K.

    1996-01-01

    Typical software development for manufacturing control is done either by specialists with consideral real-time programming experience or done by the adaptation of standard software packages for manufacturing control. After investigation and test of two commercial software packages: "InTouch" and ......Touch" and "Fix", it is argued, that a more efficient software solution can be achieved by utilising an integrated specification for SCADA and PLC-programming. Experiences gained from process control is planned investigated for descrete parts manufacturing....

  8. Modeling multimodal human-computer interaction

    NARCIS (Netherlands)

    Obrenovic, Z.; Starcevic, D.

    2004-01-01

    Incorporating the well-known Unified Modeling Language into a generic modeling framework makes research on multimodal human-computer interaction accessible to a wide range off software engineers. Multimodal interaction is part of everyday human discourse: We speak, move, gesture, and shift our gaze

  9. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  10. Software Epistemology

    Science.gov (United States)

    2016-03-01

    in-vitro decision to incubate a startup, Lexumo [7], which is developing a commercial Software as a Service ( SaaS ) vulnerability assessment...LTS Label Transition System MUSE Mining and Understanding Software Enclaves RTEMS Real-Time Executive for Multi-processor Systems SaaS Software ...as a Service SSA Static Single Assignment SWE Software Epistemology UD/DU Def-Use/Use-Def Chains (Dataflow Graph)

  11. A REVIEW ON SOFTWARE PRONE DETECTION AND ITS PREVENTION TECHNIQUES

    OpenAIRE

    Laxmi Dewangan*1 & Prof. Anish Lazrus2

    2018-01-01

    The need of distributed and complex business applications in big business requests error free and quality application frameworks. This makes it critical in programming improvement to create quality and fault free programming. It is likewise critical to outline dependable and simple to keep up as it includes a great deal of human endeavors, cost and time amid programming life cycle. A software advancement process performs different exercises to limit the faults, for example, fault prediction, ...

  12. Need for multiple approaches in collaborative software development

    International Nuclear Information System (INIS)

    LePoire, D. J.

    2002-01-01

    The need to share software and reintegrate it into new applications presents a difficult but important challenge. Component-based development as an approach to this problem is receiving much attention in professional journals and academic curricula. However, there are many other approaches to collaborative software development that might be more appropriate. This paper reviews a few of these approaches and discusses criteria for the conditions and contexts in which these alternative approaches might be more appropriate. This paper complements the discussion of context-based development team organizations and processes. Examples from a small development team that interacts with a larger professional community are analyzed

  13. Comparison of Software Technologies for Vectorization and Parallelization

    CERN Document Server

    Lazzaro, Alfio; Nowak, Andrzej; Valsan, Liviu

    2012-01-01

    This paper demonstrates how modern software development methodologies can be used to give an existing sequential application a considerable performance speed-up on modern x86 server systems. Whereas, in the past, speed-up was directly linked to the increase in clock frequency when moving to a more modern system, current x86 servers present a plethora of “performance dimensions” that need to be harnessed with great care. The application we used is a real-life data analysis example in C++ analyzing High Energy Physics data. The key software methods used are OpenMP, Intel Threading Building Blocks (TBB), Intel Cilk Plus, and the auto-vectorization capability of the Intel compiler (Composer XE). Somewhat surprisingly, the Message Passing Interface (MPI) is successfully added, although our focus is on single-node rather than multi-node performance optimization. The paper underlines the importance of algorithmic redesign in order to optimize each performance dimension and links this to close control of the memo...

  14. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  15. Modeling of ultrafast THz interactions in molecular crystals

    DEFF Research Database (Denmark)

    Pedersen, Pernille Klarskov; Clark, Stewart J.; Jepsen, Peter Uhd

    2014-01-01

    In this paper we present a numerical study of terahertz pulses interacting with crystals of cesium iodide. We model the molecular dynamics of the cesium iodide crystals with the Density Functional Theory software CASTEP, where ultrafast terahertz pulses are implemented to the CASTEP software...... to interact with molecular crystals. We investigate the molecular dynamics of cesium iodide crystals when interacting with realistic terahertz pulses of field strengths from 0 to 50 MV/cm. We find nonlinearities in the response of the CsI crystals at field strengths higher than 10 MV/cm....

  16. What's New in Software? Hot New Tool: The Hypertext.

    Science.gov (United States)

    Hedley, Carolyn N.

    1989-01-01

    This article surveys recent developments in hypertext software, a highly interactive nonsequential reading/writing/database approach to research and teaching that allows paths to be created through related materials including text, graphics, video, and animation sources. Described are uses, advantages, and problems of hypertext. (PB)

  17. Interactive geometry inside MathDox

    NARCIS (Netherlands)

    Cuypers, H.; Hendriks, M.; Knopper, J.W.

    2010-01-01

    In this paper we describe how we envision using interactive geometry inside MathDox pages. In particular, by some examples we discuss how users and mathematical services (offered by various mathematical software packages) can interact with the geometric objects available. This not only includes

  18. Sea lamprey (Petromyzon marinus) parasite-host interactions in the Great Lakes

    Science.gov (United States)

    Bence, James R.; Bergstedt, Roger A.; Christie, Gavin C.; Cochran, Phillip A.; Ebener, Mark P.; Koonce, Joseph F.; Rutter, Michael A.; Swink, William D.

    2003-01-01

    Prediction of how host mortality responds to efforts to control sea lampreys (Petromyzon marinus) is central to the integrated management strategy for sea lamprey (IMSL) in the Great Lakes. A parasite-host submodel is used as part of this strategy, and this includes a type-2 multi-species functional response, a developmental response, but no numerical response. General patterns of host species and size selection are consistent with the model assumptions, but some observations appear to diverge. For example, some patterns in sea lamprey marking on hosts suggest increases in selectivity for less preferred hosts and lower host survival when preferred hosts are scarce. Nevertheless, many of the IMSL assumptions may be adequate under conditions targeted by fish community objectives. Of great concern is the possibility that the survival of young parasites (parasitic-phase sea lampreys) varies substantially among lakes or over time. Joint analysis of abundance estimates for parasites being produced in streams and returning spawners could address this. Data on sea lamprey marks is a critical source of information on sea lamprey activity and potential effects. Theory connecting observed marks to sea lamprey feeding activity and host mortality is reviewed. Uncertainties regarding healing and attachment times, the probability of hosts surviving attacks, and problems in consistent classification of marks have led to widely divergent estimates of damages caused by sea lamprey. Laboratory and field studies are recommended to provide a firmer linkage between host blood loss, host mortality, and observed marks on surviving hosts, so as to improve estimates of damage.

  19. Configuring the Orion Guidance, Navigation, and Control Flight Software for Automated Sequencing

    Science.gov (United States)

    Odegard, Ryan G.; Siliwinski, Tomasz K.; King, Ellis T.; Hart, Jeremy J.

    2010-01-01

    The Orion Crew Exploration Vehicle is being designed with greater automation capabilities than any other crewed spacecraft in NASA s history. The Guidance, Navigation, and Control (GN&C) flight software architecture is designed to provide a flexible and evolvable framework that accommodates increasing levels of automation over time. Within the GN&C flight software, a data-driven approach is used to configure software. This approach allows data reconfiguration and updates to automated sequences without requiring recompilation of the software. Because of the great dependency of the automation and the flight software on the configuration data, the data management is a vital component of the processes for software certification, mission design, and flight operations. To enable the automated sequencing and data configuration of the GN&C subsystem on Orion, a desktop database configuration tool has been developed. The database tool allows the specification of the GN&C activity sequences, the automated transitions in the software, and the corresponding parameter reconfigurations. These aspects of the GN&C automation on Orion are all coordinated via data management, and the database tool provides the ability to test the automation capabilities during the development of the GN&C software. In addition to providing the infrastructure to manage the GN&C automation, the database tool has been designed with capabilities to import and export artifacts for simulation analysis and documentation purposes. Furthermore, the database configuration tool, currently used to manage simulation data, is envisioned to evolve into a mission planning tool for generating and testing GN&C software sequences and configurations. A key enabler of the GN&C automation design, the database tool allows both the creation and maintenance of the data artifacts, as well as serving the critical role of helping to manage, visualize, and understand the data-driven parameters both during software development

  20. Software engineering laboratory series: Annotated bibliography of software engineering laboratory literature

    Science.gov (United States)

    Morusiewicz, Linda; Valett, Jon

    1992-01-01

    This document is an annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory. More than 100 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. This document has been updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials have been grouped into eight general subject areas for easy reference: (1) the Software Engineering Laboratory; (2) the Software Engineering Laboratory: Software Development Documents; (3) Software Tools; (4) Software Models; (5) Software Measurement; (6) Technology Evaluations; (7) Ada Technology; and (8) Data Collection. This document contains an index of these publications classified by individual author.

  1. Software Engineering Guidebook

    Science.gov (United States)

    Connell, John; Wenneson, Greg

    1993-01-01

    The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.

  2. Family-Based Benchmarking of Copy Number Variation Detection Software.

    Science.gov (United States)

    Nutsua, Marcel Elie; Fischer, Annegret; Nebel, Almut; Hofmann, Sylvia; Schreiber, Stefan; Krawczak, Michael; Nothnagel, Michael

    2015-01-01

    The analysis of structural variants, in particular of copy-number variations (CNVs), has proven valuable in unraveling the genetic basis of human diseases. Hence, a large number of algorithms have been developed for the detection of CNVs in SNP array signal intensity data. Using the European and African HapMap trio data, we undertook a comparative evaluation of six commonly used CNV detection software tools, namely Affymetrix Power Tools (APT), QuantiSNP, PennCNV, GLAD, R-gada and VEGA, and assessed their level of pair-wise prediction concordance. The tool-specific CNV prediction accuracy was assessed in silico by way of intra-familial validation. Software tools differed greatly in terms of the number and length of the CNVs predicted as well as the number of markers included in a CNV. All software tools predicted substantially more deletions than duplications. Intra-familial validation revealed consistently low levels of prediction accuracy as measured by the proportion of validated CNVs (34-60%). Moreover, up to 20% of apparent family-based validations were found to be due to chance alone. Software using Hidden Markov models (HMM) showed a trend to predict fewer CNVs than segmentation-based algorithms albeit with greater validity. PennCNV yielded the highest prediction accuracy (60.9%). Finally, the pairwise concordance of CNV prediction was found to vary widely with the software tools involved. We recommend HMM-based software, in particular PennCNV, rather than segmentation-based algorithms when validity is the primary concern of CNV detection. QuantiSNP may be used as an additional tool to detect sets of CNVs not detectable by the other tools. Our study also reemphasizes the need for laboratory-based validation, such as qPCR, of CNVs predicted in silico.

  3. From Software Development to Software Assembly

    NARCIS (Netherlands)

    Sneed, Harry M.; Verhoef, Chris

    2016-01-01

    The lack of skilled programming personnel and the growing burden of maintaining customized software are forcing organizations to quit producing their own software. It's high time they turned to ready-made, standard components to fulfill their business requirements. Cloud services might be one way to

  4. Current and future trends in marine image annotation software

    Science.gov (United States)

    Gomes-Pereira, Jose Nuno; Auger, Vincent; Beisiegel, Kolja; Benjamin, Robert; Bergmann, Melanie; Bowden, David; Buhl-Mortensen, Pal; De Leo, Fabio C.; Dionísio, Gisela; Durden, Jennifer M.; Edwards, Luke; Friedman, Ariell; Greinert, Jens; Jacobsen-Stout, Nancy; Lerner, Steve; Leslie, Murray; Nattkemper, Tim W.; Sameoto, Jessica A.; Schoening, Timm; Schouten, Ronald; Seager, James; Singh, Hanumant; Soubigou, Olivier; Tojeira, Inês; van den Beld, Inge; Dias, Frederico; Tempera, Fernando; Santos, Ricardo S.

    2016-12-01

    Given the need to describe, analyze and index large quantities of marine imagery data for exploration and monitoring activities, a range of specialized image annotation tools have been developed worldwide. Image annotation - the process of transposing objects or events represented in a video or still image to the semantic level, may involve human interactions and computer-assisted solutions. Marine image annotation software (MIAS) have enabled over 500 publications to date. We review the functioning, application trends and developments, by comparing general and advanced features of 23 different tools utilized in underwater image analysis. MIAS requiring human input are basically a graphical user interface, with a video player or image browser that recognizes a specific time code or image code, allowing to log events in a time-stamped (and/or geo-referenced) manner. MIAS differ from similar software by the capability of integrating data associated to video collection, the most simple being the position coordinates of the video recording platform. MIAS have three main characteristics: annotating events in real time, posteriorly to annotation and interact with a database. These range from simple annotation interfaces, to full onboard data management systems, with a variety of toolboxes. Advanced packages allow to input and display data from multiple sensors or multiple annotators via intranet or internet. Posterior human-mediated annotation often include tools for data display and image analysis, e.g. length, area, image segmentation, point count; and in a few cases the possibility of browsing and editing previous dive logs or to analyze the annotations. The interaction with a database allows the automatic integration of annotations from different surveys, repeated annotation and collaborative annotation of shared datasets, browsing and querying of data. Progress in the field of automated annotation is mostly in post processing, for stable platforms or still images

  5. A Software Reuse Approach and Its Effect On Software Quality, An Empirical Study for The Software Industry

    OpenAIRE

    Mateen, Ahmed; Kausar, Samina; Sattar, Ahsan Raza

    2017-01-01

    Software reusability has become much interesting because of increased quality and reduce cost. A good process of software reuse leads to enhance the reliability, productivity, quality and the reduction of time and cost. Current reuse techniques focuses on the reuse of software artifact which grounded on anticipated functionality whereas, the non-functional (quality) aspect are also important. So, Software reusability used here to expand quality and productivity of software. It improves overal...

  6. Software quality assurance: in large scale and complex software-intensive systems

    NARCIS (Netherlands)

    Mistrik, I.; Soley, R.; Ali, N.; Grundy, J.; Tekinerdogan, B.

    2015-01-01

    Software Quality Assurance in Large Scale and Complex Software-intensive Systems presents novel and high-quality research related approaches that relate the quality of software architecture to system requirements, system architecture and enterprise-architecture, or software testing. Modern software

  7. TrajAnalytics: A Web-Based Visual Analytics Software of Urban Trajectory

    OpenAIRE

    Zhao, Ye; AL-Dohuki, Shamal; Eynon, Thomas; Kamw, Farah; Sheets, David; Ma, Chao; Ye, Xinyue; Hu, Yueqi; Feng, Tinghao; Yang, Jing

    2017-01-01

    Advanced technologies in sensing and computing have created urban trajectory datasets of humans and vehicles travelling over urban road networks. Understanding and analyzing the large-scale, complex data reflecting city dynamics is of great importance to enhance both human lives and urban environments. Domain practitioners, researchers, and decision-makers need to store, manage, query and visualize such big datasets. We develop a software system named TrajAnalytics, which explicitly supports ...

  8. Surgical model-view-controller simulation software framework for local and collaborative applications.

    Science.gov (United States)

    Maciel, Anderson; Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu

    2011-07-01

    Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users.

  9. Model-Based Software Testing for Object-Oriented Software

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Model-based testing is one of the best solutions for testing object-oriented software. It has a better test coverage than other testing styles. Model-based testing takes into consideration behavioural aspects of a class, which are usually unchecked in other testing methods. An increase in the complexity of software has forced the software industry…

  10. Visualization of protein interaction networks: problems and solutions

    Directory of Open Access Journals (Sweden)

    Agapito Giuseppe

    2013-01-01

    Full Text Available Abstract Background Visualization concerns the representation of data visually and is an important task in scientific research. Protein-protein interactions (PPI are discovered using either wet lab techniques, such mass spectrometry, or in silico predictions tools, resulting in large collections of interactions stored in specialized databases. The set of all interactions of an organism forms a protein-protein interaction network (PIN and is an important tool for studying the behaviour of the cell machinery. Since graphic representation of PINs may highlight important substructures, e.g. protein complexes, visualization is more and more used to study the underlying graph structure of PINs. Although graphs are well known data structures, there are different open problems regarding PINs visualization: the high number of nodes and connections, the heterogeneity of nodes (proteins and edges (interactions, the possibility to annotate proteins and interactions with biological information extracted by ontologies (e.g. Gene Ontology that enriches the PINs with semantic information, but complicates their visualization. Methods In these last years many software tools for the visualization of PINs have been developed. Initially thought for visualization only, some of them have been successively enriched with new functions for PPI data management and PIN analysis. The paper analyzes the main software tools for PINs visualization considering four main criteria: (i technology, i.e. availability/license of the software and supported OS (Operating System platforms; (ii interoperability, i.e. ability to import/export networks in various formats, ability to export data in a graphic format, extensibility of the system, e.g. through plug-ins; (iii visualization, i.e. supported layout and rendering algorithms and availability of parallel implementation; (iv analysis, i.e. availability of network analysis functions, such as clustering or mining of the graph, and the

  11. Design and development of virtual TXP control system software

    International Nuclear Information System (INIS)

    Wang Yunwei; Leng Shan; Liu Zhisheng; Wang Qiang; Shang Yanxia

    2008-01-01

    Taking distributed control system (DCS) of Siemens TELEPERM-XP (TXP) as the simulation object,Virtual TXP (VTXP) control system based on Virtual DCS with high fidelity and reliability was designed and developed on the platform of Windows. In the process of development, the method of object-oriented modeling and modularization program design are adopted, C++ language and technologies such as multithreading, ActiveX control, Socket network communication are used, to realize the wide range dynamic simulation and recreate the functions of the hardware and software of real TXP. This paper puts emphasis on the design and realization of Control server and Communication server. The development of Virtual TXP control system software is with great effect on the construction of simulation system and the design, commission, verification and maintenance of control system in large-scale power plants, nuclear power plants and combined cycle power plants. (authors)

  12. [The Development and Application of the Orthopaedics Implants Failure Database Software Based on WEB].

    Science.gov (United States)

    Huang, Jiahua; Zhou, Hai; Zhang, Binbin; Ding, Biao

    2015-09-01

    This article develops a new failure database software for orthopaedics implants based on WEB. The software is based on B/S mode, ASP dynamic web technology is used as its main development language to achieve data interactivity, Microsoft Access is used to create a database, these mature technologies make the software extend function or upgrade easily. In this article, the design and development idea of the software, the software working process and functions as well as relative technical features are presented. With this software, we can store many different types of the fault events of orthopaedics implants, the failure data can be statistically analyzed, and in the macroscopic view, it can be used to evaluate the reliability of orthopaedics implants and operations, it also can ultimately guide the doctors to improve the clinical treatment level.

  13. Proceedings of the Fifth Triennial Software Quality Forum 2000, Software for the Next Millennium, Software Quality Forum

    Energy Technology Data Exchange (ETDEWEB)

    Scientific Software Engineering Group, CIC-12

    2000-04-01

    The Software Quality Forum is a triennial conference held by the Software Quality Assurance Subcommittee for the Department of Energy's Quality Managers. The forum centers on key issues, information, and technology important in software development for the Nuclear Weapons Complex. This year it will be opened up to include local information technology companies and software vendors presenting their solutions, ideas, and lessons learned. The Software Quality Forum 2000 will take on a more hands-on, instructional tone than those previously held. There will be an emphasis on providing information, tools, and resources to assist developers in their goal of producing next generation software.

  14. NASA software documentation standard software engineering program

    Science.gov (United States)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as Standard) can be applied to the documentation of all NASA software. This Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. This basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  15. Creating the next generation control system software

    International Nuclear Information System (INIS)

    Schultz, D.E.

    1989-01-01

    A new 1980's style support package for future accelerator control systems is proposed. It provides a way to create accelerator applications software without traditional programming. Visual Interactive Applications (VIA) is designed to meet the needs of expanded accelerator complexes in a more cost effective way than past experience with procedural languages by using technology from the personal computer and artificial intelligence communities. 4 refs

  16. Ground control station software design for micro aerial vehicles

    Science.gov (United States)

    Walendziuk, Wojciech; Oldziej, Daniel; Binczyk, Dawid Przemyslaw; Slowik, Maciej

    2017-08-01

    This article describes the process of designing the equipment part and the software of a ground control station used for configuring and operating micro unmanned aerial vehicles (UAV). All the works were conducted on a quadrocopter model being a commonly accessible commercial construction. This article contains a characteristics of the research object, the basics of operating the micro aerial vehicles (MAV) and presents components of the ground control station model. It also describes the communication standards for the purpose of building a model of the station. Further part of the work concerns the software of the product - the GIMSO application (Generally Interactive Station for Mobile Objects), which enables the user to manage the actions and communication and control processes from the UAV. The process of creating the software and the field tests of a station model are also presented in the article.

  17. Science and Software

    Science.gov (United States)

    Zelt, C. A.

    2017-12-01

    Earth science attempts to understand how the earth works. This research often depends on software for modeling, processing, inverting or imaging. Freely sharing open-source software is essential to prevent reinventing the wheel and allows software to be improved and applied in ways the original author may never have envisioned. For young scientists, releasing software can increase their name ID when applying for jobs and funding, and create opportunities for collaborations when scientists who collect data want the software's creator to be involved in their project. However, we frequently hear scientists say software is a tool, it's not science. Creating software that implements a new or better way of earth modeling or geophysical processing, inverting or imaging should be viewed as earth science. Creating software for things like data visualization, format conversion, storage, or transmission, or programming to enhance computational performance, may be viewed as computer science. The former, ideally with an application to real data, can be published in earth science journals, the latter possibly in computer science journals. Citations in either case should accurately reflect the impact of the software on the community. Funding agencies need to support more software development and open-source releasing, and the community should give more high-profile awards for developing impactful open-source software. Funding support and community recognition for software development can have far reaching benefits when the software is used in foreseen and unforeseen ways, potentially for years after the original investment in the software development. For funding, an open-source release that is well documented should be required, with example input and output files. Appropriate funding will provide the incentive and time to release user-friendly software, and minimize the need for others to duplicate the effort. All funded software should be available through a single web site

  18. A software for parameter estimation in dynamic models

    Directory of Open Access Journals (Sweden)

    M. Yuceer

    2008-12-01

    Full Text Available A common problem in dynamic systems is to determine parameters in an equation used to represent experimental data. The goal is to determine the values of model parameters that provide the best fit to measured data, generally based on some type of least squares or maximum likelihood criterion. In the most general case, this requires the solution of a nonlinear and frequently non-convex optimization problem. Some of the available software lack in generality, while others do not provide ease of use. A user-interactive parameter estimation software was needed for identifying kinetic parameters. In this work we developed an integration based optimization approach to provide a solution to such problems. For easy implementation of the technique, a parameter estimation software (PARES has been developed in MATLAB environment. When tested with extensive example problems from literature, the suggested approach is proven to provide good agreement between predicted and observed data within relatively less computing time and iterations.

  19. Use of Student Experiments for Teaching Embedded Software Development Including HW/SW Co-Design

    Science.gov (United States)

    Mitsui, H.; Kambe, H.; Koizumi, H.

    2009-01-01

    Embedded systems have been applied widely, not only to consumer products and industrial machines, but also to new applications such as ubiquitous or sensor networking. The increasing role of software (SW) in embedded system development has caused a great demand for embedded SW engineers, and university education for embedded SW engineering has…

  20. Developing Teaching Material Software Assisted for Numerical Methods

    Science.gov (United States)

    Handayani, A. D.; Herman, T.; Fatimah, S.

    2017-09-01

    The NCTM vision shows the importance of two things in school mathematics, which is knowing the mathematics of the 21st century and the need to continue to improve mathematics education to answer the challenges of a changing world. One of the competencies associated with the great challenges of the 21st century is the use of help and tools (including IT), such as: knowing the existence of various tools for mathematical activity. One of the significant challenges in mathematical learning is how to teach students about abstract concepts. In this case, technology in the form of mathematics learning software can be used more widely to embed the abstract concept in mathematics. In mathematics learning, the use of mathematical software can make high level math activity become easier accepted by student. Technology can strengthen student learning by delivering numerical, graphic, and symbolic content without spending the time to calculate complex computing problems manually. The purpose of this research is to design and develop teaching materials software assisted for numerical method. The process of developing the teaching material starts from the defining step, the process of designing the learning material developed based on information obtained from the step of early analysis, learners, materials, tasks that support then done the design step or design, then the last step is the development step. The development of teaching materials software assisted for numerical methods is valid in content. While validator assessment for teaching material in numerical methods is good and can be used with little revision.

  1. Spacecraft Interactions Modeling and Post-Mission Data Analysis

    National Research Council Canada - National Science Library

    Bonito, N

    1996-01-01

    Software systems were designed and developed for data management, data acquisition, interactive visualization and analysis of solar arrays, tethered objects, and large object space plasma interactions...

  2. The TSO Logic and G2 Software Product

    Science.gov (United States)

    Davis, Derrick D.

    2014-01-01

    This internship assignment for spring 2014 was at John F. Kennedy Space Center (KSC), in NASAs Engineering and Technology (NE) group in support of the Control and Data Systems Division (NE-C) within the Systems Hardware Engineering Branch. (NEC-4) The primary focus was in system integration and benchmarking utilizing two separate computer software products. The first half of this 2014 internship is spent in assisting NE-C4s Electronics and Embedded Systems Engineer, Kelvin Ruiz and fellow intern Scott Ditto with the evaluation of a newly piece of software, called G2. Its developed by the Gensym Corporation and introduced to the group as a tool used in monitoring launch environments. All fellow interns and employees of the G2 group have been working together in order to better understand the significance of the G2 application and how KSC can benefit from its capabilities. The second stage of this Spring project is to assist with an ongoing integration of a benchmarking tool, developed by a group of engineers from a Canadian based organization known as TSO Logic. Guided by NE-C4s Computer Engineer, Allen Villorin, NASA 2014 interns put forth great effort in helping to integrate TSOs software into the Spaceport Processing Systems Development Laboratory (SPSDL) for further testing and evaluating. The TSO Logic group claims that their software is designed for, monitoring and reducing energy consumption at in-house server farms and large data centers, allows data centers to control the power state of servers, without impacting availability or performance and without changes to infrastructure and the focus of the assignment is to test this theory. TSOs Aaron Rallo Founder and CEO, and Chris Tivel CTO, both came to KSC to assist with the installation of their software in the SPSDL laboratory. TSOs software is installed onto 24 individual workstations running three different operating systems. The workstations were divided into three groups of 8 with each group having its

  3. Pragmatic Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan; Jensen, Rikke Hagensby

    2014-01-01

    We understand software innovation as concerned with introducing innovation into the development of software intensive systems, i.e. systems in which software development and/or integration are dominant considerations. Innovation is key in almost any strategy for competitiveness in existing markets......, for creating new markets, or for curbing rising public expenses, and software intensive systems are core elements in most such strategies. Software innovation therefore is vital for about every sector of the economy. Changes in software technologies over the last decades have opened up for experimentation......, learning, and flexibility in ongoing software projects, but how can this change be used to facilitate software innovation? How can a team systematically identify and pursue opportunities to create added value in ongoing projects? In this paper, we describe Deweyan pragmatism as the philosophical foundation...

  4. The development and application of composite complexity models and a relative complexity metric in a software maintenance environment

    Science.gov (United States)

    Hops, J. M.; Sherif, J. S.

    1994-01-01

    A great deal of effort is now being devoted to the study, analysis, prediction, and minimization of software maintenance expected cost, long before software is delivered to users or customers. It has been estimated that, on the average, the effort spent on software maintenance is as costly as the effort spent on all other software costs. Software design methods should be the starting point to aid in alleviating the problems of software maintenance complexity and high costs. Two aspects of maintenance deserve attention: (1) protocols for locating and rectifying defects, and for ensuring that noe new defects are introduced in the development phase of the software process; and (2) protocols for modification, enhancement, and upgrading. This article focuses primarily on the second aspect, the development of protocols to help increase the quality and reduce the costs associated with modifications, enhancements, and upgrades of existing software. This study developed parsimonious models and a relative complexity metric for complexity measurement of software that were used to rank the modules in the system relative to one another. Some success was achieved in using the models and the relative metric to identify maintenance-prone modules.

  5. Advantages and Disadvantages in Image Processing with Free Software in Radiology.

    Science.gov (United States)

    Mujika, Katrin Muradas; Méndez, Juan Antonio Juanes; de Miguel, Andrés Framiñan

    2018-01-15

    Currently, there are sophisticated applications that make it possible to visualize medical images and even to manipulate them. These software applications are of great interest, both from a teaching and a radiological perspective. In addition, some of these applications are known as Free Open Source Software because they are free and the source code is freely available, and therefore it can be easily obtained even on personal computers. Two examples of free open source software are Osirix Lite® and 3D Slicer®. However, this last group of free applications have limitations in its use. For the radiological field, manipulating and post-processing images is increasingly important. Consequently, sophisticated computing tools that combine software and hardware to process medical images are needed. In radiology, graphic workstations allow their users to process, review, analyse, communicate and exchange multidimensional digital images acquired with different image-capturing radiological devices. These radiological devices are basically CT (Computerised Tomography), MRI (Magnetic Resonance Imaging), PET (Positron Emission Tomography), etc. Nevertheless, the programs included in these workstations have a high cost which always depends on the software provider and is always subject to its norms and requirements. With this study, we aim to present the advantages and disadvantages of these radiological image visualization systems in the advanced management of radiological studies. We will compare the features of the VITREA2® and AW VolumeShare 5® radiology workstation with free open source software applications like OsiriX® and 3D Slicer®, with examples from specific studies.

  6. A Scalable Cyberinfrastructure for Interactive Visualization of Terascale Microscopy Data.

    Science.gov (United States)

    Venkat, A; Christensen, C; Gyulassy, A; Summa, B; Federer, F; Angelucci, A; Pascucci, V

    2016-08-01

    The goal of the recently emerged field of connectomics is to generate a wiring diagram of the brain at different scales. To identify brain circuitry, neuroscientists use specialized microscopes to perform multichannel imaging of labeled neurons at a very high resolution. CLARITY tissue clearing allows imaging labeled circuits through entire tissue blocks, without the need for tissue sectioning and section-to-section alignment. Imaging the large and complex non-human primate brain with sufficient resolution to identify and disambiguate between axons, in particular, produces massive data, creating great computational challenges to the study of neural circuits. Researchers require novel software capabilities for compiling, stitching, and visualizing large imagery. In this work, we detail the image acquisition process and a hierarchical streaming platform, ViSUS, that enables interactive visualization of these massive multi-volume datasets using a standard desktop computer. The ViSUS visualization framework has previously been shown to be suitable for 3D combustion simulation, climate simulation and visualization of large scale panoramic images. The platform is organized around a hierarchical cache oblivious data layout, called the IDX file format, which enables interactive visualization and exploration in ViSUS, scaling to the largest 3D images. In this paper we showcase the VISUS framework used in an interactive setting with the microscopy data.

  7. Software testability and its application to avionic software

    Science.gov (United States)

    Voas, Jeffrey M.; Miller, Keith W.; Payne, Jeffery E.

    1993-01-01

    Randomly generated black-box testing is an established yet controversial method of estimating software reliability. Unfortunately, as software applications have required higher reliabilities, practical difficulties with black-box testing have become increasingly problematic. These practical problems are particularly acute in life-critical avionics software, where requirements of 10 exp -7 failures per hour of system reliability can translate into a probability of failure (POF) of perhaps 10 exp -9 or less for each individual execution of the software. This paper describes the application of one type of testability analysis called 'sensitivity analysis' to B-737 avionics software; one application of sensitivity analysis is to quantify whether software testing is capable of detecting faults in a particular program and thus whether we can be confident that a tested program is not hiding faults. We so 80 by finding the testabilities of the individual statements of the program, and then use those statement testabilities to find the testabilities of the functions and modules. For the B-737 system we analyzed, we were able to isolate those functions that are more prone to hide errors during system/reliability testing.

  8. The software life cycle

    CERN Document Server

    Ince, Darrel

    1990-01-01

    The Software Life Cycle deals with the software lifecycle, that is, what exactly happens when software is developed. Topics covered include aspects of software engineering, structured techniques of software development, and software project management. The use of mathematics to design and develop computer systems is also discussed. This book is comprised of 20 chapters divided into four sections and begins with an overview of software engineering and software development, paying particular attention to the birth of software engineering and the introduction of formal methods of software develop

  9. Development of a Software Based Firewall System for Computer Network Traffic Control

    Directory of Open Access Journals (Sweden)

    Ikhajamgbe OYAKHILOME

    2009-12-01

    Full Text Available The connection of an internal network to an external network such as Internet has made it vulnerable to attacks. One class of network attack is unauthorized penetration into network due to the openness of networks. It is possible for hackers to sum access to an internal network, this pose great danger to the network and network resources. Our objective and major concern of network design was to build a secured network, based on software firewall that ensured the integrity and confidentiality of information on the network. We studied several mechanisms to achieve this; one of such mechanism is the implementation of firewall system as a network defence. Our developed firewall has the ability to determine which network traffic should be allowed in or out of the network. Part of our studied work was also channelled towards a comprehensive study of hardware firewall security system with the aim of developing this software based firewall system. Our software firewall goes a long way in protecting an internal network from external unauthorized traffic penetration. We included an anti virus software which is lacking in most firewalls.

  10. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 14

    Science.gov (United States)

    1996-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  11. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 15

    Science.gov (United States)

    1997-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  12. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 13

    Science.gov (United States)

    1995-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  13. Calidad del software: camino hacia una verdadera industria del software

    Directory of Open Access Journals (Sweden)

    Saulo Ernesto Rojas Salamanca

    1999-07-01

    Full Text Available El software es quizá uno de los productos de la ingeniería que más ha evolucionado en muy poco tiempo, pasando desde el software empírico o artesanal hasta llegar al software desarrollado bajo los principios y herramientas de la ingeniería del software. Sin embargo, dentro de estos cambios, las personas encargadas de la elaboración del software se han enfrentado a problemas muy comunes: unos debido a la exigencia cada vez mayor en la capacidad de resultados del software, debido al permanente cambio de condiciones lo que aumenta su complejidad y obsolescencia; y otros, debido a la carencia de herramientas adecuadas y estándares de tipo organizacional encaminados al mejoramiento de los procesos en el desarrollo del software. Hacia la búsqueda de mecanismos de solución de estos últimos problemas se orienta este artículo...

  14. Design of Multithreaded Software The Entity-Life Modeling Approach

    CERN Document Server

    Sandén, Bo I

    2011-01-01

    This book assumes familiarity with threads (in a language such as Ada, C#, or Java) and introduces the entity-life modeling (ELM) design approach for certain kinds of multithreaded software. ELM focuses on "reactive systems," which continuously interact with the problem environment. These "reactive systems" include embedded systems, as well as such interactive systems as cruise controllers and automated teller machines.Part I covers two fundamentals: program-language thread support and state diagramming. These are necessary for understanding ELM and are provided primarily for reference. P

  15. ITOS to EDGE "Bridge" Software for Morpheus Lunar/Martian Vehicle

    Science.gov (United States)

    Hirsh, Robert; Fuchs, Jordan

    2012-01-01

    My project Involved Improving upon existing software and writing new software for the Project Morpheus Team. Specifically, I created and updated Integrated Test and Operations Systems (ITOS) user Interfaces for on-board Interaction with the vehicle during archive playback as well as live streaming data. These Interfaces are an integral part of the testing and operations for the Morpheus vehicle providing any and all information from the vehicle to evaluate instruments and insure coherence and control of the vehicle during Morpheus missions. I also created a "bridge" program for Interfacing "live" telemetry data with the Engineering DOUG Graphics Engine (EDGE) software for a graphical (standalone or VR dome) view of live Morpheus nights or archive replays, providing graphical representation of vehicle night and movement during subsequent tests and in real missions.

  16. Real-Time Extended Interface Automata for Software Testing Cases Generation

    Science.gov (United States)

    Yang, Shunkun; Xu, Jiaqi; Man, Tianlong; Liu, Bin

    2014-01-01

    Testing and verification of the interface between software components are particularly important due to the large number of complex interactions, which requires the traditional modeling languages to overcome the existing shortcomings in the aspects of temporal information description and software testing input controlling. This paper presents the real-time extended interface automata (RTEIA) which adds clearer and more detailed temporal information description by the application of time words. We also establish the input interface automaton for every input in order to solve the problems of input controlling and interface covering nimbly when applied in the software testing field. Detailed definitions of the RTEIA and the testing cases generation algorithm are provided in this paper. The feasibility and efficiency of this method have been verified in the testing of one real aircraft braking system. PMID:24892080

  17. HeteroGenius: A Framework for Hybrid Analysis of Heterogeneous Software Specifications

    Directory of Open Access Journals (Sweden)

    Manuel Giménez

    2014-01-01

    Full Text Available Nowadays, software artifacts are ubiquitous in our lives being an essential part of home appliances, cars, cell phones, and even in more critical activities like aeronautics and health sciences. In this context software failures may produce enormous losses, either economical or, in the worst case, in human lives. Software analysis is an area in software engineering concerned with the application of diverse techniques in order to prove the absence of errors in software pieces. In many cases different analysis techniques are applied by following specific methodological combinations that ensure better results. These interactions between tools are usually carried out at the user level and it is not supported by the tools. In this work we present HeteroGenius, a framework conceived to develop tools that allow users to perform hybrid analysis of heterogeneous software specifications. HeteroGenius was designed prioritising the possibility of adding new specification languages and analysis tools and enabling a synergic relation of the techniques under a graphical interface satisfying several well-known usability enhancement criteria. As a case-study we implemented the functionality of Dynamite on top of HeteroGenius.

  18. Using software metrics and software reliability models to attain acceptable quality software for flight and ground support software for avionic systems

    Science.gov (United States)

    Lawrence, Stella

    1992-01-01

    This paper is concerned with methods of measuring and developing quality software. Reliable flight and ground support software is a highly important factor in the successful operation of the space shuttle program. Reliability is probably the most important of the characteristics inherent in the concept of 'software quality'. It is the probability of failure free operation of a computer program for a specified time and environment.

  19. Software Radar Technology

    Directory of Open Access Journals (Sweden)

    Tang Jun

    2015-08-01

    Full Text Available In this paper, the definition and the key features of Software Radar, which is a new concept, are proposed and discussed. We consider the development of modern radar system technology to be divided into three stages: Digital Radar, Software radar and Intelligent Radar, and the second stage is just commencing now. A Software Radar system should be a combination of various modern digital modular components conformed to certain software and hardware standards. Moreover, a software radar system with an open system architecture supporting to decouple application software and low level hardware would be easy to adopt "user requirements-oriented" developing methodology instead of traditional "specific function-oriented" developing methodology. Compared with traditional Digital Radar, Software Radar system can be easily reconfigured and scaled up or down to adapt to the changes of requirements and technologies. A demonstration Software Radar signal processing system, RadarLab 2.0, which has been developed by Tsinghua University, is introduced in this paper and the suggestions for the future development of Software Radar in China are also given in the conclusion.

  20. Applying neural networks as software sensors for enzyme engineering.

    Science.gov (United States)

    Linko, S; Zhu, Y H; Linko, P

    1999-04-01

    The on-line control of enzyme-production processes is difficult, owing to the uncertainties typical of biological systems and to the lack of suitable on-line sensors for key process variables. For example, intelligent methods to predict the end point of fermentation could be of great economic value. Computer-assisted control based on artificial-neural-network models offers a novel solution in such situations. Well-trained feedforward-backpropagation neural networks can be used as software sensors in enzyme-process control; their performance can be affected by a number of factors.

  1. Implementation and application of the gaussian decomposition software for NaI gamma-ray spectrometry data

    International Nuclear Information System (INIS)

    Zeng Lihui; Wang Nanping; Tian Gui

    2012-01-01

    In order to extract the information of peaks in different energy from the data of overlapping peaks in environmental gamma spectrometer, a spectrum data Gaussian decomposition software was designed based on least-square Gaussian fitting method. The interface of this software is friendly, it can complete the decomposition of overlapping peaks in gamma spectrometer quickly by the way of man-machines interactive. The result of field measured data decomposed by this software indicates that the Gaussian decomposition software can efficiently extract 137 Cs spectra from over lapping peaks, which has significance to estimate the human nuclide contamination in the environment. (authors)

  2. Application of Metric-based Software Reliability Analysis to Example Software

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Smidts, Carol

    2008-07-01

    The software reliability of TELLERFAST ATM software is analyzed by using two metric-based software reliability analysis methods, a state transition diagram-based method and a test coverage-based method. The procedures for the software reliability analysis by using the two methods and the analysis results are provided in this report. It is found that the two methods have a relation of complementary cooperation, and therefore further researches on combining the two methods to reflect the benefit of the complementary cooperative effect to the software reliability analysis are recommended

  3. Automated Security Testing of Web Widget Interactions

    NARCIS (Netherlands)

    Bezemer, C.P.; Mesbah, A.; Van Deursen, A.

    2009-01-01

    This paper is a pre-print of: Cor-Paul Bezemer, Ali Mesbah, and Arie van Deursen. Automated Security Testing of Web Widget Interactions. In Proceedings of the 7th joint meeting of the European Software Engineering Conference and the ACM SIGSOFT Symposium on the Foundations of Software Engineering

  4. [Development and practice evaluation of blood acid-base imbalance analysis software].

    Science.gov (United States)

    Chen, Bo; Huang, Haiying; Zhou, Qiang; Peng, Shan; Jia, Hongyu; Ji, Tianxing

    2014-11-01

    To develop a blood gas, acid-base imbalance analysis computer software to diagnose systematically, rapidly, accurately and automatically determine acid-base imbalance type, and evaluate the clinical application. Using VBA programming language, a computer aided diagnostic software for the judgment of acid-base balance was developed. The clinical data of 220 patients admitted to the Second Affiliated Hospital of Guangzhou Medical University were retrospectively analyzed. The arterial blood gas [pH value, HCO(3)(-), arterial partial pressure of carbon dioxide (PaCO₂)] and electrolytes included data (Na⁺ and Cl⁻) were collected. Data were entered into the software for acid-base imbalances judgment. At the same time the data generation was calculated manually by H-H compensation formula for determining the type of acid-base imbalance. The consistency of judgment results from software and manual calculation was evaluated, and the judgment time of two methods was compared. The clinical diagnosis of the types of acid-base imbalance for the 220 patients: 65 cases were normal, 90 cases with simple type, mixed type in 41 cases, and triplex type in 24 cases. The accuracy of the judgment results of the normal and triplex types from computer software compared with which were calculated manually was 100%, the accuracy of the simple type judgment was 98.9% and 78.0% for the mixed type, and the total accuracy was 95.5%. The Kappa value of judgment result from software and manual judgment was 0.935, P=0.000. It was demonstrated that the consistency was very good. The time for software to determine acid-base imbalances was significantly shorter than the manual judgment (seconds:18.14 ± 3.80 vs. 43.79 ± 23.86, t=7.466, P=0.000), so the method of software was much faster than the manual method. Software judgment can replace manual judgment with the characteristics of rapid, accurate and convenient, can improve work efficiency and quality of clinical doctors and has great

  5. Module Testing Techniques for Nuclear Safety Critical Software Using LDRA Testing Tool

    International Nuclear Information System (INIS)

    Moon, Kwon-Ki; Kim, Do-Yeon; Chang, Hoon-Seon; Chang, Young-Woo; Yun, Jae-Hee; Park, Jee-Duck; Kim, Jae-Hack

    2006-01-01

    The safety critical software in the I and C systems of nuclear power plants requires high functional integrity and reliability. To achieve those requirement goals, the safety critical software should be verified and tested according to related codes and standards through verification and validation (V and V) activities. The safety critical software testing is performed at various stages during the development of the software, and is generally classified as three major activities: module testing, system integration testing, and system validation testing. Module testing involves the evaluation of module level functions of hardware and software. System integration testing investigates the characteristics of a collection of modules and aims at establishing their correct interactions. System validation testing demonstrates that the complete system satisfies its functional requirements. In order to generate reliable software and reduce high maintenance cost, it is important that software testing is carried out at module level. Module testing for the nuclear safety critical software has rarely been performed by formal and proven testing tools because of its various constraints. LDRA testing tool is a widely used and proven tool set that provides powerful source code testing and analysis facilities for the V and V of general purpose software and safety critical software. Use of the tool set is indispensable where software is required to be reliable and as error-free as possible, and its use brings in substantial time and cost savings, and efficiency

  6. Quad channel software defined receiver for passive radar application

    Directory of Open Access Journals (Sweden)

    Pető Tamás

    2017-03-01

    Full Text Available In recent times the growing utilization of the electromagnetic environment brings the passive radar researches more and more to the fore. For the utilization of the wide range of illuminators of opportunity the application of wideband radio receivers is required. At the same time the multichannel receiver structure has also critical importance in target direction finding and interference suppression. This paper presents the development of a multichannel software defined receiver specifically for passive radar applications. One of the relevant feature of the developed receiver platform is its up-to-date SoC (System on hip based structure, which greatly enhance the integration and signal processing capacity of the system, all while keeping the costs low. The software defined operation of the discussed receiver system is demonstrated with using DVB-T (Digital Video Broadcast – Terrestrial signal as illuminator of opportunity. During this demonstration the multichannel capabilities of the realized system are also tested with real data using direction finding and beamforming algorithms.

  7. Possibilities and limitations of applying software reliability growth models to safety-critical software

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Jang, Seung Cheol; Ha, Jae Joo

    2007-01-01

    It is generally known that software reliability growth models such as the Jelinski-Moranda model and the Goel-Okumoto's Non-Homogeneous Poisson Process (NHPP) model cannot be applied to safety-critical software due to a lack of software failure data. In this paper, by applying two of the most widely known software reliability growth models to sample software failure data, we demonstrate the possibility of using the software reliability growth models to prove the high reliability of safety-critical software. The high sensitivity of a piece of software's reliability to software failure data, as well as a lack of sufficient software failure data, is also identified as a possible limitation when applying the software reliability growth models to safety-critical software

  8. Interface-based software testing

    Directory of Open Access Journals (Sweden)

    Aziz Ahmad Rais

    2016-10-01

    Full Text Available Software quality is determined by assessing the characteristics that specify how it should work, which are verified through testing. If it were possible to touch, see, or measure software, it would be easier to analyze and prove its quality. Unfortunately, software is an intangible asset, which makes testing complex. This is especially true when software quality is not a question of particular functions that can be tested through a graphical user interface. The primary objective of software architecture is to design quality of software through modeling and visualization. There are many methods and standards that define how to control and manage quality. However, many IT software development projects still fail due to the difficulties involved in measuring, controlling, and managing software quality. Software quality failure factors are numerous. Examples include beginning to test software too late in the development process, or failing properly to understand, or design, the software architecture and the software component structure. The goal of this article is to provide an interface-based software testing technique that better measures software quality, automates software quality testing, encourages early testing, and increases the software’s overall testability

  9. Stories Matter: Conceptual Challenges in the Development of Oral History Database Building Software

    Directory of Open Access Journals (Sweden)

    Erin Jessee

    2010-11-01

    Full Text Available Stories Matter is new oral history database building software designed by an interdisciplinary team of oral historians and a software engineer affiliated with the Centre for Oral History and Digital Storytelling at Concordia University in Montreal, Quebec, Canada. It encourages a shift away from transcription, enabling oral historians to continue to interact with their interviews in an efficient manner without compromising the greater life history context of their interviewees. This article addresses some of the conceptual challenges that arose when developing this software. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs110119

  10. Agile software assessment

    OpenAIRE

    Nierstrasz Oscar; Lungu Mircea

    2012-01-01

    Informed decision making is a critical activity in software development but it is poorly supported by common development environments which focus mainly on low level programming tasks. We posit the need for agile software assessment which aims to support decision making by enabling rapid and effective construction of software models and custom analyses. Agile software assessment entails gathering and exploiting the broader context of software information related to the system at hand as well ...

  11. Wingridder - an interactive grid generator for TOUGH2

    International Nuclear Information System (INIS)

    Pan, Lehua

    2003-01-01

    The TOUGH (Transport Of Unsaturated Groundwater and Heat) family of codes has great flexibility in handling the variety of grid information required to describe a complex subsurface system. However, designing and generating such a grid can be a tedious and error-prone process. This is especially true when the number of cells and connections is very large. As a user-friendly, efficient, and effective grid generating software, WinGridder has been developed for designing, generating, and visualizing (at various spatial scales) numerical grids used in reservoir simulations and groundwater modeling studies. It can save mesh files for TOUGH family codes. It can also output additional grid information for various purposes in either graphic format or plain text format. It has user-friendly graphical user interfaces, along with an easy-to-use interactive design and plot tools. Many important features, such as inclined faults and offset, layering structure, local refinements, and embedded engineering structures, can be represented in the grid

  12. Great Apes

    Science.gov (United States)

    Sleeman, Jonathan M.; Cerveny, Shannon

    2014-01-01

    Anesthesia of great apes is often necessary to conduct diagnostic analysis, provide therapeutics, facilitate surgical procedures, and enable transport and translocation for conservation purposes. Due to the stress of remote delivery injection of anesthetic agents, recent studies have focused on oral delivery and/or transmucosal absorption of preanesthetic and anesthetic agents. Maintenance of the airway and provision of oxygen is an important aspect of anesthesia in great ape species. The provision of analgesia is an important aspect of the anesthesia protocol for any procedure involving painful stimuli. Opioids and nonsteroidal anti-inflammatory drugs (NSAIDs) are often administered alone, or in combination to provide multi-modal analgesia. There is increasing conservation management of in situ great ape populations, which has resulted in the development of field anesthesia techniques for free-living great apes for the purposes of translocation, reintroduction into the wild, and clinical interventions.

  13. Overview of the software for the Telemation/Sandia unattended video surveillance system

    International Nuclear Information System (INIS)

    Merillat, P.D.

    1979-10-01

    A microprocessor has been used to provide the major control functions in the Telemation/Sandia unattended video surveillance system. The software in the microprocessor provides control of the various hardware components and provides the capability of interactive communications with the operator. This document, in conjunction with the commented source listing, defines the philosophy and function of the software. It is assumed that the reader is familiar with the RCA 1802 COSMAC microprocessor and has a reasonable computer science background

  14. The manual of a computer software 'FBR Plant Planning Design Prototype System'

    International Nuclear Information System (INIS)

    2003-10-01

    This is a manual of a computer software 'FBR Plant Planning Design Prototype System', which enables users to conduct case studies of deviated FBR design concepts based on 'MONJU'. The calculations simply proceed as the user clicks displayed buttons, therefore step-by-step explanation is supposed not be necessary. The following pages introduce only particular features of this software, i.e, each interactive screens, functions of buttons and consequences after clicks, and the quitting procedure. (author)

  15. Framework for Small-Scale Experiments in Software Engineering: Guidance and Control Software Project: Software Engineering Case Study

    Science.gov (United States)

    Hayhurst, Kelly J.

    1998-01-01

    Software is becoming increasingly significant in today's critical avionics systems. To achieve safe, reliable software, government regulatory agencies such as the Federal Aviation Administration (FAA) and the Department of Defense mandate the use of certain software development methods. However, little scientific evidence exists to show a correlation between software development methods and product quality. Given this lack of evidence, a series of experiments has been conducted to understand why and how software fails. The Guidance and Control Software (GCS) project is the latest in this series. The GCS project is a case study of the Requirements and Technical Concepts for Aviation RTCA/DO-178B guidelines, Software Considerations in Airborne Systems and Equipment Certification. All civil transport airframe and equipment vendors are expected to comply with these guidelines in building systems to be certified by the FAA for use in commercial aircraft. For the case study, two implementations of a guidance and control application were developed to comply with the DO-178B guidelines for Level A (critical) software. The development included the requirements, design, coding, verification, configuration management, and quality assurance processes. This paper discusses the details of the GCS project and presents the results of the case study.

  16. Statistical Software Engineering

    Science.gov (United States)

    1998-04-13

    multiversion software subject to coincident errors. IEEE Trans. Software Eng. SE-11:1511-1517. Eckhardt, D.E., A.K Caglayan, J.C. Knight, L.D. Lee, D.F...J.C. and N.G. Leveson. 1986. Experimental evaluation of the assumption of independence in multiversion software. IEEE Trans. Software

  17. The threshold contrast thickness evaluated with different CDMAM phantoms and software

    Directory of Open Access Journals (Sweden)

    Fabiszewska Ewa

    2016-03-01

    Full Text Available The image quality in digital mammography is described by specifying the thickness and diameter of disks with threshold visibility. The European Commission recommends the CDMAM phantom as a tool to evaluate threshold contrast visibility in digital mammography [1, 2]. Inaccuracy of the manufacturing process of CDMAM 3.4 phantoms (Artinis Medical System BV, as well as differences between software used to analyze the images, may lead to discrepancies in the evaluation of threshold contrast visibility. The authors of this work used three CDMAM 3.4 phantoms with serial numbers 1669, 1840, and 1841 and two mammography systems of the same manufacturer with an identical types of detectors. The images were analyzed with EUREF software (version 1.5.5 with CDCOM 1.6. exe file and Artinis software (version 1.2 with CDCOM 1.6. exe file. The differences between the observed thicknesses of the threshold contrast structures, which were caused by differences between the CDMAM 3.4 phantoms, were not reproduced in the same way on two mammography units of the same type. The thickness reported by the Artinis software (version 1.2 with CDCOM 1.6. exe file was generally greater than the one determined by the EUREF software (version 1.5.5 with CDCOM 1.6. exe file, but the ratio of the results depended on the phantom and diameter of the structure. It was not possible to establish correction factors, which would allow correction of the differences between the results obtained for different CDMAM 3.4 phantoms, or to correct the differences between software. Great care must be taken when results of the tests performed with different CDMAM 3.4 phantoms and with different software application are interpreted.

  18. Towards a mature measurement environment: Creating a software engineering research environment

    Science.gov (United States)

    Basili, Victor R.

    1990-01-01

    Software engineering researchers are building tools, defining methods, and models; however, there are problems with the nature and style of the research. The research is typically bottom-up, done in isolation so the pieces cannot be easily logically or physically integrated. A great deal of the research is essentially the packaging of a particular piece of technology with little indication of how the work would be integrated with other prices of research. The research is not aimed at solving the real problems of software engineering, i.e., the development and maintenance of quality systems in a productive manner. The research results are not evaluated or analyzed via experimentation or refined and tailored to the application environment. Thus, it cannot be easily transferred into practice. Because of these limitations we have not been able to understand the components of the discipline as a coherent whole and the relationships between various models of the process and product. What is needed is a top down experimental, evolutionary framework in which research can be focused, logically and physically integrated to produce quality software productively, and evaluated and tailored to the application environment. This implies the need for experimentation, which in turn implies the need for a laboratory that is associated with the artifact we are studying. This laboratory can only exist in an environment where software is being built, i.e., as part of a real software development and maintenance organization. Thus, we propose that Software Engineering Laboratory (SEL) type activities exist in all organizations to support software engineering research. We describe the SEL from a researcher's point of view, and discuss the corporate and government benefits of the SEL. The discussion focuses on the benefits to the research community.

  19. Addressing Software Engineering Issues in Real-Time Software ...

    African Journals Online (AJOL)

    Addressing Software Engineering Issues in Real-Time Software ... systems, manufacturing process, process control, military, space exploration, and ... but also physical properties such as timeliness, Quality of Service and reliability.

  20. Software Replica of Minimal Living Processes

    Science.gov (United States)

    Bersini, Hugues

    2010-04-01

    There is a long tradition of software simulations in theoretical biology to complement pure analytical mathematics which are often limited to reproduce and understand the self-organization phenomena resulting from the non-linear and spatially grounded interactions of the huge number of diverse biological objects. Since John Von Neumann and Alan Turing pioneering works on self-replication and morphogenesis, proponents of artificial life have chosen to resolutely neglecting a lot of materialistic and quantitative information deemed not indispensable and have focused on the rule-based mechanisms making life possible, supposedly neutral with respect to their underlying material embodiment. Minimal life begins at the intersection of a series of processes which need to be isolated, differentiated and duplicated as such in computers. Only software developments and running make possible to understand the way these processes are intimately interconnected in order for life to appear at the crossroad. In this paper, I will attempt to set out the history of life as the disciples of artificial life understand it, by placing these different lessons on a temporal and causal axis, showing which one is indispensable to the appearance of the next and how does it connect to the next. I will discuss the task of artificial life as setting up experimental software platforms where these different lessons, whether taken in isolation or together, are tested, simulated, and, more systematically, analyzed. I will sketch some of these existing software platforms: chemical reaction networks, Varela’s autopoietic cellular automata, Ganti’s chemoton model, whose running delivers interesting take home messages to open-minded biologists.

  1. Composing simulations using persistent software components

    Energy Technology Data Exchange (ETDEWEB)

    Holland, J.V.; Michelsen, R.E.; Powell, D.R.; Upton, S.C.; Thompson, D.R.

    1999-03-01

    The traditional process for developing large-scale simulations is cumbersome, time consuming, costly, and in some cases, inadequate. The topics of software components and component-based software engineering are being explored by software professionals in academic and industrial settings. A component is a well-delineated, relatively independent, and replaceable part of a software system that performs a specific function. Many researchers have addressed the potential to derive a component-based approach to simulations in general, and a few have focused on military simulations in particular. In a component-based approach, functional or logical blocks of the simulation entities are represented as coherent collections of components satisfying explicitly defined interface requirements. A simulation is a top-level aggregate comprised of a collection of components that interact with each other in the context of a simulated environment. A component may represent a simulation artifact, an agent, or any entity that can generated events affecting itself, other simulated entities, or the state of the system. The component-based approach promotes code reuse, contributes to reducing time spent validating or verifying models, and promises to reduce the cost of development while still delivering tailored simulations specific to analysis questions. The Integrated Virtual Environment for Simulation (IVES) is a composition-centered framework to achieve this potential. IVES is a Java implementation of simulation composition concepts developed at Los Alamos National Laboratory for use in several application domains. In this paper, its use in the military domain is demonstrated via the simulation of dismounted infantry in an urban environment.

  2. Investigation of interfacing NPP simulators with PSCAD software

    International Nuclear Information System (INIS)

    Zahedi, P.

    2006-01-01

    This paper demonstrates a new method of interfacing a NPP simulator with the PSCAD software so that the interaction between the plant dynamics and the grid dynamics can be studied. Through this method, data extracted from NPP simulator, as an object, will be communicated to a server running PSCAD software. This paper will illustrate the use of Remote Procedure Calls (RPC) as a distributed, client-server based application used to establish a connection between NPP simulator running on UNIX server and PSCAD running on a PC. Using this method, the called procedure need not exist in the same address space as the calling procedure. Communication protocols available in the NPP simulator are used to export data from the simulator. (author)

  3. Commercial Literacy Software.

    Science.gov (United States)

    Balajthy, Ernest

    1997-01-01

    Presents the first year's results of a continuing project to monitor the availability of software of relevance for literacy education purposes. Concludes there is an enormous amount of software available for use by teachers of reading and literacy--whereas drill-and-practice software is the largest category of software available, large numbers of…

  4. Software Quality Assurance Metrics

    Science.gov (United States)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  5. A foundation for translating user interaction designs into OSF/Motif-based software

    OpenAIRE

    Hinson, Kenneth Paul

    1994-01-01

    The user interface development process occurs in a behavioral domain and in a constructional domain. The development process in the behavioral domain focuses on the "look and feel" of the user interface and its behavior in response to user actions. The development process in the constructional domain focuses on developing software to implement the user interface. Although one may attempt to design a user interface from a constructional view, it is important to concentrate desig...

  6. Software Quality Perceptions of Stakeholders Involved in the Software Development Process

    Science.gov (United States)

    Padmanabhan, Priya

    2013-01-01

    Software quality is one of the primary determinants of project management success. Stakeholders involved in software development widely agree that quality is important (Barney and Wohlin 2009). However, they may differ on what constitutes software quality, and which of its attributes are more important than others. Although, software quality…

  7. Application programming in C# environment with recorded user software interactions and its application in autopilot of VMAT/IMRT treatment planning.

    Science.gov (United States)

    Wang, Henry; Xing, Lei

    2016-11-08

    An autopilot scheme of volumetric-modulated arc therapy (VMAT)/intensity-modulated radiation therapy (IMRT) planning with the guidance of prior knowl-edge is established with recorded interactions between a planner and a commercial treatment planning system (TPS). Microsoft (MS) Visual Studio Coded UI is applied to record some common planner-TPS interactions as subroutines. The TPS used in this study is a Windows-based Eclipse system. The interactions of our application program with Eclipse TPS are realized through a series of subrou-tines obtained by prerecording the mouse clicks or keyboard strokes of a planner in operating the TPS. A strategy to autopilot Eclipse VMAT/IMRT plan selection process is developed as a specific example of the proposed "scripting" method. The autopiloted planning is navigated by a decision function constructed with a reference plan that has the same prescription and similar anatomy with the case at hand. The calculation proceeds by alternating between the Eclipse optimization and the outer-loop optimization independent of the Eclipse. In the C# program, the dosimetric characteristics of a reference treatment plan are used to assess and modify the Eclipse planning parameters and to guide the search for a clinically sensible treatment plan. The approach is applied to plan a head and neck (HN) VMAT case and a prostate IMRT case. Our study demonstrated the feasibility of application programming method in C# environment with recorded interactions of planner-TPS. The process mimics a planner's planning process and automatically provides clinically sensible treatment plans that would otherwise require a large amount of manual trial and error of a planner. The proposed technique enables us to harness a commercial TPS by application programming via the use of recorded human computer interactions and provides an effective tool to greatly facilitate the treatment planning process. © 2016 The Authors.

  8. Oxygen octahedra picker: A software tool to extract quantitative information from STEM images

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Yi, E-mail: y.wang@fkf.mpg.de; Salzberger, Ute; Sigle, Wilfried; Eren Suyolcu, Y.; Aken, Peter A. van

    2016-09-15

    In perovskite oxide based materials and hetero-structures there are often strong correlations between oxygen octahedral distortions and functionality. Thus, atomistic understanding of the octahedral distortion, which requires accurate measurements of atomic column positions, will greatly help to engineer their properties. Here, we report the development of a software tool to extract quantitative information of the lattice and of BO{sub 6} octahedral distortions from STEM images. Center-of-mass and 2D Gaussian fitting methods are implemented to locate positions of individual atom columns. The precision of atomic column distance measurements is evaluated on both simulated and experimental images. The application of the software tool is demonstrated using practical examples. - Highlights: • We report a software tool for mapping atomic positions from HAADF and ABF images. • It enables quantification of both crystal lattice and oxygen octahedral distortions. • We test the measurement accuracy and precision on simulated and experimental images. • It works well for different orientations of perovskite structures and interfaces.

  9. Modelling the critical success factors of agile software development projects in South Africa

    Directory of Open Access Journals (Sweden)

    Tawanda B. Chiyangwa

    2017-10-01

    Full Text Available Background: The continued in failure of agile and traditional software development projects have led to the consideration, attention and dispute to critical success factors that are the aspects which are most vital to make a software engineering methodology fruitful. Although there is an increasing variety of critical success factors and methodologies, the conceptual frameworks which have causal relationship are limited. Objective: The objective of this study was to identify and provide insights into the critical success factors that influence the success of software development projects using agile methodologies in South Africa. Method: Quantitative method of collecting data was used. Data were collected in South Africa through a Web-based survey using structured questionnaires. Results: These results show that organisational factors have a great influence on performance expectancy characteristics. Conclusion: The results of this study discovered a comprehensive model that could provide guidelines to the agile community and to the agile professionals.

  10. Software engineering

    CERN Document Server

    Sommerville, Ian

    2016-01-01

    For courses in computer science and software engineering The Fundamental Practice of Software Engineering Software Engineering introduces readers to the overwhelmingly important subject of software programming and development. In the past few years, computer systems have come to dominate not just our technological growth, but the foundations of our world's major industries. This text seeks to lay out the fundamental concepts of this huge and continually growing subject area in a clear and comprehensive manner. The Tenth Edition contains new information that highlights various technological updates of recent years, providing readers with highly relevant and current information. Sommerville's experience in system dependability and systems engineering guides the text through a traditional plan-based approach that incorporates some novel agile methods. The text strives to teach the innovators of tomorrow how to create software that will make our world a better, safer, and more advanced place to live.

  11. Scientific Visualization of Radio Astronomy Data using Gesture Interaction

    Science.gov (United States)

    Mulumba, P.; Gain, J.; Marais, P.; Woudt, P.

    2015-09-01

    MeerKAT in South Africa (Meer = More Karoo Array Telescope) will require software to help visualize, interpret and interact with multidimensional data. While visualization of multi-dimensional data is a well explored topic, little work has been published on the design of intuitive interfaces to such systems. More specifically, the use of non-traditional interfaces (such as motion tracking and multi-touch) has not been widely investigated within the context of visualizing astronomy data. We hypothesize that a natural user interface would allow for easier data exploration which would in turn lead to certain kinds of visualizations (volumetric, multidimensional). To this end, we have developed a multi-platform scientific visualization system for FITS spectral data cubes using VTK (Visualization Toolkit) and a natural user interface to explore the interaction between a gesture input device and multidimensional data space. Our system supports visual transformations (translation, rotation and scaling) as well as sub-volume extraction and arbitrary slicing of 3D volumetric data. These tasks were implemented across three prototypes aimed at exploring different interaction strategies: standard (mouse/keyboard) interaction, volumetric gesture tracking (Leap Motion controller) and multi-touch interaction (multi-touch monitor). A Heuristic Evaluation revealed that the volumetric gesture tracking prototype shows great promise for interfacing with the depth component (z-axis) of 3D volumetric space across multiple transformations. However, this is limited by users needing to remember the required gestures. In comparison, the touch-based gesture navigation is typically more familiar to users as these gestures were engineered from standard multi-touch actions. Future work will address a complete usability test to evaluate and compare the different interaction modalities against the different visualization tasks.

  12. Configurable software for satellite graphics

    Energy Technology Data Exchange (ETDEWEB)

    Hartzman, P D

    1977-12-01

    An important goal in interactive computer graphics is to provide users with both quick system responses for basic graphics functions and enough computing power for complex calculations. One solution is to have a distributed graphics system in which a minicomputer and a powerful large computer share the work. The most versatile type of distributed system is an intelligent satellite system in which the minicomputer is programmable by the application user and can do most of the work while the large remote machine is used for difficult computations. At New York University, the hardware was configured from available equipment. The level of system intelligence resulted almost completely from software development. Unlike previous work with intelligent satellites, the resulting system had system control centered in the satellite. It also had the ability to reconfigure software during realtime operation. The design of the system was done at a very high level using set theoretic language. The specification clearly illustrated processor boundaries and interfaces. The high-level specification also produced a compact, machine-independent virtual graphics data structure for picture representation. The software was written in a systems implementation language; thus, only one set of programs was needed for both machines. A user can program both machines in a single language. Tests of the system with an application program indicate that is has very high potential. A major result of this work is the demonstration that a gigantic investment in new hardware is not necessary for computing facilities interested in graphics.

  13. Agile Software Development

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  14. Study of fault diagnosis software design for complex system based on fault tree

    International Nuclear Information System (INIS)

    Yuan Run; Li Yazhou; Wang Jianye; Hu Liqin; Wang Jiaqun; Wu Yican

    2012-01-01

    Complex systems always have high-level reliability and safety requirements, and same does their diagnosis work. As a great deal of fault tree models have been acquired during the design and operation phases, a fault diagnosis method which combines fault tree analysis with knowledge-based technology has been proposed. The prototype of fault diagnosis software has been realized and applied to mobile LIDAR system. (authors)

  15. Real-Time Extended Interface Automata for Software Testing Cases Generation

    Directory of Open Access Journals (Sweden)

    Shunkun Yang

    2014-01-01

    Full Text Available Testing and verification of the interface between software components are particularly important due to the large number of complex interactions, which requires the traditional modeling languages to overcome the existing shortcomings in the aspects of temporal information description and software testing input controlling. This paper presents the real-time extended interface automata (RTEIA which adds clearer and more detailed temporal information description by the application of time words. We also establish the input interface automaton for every input in order to solve the problems of input controlling and interface covering nimbly when applied in the software testing field. Detailed definitions of the RTEIA and the testing cases generation algorithm are provided in this paper. The feasibility and efficiency of this method have been verified in the testing of one real aircraft braking system.

  16. Establishing software quality assurance

    International Nuclear Information System (INIS)

    Malsbury, J.

    1983-01-01

    This paper is concerned with four questions about establishing software QA: What is software QA. Why have software QA. What is the role of software QA. What is necessary to ensure the success of software QA

  17. Engagement and Learning through Social Software in Finance: A Retrospective on the "Trading Room" Experience

    Science.gov (United States)

    Jain, Ameeta; Thomson, Dianne; Farley, Alan; Mulready, Pamela

    2012-01-01

    The introduction of a social software blog space called the Trading Room in an undergraduate finance unit generated a great deal of activity to support student learning. A subsequent evaluation of this innovation, viewed through the lens of Activity Theory, demonstrated that students perceived high value in the opportunity it provided for them to…

  18. Software Tools for In-Situ Documentation of Built Heritage

    Science.gov (United States)

    Smars, P.

    2013-07-01

    The paper presents open source software tools developed by the author to facilitate in-situ documentation of architectural and archæological heritage. The design choices are exposed and related to a general issue in conservation and documentation: taking decisions about a valuable object under threat . The questions of level of objectivity is central to the three steps of this process. It is our belief that in-situ documentation has to be favoured in this demanding context, full of potential discoveries. The very powerful surveying techniques in rapid development nowadays enhance our vision but often tend to bring back a critical part of the documentation process to the office. The software presented facilitate a direct treatment of the data on the site. Emphasis is given to flexibility, interoperability and simplicity. Key features of the software are listed and illustrated with examples (3D model of Gothic vaults, analysis of the shape of a column, deformation of a wall, direct interaction with AutoCAD).

  19. Applied software risk management a guide for software project managers

    CERN Document Server

    Pandian, C Ravindranath

    2006-01-01

    Few software projects are completed on time, on budget, and to their original specifications. Focusing on what practitioners need to know about risk in the pursuit of delivering software projects, Applied Software Risk Management: A Guide for Software Project Managers covers key components of the risk management process and the software development process, as well as best practices for software risk identification, risk planning, and risk analysis. Written in a clear and concise manner, this resource presents concepts and practical insight into managing risk. It first covers risk-driven project management, risk management processes, risk attributes, risk identification, and risk analysis. The book continues by examining responses to risk, the tracking and modeling of risks, intelligence gathering, and integrated risk management. It concludes with details on drafting and implementing procedures. A diary of a risk manager provides insight in implementing risk management processes.Bringing together concepts ...

  20. Software maintenance and evolution and automated software engineering

    NARCIS (Netherlands)

    Carver, Jeffrey C.; Serebrenik, Alexander

    2018-01-01

    This issue's column reports on the 33rd International Conference on Software Maintenance and Evolution and 32nd International Conference on Automated Software Engineering. Topics include flaky tests, technical debt, QA bots, and regular expressions.

  1. Can easy-to-use software deliver effective e-learning in dental education? A randomised controlled study.

    Science.gov (United States)

    Woelber, J P; Hilbert, T S; Ratka-Krüger, P

    2012-08-01

    For the production of computer-based learning environments, a wide range of software solutions can be used which differ not only in their functionality but also vary in cost and ease to program. The aim of our study was to evaluate the overall efficiency and student's perception of two case-based e-learning programs that were produced with either an easy-to-use or a complex software. Eighty-five dental students were randomly assigned to one of two experimental groups. One group studied with a laborious, high-interactive e-learning program (complex-software group). The second group studied within a low-interactive learning environment (easy-software group) that was easy to be programmed. Both programs identically referred to a case report on localised aggressive periodontitis. Learning outcome was tested by a pre- and post-test. Furthermore, questionnaires on workload, motivation, perceived usefulness and perceived learning outcome were used. Learners in the easy-software group showed better results in the post-test F(1, 82) = 4.173, P software tools have the potential to be beneficial in dental education. Students were showing a high acceptance and ability in using both e-learning environments. We conclude that e-learning programs for case-based learning do not have to be overly laborious to program to be useful. Based on our results, we want to encourage instructors to produce case-based e-learning tools with easy-to-use software. © 2012 John Wiley & Sons A/S.

  2. Recent Survey and Application of the simSUNDT Software

    Science.gov (United States)

    Persson, G.; Wirdelius, H.

    2010-02-01

    The simSUNDT software is based on a previous developed program (SUNDT). The latest version has been customized in order to generate realistic synthetic data (including a grain noise model), compatible with a number of off-line analysis software. The software consists of a Windows®-based preprocessor and postprocessor together with a mathematical kernel (UTDefect), dealing with the actual mathematical modeling. The model employs various integral transforms and integral equation and enables simulations of the entire ultrasonic testing situation. The model is completely three-dimensional though the simulated component is two-dimensional, bounded by the scanning surface and a planar back surface as an option. It is of great importance that inspection methods that are applied are proper validated and that their capability of detection of cracks and defects are quantified. In order to achieve this, statistical methods such as Probability of Detection (POD) often are applied, with the ambition to estimate the detectability as a function of defect size. Despite the fact that the proposed procedure with the utilization of test pieces is very expensive, it also tends to introduce a number of possible misalignments between the actual NDT situation that is to be performed and the proposed experimental simulation. The presentation will describe the developed model that will enable simulation of a phased array NDT inspection and the ambition to use this simulation software to generate POD information. The paper also includes the most recent developments of the model including some initial experimental validation of the phased array probe model.

  3. Reliability of software

    International Nuclear Information System (INIS)

    Kopetz, H.

    1980-01-01

    Common factors and differences in the reliability of hardware and software; reliability increase by means of methods of software redundancy. Maintenance of software for long term operating behavior. (HP) [de

  4. An Application-oriented Open Software Platform for Multi-purpose Field Robotics

    DEFF Research Database (Denmark)

    Jensen, Kjeld

    that solve simple tasks such as mowing, and automatic tractor steering that navigates through a planned route under the supervision of an operator. The outdoor environment in which the robot operates is often very complex. This places great demands on the robot's ability to perceive the environment and based...... on this behave in a way that is appropriate and productive with respect to the given task while being safe for nearby people, animals and objects. Researchers are challenged by the considerable resources required to develop robot software capable of supporting experiments in such a complex perception...... and behavior. The lack of collaboration between research groups contributes to the problem, the scientific publications describe methods and results from the work, but little software for field robots are released and documented for use by others. The hypothesis of this work is that an application oriented...

  5. Work–family balance of Indian women software professionals: A qualitative study

    Directory of Open Access Journals (Sweden)

    Reimara Valk

    2011-03-01

    Full Text Available One of the significant changes witnessed in the labour markets in India has been the entry of women IT professionals in the rapidly growing software services sector. As the women take on the role of working professional in addition to their traditional role of the homemaker, they are under great pressure to balance their work and personal lives. This study attempts to understand how work and family related factors influence the work–family balance of Indian women IT professionals. The study is based on an exploratory qualitative study of 13 women IT professionals in the software sector in Bangalore, India. The narratives reveal six major themes: familial influences on life choices; multi-role responsibilities and attempts to negotiate them; self and professional identity; work–life challenges and coping strategies; organisational policies and practices; and social support.

  6. Intrafirm knowledge transfer of agile software practices: barriers and their relations

    DEFF Research Database (Denmark)

    Tordrup Heeager, Lise; Nielsen, Peter Axel

    2018-01-01

    to knowledge transfer, we modify and extend the framework to transferring knowledge of agile practices. This framework is subsequently applied for interpreting and analyzing the case study data. The analysis shows how these barriers (e.g., the organizational culture, time and resources, knowledge strategy......Agile software practices are widely used in a great variety of organizations, and the shift from traditional plan-driven approaches entails a redefinition of processes in these organizations. Intrafirm knowledge transfer of agile software practices between projects is a key concern...... in this redefinition. While knowledge transfer is essential for an organization to develop or keep its competitive advantage, it is also both difficult and time consuming, due to a wide range of barriers. Transferring knowledge on agile practices is even more complex due to there being a high degree of tacit knowledge...

  7. Impact of Internet of Things on Software Business Model and Software Industry

    OpenAIRE

    Murari, Bhanu Teja

    2016-01-01

    Context: Internet of things (IoT) technology is rapidly increasing and changes the business environment for a software organization. There is a need to understand what are important factors of business model should a software company focus on obtaining benefits from the potential that IoT offers. This thesis also focuses on finding the impact of IoT on software business model and software industry especially on software development. Objectives: In this thesis, we do research on IoT software b...

  8. Bringing Legacy Visualization Software to Modern Computing Devices via Application Streaming

    Science.gov (United States)

    Fisher, Ward

    2014-05-01

    Planning software compatibility across forthcoming generations of computing platforms is a problem commonly encountered in software engineering and development. While this problem can affect any class of software, data analysis and visualization programs are particularly vulnerable. This is due in part to their inherent dependency on specialized hardware and computing environments. A number of strategies and tools have been designed to aid software engineers with this task. While generally embraced by developers at 'traditional' software companies, these methodologies are often dismissed by the scientific software community as unwieldy, inefficient and unnecessary. As a result, many important and storied scientific software packages can struggle to adapt to a new computing environment; for example, one in which much work is carried out on sub-laptop devices (such as tablets and smartphones). Rewriting these packages for a new platform often requires significant investment in terms of development time and developer expertise. In many cases, porting older software to modern devices is neither practical nor possible. As a result, replacement software must be developed from scratch, wasting resources better spent on other projects. Enabled largely by the rapid rise and adoption of cloud computing platforms, 'Application Streaming' technologies allow legacy visualization and analysis software to be operated wholly from a client device (be it laptop, tablet or smartphone) while retaining full functionality and interactivity. It mitigates much of the developer effort required by other more traditional methods while simultaneously reducing the time it takes to bring the software to a new platform. This work will provide an overview of Application Streaming and how it compares against other technologies which allow scientific visualization software to be executed from a remote computer. We will discuss the functionality and limitations of existing application streaming

  9. The Software Invention Cube: A classification scheme for software inventions

    NARCIS (Netherlands)

    Bergstra, J.A.; Klint, P.

    2008-01-01

    The patent system protects inventions. The requirement that a software invention should make ‘a technical contribution’ turns out to be untenable in practice and this raises the question, what constitutes an invention in the realm of software. The authors developed the Software Invention Cube

  10. PySE: Software for extracting sources from radio images

    Science.gov (United States)

    Carbone, D.; Garsden, H.; Spreeuw, H.; Swinbank, J. D.; van der Horst, A. J.; Rowlinson, A.; Broderick, J. W.; Rol, E.; Law, C.; Molenaar, G.; Wijers, R. A. M. J.

    2018-04-01

    PySE is a Python software package for finding and measuring sources in radio telescope images. The software was designed to detect sources in the LOFAR telescope images, but can be used with images from other radio telescopes as well. We introduce the LOFAR Telescope, the context within which PySE was developed, the design of PySE, and describe how it is used. Detailed experiments on the validation and testing of PySE are then presented, along with results of performance testing. We discuss some of the current issues with the algorithms implemented in PySE and their interaction with LOFAR images, concluding with the current status of PySE and its future development.

  11. Great Lakes Restoration Initiative Great Lakes Mussel Watch(2009-2014)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Following the inception of the Great Lakes Restoration Initiative (GLRI) to address the significant environmental issues plaguing the Great Lakes region, the...

  12. Network software of FD-NET local network for the RT-11 operational system

    International Nuclear Information System (INIS)

    Bobyshev, A.N.; Kutsenko, V.A.; Kravtsov, A.I.; Korzhavin, A.I.; Rozhkov, A.B.; Semenov, Yu.A.; Fedotov, O.P.

    1987-01-01

    Description of software of FD-Net ring local network based on the ''Elektronika-60'' and ''MERA-60'' microcomputers as well as on SM-3, SM-4 and ''MERA-125'' minicomputers is given. FD-Net local network is aimed at automatization of complex and labour-consuming physical experiments carried out at the THEP. It permits to carry out simultaneous application of external devices, files and programs as well as data exchange between problems solved by different computers. The architecture of FD-Net network hardware is considered as well as a general structure of software. Certain modules of network software and their interaction with each other are described

  13. GRAPES: a software for parallel searching on biological graphs targeting multi-core architectures.

    Directory of Open Access Journals (Sweden)

    Rosalba Giugno

    Full Text Available Biological applications, from genomics to ecology, deal with graphs that represents the structure of interactions. Analyzing such data requires searching for subgraphs in collections of graphs. This task is computationally expensive. Even though multicore architectures, from commodity computers to more advanced symmetric multiprocessing (SMP, offer scalable computing power, currently published software implementations for indexing and graph matching are fundamentally sequential. As a consequence, such software implementations (i do not fully exploit available parallel computing power and (ii they do not scale with respect to the size of graphs in the database. We present GRAPES, software for parallel searching on databases of large biological graphs. GRAPES implements a parallel version of well-established graph searching algorithms, and introduces new strategies which naturally lead to a faster parallel searching system especially for large graphs. GRAPES decomposes graphs into subcomponents that can be efficiently searched in parallel. We show the performance of GRAPES on representative biological datasets containing antiviral chemical compounds, DNA, RNA, proteins, protein contact maps and protein interactions networks.

  14. A customizable software for fast reduction and analysis of large X-ray scattering data sets: applications of the new DPDAK package to small-angle X-ray scattering and grazing-incidence small-angle X-ray scattering.

    Science.gov (United States)

    Benecke, Gunthard; Wagermaier, Wolfgang; Li, Chenghao; Schwartzkopf, Matthias; Flucke, Gero; Hoerth, Rebecca; Zizak, Ivo; Burghammer, Manfred; Metwalli, Ezzeldin; Müller-Buschbaum, Peter; Trebbin, Martin; Förster, Stephan; Paris, Oskar; Roth, Stephan V; Fratzl, Peter

    2014-10-01

    X-ray scattering experiments at synchrotron sources are characterized by large and constantly increasing amounts of data. The great number of files generated during a synchrotron experiment is often a limiting factor in the analysis of the data, since appropriate software is rarely available to perform fast and tailored data processing. Furthermore, it is often necessary to perform online data reduction and analysis during the experiment in order to interactively optimize experimental design. This article presents an open-source software package developed to process large amounts of data from synchrotron scattering experiments. These data reduction processes involve calibration and correction of raw data, one- or two-dimensional integration, as well as fitting and further analysis of the data, including the extraction of certain parameters. The software, DPDAK (directly programmable data analysis kit), is based on a plug-in structure and allows individual extension in accordance with the requirements of the user. The article demonstrates the use of DPDAK for on- and offline analysis of scanning small-angle X-ray scattering (SAXS) data on biological samples and microfluidic systems, as well as for a comprehensive analysis of grazing-incidence SAXS data. In addition to a comparison with existing software packages, the structure of DPDAK and the possibilities and limitations are discussed.

  15. Incluso: Social software for the social inclusion of marginalized youth

    Directory of Open Access Journals (Sweden)

    Wouter Van den Bosch

    2010-12-01

    Can ICT, and more specifically social software, support welfare organizations in their work with marginalized young people? This was the main research question addressed in INCLUSO, a research project funded by the European Commission’s 7th Framework Programme. In this paper, the authors start by introducing the concepts of social exclusion, e-inclusion and the digital divide. They discuss the concept of social software, its use by youngsters and the potential of social software to contribute to social inclusion. The authors then report on the organizational challenges met as they guided four social welfare organizations from Austria, Belgium, Poland and the UK in their implementation of social software tools to support their interaction with marginalized young people. They identify these challenges and present tools to assist social work organizations in defining successful strategies for adopting ICT and social software within their organizations. INCLUSO: Sociale software ten behoeve van sociale inclusie van gemarginaliseerde jongeren In hoeverre kan ICT, en in het bijzonder het gebruik van sociale software, een bijdrage leveren aan de sociale inclusie van kansarme jongeren? Wat is de rol van welzijnsorganisaties in dit proces en wat zijn de voornaamste belemmeringen voor het gebruik van sociale software als middel om sociale inclusie te stimuleren? Deze vragen stonden centraal in het INCLUSOproject, een onderzoeksproject dat werd gefinancierd door de European Commission’s 7th Framework Programme. Dit artikel start met een toelichting op concepten als sociale uitsluiting, digitale inclusie en digital divide. Ook wordt ingegaan op het gebruik van social software door jongeren en de potentie ervan voor sociale inclusie. De auteurs doen vervolgens verslag van de organisatorische uitdagingen die ontstonden bij de begeleiding van vier welzijnsorganisaties, bij de implementatie van social software ten behoeve van sociale inclusie. Zij identificeren deze

  16. Implementing Kanban for agile process management within the ALMA Software Operations Group

    Science.gov (United States)

    Reveco, Johnny; Mora, Matias; Shen, Tzu-Chiang; Soto, Ruben; Sepulveda, Jorge; Ibsen, Jorge

    2014-07-01

    After the inauguration of the Atacama Large Millimeter/submillimeter Array (ALMA), the Software Operations Group in Chile has refocused its objectives to: (1) providing software support to tasks related to System Integration, Scientific Commissioning and Verification, as well as Early Science observations; (2) testing the remaining software features, still under development by the Integrated Computing Team across the world; and (3) designing and developing processes to optimize and increase the level of automation of operational tasks. Due to their different stakeholders, each of these tasks presents a wide diversity of importances, lifespans and complexities. Aiming to provide the proper priority and traceability for every task without stressing our engineers, we introduced the Kanban methodology in our processes in order to balance the demand on the team against the throughput of the delivered work. The aim of this paper is to share experiences gained during the implementation of Kanban in our processes, describing the difficulties we have found, solutions and adaptations that led us to our current but still evolving implementation, which has greatly improved our throughput, prioritization and problem traceability.

  17. The Orion GN and C Data-Driven Flight Software Architecture for Automated Sequencing and Fault Recovery

    Science.gov (United States)

    King, Ellis; Hart, Jeremy; Odegard, Ryan

    2010-01-01

    The Orion Crew Exploration Vehicle (CET) is being designed to include significantly more automation capability than either the Space Shuttle or the International Space Station (ISS). In particular, the vehicle flight software has requirements to accommodate increasingly automated missions throughout all phases of flight. A data-driven flight software architecture will provide an evolvable automation capability to sequence through Guidance, Navigation & Control (GN&C) flight software modes and configurations while maintaining the required flexibility and human control over the automation. This flexibility is a key aspect needed to address the maturation of operational concepts, to permit ground and crew operators to gain trust in the system and mitigate unpredictability in human spaceflight. To allow for mission flexibility and reconfrgurability, a data driven approach is being taken to load the mission event plan as well cis the flight software artifacts associated with the GN&C subsystem. A database of GN&C level sequencing data is presented which manages and tracks the mission specific and algorithm parameters to provide a capability to schedule GN&C events within mission segments. The flight software data schema for performing automated mission sequencing is presented with a concept of operations for interactions with ground and onboard crew members. A prototype architecture for fault identification, isolation and recovery interactions with the automation software is presented and discussed as a forward work item.

  18. The Model of Formation of Professional Competence of Future Software Engineers

    Directory of Open Access Journals (Sweden)

    Viktor Sedov

    2016-05-01

    Full Text Available The rapid technological development of modern society fundamentally changes processes of production, communication and services. There is a great demand for specialists who are competent in recently emerged industries. Moreover, the gap between scientific invention and its wide distribution and consumption has significantly reduced. Therefore, we face an urgent need for preparation of specialists in higher education that meet the requirements of modern society and labour market. Particularly relevant is the issue of training of future software engineers in the system of master’s degree, which is the level of education that trains not only professionals, but also scientists and university teachers. The article presents a developed model of formation of professional competence of future software engineers in the system of master’s degree. The model comprises units of training of future software engineers, identifies methodological approaches, a number of general didactic and methodological principles that underpin learning processes in higher education. It describes methods, forms of organization and means that are used in the system of master’s degree, and also provides pedagogical conditions of effective implementation of the model. The developed model addresses the issue of individualization, intensification and optimization of studying. While developing the model, special attention was paid to updating the content of education and searching for new organizational forms of training of future software engineers.

  19. Application of discrete function and software control flow to dependability assessment of embedded digital system

    International Nuclear Information System (INIS)

    Choi, Jong Gyun; Seong, Poong Hyun

    2001-01-01

    This article describes a combinatorial model for estimating the reliability of the embedded digital system by means of discrete function theory and software control flow. This model includes a coverage model for fault processing mechanisms implemented in digital system. Furthermore, the model considers the interaction between hardware and software. The fault processing mechanisms make it difficult for many types of components in digital system to be treated as binary state, good or bad. The discrete function theory provides a complete analysis of multi-state system as which the digital system can be regarded Through adaptation software control flow to discrete function theory, the HW/SW interaction is considered for estimation of the reliability of digital system. Using this model, we predict the reliability of one board controller in a digital system, Interposing Logic System(ILS), which is installed in YGN nuclear power units 3 and 4. Since the proposed model is general combinatinal model, the simplification of this model becomes a conservative model that treats the system as binary state. Moreover, if information for coverage factor of fault tolerance mechanisms implemented in system through fault injection experiment is obtained, this model can consider detailed interaction of system components

  20. Model extension and improvement for simulator-based software safety analysis

    Energy Technology Data Exchange (ETDEWEB)

    Huang, H.-W. [Department of Engineering and System Science, National Tsing Hua University (NTHU), 101 Section 2 Kuang Fu Road, Hsinchu, Taiwan (China) and Institute of Nuclear Energy Research (INER), No. 1000 Wenhua Road, Chiaan Village, Longtan Township, Taoyuan County 32546, Taiwan (China)]. E-mail: hwhwang@iner.gov.tw; Shih Chunkuan [Department of Engineering and System Science, National Tsing Hua University (NTHU), 101 Section 2 Kuang Fu Road, Hsinchu, Taiwan (China); Yih Swu [Department of Computer Science and Information Engineering, Ching Yun University, 229 Chien-Hsin Road, Jung-Li, Taoyuan County 320, Taiwan (China); Chen, M.-H. [Institute of Nuclear Energy Research (INER), No. 1000Wenhua Road, Chiaan Village, Longtan Township, Taoyuan County 32546, Taiwan (China); Lin, J.-M. [Taiwan Power Company (TPC), 242 Roosevelt Road, Section 3, Taipei 100, Taiwan (China)

    2007-05-15

    One of the major concerns when employing digital I and C system in nuclear power plant is digital system may introduce new failure mode, which differs with previous analog I and C system. Various techniques are under developing to analyze the hazard originated from software faults in digital systems. Preliminary hazard analysis, failure modes and effects analysis, and fault tree analysis are the most extensive used techniques. However, these techniques are static analysis methods, cannot perform dynamic analysis and the interactions among systems. This research utilizes 'simulator/plant model testing' technique classified in (IEEE Std 7-4.3.2-2003, 2003. IEEE Standard for Digital Computers in Safety Systems of Nuclear Power Generating Stations) to identify hazards which might be induced by nuclear I and C software defects. The recirculation flow system, control rod system, feedwater system, steam line model, dynamic power-core flow map, and related control systems of PCTran-ABWR model were successfully extended and improved. The benchmark against ABWR SAR proves this modified model is capable to accomplish dynamic system level software safety analysis and better than the static methods. This improved plant simulation can then further be applied to hazard analysis for operator/digital I and C interface interaction failure study, and the hardware-in-the-loop fault injection study.