WorldWideScience

Sample records for conventional software project

  1. Guidelines for the verification and validation of expert system software and conventional software: Project summary. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Mirsky, S.M.; Hayes, J.E.; Miller, L.A. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This eight-volume report presents guidelines for performing verification and validation (V&V) on Artificial Intelligence (Al) systems with nuclear applications. The guidelines have much broader application than just expert systems; they are also applicable to object-oriented programming systems, rule-based systems, frame-based systems, model-based systems, neural nets, genetic algorithms, and conventional software systems. This is because many of the components of AI systems are implemented in conventional procedural programming languages, so there is no real distinction. The report examines the state of the art in verifying and validating expert systems. V&V methods traditionally applied to conventional software systems are evaluated for their applicability to expert systems. One hundred fifty-three conventional techniques are identified and evaluated. These methods are found to be useful for at least some of the components of expert systems, frame-based systems, and object-oriented systems. A taxonomy of 52 defect types and their delectability by the 153 methods is presented. With specific regard to expert systems, conventional V&V methods were found to apply well to all the components of the expert system with the exception of the knowledge base. The knowledge base requires extension of the existing methods. Several innovative static verification and validation methods for expert systems have been identified and are described here, including a method for checking the knowledge base {open_quotes}semantics{close_quotes} and a method for generating validation scenarios. Evaluation of some of these methods was performed both analytically and experimentally. A V&V methodology for expert systems is presented based on three factors: (1) a system`s judged need for V&V (based in turn on its complexity and degree of required integrity); (2) the life-cycle phase; and (3) the system component being tested.

  2. Guidelines for the verification and validation of expert system software and conventional software. Volume 1: Project summary. Final report

    International Nuclear Information System (INIS)

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M.

    1995-05-01

    This eight-volume report presents guidelines for performing verification and validation (V ampersand V) on Artificial Intelligence (AI) systems with nuclear applications. The guidelines have much broader application than just expert systems; they are also applicable to object-oriented programming systems, rule-based systems, frame-based systems, model-based systems, neural nets, genetic algorithms, and conventional software systems. This is because many of the components of AI systems are implemented in conventional procedural programming languages, so there is no real distinction. The report examines the state of the art in verifying and validating expert systems. V ampersand V methods traditionally applied to conventional software systems are evaluated for their applicability to expert systems. One hundred fifty-three conventional techniques are identified and evaluated. These methods are found to be useful for at least some of the components of expert systems, frame-based systems, and object-oriented systems. A taxonomy of 52 defect types and their delectability by the 153 methods is presented. With specific regard to expert systems, conventional V ampersand V methods were found to apply well to all the components of the expert system with the exception of the knowledge base. The knowledge base requires extension of the existing methods. Several innovative static verification and validation methods for expert systems have been identified and are described here, including a method for checking the knowledge base open-quotes semanticsclose quotes and a method for generating validation scenarios. Evaluation of some of these methods was performed both analytically and experimentally

  3. Guidelines for the verification and validation of expert system software and conventional software: Project summary. Volume 1

    International Nuclear Information System (INIS)

    Mirsky, S.M.; Hayes, J.E.; Miller, L.A.

    1995-03-01

    This eight-volume report presents guidelines for performing verification and validation (V ampersand V) on Artificial Intelligence (Al) systems with nuclear applications. The guidelines have much broader application than just expert systems; they are also applicable to object-oriented programming systems, rule-based systems, frame-based systems, model-based systems, neural nets, genetic algorithms, and conventional software systems. This is because many of the components of AI systems are implemented in conventional procedural programming languages, so there is no real distinction. The report examines the state of the art in verifying and validating expert systems. V ampersand V methods traditionally applied to conventional software systems are evaluated for their applicability to expert systems. One hundred fifty-three conventional techniques are identified and evaluated. These methods are found to be useful for at least some of the components of expert systems, frame-based systems, and object-oriented systems. A taxonomy of 52 defect types and their delectability by the 153 methods is presented. With specific regard to expert systems, conventional V ampersand V methods were found to apply well to all the components of the expert system with the exception of the knowledge base. The knowledge base requires extension of the existing methods. Several innovative static verification and validation methods for expert systems have been identified and are described here, including a method for checking the knowledge base open-quotes semanticsclose quotes and a method for generating validation scenarios. Evaluation of some of these methods was performed both analytically and experimentally. A V ampersand V methodology for expert systems is presented based on three factors: (1) a system's judged need for V ampersand V (based in turn on its complexity and degree of required integrity); (2) the life-cycle phase; and (3) the system component being tested

  4. Avionics and Software Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of the AES Avionics and Software (A&S) project is to develop a reference avionics and software architecture that is based on standards and that can be...

  5. Managing Distributed Software Projects

    DEFF Research Database (Denmark)

    Persson, John Stouby

    Increasingly, software projects are becoming geographically distributed, with limited face-toface interaction between participants. These projects face particular challenges that need careful managerial attention. This PhD study reports on how we can understand and support the management...... of distributed software projects, based on a literature study and a case study. The main emphasis of the literature study was on how to support the management of distributed software projects, but also contributed to an understanding of these projects. The main emphasis of the case study was on how to understand...... the management of distributed software projects, but also contributed to supporting the management of these projects. The literature study integrates what we know about risks and risk-resolution techniques, into a framework for managing risks in distributed contexts. This framework was developed iteratively...

  6. Characteristics for Software Optimization Projects

    Directory of Open Access Journals (Sweden)

    Iulian NITESCU

    2008-01-01

    Full Text Available The increasing of the software systems complexity imposes the identification and implementation of some methods and techniques in order to manage it. The software optimization project is a way in which the software complexity is controlled. The software optimization project must face to the organization need to earn profit. The software optimization project is an integrated part of the application cycle because share same resources, depends on other stages and influences next phases. The optimization project has some particularities because it works on an finished product around its quality. The process is quality and performance oriented and it assumes that the product life cycle is almost finished.

  7. Software engineering beyond the project

    DEFF Research Database (Denmark)

    Dittrich, Yvonne

    2014-01-01

    Context The main part of software engineering methods, tools and technologies has developed around projects as the central organisational form of software development. A project organisation depends on clear bounds regarding scope, participants, development effort and lead-time. What happens when...... of traditional software engineering, but makes perfect sense, considering that the frame of reference for product development is not a project but continuous innovation across the respective ecosystem. The article provides a number of concrete points for further research....

  8. Software Project Management

    Science.gov (United States)

    1989-07-01

    incorporated into the sys- Kotler88 tem. Several interesting concepts are presented, but Kotler , P. Marketing Planning: Analysis, Planning, the bulk of the...the development organiza- In some environments, a software product is de- tion. veloped on speculation that there is a market for Teaching Consideration...development houses find it necessary to b. Types of plans know what the potential market for the product There are a number of plans developed in

  9. Gamification in Software Development Projects

    Directory of Open Access Journals (Sweden)

    Platonova Valērija

    2017-12-01

    Full Text Available Gamification is one of the many ways to motivate employees and introduce more fun in daily activities. The aim of the paper is to analyse the impact of gamification method on the software development projects. The paper contains results of a literature review about application areas of gamification, methods, positive and negative effects on projects. The paper also presents an overview of the gamification tools used in software development projects and attempts to answer the question about benefits of gamification usage: whether gamification in the project leads to the desired results and increases the employee productivity and motivation.

  10. Software Quality Assurance in Software Projects: A Study of Pakistan

    OpenAIRE

    Faisal Shafique Butt; Sundus Shaukat; M. Wasif Nisar; Ehsan Ullah Munir; Muhammad Waseem; Kashif Ayyub

    2013-01-01

    Software quality is specific property which tells what kind of standard software should have. In a software project, quality is the key factor of success and decline of software related organization. Many researches have been done regarding software quality. Software related organization follows standards introduced by Capability Maturity Model Integration (CMMI) to achieve good quality software. Quality is divided into three main layers which are Software Quality Assurance (SQA), Software Qu...

  11. Management of Software Development Projects

    Directory of Open Access Journals (Sweden)

    Felician ALECU

    2011-04-01

    Full Text Available Any major software development starts with the Initiating process group. Once the charter document is approved, the Planning and then to the Executing stages will follow. Monitoring and Controlling is measuring the potential performance deviation of the project in terms of schedule and costs and performs the related Integrated Change Control activities. At the end, during the Closing, the program/project manager will check the entire work is completed and the objectives are met.

  12. Managing MDO Software Development Projects

    Science.gov (United States)

    Townsend, J. C.; Salas, A. O.

    2002-01-01

    Over the past decade, the NASA Langley Research Center developed a series of 'grand challenge' applications demonstrating the use of parallel and distributed computation and multidisciplinary design optimization. All but the last of these applications were focused on the high-speed civil transport vehicle; the final application focused on reusable launch vehicles. Teams of discipline experts developed these multidisciplinary applications by integrating legacy engineering analysis codes. As teams became larger and the application development became more complex with increasing levels of fidelity and numbers of disciplines, the need for applying software engineering practices became evident. This paper briefly introduces the application projects and then describes the approaches taken in project management and software engineering for each project; lessons learned are highlighted.

  13. Survey and assessment of conventional software verification and validation techniques

    International Nuclear Information System (INIS)

    Miller, L.A.; Groundwater, E.; Mirsky, S.M.

    1993-02-01

    Reliable software is required for nuclear power plant applications. Verification and validation (V ampersand V) techniques may be applied during software development to help eliminate errors that can inhibit the proper operation of digital systems and that may cause safety problems. EPRI and the NRC are cosponsoring this investigation to determine the best strategies for V ampersand V of expert system software. The strategy used for a particular system will depend on the complexity of the software and the level of integrity required. This report covers the first task in the investigation of reviewing methods for V ampersand V of conventional software systems and evaluating them for use with expert systems

  14. Comparison of community managed projects and conventional ...

    African Journals Online (AJOL)

    Comparison of community managed projects and conventional approaches in rural water supply of Ethiopia. ... African Journal of Environmental Science and Technology ... This study aimed to compare Community Managed Projects (CMP) approach with the conventional approaches (Non-CMP) in the case of Ethiopia.

  15. Guidelines for the verification and validation of expert system software and conventional software. Volume 7, User's manual: Final report

    International Nuclear Information System (INIS)

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M.

    1995-05-01

    Reliable software is required for nuclear power industry applications. Verification and validation techniques applied during the software development process can help eliminate errors that could inhibit the proper operation of digital systems and cause availability and safety problems. Most of the techniques described in this report are valid for conventional software systems as well as for expert systems. The project resulted in a set of 16 V ampersand V guideline packages and 11 sets of procedures based on the class, development phase, and system component being tested. These guideline packages and procedures help a utility define the level of V ampersand V, which involves evaluating the complexity and type of software component along with the consequences of failure. In all, the project identified 153 V ampersand V techniques for conventional software systems and demonstrated their application to all aspects of expert systems except for the knowledge base, which requires specially developed tools. Each of these conventional techniques covers anywhere from 2-52 total types of conventional software defects, and each defect is covered by 21-50 V ampersand V techniques. The project also identified automated tools to Support V ampersand V activities

  16. Estimating software development project size, using probabilistic ...

    African Journals Online (AJOL)

    Estimating software development project size, using probabilistic techniques. ... of managing the size of software development projects by Purchasers (Clients) and Vendors (Development ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  17. Improving Software Engineering on NASA Projects

    Science.gov (United States)

    Crumbley, Tim; Kelly, John C.

    2010-01-01

    Software Engineering Initiative: Reduces risk of software failure -Increases mission safety. More predictable software cost estimates and delivery schedules. Smarter buyer of contracted out software. More defects found and removed earlier. Reduces duplication of efforts between projects. Increases ability to meet the challenges of evolving software technology.

  18. Implementing Large Projects in Software Engineering Courses

    Science.gov (United States)

    Coppit, David

    2006-01-01

    In software engineering education, large projects are widely recognized as a useful way of exposing students to the real-world difficulties of team software development. But large projects are difficult to put into practice. First, educators rarely have additional time to manage software projects. Second, classrooms have inherent limitations that…

  19. Measurement of software project management effectiveness

    OpenAIRE

    Demir, Kadir Alpaslan

    2008-01-01

    Approved for public release; distribution is unlimited. Evaluating, monitoring, and improving the effectiveness of project management can contribute to successful acquisition of software systems. In this dissertation, we introduce a quantitative metric for gauging the effectiveness of managing a software-development project. The metric may be used to evaluate and monitor project management effectiveness in software projects by project managers, technical managers, executive man...

  20. Software Maintenance Exercises for a Software Engineering Project Course

    Science.gov (United States)

    1989-02-01

    what is program style and how can it be measured? Program style has been defined as a "followed convention with respect to punctuation, capitalization ...convention with respect to punctuation, capitalization , and typographic arrangement and display." *DASC is a software tool that takes a syntactically...Specilleauons: A Frarnewo* * CM-12 Software Metrws CM- 13 Introduction to Softwarell Verification and Validation CM-14 Intelectual Property Protection for

  1. Applied software risk management a guide for software project managers

    CERN Document Server

    Pandian, C Ravindranath

    2006-01-01

    Few software projects are completed on time, on budget, and to their original specifications. Focusing on what practitioners need to know about risk in the pursuit of delivering software projects, Applied Software Risk Management: A Guide for Software Project Managers covers key components of the risk management process and the software development process, as well as best practices for software risk identification, risk planning, and risk analysis. Written in a clear and concise manner, this resource presents concepts and practical insight into managing risk. It first covers risk-driven project management, risk management processes, risk attributes, risk identification, and risk analysis. The book continues by examining responses to risk, the tracking and modeling of risks, intelligence gathering, and integrated risk management. It concludes with details on drafting and implementing procedures. A diary of a risk manager provides insight in implementing risk management processes.Bringing together concepts ...

  2. Software Tools Streamline Project Management

    Science.gov (United States)

    2009-01-01

    Three innovative software inventions from Ames Research Center (NETMARK, Program Management Tool, and Query-Based Document Management) are finding their way into NASA missions as well as industry applications. The first, NETMARK, is a program that enables integrated searching of data stored in a variety of databases and documents, meaning that users no longer have to look in several places for related information. NETMARK allows users to search and query information across all of these sources in one step. This cross-cutting capability in information analysis has exponentially reduced the amount of time needed to mine data from days or weeks to mere seconds. NETMARK has been used widely throughout NASA, enabling this automatic integration of information across many documents and databases. NASA projects that use NETMARK include the internal reporting system and project performance dashboard, Erasmus, NASA s enterprise management tool, which enhances organizational collaboration and information sharing through document routing and review; the Integrated Financial Management Program; International Space Station Knowledge Management; Mishap and Anomaly Information Reporting System; and management of the Mars Exploration Rovers. Approximately $1 billion worth of NASA s projects are currently managed using Program Management Tool (PMT), which is based on NETMARK. PMT is a comprehensive, Web-enabled application tool used to assist program and project managers within NASA enterprises in monitoring, disseminating, and tracking the progress of program and project milestones and other relevant resources. The PMT consists of an integrated knowledge repository built upon advanced enterprise-wide database integration techniques and the latest Web-enabled technologies. The current system is in a pilot operational mode allowing users to automatically manage, track, define, update, and view customizable milestone objectives and goals. The third software invention, Query

  3. Halden project activities on software dependability

    International Nuclear Information System (INIS)

    Dahll, G.; Sivertsen.

    1994-01-01

    Since 1977, the OECD Halden Reactor Project has been working in the field of software dependability. Special emphasis has been put on the use of software in safety critical systems. All phases in software development, from specification through software development, verification, and validation have been covered and are discussed in this article

  4. Security Risk Assessment in Software Development Projects

    OpenAIRE

    Svendsen, Heidi

    2017-01-01

    Software security is increasing in importance, linearly with vulnerabilities caused by software flaws. It is not possible to spend all the project s resources on software security. To spend the resources given to security in an effective way, one should know what is most important to protect. By performing a risk analysis the project know which vulnerabilities they face. A risk analysis will prioritise the vulnerabilities, and when the vulnerabilities are prioritised the project know where th...

  5. Process-based software project management

    CERN Document Server

    Goodman, F Alan

    2006-01-01

    Not connecting software project management (SPM) to actual, real-world development processes can lead to a complete divorcing of SPM to software engineering that can undermine any successful software project. By explaining how a layered process architectural model improves operational efficiency, Process-Based Software Project Management outlines a new method that is more effective than the traditional method when dealing with SPM. With a clear and easy-to-read approach, the book discusses the benefits of an integrated project management-process management connection. The described tight coup

  6. Project Success in Agile Development Software Projects

    Science.gov (United States)

    Farlik, John T.

    2016-01-01

    Project success has multiple definitions in the scholarly literature. Research has shown that some scholars and practitioners define project success as the completion of a project within schedule and within budget. Others consider a successful project as one in which the customer is satisfied with the product. This quantitative study was conducted…

  7. Software project management in a changing world

    CERN Document Server

    Ruhe, Günther

    2014-01-01

    By bringing together various current direc­tions, Software Project Management in a Changing World focuses on how people and organizations can make their processes more change-adaptive. The selected chapters closely correspond to the project management knowledge areas introduced by the Project Management Body of Knowledge, including its extension for managing software projects. The contributions are grouped into four parts, preceded by a general introduction. Part I "Fundamentals" provides in-depth insights into fundamental topics including resource allocation, cost estimation and risk manage

  8. A review of software project testing

    Directory of Open Access Journals (Sweden)

    Jose Calvo-Manzano Villalón

    2016-03-01

    Full Text Available In this article a review of software projects based on a taxonomy project is established, allowing the development team or testing personnel to identify the tests to which the project must be subjected for validation. The taxonomy is focused on identifying software projects according to their technology. To establish the taxonomy, a development method comprised of 5 phases was applied. The developed taxonomy is comprised of 10 categories and 35 subcategories and was validated by a group of information technology (IT managers and professionals in the field of IT through the use of a survey. The results obtained from the survey are subjected to the Mann-Whitney U test, which indicates that the taxonomy is validated. The taxonomy can be implemented in development organizations with or without a testing team that provides a classification for technology projects.

  9. Testing Software Development Project Productivity Model

    Science.gov (United States)

    Lipkin, Ilya

    Software development is an increasingly influential factor in today's business environment, and a major issue affecting software development is how an organization estimates projects. If the organization underestimates cost, schedule, and quality requirements, the end results will not meet customer needs. On the other hand, if the organization overestimates these criteria, resources that could have been used more profitably will be wasted. There is no accurate model or measure available that can guide an organization in a quest for software development, with existing estimation models often underestimating software development efforts as much as 500 to 600 percent. To address this issue, existing models usually are calibrated using local data with a small sample size, with resulting estimates not offering improved cost analysis. This study presents a conceptual model for accurately estimating software development, based on an extensive literature review and theoretical analysis based on Sociotechnical Systems (STS) theory. The conceptual model serves as a solution to bridge organizational and technological factors and is validated using an empirical dataset provided by the DoD. Practical implications of this study allow for practitioners to concentrate on specific constructs of interest that provide the best value for the least amount of time. This study outlines key contributing constructs that are unique for Software Size E-SLOC, Man-hours Spent, and Quality of the Product, those constructs having the largest contribution to project productivity. This study discusses customer characteristics and provides a framework for a simplified project analysis for source selection evaluation and audit task reviews for the customers and suppliers. Theoretical contributions of this study provide an initial theory-based hypothesized project productivity model that can be used as a generic overall model across several application domains such as IT, Command and Control

  10. Developing Project Duration Models in Software Engineering

    Institute of Scientific and Technical Information of China (English)

    Pierre Bourque; Serge Oligny; Alain Abran; Bertrand Fournier

    2007-01-01

    Based on the empirical analysis of data contained in the International Software Benchmarking Standards Group(ISBSG) repository, this paper presents software engineering project duration models based on project effort. Duration models are built for the entire dataset and for subsets of projects developed for personal computer, mid-range and mainframeplatforms. Duration models are also constructed for projects requiring fewer than 400 person-hours of effort and for projectsre quiring more than 400 person-hours of effort. The usefulness of adding the maximum number of assigned resources as asecond independent variable to explain duration is also analyzed. The opportunity to build duration models directly fromproject functional size in function points is investigated as well.

  11. From conventional software based systems to knowledge based systems

    International Nuclear Information System (INIS)

    Bologna, S.

    1995-01-01

    Even if todays nuclear power plants have a very good safety record, there is a continuous search for still improving safety. One direction of this effort address operational safety, trying to improve the handling of disturbances and accidents partly by further automation, partly by creating a better control room environment, providing the operator with intelligent support systems to help in the decision making process. Introduction of intelligent computerised operator support systems has proved to be an efficient way of improving the operators performance. A number of systems have been developed worldwide, assisting in tasks like process fault detection and diagnosis, selection and implementation of proper remedial actions. Unfortunately, the use of Knowledge Based Systems (KBSs), introduces a new dimension to the problem of the licensing process. KBSs, despite the different technology employed, are still nothing more than a computer program. Unfortunately, quite a few people building knowledge based systems seem to ignore the many good programming practices that have evolved over the years for producing traditional computer programs. In this paper the author will try to point out similarities and differences between conventional software based systems, and knowledge based systems, introducing also the concept of model based reasoning. (orig.) (25 refs., 2 figs.)

  12. Survey and assessment of conventional software verification and validation methods

    International Nuclear Information System (INIS)

    Miller, L.A.; Groundwater, E.; Mirsky, S.M.

    1993-04-01

    By means of a literature survey, a comprehensive set of methods was identified for the verification and validation of conventional software. The 134 methods so identified were classified according to their appropriateness for various phases of a developmental lifecycle -- requirements, design, and implementation; the last category was subdivided into two, static testing and dynamic testing methods. The methods were then characterized in terms of eight rating factors, four concerning ease-of-use of the methods and four concerning the methods' power to detect defects. Based on these factors, two measurements were developed to permit quantitative comparisons among methods, a Cost-Benefit metric and an Effectiveness Metric. The Effectiveness Metric was further refined to provide three different estimates for each method, depending on three classes of needed stringency of V ampersand V (determined by ratings of a system's complexity and required-integrity). Methods were then rank-ordered for each of the three classes in terms of their overall cost-benefits and effectiveness. The applicability was then assessed of each method for the four identified components of knowledge-based and expert systems, as well as the system as a whole

  13. The advanced software development workstation project

    Science.gov (United States)

    Fridge, Ernest M., III; Pitman, Charles L.

    1991-01-01

    The Advanced Software Development Workstation (ASDW) task is researching and developing the technologies required to support Computer Aided Software Engineering (CASE) with the emphasis on those advanced methods, tools, and processes that will be of benefit to support all NASA programs. Immediate goals are to provide research and prototype tools that will increase productivity, in the near term, in projects such as the Software Support Environment (SSE), the Space Station Control Center (SSCC), and the Flight Analysis and Design System (FADS) which will be used to support the Space Shuttle and Space Station Freedom. Goals also include providing technology for development, evolution, maintenance, and operations. The technologies under research and development in the ASDW project are targeted to provide productivity enhancements during the software life cycle phase of enterprise and information system modeling, requirements generation and analysis, system design and coding, and system use and maintenance. On-line user's guides will assist users in operating the developed information system with knowledge base expert assistance.

  14. Project Management Software for Distributed Industrial Companies

    Science.gov (United States)

    Dobrojević, M.; Medjo, B.; Rakin, M.; Sedmak, A.

    This paper gives an overview of the development of a new software solution for project management, intended mainly to use in industrial environment. The main concern of the proposed solution is application in everyday engineering practice in various, mainly distributed industrial companies. Having this in mind, special care has been devoted to development of appropriate tools for tracking, storing and analysis of the information about the project, and in-time delivering to the right team members or other responsible persons. The proposed solution is Internet-based and uses LAMP/WAMP (Linux or Windows - Apache - MySQL - PHP) platform, because of its stability, versatility, open source technology and simple maintenance. Modular structure of the software makes it easy for customization according to client specific needs, with a very short implementation period. Its main advantages are simple usage, quick implementation, easy system maintenance, short training and only basic computer skills needed for operators.

  15. Bottlenecks in Software Defect Prediction Implementation in Industrial Projects

    OpenAIRE

    Hryszko Jarosław; Madeyski Lech

    2015-01-01

    Case studies focused on software defect prediction in real, industrial software development projects are extremely rare. We report on dedicated R&D project established in cooperation between Wroclaw University of Technology and one of the leading automotive software development companies to research possibilities of introduction of software defect prediction using an open source, extensible software measurement and defect prediction framework called DePress (Defect Prediction in Software Syst...

  16. A pilot project on non-conventional learning

    OpenAIRE

    Fernandes, Sara; Cerone, Antonio; Barbosa, L. S.

    2013-01-01

    This poster presents a pilot project on non-conventional learning strategies based on students’ active participation in real-life FLOSS projects. The aim of the project is to validate the hypothesis that the peer-production model, which underlies most FLOSS projects, can enhance the learning-teaching process based on extensive and systematic collaborative practices.

  17. Comparison of community managed projects and conventional ...

    African Journals Online (AJOL)

    Ejiro O. Taghwo

    This study aimed to compare Community Managed Projects (CMP) approach with the ... Author(s) agree that this article remain permanently open access under the terms of the Creative Commons Attribution ... attention to repair and upgrade failed systems in ... In addition to low success in improving coverage, non-.

  18. Supersymmetry Parameter Analysis : SPA Convention and Project

    CERN Document Server

    Aguilar-Saavedra, J A; Allanach, Benjamin C; Arnowitt, R; Baer, H A; Bagger, J A; Balázs, C; Barger, V; Barnett, M; Bartl, Alfred; Battaglia, M; Bechtle, P; Belyaev, A; Berger, E L; Blair, G; Boos, E; Bélanger, G; Carena, M S; Choi, S Y; Deppisch, F; Desch, Klaus; Djouadi, A; Dutta, B; Dutta, S; Díaz, M A; Eberl, H; Ellis, Jonathan Richard; Erler, Jens; Fraas, H; Freitas, A; Fritzsche, T; Godbole, Rohini M; Gounaris, George J; Guasch, J; Gunion, J F; Haba, N; Haber, Howard E; Hagiwara, K; Han, L; Han, T; He, H J; Heinemeyer, S; Hesselbach, S; Hidaka, K; Hinchliffe, Ian; Hirsch, M; Hohenwarter-Sodek, K; Hollik, W; Hou, W S; Hurth, Tobias; Jack, I; Jiang, Y; Jones, D R T; Kalinowski, Jan; Kamon, T; Kane, G; Kang, S K; Kernreiter, T; Kilian, W; Kim, C S; King, S F; Kittel, O; Klasen, M; Kneur, J L; Kovarik, K; Kraml, Sabine; Krämer, M; Lafaye, R; Langacker, P; Logan, H E; Ma, W G; Majerotto, Walter; Martyn, H U; Matchev, K; Miller, D J; Mondragon, M; Moortgat-Pick, G; Moretti, S; Mori, T; Moultaka, G; Muanza, S; Mukhopadhyaya, B; Mühlleitner, M M; Nauenberg, U; Nojiri, M M; Nomura, D; Nowak, H; Okada, N; Olive, Keith A; Oller, W; Peskin, M; Plehn, T; Polesello, G; Porod, Werner; Quevedo, Fernando; Rainwater, D L; Reuter, J; Richardson, P; Rolbiecki, K; de Roeck, A; Weber, Ch.

    2006-01-01

    High-precision analyses of supersymmetry parameters aim at reconstructing the fundamental supersymmetric theory and its breaking mechanism. A well defined theoretical framework is needed when higher-order corrections are included. We propose such a scheme, Supersymmetry Parameter Analysis SPA, based on a consistent set of conventions and input parameters. A repository for computer programs is provided which connect parameters in different schemes and relate the Lagrangian parameters to physical observables at LHC and high energy e+e- linear collider experiments, i.e., masses, mixings, decay widths and production cross sections for supersymmetric particles. In addition, programs for calculating high-precision low energy observables, the density of cold dark matter (CDM) in the universe as well as the cross sections for CDM search experiments are included. The SPA scheme still requires extended efforts on both the theoretical and experimental side before data can be evaluated in the future at the level of the d...

  19. Requirements: Towards an understanding on why software projects fail

    Science.gov (United States)

    Hussain, Azham; Mkpojiogu, Emmanuel O. C.

    2016-08-01

    Requirement engineering is at the foundation of every successful software project. There are many reasons for software project failures; however, poorly engineered requirements process contributes immensely to the reason why software projects fail. Software project failure is usually costly and risky and could also be life threatening. Projects that undermine requirements engineering suffer or are likely to suffer from failures, challenges and other attending risks. The cost of project failures and overruns when estimated is very huge. Furthermore, software project failures or overruns pose a challenge in today's competitive market environment. It affects the company's image, goodwill, and revenue drive and decreases the perceived satisfaction of customers and clients. In this paper, requirements engineering was discussed. Its role in software projects success was elaborated. The place of software requirements process in relation to software project failure was explored and examined. Also, project success and failure factors were also discussed with emphasis placed on requirements factors as they play a major role in software projects' challenges, successes and failures. The paper relied on secondary data and empirical statistics to explore and examine factors responsible for the successes, challenges and failures of software projects in large, medium and small scaled software companies.

  20. Survey on Projects at DLR Simulation and Software Technology with Focus on Software Engineering and HPC

    OpenAIRE

    Schreiber, Andreas; Basermann, Achim

    2013-01-01

    We introduce the DLR institute “Simulation and Software Technology” (SC) and present current activities regarding software engineering and high performance computing (HPC) in German or international projects. Software engineering at SC focusses on data and knowledge management as well as tools for studies and experiments. We discuss how we apply software configuration management, validation and verification in our projects. Concrete research topics are traceability of (software devel...

  1. Guidance and Control Software Project Data - Volume 1: Planning Documents

    Science.gov (United States)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the planning documents from the GCS project. Volume 1 contains five appendices: A. Plan for Software Aspects of Certification for the Guidance and Control Software Project; B. Software Development Standards for the Guidance and Control Software Project; C. Software Verification Plan for the Guidance and Control Software Project; D. Software Configuration Management Plan for the Guidance and Control Software Project; and E. Software Quality Assurance Activities.

  2. Software Engineering Team Project - lessons learned

    Directory of Open Access Journals (Sweden)

    Bogumiła Hnatkowska

    2013-06-01

    Full Text Available In the 2010/11 academic year the Institute of Informatics at Wroclaw University of Technology issued ’Software Engineering Team Project’ as a course being a part of the final exam to earn bachelor’s degree. The main assumption about the course was that it should simulate the real environment (a virtual IT company for its participants. The course was aimed to introduce issues regarding programming in the medium scale, project planning and management. It was a real challenge as the course was offered for more than 140 students. The number of staff members involved in its preparation and performance was more than 15. The paper presents the lessons learned from the first course edition as well as more detailed qualitative and quantitative course assessment.

  3. Guidelines for the verification and validation of expert system software and conventional software: Survey and assessment of conventional software verification and validation methods. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Mirsky, S.M.; Groundwater, E.H.; Hayes, J.E.; Miller, L.A. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    By means of a literature survey, a comprehensive set of methods was identified for the verification and validation of conventional software. The 153 methods so identified were classified according to their appropriateness for various phases of a developmental life-cycle -- requirements, design, and implementation; the last category was subdivided into two, static testing and dynamic testing methods. The methods were then characterized in terms of eight rating factors, four concerning ease-of-use of the methods and four concerning the methods` power to detect defects. Based on these factors, two measurements were developed to permit quantitative comparisons among methods, a Cost-Benefit metric and an Effectiveness Metric. The Effectiveness Metric was further refined to provide three different estimates for each method, depending on three classes of needed stringency of V&V (determined by ratings of a system`s complexity and required-integrity). Methods were then rank-ordered for each of the three classes by terms of their overall cost-benefits and effectiveness. The applicability was then assessed of each for the identified components of knowledge-based and expert systems, as well as the system as a whole.

  4. Guidelines for the verification and validation of expert system software and conventional software: Survey and assessment of conventional software verification and validation methods. Volume 2

    International Nuclear Information System (INIS)

    Mirsky, S.M.; Groundwater, E.H.; Hayes, J.E.; Miller, L.A.

    1995-03-01

    By means of a literature survey, a comprehensive set of methods was identified for the verification and validation of conventional software. The 153 methods so identified were classified according to their appropriateness for various phases of a developmental life-cycle -- requirements, design, and implementation; the last category was subdivided into two, static testing and dynamic testing methods. The methods were then characterized in terms of eight rating factors, four concerning ease-of-use of the methods and four concerning the methods' power to detect defects. Based on these factors, two measurements were developed to permit quantitative comparisons among methods, a Cost-Benefit metric and an Effectiveness Metric. The Effectiveness Metric was further refined to provide three different estimates for each method, depending on three classes of needed stringency of V ampersand V (determined by ratings of a system's complexity and required-integrity). Methods were then rank-ordered for each of the three classes by terms of their overall cost-benefits and effectiveness. The applicability was then assessed of each for the identified components of knowledge-based and expert systems, as well as the system as a whole

  5. A Framework for Effective Software Monitoring in Project Management

    African Journals Online (AJOL)

    A Framework for Effective Software Monitoring in Project Management. ... is shown to provide meaningful interpretation of collected metric data by embedding certain quality function. Key words: Project Management, Feedback, project control, metrics, process model, quantitative validity ... AJOL African Journals Online.

  6. A Quantitative Study of Global Software Development Teams, Requirements, and Software Projects

    Science.gov (United States)

    Parker, Linda L.

    2016-01-01

    The study explored the relationship between global software development teams, effective software requirements, and stakeholders' perception of successful software development projects within the field of information technology management. It examined the critical relationship between Global Software Development (GSD) teams creating effective…

  7. Software qualification for digital safety system in KNICS project

    International Nuclear Information System (INIS)

    Kwon, Kee-Choon; Lee, Dong-Young; Choi, Jong-Gyun

    2012-01-01

    In order to achieve technical self-reliance in the area of nuclear instrumentation and control, the Korea Nuclear Instrumentation and Control System (KNICS) project had been running for seven years from 2001. The safety-grade Programmable Logic Controller (PLC) and the digital safety system were developed by KNICS project. All the software of the PLC and digital safety system were developed and verified following the software development life cycle Verification and Validation (V and V) procedure. The main activities of the V and V process are preparation of software planning documentations, verification of the Software Requirement Specification (SRS), Software Design Specification (SDS) and codes, and a testing of the software components, the integrated software, and the integrated system. In addition, a software safety analysis and a software configuration management are included in the activities. For the software safety analysis at the SRS and SDS phases, the software Hazard Operability (HAZOP) was performed and then the software fault tree analysis was applied. The software fault tree analysis was applied to a part of software module with some critical defects identified by the software HAZOP in SDS phase. The software configuration management was performed using the in-house tool developed in the KNICS project. (author)

  8. EPRI's POWERCOACH trademark software development project

    International Nuclear Information System (INIS)

    Rost, S.; Leu, Kehshiou

    1993-01-01

    Today's complex bulk power market accounts for an estimated $35 billion in transactions a year, significantly more than a decade ago. With the increased levels of non-utility generation and changing strategies in the utility industry, it is anticipated that the trend toward rapid growth in the bulk power market will continue. This market has evolved from an ad hoc residual market to one that in some respects stands at par with the retail market in the plans of many utilities. The bulk power market is not based on the obligation to serve to the same extent as retail markets. Utility participation in this market is therefore purely voluntary. This freedom of action or inaction in the bulk power market actually renders corporate decision-making, investment related or operational, more complicated in many respects than in retail markets. Examples of the burgeoning uncertainties affecting the bulk power market include the rapid expansion of transactions undertaken through power pools, and the impact on utility planning and operations brought about by the abundance and price attractiveness of power available for flexible periods. These uncertainties present an ideal opportunity to employ state-of-the-art analytical models to facilitate the effective use of utility assets to foster the efficient functioning of the entire bulk power market. This paper will focus on the POWERCOACH methodology for short-term bulk power transaction analysis under conditions of uncertainty. In August 1992, UPMP began a seventeen month project to convert POWERCOACH from a methodology to a fully functional, commercial software package. UPMP is developing the POWERCOACH software with the extensive, direct involvement of thirty EPRI member utilities. A synopsis of POWERCOACH is presented

  9. Predicting Software Projects Cost Estimation Based on Mining Historical Data

    OpenAIRE

    Najadat, Hassan; Alsmadi, Izzat; Shboul, Yazan

    2012-01-01

    In this research, a hybrid cost estimation model is proposed to produce a realistic prediction model that takes into consideration software project, product, process, and environmental elements. A cost estimation dataset is built from a large number of open source projects. Those projects are divided into three domains: communication, finance, and game projects. Several data mining techniques are used to classify software projects in terms of their development complexity. Data mining techniqu...

  10. Guidance and Control Software Project Data - Volume 2: Development Documents

    Science.gov (United States)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the development documents from the GCS project. Volume 2 contains three appendices: A. Guidance and Control Software Development Specification; B. Design Description for the Pluto Implementation of the Guidance and Control Software; and C. Source Code for the Pluto Implementation of the Guidance and Control Software

  11. Large-scale visualization projects for teaching software engineering.

    Science.gov (United States)

    Müller, Christoph; Reina, Guido; Burch, Michael; Weiskopf, Daniel

    2012-01-01

    The University of Stuttgart's software engineering major complements the traditional computer science major with more practice-oriented education. Two-semester software projects in various application areas offered by the university's different computer science institutes are a successful building block in the curriculum. With this realistic, complex project setting, students experience the practice of software engineering, including software development processes, technologies, and soft skills. In particular, visualization-based projects are popular with students. Such projects offer them the opportunity to gain profound knowledge that would hardly be possible with only regular lectures and homework assignments.

  12. Methodology for economic evaluation of software development projects

    International Nuclear Information System (INIS)

    Witte, D.M.

    1990-01-01

    Many oil and gas exploration and production companies develop computer software in-house or with contract programmers to support their exploration activities. Software development projects compete for funding with exploration and development projects, though most companies lack valid comparison measures for the two types of projects. This paper presents a methodology of pro form a cash flow analysis for software development proposals intended for internal use. This methodology, based on estimates of development and support costs, exploration benefits, and probability of successful development and implementation, can be used to compare proposed software development projects directly with competing exploration proposals

  13. Guidance and Control Software Project Data - Volume 3: Verification Documents

    Science.gov (United States)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the verification documents from the GCS project. Volume 3 contains four appendices: A. Software Verification Cases and Procedures for the Guidance and Control Software Project; B. Software Verification Results for the Pluto Implementation of the Guidance and Control Software; C. Review Records for the Pluto Implementation of the Guidance and Control Software; and D. Test Results Logs for the Pluto Implementation of the Guidance and Control Software.

  14. Automating the management of software projects in a developing it ...

    African Journals Online (AJOL)

    Software project management is the control of the transformation of users' ... Model from The American Systems Corporation (ASC) was used for risk management. ... Multi-site development approach facilitates large projects by using simple ...

  15. FY95 software project management plan: TMACS, CASS computer systems

    International Nuclear Information System (INIS)

    Spurling, D.G.

    1994-01-01

    The FY95 Work Plan for TMACS and CASS Software Projects describes the activities planned for the current fiscal year. This plan replaces WHC-SD-WM-SDP-008. The TMACS project schedule is included in the TWRS Integrated Schedule

  16. Automating the management of software projects in a developing IT ...

    African Journals Online (AJOL)

    The resultant network-based software tool was developed on object-oriented technology using Java. The study established that good management practices may still be applied by the Nigerian software industry that lacks expertise in software management. Multi-site development approach facilitates large projects by using ...

  17. CVSgrab : Mining the History of Large Software Projects

    NARCIS (Netherlands)

    Voinea, S.L.; Telea, A.

    2006-01-01

    Many software projects use Software Configuration Management systems to support their development process. Such systems accumulate in time large amounts of information useful for process accounting and auditing. We study how software developers can get insight in this information in order to

  18. Good practices for educational software engineering projects

    NARCIS (Netherlands)

    van der Duim, Louwarnoud; Andersson, Jesper; Sinnema, Marco

    2007-01-01

    Recent publications indicate the importance of software engineering in the computer science curriculum. In this paper, we present the final part of software engineering education at University of Groningen in the Netherlands and Vaxjo University in Sweden, where student teams perform an industrial

  19. Guidelines for the verification and validation of expert system software and conventional software: Bibliography. Volume 8

    International Nuclear Information System (INIS)

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M.

    1995-03-01

    This volume contains all of the technical references found in Volumes 1-7 concerning the development of guidelines for the verification and validation of expert systems, knowledge-based systems, other AI systems, object-oriented systems, and conventional systems

  20. Guidelines for the verification and validation of expert system software and conventional software: Bibliography. Volume 8

    Energy Technology Data Exchange (ETDEWEB)

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This volume contains all of the technical references found in Volumes 1-7 concerning the development of guidelines for the verification and validation of expert systems, knowledge-based systems, other AI systems, object-oriented systems, and conventional systems.

  1. Automated transportation management system (ATMS) software project management plan (SPMP)

    Energy Technology Data Exchange (ETDEWEB)

    Weidert, R.S., Westinghouse Hanford

    1996-05-20

    The Automated Transportation Management System (ATMS) Software Project Management plan (SPMP) is the lead planning document governing the life cycle of the ATMS and its integration into the Transportation Information Network (TIN). This SPMP defines the project tasks, deliverables, and high level schedules involved in developing the client/server ATMS software.

  2. Designing Project Management for Global Software Development

    DEFF Research Database (Denmark)

    Tjørnehøj, Gitte; B. Balogh, Maria; Iversen, Cathrine

    2014-01-01

    of distributed software teams, based on a practice study and informed by well-known theories. Our work pinpoints the difficulties of handling the vital informal processes in distributed collaboration that are so vulnerable because the distances risk detaining their growth and increasing their decay rate......Software development in distributed teams remains challenging despite rapid technical improvement in tools for communication and collaboration across distance. The challenges stem from geographical, temporal and sociocultural distance and manifest themselves in a variety of difficulties...

  3. Identifying Coordination Problems in Software Development : Finding Mismatches between Software and Project Team Structures

    NARCIS (Netherlands)

    Amrit, Chintan Amrit; van Hillegersberg, Jos; Kumar, Kuldeep

    2012-01-01

    Today’s dynamic and iterative development environment brings significant challenges for software project management. In distributed project settings, “management by walking around” is no longer an option and project managers may miss out on key project insights. The TESNA (TEchnical Social Network

  4. Turbine Aeration Design Software for Mitigating Adverse Environmental Impacts Resulting From Conventional Hydropower Turbines

    Energy Technology Data Exchange (ETDEWEB)

    Gulliver, John S. [Univ. of Minnesota, Minneapolis, MN (United States)

    2015-03-01

    Conventional hydropower turbine aeration test-bed for computational routines and software tools for improving environmental mitigation technologies for conventional hydropower systems. In achieving this goal, we have partnered with Alstom, a global leader in energy technology development and United States power generation, with additional funding from the Initiative for Renewable Energy and the Environment (IREE) and the College of Science and Engineering (CSE) at the UMN

  5. OPTIMALISASI GITHUB UNTUK SOFTWARE PROJECT MANAGEMENT DENGAN MEMANFAATKAN NOTIFIKASI SMS

    Directory of Open Access Journals (Sweden)

    Syarif Hidayatulloh

    2016-03-01

    Full Text Available Abstract - software project management is the art and science of planning and supervision in software projects. On project management software many emerging constraints that could result in his development and software development. One of the key points in a project the software repository is the time it takes to comment on, add and merge source code. Rapid feedback to make the members of a software project team was pleased to contribute in a software project. Problems on Github repository is when the notification was delivered via the web and email are not directly addressed because the members of the project management software rarely check emails. The methods used in this study is a literature study and experiments of different cases in the article, book or paper that discusses how to implement project management software quality so the software project goals can be achieved. Conclusion the results of the research that has been done is to apply the SMS notification on Github is expected to further speed up the interaction and communication between members in a software project and makes it easier for an IT manager in conducting management on Github project. Keywords: Software project management, GitHub, notification, SMS Abstrak - Manajemen proyek Perangkat lunak adalah seni dan ilmu perencanaan dan pembimbingan dalam proyek perangkat lunak. Pada manajemen proyek perangkat lunak banyak muncul kendala-kendala yang dapat mengakibatkan lamanya pembangunan dan pengembangan perangkat lunak. Salah satu poin penting dalam sebuah repositori proyek perangkat lunak adalah waktu yang dibutuhkan untuk mengomentari, menambahkan dan menggabungkan source code. Feedback yang cepat membuat anggota-anggota tim proyek perangkat lunak merasa senang untuk berkontribusi dalam sebuah proyek perangkat lunak. Permasalahan pada repositori Github adalah ketika notifikasi yang disampaikan lewat web dan email tidak langsung ditanggapi karena anggota-anggota proyek

  6. Study on Risk Approaches in Software Development Projects

    Directory of Open Access Journals (Sweden)

    Claudiu BRANDAS

    2012-01-01

    Full Text Available Risk approaches in project development led to the integration in the IT project management methodologies and software development of activities and processes of risk management. The diversity and the advanced level of the used technologies in IT projects with increasing com-plexity leads to an exponential diversification of risk factors.The purpose of this research is to identify the level of the risk approach in IT projects both at the IT project management and software development methodologies level and the level of the perception of IT project man-agers, IT managers and IT analysts in Romanian IT companies. Thus, we want to determine the correlation between the use of a project management or software development methodology and the overall level of risk perceived by the project managers using these methodologies.

  7. Incorporating Gaming in Software Engineering Projects: Case of RMU Monopoly

    Directory of Open Access Journals (Sweden)

    Sushil Acharya

    2009-02-01

    Full Text Available A major challenge in engineering education is retaining student interest in the engineering discipline. Active student involvement in engineering projects is one way of retaining student interest. Such involvement can only be realized if project inception comes entirely from the student. This paper presents a software game, RMU Monopoly, developed as a project requirement for a software engineering course and describes the challenges and gains of implementing such a project. The RMU Monopoly was proposed by three junior software engineering students. The game is a multi-platform software program that allows up to eight players and implements the rules of the Monopoly board game. To ensure agility the game was developed using the spiral software development model. The Software Requirements Specification (SRS document was finalized through an iterative procedure. Standard Unified Modeling Language (UML diagrams were used for product design. A Risk Mitigation, Monitoring, and Management Plan (RMMM was developed to ensure proactive risk management. Gantt chart, weekly progress meetings and weekly scrum meetings were used to track project progress. C# and Sub- Version were used in a client-server architecture to develop the software. The project was successful in retaining student interest in the software engineering discipline

  8. Dynamic Capabilities and Project Management in Small Software Companies

    DEFF Research Database (Denmark)

    Nørbjerg, Jacob; Nielsen, Peter Axel; Persson, John Stouby

    2017-01-01

    A small software company depends on its capability to adapt to rapid technological and other changes in its environment—its dynamic capabilities. In this paper, we argue that to evolve and maintain its dynamic capabilities a small software company must pay attention to the interaction between...... dynamic capabilities at different levels of the company — particularly between the project management and the company levels. We present a case study of a small software company and show how successful dynamic capabilities at the company level can affect project management in small software companies...

  9. MEASUREMENT PROCESS OF SOFTWARE DEVELOPMENT PROJECTS FOR SUPPORTING STRATEGIC BUSINESS OBJECTIVES IN SOFTWARE DEVELOPING COMPANIES

    Directory of Open Access Journals (Sweden)

    Sandra Lais Pedroso

    2013-08-01

    Full Text Available Software developing companies work in a competitive market and are often challenged to make business decisions with impact on competitiveness. Models accessing maturity for software development processes quality, such as CMMI and MPS-BR, comprise process measurements systems (PMS. However, these models are not necessarily suitable to support business decisions, neither to achieve strategic goals. The objective of this work is to analyze how the PMS of software development projects could support business strategies for software developing companies. Results taken from this work show that PMS results from maturity models for software processes can be suited to help evaluating operating capabilities and supporting strategic business decisions.

  10. Software-based annunciator replacement: a tale of two projects

    Energy Technology Data Exchange (ETDEWEB)

    Simmons, G.T., E-mail: simmongt@westinghouse.com [Westinghouse Electric Company LLC, Cranberry Township, PA (United States)

    2015-07-01

    Annunciator upgrade projects are often included as parts of operating plant life extension projects as the systems are old and replacement parts are difficult to source. This paper contains case studies of the software-based annunciator replacement projects at the Westinghouse SNUPPS training simulator in Pennsylvania and the Axpo Beznau nuclear power plant in Switzerland. Software-based annunciator systems can offer a number of feature enhancements including improved readability and operator awareness, easy configuration, alarm suppression features, and alarm management at operator workstations. This paper provides an overview of each project and discusses advantages, challenges, and lessons learned from both annunciator-replacement projects. (author)

  11. Software-based annunciator replacement: a tale of two projects

    International Nuclear Information System (INIS)

    Simmons, G.T.

    2015-01-01

    Annunciator upgrade projects are often included as parts of operating plant life extension projects as the systems are old and replacement parts are difficult to source. This paper contains case studies of the software-based annunciator replacement projects at the Westinghouse SNUPPS training simulator in Pennsylvania and the Axpo Beznau nuclear power plant in Switzerland. Software-based annunciator systems can offer a number of feature enhancements including improved readability and operator awareness, easy configuration, alarm suppression features, and alarm management at operator workstations. This paper provides an overview of each project and discusses advantages, challenges, and lessons learned from both annunciator-replacement projects. (author)

  12. Development and application of new quality model for software projects.

    Science.gov (United States)

    Karnavel, K; Dillibabu, R

    2014-01-01

    The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects.

  13. Software Schedules Missions, Aids Project Management

    Science.gov (United States)

    2008-01-01

    NASA missions require advanced planning, scheduling, and management, and the Space Agency has worked extensively to develop the programs and software suites necessary to facilitate these complex missions. These enormously intricate undertakings have hundreds of active components that need constant management and monitoring. It is no surprise, then, that the software developed for these tasks is often applicable in other high-stress, complex environments, like in government or industrial settings. NASA work over the past few years has resulted in a handful of new scheduling, knowledge-management, and research tools developed under contract with one of NASA s partners. These tools have the unique responsibility of supporting NASA missions, but they are also finding uses outside of the Space Program.

  14. The Fox Project: Advanced Development of Systems Software

    National Research Council Canada - National Science Library

    1999-01-01

    The long-term objectives of the Carnegie Mellon Fox Project are to improve the design and construction of systems software and to further the development of advanced programming language technology...

  15. The Fox Project: Advanced Development of Systems Software

    National Research Council Canada - National Science Library

    2000-01-01

    The long-term objectives of the Carnegie Mellon Fox Project are to improve the design and construction of systems software and to further the development of advanced programming language technology...

  16. Software metrics: The key to quality software on the NCC project

    Science.gov (United States)

    Burns, Patricia J.

    1993-01-01

    Network Control Center (NCC) Project metrics are captured during the implementation and testing phases of the NCCDS software development lifecycle. The metrics data collection and reporting function has interfaces with all elements of the NCC project. Close collaboration with all project elements has resulted in the development of a defined and repeatable set of metrics processes. The resulting data are used to plan and monitor release activities on a weekly basis. The use of graphical outputs facilitates the interpretation of progress and status. The successful application of metrics throughout the NCC project has been instrumental in the delivery of quality software. The use of metrics on the NCC Project supports the needs of the technical and managerial staff. This paper describes the project, the functions supported by metrics, the data that are collected and reported, how the data are used, and the improvements in the quality of deliverable software since the metrics processes and products have been in use.

  17. Exploring the Role of Social Software in Global Software Development Projects

    DEFF Research Database (Denmark)

    Giuffrida, Rosalba; Dittrich, Y.

    2011-01-01

    We present a PhD project that investigates the use of Social Software (SoSo) in Global Software Development (GSD) teams. Since SoSo in unstructured and informal in its own nature, we explore how informal communication, which is challenging in GSD, is supported by SoSo in distributed teams and how...

  18. Measurement of Software Project Management Effectiveness

    Science.gov (United States)

    2008-12-01

    factors such as advertisement of project mission, top management support, client consultation, personnel issues, client acceptance, etc. Trouble...and PERT (Program/Project Evaluation and Review Technique) and CPM (Critical Path Analysis) charts are process models, and the development of Gantt...models (such as Gantt, PERT and CPM ) got wide-acceptance in industry, as Fuggetta (2000) pointed out few (if any) of the proposed PMLs and related

  19. Coordinating Management Activities in Distributed Software Development Projects

    OpenAIRE

    Bendeck, Fawsy; Goldmann, Sigrid; Holz, Harald; Kötting, Boris

    1999-01-01

    Coordinating distributed processes, especially engineering and software design processes, has been a research topic for some time now. Several approaches have been published that aim at coordinating large projects in general, and large software development processes in specific. However, most of these approaches focus on the technical part of the design process and omit management activities like planning and scheduling the project, or monitoring it during execution. In this paper, we focus o...

  20. Framework for Small-Scale Experiments in Software Engineering: Guidance and Control Software Project: Software Engineering Case Study

    Science.gov (United States)

    Hayhurst, Kelly J.

    1998-01-01

    Software is becoming increasingly significant in today's critical avionics systems. To achieve safe, reliable software, government regulatory agencies such as the Federal Aviation Administration (FAA) and the Department of Defense mandate the use of certain software development methods. However, little scientific evidence exists to show a correlation between software development methods and product quality. Given this lack of evidence, a series of experiments has been conducted to understand why and how software fails. The Guidance and Control Software (GCS) project is the latest in this series. The GCS project is a case study of the Requirements and Technical Concepts for Aviation RTCA/DO-178B guidelines, Software Considerations in Airborne Systems and Equipment Certification. All civil transport airframe and equipment vendors are expected to comply with these guidelines in building systems to be certified by the FAA for use in commercial aircraft. For the case study, two implementations of a guidance and control application were developed to comply with the DO-178B guidelines for Level A (critical) software. The development included the requirements, design, coding, verification, configuration management, and quality assurance processes. This paper discusses the details of the GCS project and presents the results of the case study.

  1. The Dark Side of Software Engineering Evil on Computing Projects

    CERN Document Server

    Rost, Johann

    2010-01-01

    Betrayal! Corruption! Software engineering? Industry experts Johann Rost and Robert L. Glass explore the seamy underbelly of software engineering in this timely report on and analysis of the prevalance of subversion, lying, hacking, and espionage on every level of software project management. Based on the authors' original research and augmented by frank discussion and insights from other well-respected figures, The Dark Side of Software Engineering goes where other management studies fear to tread -- a corporate environment where schedules are fabricated, trust is betrayed, millions of dollar

  2. Clinical software development for the Web: lessons learned from the BOADICEA project.

    Science.gov (United States)

    Cunningham, Alex P; Antoniou, Antonis C; Easton, Douglas F

    2012-04-10

    In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web

  3. Methods for cost estimation in software project management

    Science.gov (United States)

    Briciu, C. V.; Filip, I.; Indries, I. I.

    2016-02-01

    The speed in which the processes used in software development field have changed makes it very difficult the task of forecasting the overall costs for a software project. By many researchers, this task has been considered unachievable, but there is a group of scientist for which this task can be solved using the already known mathematical methods (e.g. multiple linear regressions) and the new techniques as genetic programming and neural networks. The paper presents a solution for building a model for the cost estimation models in the software project management using genetic algorithms starting from the PROMISE datasets related COCOMO 81 model. In the first part of the paper, a summary of the major achievements in the research area of finding a model for estimating the overall project costs is presented together with the description of the existing software development process models. In the last part, a basic proposal of a mathematical model of a genetic programming is proposed including here the description of the chosen fitness function and chromosome representation. The perspective of model described it linked with the current reality of the software development considering as basis the software product life cycle and the current challenges and innovations in the software development area. Based on the author's experiences and the analysis of the existing models and product lifecycle it was concluded that estimation models should be adapted with the new technologies and emerging systems and they depend largely by the chosen software development method.

  4. IT Project Management in Very Small Software Companies

    DEFF Research Database (Denmark)

    Shakir, Shahid Nadeem; Nørbjerg, Jacob

    2013-01-01

    In developing countries very small software companies (VSSCs) with only 1-10 employees play an important role both in the local economy and as providers of software and services to customers in other parts of the world. Understanding and improving their IT project management (ITPM) practices...... and challenges are, therefore, important in the local as well as the larger context of globalized software development. There is, however, very little research into small shop software practices in developing countries. The current paper explores actual ITPM practices in Pakistani VSSCs based on a qualitative...... study of seven Pakistani VSSCs. We find that some Pakistani ITPM practices are similar to what is reported from VSSCs in other parts of the world, while others seem to be related to the companies' position in the global software development chain. This paper is part of a larger research project aiming...

  5. Software Support for the Classical, Contemporary and Future Project Management

    Directory of Open Access Journals (Sweden)

    Jakov Crnkovic

    2006-04-01

    Full Text Available The volume and complexity of Project Management (PM raises many questions for managers. What exactly are we managing? People? Performance? Efficiency? Effectiveness? Cost? Time? At what levels do projects become challenging and worthy of significant management attention? Can some projects be left on auto-pilot? Must others be managed more aggressively? What metrics are useful in Project Management? How can they be integrated with normal performance metrics in the organization? How can metrics be built into assessment programs that work? How can projects be monitored, re-planned to stay within the original budget and schedule deadlines? How good is the PM software support? Do we really need PM software packages or it should be the integral part of the company's information system (IS? Where is the knowledge about company's previous projects and performance? Are we able to establish company or even industry wide standards for project management? Can we (or should we move from the PMBOK® guidelines and use other approaches? We discussing important questions in PM: software products, responsibilities for concurrently executing several projects (multi-projects with multi objectives and multiple deadlines, introducing a need for initiation, design, execution, and control using a virtual project management and application of the organizational project maturity model.

  6. A Project Management Approach to Using Simulation for Cost Estimation on Large, Complex Software Development Projects

    Science.gov (United States)

    Mizell, Carolyn; Malone, Linda

    2007-01-01

    It is very difficult for project managers to develop accurate cost and schedule estimates for large, complex software development projects. None of the approaches or tools available today can estimate the true cost of software with any high degree of accuracy early in a project. This paper provides an approach that utilizes a software development process simulation model that considers and conveys the level of uncertainty that exists when developing an initial estimate. A NASA project will be analyzed using simulation and data from the Software Engineering Laboratory to show the benefits of such an approach.

  7. Support for Different Roles in Software Engineering Master's Thesis Projects

    Science.gov (United States)

    Host, M.; Feldt, R.; Luders, F.

    2010-01-01

    Like many engineering programs in Europe, the final part of most Swedish software engineering programs is a longer project in which the students write a Master's thesis. These projects are often conducted in cooperation between a university and industry, and the students often have two supervisors, one at the university and one in industry. In…

  8. A Team Building Model for Software Engineering Courses Term Projects

    Science.gov (United States)

    Sahin, Yasar Guneri

    2011-01-01

    This paper proposes a new model for team building, which enables teachers to build coherent teams rapidly and fairly for the term projects of software engineering courses. Moreover, the model can also be used to build teams for any type of project, if the team member candidates are students, or if they are inexperienced on a certain subject. The…

  9. Open Crowdsourcing: Leveraging Community Software Developers for IT Projects

    Science.gov (United States)

    Phair, Derek

    2012-01-01

    This qualitative exploratory single-case study was designed to examine and understand the use of volunteer community participants as software developers and other project related roles, such as testers, in completing a web-based application project by a non-profit organization. This study analyzed the strategic decision to engage crowd…

  10. Metrics-based control in outsourced software development projects

    NARCIS (Netherlands)

    Ponisio, Laura; van Eck, Pascal

    2012-01-01

    Measurements have been recognised as vital instruments to improve control in outsourced software development projects. However, project managers are still struggling with the design and implementation of effective measurement programs. One reason for this is that although there is a large body of

  11. Open Source Projects in Software Engineering Education: A Mapping Study

    Science.gov (United States)

    Nascimento, Debora M. C.; Almeida Bittencourt, Roberto; Chavez, Christina

    2015-01-01

    Context: It is common practice in academia to have students work with "toy" projects in software engineering (SE) courses. One way to make such courses more realistic and reduce the gap between academic courses and industry needs is getting students involved in open source projects (OSP) with faculty supervision. Objective: This study…

  12. Introduction to Financial Projection Models. Business Management Instructional Software.

    Science.gov (United States)

    Pomeroy, Robert W., III

    This guidebook and teacher's guide accompany a personal computer software program and introduce the key elements of financial projection modeling to project the financial statements of an industrial enterprise. The student will then build a model on an electronic spreadsheet. The guidebook teaches the purpose of a financial model and the steps…

  13. QFD Application to a Software - Intensive System Development Project

    Science.gov (United States)

    Tran, T. L.

    1996-01-01

    This paper describes the use of Quality Function Deployment (QFD), adapted to requirements engineering for a software-intensive system development project, and sysnthesizes the lessons learned from the application of QFD to the Network Control System (NCS) pre-project of the Deep Space Network.

  14. Integrating HCI Specialists into Open Source Software Development Projects

    Science.gov (United States)

    Hedberg, Henrik; Iivari, Netta

    Typical open source software (OSS) development projects are organized around technically talented developers, whose communication is based on technical aspects and source code. Decision-making power is gained through proven competence and activity in the project, and non-technical end-user opinions are too many times neglected. In addition, also human-computer interaction (HCI) specialists have encountered difficulties in trying to participate in OSS projects, because there seems to be no clear authority and responsibility for them. In this paper, based on HCI and OSS literature, we introduce an extended OSS development project organization model that adds a new level of communication and roles for attending human aspects of software. The proposed model makes the existence of HCI specialists visible in the projects, and promotes interaction between developers and the HCI specialists in the course of a project.

  15. How Social Software Supports Cooperative Practices in a Globally Distributed Software Project

    DEFF Research Database (Denmark)

    Giuffrida, Rosalba; Dittrich, Yvonne

    2014-01-01

    In Global Software Development (GSD), the lack of face- to-face communication is a major challenge and effective computer-mediated practices are necessary. This paper analyzes cooperative practices supported by Social Software (SoSo) in a GSD student project. The empirical results show...... that the role of SoSo is to support informal communication, enabling social talks and metawork, both necessary for establishing and for maintaining effective coordination mechanisms, thus successful cooperation....

  16. Managing Risks in Distributed Software Projects: An Integrative Framework

    DEFF Research Database (Denmark)

    Persson, John Stouby; Mathiassen, Lars; Boeg, Jesper

    2009-01-01

    techniques into an integrative framework for managing risks in distributed contexts. Subsequent implementation of a Web-based tool helped us refine the framework based on empirical evaluation of its practical usefulness.We conclude by discussing implications for both research and practice.......Software projects are increasingly geographically distributed with limited face-to-face interaction between participants. These projects face particular challenges that need carefulmanagerial attention. While risk management has been adopted with success to address other challenges within software...... development, there are currently no frameworks available for managing risks related to geographical distribution. On this background, we systematically review the literature on geographically distributed software projects. Based on the review, we synthesize what we know about risks and risk resolution...

  17. The dynamic of modern software development project management and the software crisis of quality. An integrated system dynamics approach towards software quality improvement

    OpenAIRE

    Nasirikaljahi, Armindokht

    2012-01-01

    The software industry is plagued by cost-overruns, delays, poor customer satisfaction and quality issues that are costing clients and customers world-wide billions of dollars each year. The phenomenon is coined The Software Crisis", and poses a huge challenge for software project management. This thesis addresses one of the core issues of the software crisis, namely software quality. The challenges of software quality are central for understanding the other symptoms of the software crisis. Th...

  18. A comparative evaluation of V and V procedures for conventional software and expert systems

    International Nuclear Information System (INIS)

    Saglietti, F.

    1992-01-01

    This paper presents some initial considerations on the particular features of artificial intelligence differing from conventional programs and rendering the V and V procedures more simple under certain points of view, but more complex with respect to others. On the basis of the characteristics identified, some existing testing strategies are presented and analyzed; an additional adequacy criterion is suggested in order to increase error detectability. The aim of this work is to extend the existing (though so far still incomplete) testing theories for conventional programs, such as to allow their application also to artificial intelligent systems. The ultimate goal of the present investigations is thus represented by a unified problem representation permitting to define adequate V and V procedures by adapting them to the specific software category considered. In particular, special criteria for Al testing should be included to the usual coverage requirements in order to capture also those faults, which seem to be typical for the expert reasoning

  19. NIF Projects Controls and Information Systems Software Quality Assurance Plan

    Energy Technology Data Exchange (ETDEWEB)

    Fishler, B

    2011-03-18

    Quality achievement for the National Ignition Facility (NIF) and the National Ignition Campaign (NIC) is the responsibility of the NIF Projects line organization as described in the NIF and Photon Science Directorate Quality Assurance Plan (NIF QA Plan). This Software Quality Assurance Plan (SQAP) is subordinate to the NIF QA Plan and establishes quality assurance (QA) activities for the software subsystems within Controls and Information Systems (CIS). This SQAP implements an activity level software quality assurance plan for NIF Projects as required by the LLNL Institutional Software Quality Assurance Program (ISQAP). Planned QA activities help achieve, assess, and maintain appropriate quality of software developed and/or acquired for control systems, shot data systems, laser performance modeling systems, business applications, industrial control and safety systems, and information technology systems. The objective of this SQAP is to ensure that appropriate controls are developed and implemented for management planning, work execution, and quality assessment of the CIS organization's software activities. The CIS line organization places special QA emphasis on rigorous configuration control, change management, testing, and issue tracking to help achieve its quality goals.

  20. NIF Projects Controls and Information Systems Software Quality Assurance Plan

    International Nuclear Information System (INIS)

    Fishler, B.

    2011-01-01

    Quality achievement for the National Ignition Facility (NIF) and the National Ignition Campaign (NIC) is the responsibility of the NIF Projects line organization as described in the NIF and Photon Science Directorate Quality Assurance Plan (NIF QA Plan). This Software Quality Assurance Plan (SQAP) is subordinate to the NIF QA Plan and establishes quality assurance (QA) activities for the software subsystems within Controls and Information Systems (CIS). This SQAP implements an activity level software quality assurance plan for NIF Projects as required by the LLNL Institutional Software Quality Assurance Program (ISQAP). Planned QA activities help achieve, assess, and maintain appropriate quality of software developed and/or acquired for control systems, shot data systems, laser performance modeling systems, business applications, industrial control and safety systems, and information technology systems. The objective of this SQAP is to ensure that appropriate controls are developed and implemented for management planning, work execution, and quality assessment of the CIS organization's software activities. The CIS line organization places special QA emphasis on rigorous configuration control, change management, testing, and issue tracking to help achieve its quality goals.

  1. REVIEW ON CONSTRUCTION PROJECT MANAGEMENT SOFTWARE PRIMAVERA P6

    OpenAIRE

    Piyush Pramod Bagade* & Prof. Abhijit Bhirud

    2018-01-01

    Planning, scheduling and Resource levelling plays an important role in any construction project maybe it is construction of building or construction of road. In absence of proper planning, scheduling and resource levelling construction industry does not get profit over the project. For this purpose, proper software’s and techniques must be utilize. This paper is focussing on the advantages of Oracle Primavera P6 Software. The latest version of Primavera is P6-17. In any construction work huge...

  2. An Assessment of risk response strategies practiced in software projects

    Directory of Open Access Journals (Sweden)

    Vanita Bhoola

    2014-11-01

    Full Text Available Risk management and success in projects are highly intertwined – better approaches to project risk management tend to increase chances of project success in terms of achieving scope & quality, schedule and cost targets. The process of responding to risk factors during a project’s life cycle is a crucial aspect of risk management referred to as risk response strategies, in this paper. The current research explores the status of risk response strategies applied in the software development projects in India. India provides a young IT-savvy English-speaking population, which is also cost effective. Other than the workforce, the environment for implementation of software projects in India is different from the matured economies. Risk management process is a commonly discussed theme, though its implementation in practice has a huge scope for improvement in India. The paper talks about four fundamental treatments to risk response – Avoidance, Transference, Mitigation and Acceptance (ATMA. From a primary data of 302 project managers, the paper attempts to address the risk response factors that lead to successful achievement of project scope & quality, schedule and cost targets, by using a series of regressions followed with Seemingly Unrelated Regression Equations (SURE modelling. Mitigation emerged as the most significant risk response strategy to achieve project targets. Acceptance, transference, and avoidance of risk were mostly manifested in the forms of transparency in communication across stakeholders, careful study of the nature of risks and close coordination between project team, customers/end-users and top management.

  3. The Company Approach to Software Engineering Project Courses

    Science.gov (United States)

    Broman, D.; Sandahl, K.; Abu Baker, M.

    2012-01-01

    Teaching larger software engineering project courses at the end of a computing curriculum is a way for students to learn some aspects of real-world jobs in industry. Such courses, often referred to as capstone courses, are effective for learning how to apply the skills they have acquired in, for example, design, test, and configuration management.…

  4. Enhancing the Student Learning Experience in Software Engineering Project Courses

    Science.gov (United States)

    Marques, Maira; Ochoa, Sergio F.; Bastarrica, Maria Cecilia; Gutierrez, Francisco J.

    2018-01-01

    Carrying out real-world software projects in their academic studies helps students to understand what they will face in industry, and to experience first-hand the challenges involved when working collaboratively. Most of the instructional strategies used to help students take advantage of these activities focus on supporting agile programming,…

  5. Coordination and Control of Globally Distributed Software Projects

    NARCIS (Netherlands)

    P.C. van Fenema (Paul)

    2002-01-01

    textabstractRecently, software development and implementation projects have globalized at a rapid pace. Companies in North America, Europe, and the Far East are beginning to integrate international Information Technology (IT) resources to support operations across the globe. Offshore IT services

  6. Software quality assurance on the Yucca Mountain Site Characterization Project

    International Nuclear Information System (INIS)

    Matras, J.R.

    1993-01-01

    The Yucca Mountain Site Characterization Project (YMP) has been involved over the years in the continuing struggle with establishing acceptable Software Quality Assurance (SQA) requirements for the development, modification, and acquisition of computer programs used to support the Mined Geologic Disposal System. These computer programs will be used to produce or manipulate data used directly in site characterization, design, analysis, performance assessment, and operation of repository structures, systems, and components. Scientists and engineers working on the project have claimed that the SQA requirements adopted by the project are too restrictive to allow them to perform their work. This paper will identify the source of the original SQA requirements adopted by the project. It will delineate the approach used by the project to identify concerns voiced by project engineers and scientists regarding the original SQA requirements. It will conclude with a discussion of methods used to address these problems in the rewrite of the original SQA requirements

  7. Sustainable Software Decisions for Long-term Projects (Invited)

    Science.gov (United States)

    Shepherd, A.; Groman, R. C.; Chandler, C. L.; Gaylord, D.; Sun, M.

    2013-12-01

    Adopting new, emerging technologies can be difficult for established projects that are positioned to exist for years to come. In some cases the challenge lies in the pre-existing software architecture. In others, the challenge lies in the fluctuation of resources like people, time and funding. The Biological and Chemical Oceanography Data Management Office (BCO-DMO) was created in late 2006 by combining the data management offices for the U.S. GLOBEC and U.S. JGOFS programs to publish data for researchers funded by the National Science Foundation (NSF). Since its inception, BCO-DMO has been supporting access and discovery of these data through web-accessible software systems, and the office has worked through many of the challenges of incorporating new technologies into its software systems. From migrating human readable, flat file metadata storage into a relational database, and now, into a content management system (Drupal) to incorporating controlled vocabularies, new technologies can radically affect the existing software architecture. However, through the use of science-driven use cases, effective resource management, and loosely coupled software components, BCO-DMO has been able to adapt its existing software architecture to adopt new technologies. One of the latest efforts at BCO-DMO revolves around applying metadata semantics for publishing linked data in support of data discovery. This effort primarily affects the metadata web interface software at http://bco-dmo.org and the geospatial interface software at http://mapservice.bco-dmo.org/. With guidance from science-driven use cases and consideration of our resources, implementation decisions are made using a strategy to loosely couple the existing software systems to the new technologies. The results of this process led to the use of REST web services and a combination of contributed and custom Drupal modules for publishing BCO-DMO's content using the Resource Description Framework (RDF) via an instance of

  8. Guidelines for the verification and validation of expert system software and conventional software: User`s manual. Volume 7

    Energy Technology Data Exchange (ETDEWEB)

    Mirsky, S.M.; Hayes, J.E.; Miller, L.A. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This report provides a step-by-step guide, or user manual, for personnel responsible for the planning and execution of the verification and validation (V&V), and developmental testing, of expert systems, conventional software systems, and various other types of artificial intelligence systems. While the guide was developed primarily for applications in the utility industry, it applies well to all industries. The user manual has three sections. In Section 1 the user assesses the stringency of V&V needed for the system under consideration, identifies the development stage the system is in, and identifies the component(s) of the system to be tested next. These three pieces of information determine which Guideline Package of V&V methods is most appropriate for those conditions. The V&V Guideline Packages are provided in Section 2. Each package consists of an ordered set of V&V techniques to be applied to the system, guides on choosing the review/evaluation team, measurement criteria, and references to a book or report which describes the application of the method. Section 3 presents details of 11 of the most important (or least well-explained in the literature) methods to assist the user in applying these techniques accurately.

  9. Guidelines for the verification and validation of expert system software and conventional software: User's manual. Volume 7

    International Nuclear Information System (INIS)

    Mirsky, S.M.; Hayes, J.E.; Miller, L.A.

    1995-03-01

    This report provides a step-by-step guide, or user manual, for personnel responsible for the planning and execution of the verification and validation (V ampersand V), and developmental testing, of expert systems, conventional software systems, and various other types of artificial intelligence systems. While the guide was developed primarily for applications in the utility industry, it applies well to all industries. The user manual has three sections. In Section 1 the user assesses the stringency of V ampersand V needed for the system under consideration, identifies the development stage the system is in, and identifies the component(s) of the system to be tested next. These three pieces of information determine which Guideline Package of V ampersand V methods is most appropriate for those conditions. The V ampersand V Guideline Packages are provided in Section 2. Each package consists of an ordered set of V ampersand V techniques to be applied to the system, guides on choosing the review/evaluation team, measurement criteria, and references to a book or report which describes the application of the method. Section 3 presents details of 11 of the most important (or least well-explained in the literature) methods to assist the user in applying these techniques accurately

  10. Dedicated OO expertise applied to Run II software projects

    International Nuclear Information System (INIS)

    Amidei, D.

    2000-01-01

    The change in software language and methodology by CDF and D0 to object-oriented from procedural Fortran is significant. Both experiments requested dedicated expertise that could be applied to software design, coding, advice and review. The Fermilab Run II offline computing outside review panel agreed strongly with the request and recommended that the Fermilab Computing Division hire dedicated OO expertise for the CDF/D0/Computing Division joint project effort. This was done and the two experts have been an invaluable addition to the CDF and D0 upgrade software projects and to the Computing Division in general. These experts have encouraged common approaches and increased the overall quality of the upgrade software. Advice on OO techniques and specific advice on C++ coding has been used. Recently a set of software reviews has been accomplished. This has been a very successful instance of a targeted application of computing expertise, and constitutes a very interesting study of how to move toward modern computing methodologies in HEP

  11. Software project management tools in global software development: a systematic mapping study.

    Science.gov (United States)

    Chadli, Saad Yasser; Idri, Ali; Ros, Joaquín Nicolás; Fernández-Alemán, José Luis; de Gea, Juan M Carrillo; Toval, Ambrosio

    2016-01-01

    Global software development (GSD) which is a growing trend in the software industry is characterized by a highly distributed environment. Performing software project management (SPM) in such conditions implies the need to overcome new limitations resulting from cultural, temporal and geographic separation. The aim of this research is to discover and classify the various tools mentioned in literature that provide GSD project managers with support and to identify in what way they support group interaction. A systematic mapping study has been performed by means of automatic searches in five sources. We have then synthesized the data extracted and presented the results of this study. A total of 102 tools were identified as being used in SPM activities in GSD. We have classified these tools, according to the software life cycle process on which they focus and how they support the 3C collaboration model (communication, coordination and cooperation). The majority of the tools found are standalone tools (77%). A small number of platforms (8%) also offer a set of interacting tools that cover the software development lifecycle. Results also indicate that SPM areas in GSD are not adequately supported by corresponding tools and deserve more attention from tool builders.

  12. Guidelines for the verification and validation of expert system software and conventional software: Survey and documentation of expert system verification and validation methodologies. Volume 3

    International Nuclear Information System (INIS)

    Groundwater, E.H.; Miller, L.A.; Mirsky, S.M.

    1995-03-01

    This report is the third volume in the final report for the Expert System Verification and Validation (V ampersand V) project which was jointly sponsored by the Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V ampersand V of expert systems for use in nuclear power applications. The purpose of this activity was to survey and document techniques presently in use for expert system V ampersand V. The survey effort included an extensive telephone interviewing program, site visits, and a thorough bibliographic search and compilation. The major finding was that V ampersand V of expert systems is not nearly as established or prevalent as V ampersand V of conventional software systems. When V ampersand V was used for expert systems, it was almost always at the system validation stage after full implementation and integration usually employing the non-systematic dynamic method of open-quotes ad hoc testing.close quotes There were few examples of employing V ampersand V in the early phases of development and only weak sporadic mention of the possibilities in the literature. There is, however, a very active research area concerning the development of methods and tools to detect problems with, particularly, rule-based expert systems. Four such static-testing methods were identified which were not discovered in a comprehensive review of conventional V ampersand V methods in an earlier task

  13. Guidelines for the verification and validation of expert system software and conventional software: Survey and documentation of expert system verification and validation methodologies. Volume 3

    Energy Technology Data Exchange (ETDEWEB)

    Groundwater, E.H.; Miller, L.A.; Mirsky, S.M. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This report is the third volume in the final report for the Expert System Verification and Validation (V&V) project which was jointly sponsored by the Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V&V of expert systems for use in nuclear power applications. The purpose of this activity was to survey and document techniques presently in use for expert system V&V. The survey effort included an extensive telephone interviewing program, site visits, and a thorough bibliographic search and compilation. The major finding was that V&V of expert systems is not nearly as established or prevalent as V&V of conventional software systems. When V&V was used for expert systems, it was almost always at the system validation stage after full implementation and integration usually employing the non-systematic dynamic method of {open_quotes}ad hoc testing.{close_quotes} There were few examples of employing V&V in the early phases of development and only weak sporadic mention of the possibilities in the literature. There is, however, a very active research area concerning the development of methods and tools to detect problems with, particularly, rule-based expert systems. Four such static-testing methods were identified which were not discovered in a comprehensive review of conventional V&V methods in an earlier task.

  14. Computing with words to feasibility study of software projects

    Directory of Open Access Journals (Sweden)

    Marieta Peña Abreu

    2017-02-01

    Full Text Available Objective: This paper proposes a method to analyze the technical, commercial and social feasibility of software projects in environments of uncertainty. It allows working with multiple experts and multiple criteria and facilitates decision-making. Method: The proposal contains two phases, first the necessary information is collected and in second place projects are evaluated using 2-tuple linguistic representation model. The experts are selected by analyzing their curricular synthesis. The evaluation criteria are defined using the technique Focus Group and weighted in the interval (0,1 according to their importance. three domains are offered to express the preferences: numeric, interval-valued and linguistic. For aggregation extended arithmetic mean and weighted average extended are used, preventing the loss of information. A 2-tuple (feasibility, precision is obtained as a result for each project. Results: The evaluation of P1 project was a very high feasibility with -0,33 of precision. The P2 project obtained a high feasibility with 0,38 of precision and P3 project achieved a medium feasibility with -0,21 of precision. Conclusions: This method is favorable for software projects feasibility analysis with presence of multiple experts and criteria, in environments of uncertainty. It tries heterogeneous assessments without loss of information. Their results are consistent and useful for decision makers.

  15. Guidelines for the verification and validation of expert system software and conventional software: Volume 5, Rationale and description of verification and validation guideline packages and procedures. Final report

    International Nuclear Information System (INIS)

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M.

    1995-05-01

    This report is the fifth volume in a series of reports describing the results of the Expert System Verification and Validation (V ampersand V) project which is jointly funded by US NRC and EPRI toward formulating guidelines for V ampersand V of expert systems for use in nuclear power applications. This report provides the rationale for and description of those guidelines. The actual guidelines themselves (and the accompanying 11 step by step Procedures) are presented in Volume 7, User's Manual. Three factors determine what V ampersand V is needed: (1) the stage, of the development life cycle (requirements, design, or implementation), (2) whether the overall system or a specialized component needs be tested (knowledge base component, inference engine or other highly reusable element, or a component involving conventional software), and (3) the stringency of V ampersand V that is needed (as judged from an assessment of the system's complexity and the requirement for its integrity to form three Classes). A V ampersand V guideline package is provided for each of the combinations of these three variables. The package specifies the V ampersand V methods recommended and the order in which they should be administered, the assurances each method provides, the qualifications needed by the V ampersand V team to employ each Particular method, the degree to which the methods should be applied, the performance measures that should be taken, and the decision criteria for accepting, conditionally accepting, or rejecting an evaluated system. In addition to the guideline packages, highly detailed step-by-step procedures are provided for 11 of the more important methods, to ensure that they Can be implemented correctly. The guidelines can apply to conventional procedural software systems as well as all kinds of AI systems

  16. Guidelines for the verification and validation of expert system software and conventional software: Rationale and description of V ampersand V guideline packages and procedures. Volume 5

    International Nuclear Information System (INIS)

    Mirsky, S.M.; Hayes, J.E.; Miller, L.A.

    1995-03-01

    This report is the fifth volume in a series of reports describing the results of the Expert System Verification C, and Validation (V ampersand V) project which is jointly funded by the U.S. Nuclear Regulatory Commission and the Electric Power Research Institute toward the objective of formulating Guidelines for the V ampersand V of expert systems for use in nuclear power applications. This report provides the rationale for and description of those guidelines. The actual guidelines themselves are presented in Volume 7, open-quotes User's Manual.close quotes Three factors determine what V ampersand V is needed: (1) the stage of the development life cycle (requirements, design, or implementation); (2) whether the overall system or a specialized component needs to be tested (knowledge base component, inference engine or other highly reusable element, or a component involving conventional software); and (3) the stringency of V ampersand V that is needed (as judged from an assessment of the system's complexity and the requirement for its integrity to form three Classes). A V ampersand V Guideline package is provided for each of the combinations of these three variables. The package specifies the V ampersand V methods recommended and the order in which they should be administered, the assurances each method provides, the qualifications needed by the V ampersand V team to employ each particular method, the degree to which the methods should be applied, the performance measures that should be taken, and the decision criteria for accepting, conditionally accepting, or rejecting an evaluated system. In addition to the Guideline packages, highly detailed step-by-step procedures are provided for 11 of the more important methods, to ensure that they can be implemented correctly. The Guidelines can apply to conventional procedural software systems as well as all kinds of Al systems

  17. Modern software approaches applied to a Hydrological model: the GEOtop Open-Source Software Project

    Science.gov (United States)

    Cozzini, Stefano; Endrizzi, Stefano; Cordano, Emanuele; Bertoldi, Giacomo; Dall'Amico, Matteo

    2017-04-01

    The GEOtop hydrological scientific package is an integrated hydrological model that simulates the heat and water budgets at and below the soil surface. It describes the three-dimensional water flow in the soil and the energy exchange with the atmosphere, considering the radiative and turbulent fluxes. Furthermore, it reproduces the highly non-linear interactions between the water and energy balance during soil freezing and thawing, and simulates the temporal evolution of snow cover, soil temperature and moisture. The core components of the package were presented in the 2.0 version (Endrizzi et al, 2014), which was released as Free Software Open-source project. However, despite the high scientific quality of the project, a modern software engineering approach was still missing. Such weakness hindered its scientific potential and its use both as a standalone package and, more importantly, in an integrate way with other hydrological software tools. In this contribution we present our recent software re-engineering efforts to create a robust and stable scientific software package open to the hydrological community, easily usable by researchers and experts, and interoperable with other packages. The activity takes as a starting point the 2.0 version, scientifically tested and published. This version, together with several test cases based on recent published or available GEOtop applications (Cordano and Rigon, 2013, WRR, Kollet et al, 2016, WRR) provides the baseline code and a certain number of referenced results as benchmark. Comparison and scientific validation can then be performed for each software re-engineering activity performed on the package. To keep track of any single change the package is published on its own github repository geotopmodel.github.io/geotop/ under GPL v3.0 license. A Continuous Integration mechanism by means of Travis-CI has been enabled on the github repository on master and main development branches. The usage of CMake configuration tool

  18. Software

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, R.; Budd, G.; Ross, E.; Wells, P.

    2010-07-15

    The software section of this journal presented new software programs that have been developed to help in the exploration and development of hydrocarbon resources. Software provider IHS Inc. has made additions to its geological and engineering analysis software tool, IHS PETRA, a product used by geoscientists and engineers to visualize, analyze and manage well production, well log, drilling, reservoir, seismic and other related information. IHS PETRA also includes a directional well module and a decline curve analysis module to improve analysis capabilities in unconventional reservoirs. Petris Technology Inc. has developed a software to help manage the large volumes of data. PetrisWinds Enterprise (PWE) helps users find and manage wellbore data, including conventional wireline and MWD core data; analysis core photos and images; waveforms and NMR; and external files documentation. Ottawa-based Ambercore Software Inc. has been collaborating with Nexen on the Petroleum iQ software for steam assisted gravity drainage (SAGD) producers. Petroleum iQ integrates geology and geophysics data with engineering data in 3D and 4D. Calgary-based Envirosoft Corporation has developed a software that reduces the costly and time-consuming effort required to comply with Directive 39 of the Alberta Energy Resources Conservation Board. The product includes an emissions modelling software. Houston-based Seismic Micro-Technology (SMT) has developed the Kingdom software that features the latest in seismic interpretation. Holland-based Joa Oil and Gas and Calgary-based Computer Modelling Group have both supplied the petroleum industry with advanced reservoir simulation software that enables reservoir interpretation. The 2010 software survey included a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In

  19. Experience Report: Introducing Kanban Into Automotive Software Project

    Directory of Open Access Journals (Sweden)

    Marek Majchrzak

    2017-03-01

    Full Text Available The boundaries between traditional and agile approach methods are disappearing. A significant number of software projects require a continuous implementation of tasks without dividing them into sprints or strict project phases. Customers expect more flexibility and responsiveness from software vendors in response to the ever-changing business environment. To achieve better results in this field, Capgemini has begun using the Lean philosophy and Kanban techniques. \\\\The following article illustrates examples of different uses of Kanban and the main stakeholder of the process. The article presents the main advantages of transparency and ways to improve the customer co-operation as well as stakeholder relationships. The Authors try to visualise all of the elements in the context of the project. \\\\There is also a discussion of different approaches in two software projects. The article fokuses on the main challenges and the evolutionary approach used. An attempt is made to answer the question how to convince both the team as well as the customer, and how to optimise ways to achieve great results.

  20. Management Guidelines for Database Developers' Teams in Software Development Projects

    Science.gov (United States)

    Rusu, Lazar; Lin, Yifeng; Hodosi, Georg

    Worldwide job market for database developers (DBDs) is continually increasing in last several years. In some companies, DBDs are organized as a special team (DBDs team) to support other projects and roles. As a new role, the DBDs team is facing a major problem that there are not any management guidelines for them. The team manager does not know which kinds of tasks should be assigned to this team and what practices should be used during DBDs work. Therefore in this paper we have developed a set of management guidelines, which includes 8 fundamental tasks and 17 practices from software development process, by using two methodologies Capability Maturity Model (CMM) and agile software development in particular Scrum in order to improve the DBDs team work. Moreover the management guidelines developed here has been complemented with practices from authors' experience in this area and has been evaluated in the case of a software company. The management guidelines for DBD teams presented in this paper could be very usefully for other companies too that are using a DBDs team and could contribute towards an increase of the efficiency of these teams in their work on software development projects.

  1. Self-service for software development projects and HPC activities

    International Nuclear Information System (INIS)

    Husejko, M; Høimyr, N; Gonzalez, A; Koloventzos, G; Asbury, D; Trzcinska, A; Agtzidis, I; Botrel, G; Otto, J

    2014-01-01

    This contribution describes how CERN has implemented several essential tools for agile software development processes, ranging from version control (Git) to issue tracking (Jira) and documentation (Wikis). Running such services in a large organisation like CERN requires many administrative actions both by users and service providers, such as creating software projects, managing access rights, users and groups, and performing tool-specific customisation. Dealing with these requests manually would be a time-consuming task. Another area of our CERN computing services that has required dedicated manual support has been clusters for specific user communities with special needs. Our aim is to move all our services to a layered approach, with server infrastructure running on the internal cloud computing infrastructure at CERN. This contribution illustrates how we plan to optimise the management of our of services by means of an end-user facing platform acting as a portal into all the related services for software projects, inspired by popular portals for open-source developments such as Sourceforge, GitHub and others. Furthermore, the contribution will discuss recent activities with tests and evaluations of High Performance Computing (HPC) applications on different hardware and software stacks, and plans to offer a dynamically scalable HPC service at CERN, based on affordable hardware.

  2. SNL software manual for the ACS Data Analytics Project.

    Energy Technology Data Exchange (ETDEWEB)

    Stearley, Jon R.; McLendon, William Clarence, III; Rodrigues, Arun F.; Williams, Aaron S.; Hooper, Russell Warren; Robinson, David Gerald; Stickland, Michael G.

    2011-10-01

    In the ACS Data Analytics Project (also known as 'YumYum'), a supercomputer is modeled as a graph of components and dependencies, jobs and faults are simulated, and component fault rates are estimated using the graph structure and job pass/fail outcomes. This report documents the successful completion of all SNL deliverables and tasks, describes the software written by SNL for the project, and presents the data it generates. Readers should understand what the software tools are, how they fit together, and how to use them to reproduce the presented data and additional experiments as desired. The SNL YumYum tools provide the novel simulation and inference capabilities desired by ACS. SNL also developed and implemented a new algorithm, which provides faster estimates, at finer component granularity, on arbitrary directed acyclic graphs.

  3. Guidelines for the verification and validation of expert system software and conventional software: Validation scenarios. Volume 6

    International Nuclear Information System (INIS)

    Mirsky, S.M.; Hayes, J.E.; Miller, L.A.

    1995-03-01

    This report is the sixth volume in a series of reports describing the results of the Expert System Verification and Validation (V ampersand V) project which is jointly funded by the US Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V ampersand V of expert systems for use in nuclear power applications. This activity was concerned with the development of a methodology for selecting validation scenarios and subsequently applying it to two expert systems used for nuclear utility applications. Validation scenarios were defined and classified into five categories: PLANT, TEST, BASICS, CODE, and LICENSING. A sixth type, REGRESSION, is a composite of the others and refers to the practice of using trusted scenarios to ensure that modifications to software did not change unmodified functions. Rationale was developed for preferring scenarios selected from the categories in the order listed and for determining under what conditions to select scenarios from other types. A procedure incorporating all of the recommendations was developed as a generalized method for generating validation scenarios. The procedure was subsequently applied to two expert systems used in the nuclear industry and was found to be effective, given that an experienced nuclear engineer made the final scenario selections. A method for generating scenarios directly from the knowledge base component was suggested

  4. Guidelines for the verification and validation of expert system software and conventional software: Validation scenarios. Volume 6

    Energy Technology Data Exchange (ETDEWEB)

    Mirsky, S.M.; Hayes, J.E.; Miller, L.A. [Science Applications International Corp., McLean, VA (United States)

    1995-03-01

    This report is the sixth volume in a series of reports describing the results of the Expert System Verification and Validation (V&V) project which is jointly funded by the US Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V&V of expert systems for use in nuclear power applications. This activity was concerned with the development of a methodology for selecting validation scenarios and subsequently applying it to two expert systems used for nuclear utility applications. Validation scenarios were defined and classified into five categories: PLANT, TEST, BASICS, CODE, and LICENSING. A sixth type, REGRESSION, is a composite of the others and refers to the practice of using trusted scenarios to ensure that modifications to software did not change unmodified functions. Rationale was developed for preferring scenarios selected from the categories in the order listed and for determining under what conditions to select scenarios from other types. A procedure incorporating all of the recommendations was developed as a generalized method for generating validation scenarios. The procedure was subsequently applied to two expert systems used in the nuclear industry and was found to be effective, given that an experienced nuclear engineer made the final scenario selections. A method for generating scenarios directly from the knowledge base component was suggested.

  5. A Data Specification for Software Project Performance Measures: Results of a Collaboration on Performance Measurement

    National Research Council Canada - National Science Library

    Kasunic, Mark

    2008-01-01

    ... between completed projects. These terms and definitions were developed using a collaborative, consensus-based approach involving the Software Engineering Institute's Software Engineering Process Management program and service...

  6. The software analysis project for the Office of Human Resources

    Science.gov (United States)

    Tureman, Robert L., Jr.

    1994-01-01

    There were two major sections of the project for the Office of Human Resources (OHR). The first section was to conduct a planning study to analyze software use with the goal of recommending software purchases and determining whether the need exists for a file server. The second section was analysis and distribution planning for retirement planning computer program entitled VISION provided by NASA Headquarters. The software planning study was developed to help OHR analyze the current administrative desktop computing environment and make decisions regarding software acquisition and implementation. There were three major areas addressed by the study: current environment new software requirements, and strategies regarding the implementation of a server in the Office. To gather data on current environment, employees were surveyed and an inventory of computers were produced. The surveys were compiled and analyzed by the ASEE fellow with interpretation help by OHR staff. New software requirements represented a compilation and analysis of the surveyed requests of OHR personnel. Finally, the information on the use of a server represents research done by the ASEE fellow and analysis of survey data to determine software requirements for a server. This included selection of a methodology to estimate the number of copies of each software program required given current use and estimated growth. The report presents the results of the computing survey, a description of the current computing environment, recommenations for changes in the computing environment, current software needs, management advantages of using a server, and management considerations in the implementation of a server. In addition, detailed specifications were presented for the hardware and software recommendations to offer a complete picture to OHR management. The retirement planning computer program available to NASA employees will aid in long-range retirement planning. The intended audience is the NASA civil

  7. Software project estimation the fundamentals for providing high quality information to decision makers

    CERN Document Server

    Abran, Alain

    2015-01-01

    Software projects are often late and over-budget and this leads to major problems for software customers. Clearly, there is a serious issue in estimating a realistic, software project budget. Furthermore, generic estimation models cannot be trusted to provide credible estimates for projects as complex as software projects. This book presents a number of examples using data collected over the years from various organizations building software. It also presents an overview of the non-for-profit organization, which collects data on software projects, the International Software Benchmarking Stan

  8. The Role of Requirements in the Success or Failure of Software Projects

    OpenAIRE

    Hussain, Azham; Mkpojiogu, Emmanuel O.C.; Kamal, Fazillah Mohmad

    2016-01-01

    Requirements engineering is pivotal and central to every successful software development project. There are several reasons why software projects fail; however, poorly elicited, documented, validated and managed requirements contribute grossly to software projects failure. Software project failures are normally very costly and risky and these could even a times be life threatening also. Projects that overlook requirements engineering processes often suffer or are most likely to suffer from fa...

  9. Software use cases to elicit the software requirements analysis within the ASTRI project

    Science.gov (United States)

    Conforti, Vito; Antolini, Elisa; Bonnoli, Giacomo; Bruno, Pietro; Bulgarelli, Andrea; Capalbi, Milvia; Fioretti, Valentina; Fugazza, Dino; Gardiol, Daniele; Grillo, Alessandro; Leto, Giuseppe; Lombardi, Saverio; Lucarelli, Fabrizio; Maccarone, Maria Concetta; Malaguti, Giuseppe; Pareschi, Giovanni; Russo, Federico; Sangiorgi, Pierluca; Schwarz, Joseph; Scuderi, Salvatore; Tanci, Claudio; Tosti, Gino; Trifoglio, Massimo; Vercellone, Stefano; Zanmar Sanchez, Ricardo

    2016-07-01

    The Italian National Institute for Astrophysics (INAF) is leading the Astrofisica con Specchi a Tecnologia Replicante Italiana (ASTRI) project whose main purpose is the realization of small size telescopes (SST) for the Cherenkov Telescope Array (CTA). The first goal of the ASTRI project has been the development and operation of an innovative end-to-end telescope prototype using a dual-mirror optical configuration (SST-2M) equipped with a camera based on silicon photo-multipliers and very fast read-out electronics. The ASTRI SST-2M prototype has been installed in Italy at the INAF "M.G. Fracastoro" Astronomical Station located at Serra La Nave, on Mount Etna, Sicily. This prototype will be used to test several mechanical, optical, control hardware and software solutions which will be used in the ASTRI mini-array, comprising nine telescopes proposed to be placed at the CTA southern site. The ASTRI mini-array is a collaborative and international effort led by INAF and carried out by Italy, Brazil and South-Africa. We present here the use cases, through UML (Unified Modeling Language) diagrams and text details, that describe the functional requirements of the software that will manage the ASTRI SST-2M prototype, and the lessons learned thanks to these activities. We intend to adopt the same approach for the Mini Array Software System that will manage the ASTRI miniarray operations. Use cases are of importance for the whole software life cycle; in particular they provide valuable support to the validation and verification activities. Following the iterative development approach, which breaks down the software development into smaller chunks, we have analysed the requirements, developed, and then tested the code in repeated cycles. The use case technique allowed us to formalize the problem through user stories that describe how the user procedurally interacts with the software system. Through the use cases we improved the communication among team members, fostered

  10. Inequalities in Open Source Software Development: Analysis of Contributor's Commits in Apache Software Foundation Projects.

    Science.gov (United States)

    Chełkowski, Tadeusz; Gloor, Peter; Jemielniak, Dariusz

    2016-01-01

    While researchers are becoming increasingly interested in studying OSS phenomenon, there is still a small number of studies analyzing larger samples of projects investigating the structure of activities among OSS developers. The significant amount of information that has been gathered in the publicly available open-source software repositories and mailing-list archives offers an opportunity to analyze projects structures and participant involvement. In this article, using on commits data from 263 Apache projects repositories (nearly all), we show that although OSS development is often described as collaborative, but it in fact predominantly relies on radically solitary input and individual, non-collaborative contributions. We also show, in the first published study of this magnitude, that the engagement of contributors is based on a power-law distribution.

  11. PROJECT-DRIVEN SOFTWARE BUSINESS IN TRANSILVANIA - A CASE STUDY

    Directory of Open Access Journals (Sweden)

    Radu Marius

    2015-07-01

    Full Text Available The fairly low salaries of the IT workers compared to the Western countries, the skills and the location have supported outsourcing become one of the most competitive Romanian sectors. IT sector in Romania maintains a steady growth favoured by outsourcing companies. Moreover Romania is highly competitive when you take into account the level of technical proficiency and soft skills in the country. Romanian labour force can drive relevant projects even in small teams. This case study explores the realty of Romanian IT companies profiles. It presents in comparison two companies bases on organizational and strategic dimensions: project approach orientation, leadership, project value driven, and social responsibility. The corporate goal of the first company presented in the case study - Fortech - is to achieve the best adaptive organizational structure which can sustain its competitive advantage. This advantage results from combination of three main ingredients: scaled up human resource capital, versatile knowledge management and adaptability to customer needs. Fortech manages and administrates and execute their business activities using project management methodologies and practices in order to achieve the strategic goals. On the other hand Dolphin Kiss Company is a “Python boutique agency” created around a single contract and organized on a single project. The project was contracted with a top company from telecommunication industry. The company is a small team of creative software engineers focused on developing a very innovative software business solution. This case study is an empirical qualitative research intended to depict the main differences between two relevant company profiles present in the actual economic context: small team – results oriented – highly skilled VS large structure of outsourcing teams – matrix organized – customer oriented. The case study constructs a space for debates regarding the potential evolution of the

  12. Hierarchy Software Development Framework (h-dp-fwk) project

    International Nuclear Information System (INIS)

    Zaytsev, A

    2010-01-01

    Hierarchy Software Development Framework provides a lightweight tool for building portable modular applications for performing automated data analysis tasks in a batch mode. The history of design and development activities devoted to the project has begun in March 2005 and from the very beginning it was targeting the case of building experimental data processing applications for the CMD-3 experiment which is being commissioned at Budker Institute of Nuclear Physics (BINP, Novosibirsk, Russia). Its design addresses the generic case of modular data processing application operating within the well defined distributed computing environment. The main features of the framework are modularity, built-in message and data exchange mechanisms, XInclude and XML schema enabled XML configuration management tools, dedicated log management tools, internal debugging tools, both dynamic and static module chains support, internal DSO version and consistency checking, well defined API for developing specialized frameworks. It is supported on Scientific Linux 4 and 5 and planned to be ported to other platforms as well. The project is provided with the comprehensive set of technical documentation and users' guides. The licensing schema for the source code, binaries and documentation implies that the product is free for non-commercial use. Although the development phase is not over and many features are to be implemented yet the project is considered ready for public use and creating applications in various fields including development of events reconstruction software for small and moderate scale HEP experiments.

  13. Hierarchy Software Development Framework (h-dp-fwk) project

    Energy Technology Data Exchange (ETDEWEB)

    Zaytsev, A, E-mail: Alexander.S.Zaytsev@gmail.co [Budker Institute of Nuclear Physics, Novosibirsk (Russian Federation)

    2010-04-01

    Hierarchy Software Development Framework provides a lightweight tool for building portable modular applications for performing automated data analysis tasks in a batch mode. The history of design and development activities devoted to the project has begun in March 2005 and from the very beginning it was targeting the case of building experimental data processing applications for the CMD-3 experiment which is being commissioned at Budker Institute of Nuclear Physics (BINP, Novosibirsk, Russia). Its design addresses the generic case of modular data processing application operating within the well defined distributed computing environment. The main features of the framework are modularity, built-in message and data exchange mechanisms, XInclude and XML schema enabled XML configuration management tools, dedicated log management tools, internal debugging tools, both dynamic and static module chains support, internal DSO version and consistency checking, well defined API for developing specialized frameworks. It is supported on Scientific Linux 4 and 5 and planned to be ported to other platforms as well. The project is provided with the comprehensive set of technical documentation and users' guides. The licensing schema for the source code, binaries and documentation implies that the product is free for non-commercial use. Although the development phase is not over and many features are to be implemented yet the project is considered ready for public use and creating applications in various fields including development of events reconstruction software for small and moderate scale HEP experiments.

  14. Software for project-based learning of robot motion planning

    Science.gov (United States)

    Moll, Mark; Bordeaux, Janice; Kavraki, Lydia E.

    2013-12-01

    Motion planning is a core problem in robotics concerned with finding feasible paths for a given robot. Motion planning algorithms perform a search in the high-dimensional continuous space of robot configurations and exemplify many of the core algorithmic concepts of search algorithms and associated data structures. Motion planning algorithms can be explained in a simplified two-dimensional setting, but this masks many of the subtleties and complexities of the underlying problem. We have developed software for project-based learning of motion planning that enables deep learning. The projects that we have developed allow advanced undergraduate students and graduate students to reflect on the performance of existing textbook algorithms and their own variations on such algorithms. Formative assessment has been conducted at three institutions. The core of the software used for this teaching module is also used within the Robot Operating System, a widely adopted platform by the robotics research community. This allows for transfer of knowledge and skills to robotics research projects involving a large variety robot hardware platforms.

  15. Incorporation of the KERN ECDS-PC software into a project oriented software environment

    International Nuclear Information System (INIS)

    Oren, W.; Pushor, R.; Ruland, R.

    1986-11-01

    The Stanford Linear Accelerator Center (SLAC) is in the process of building a new particle collider, the Stanford Linear Collider (SLC). The tunnel which houses the SLC is about 3 km long and contains approximately 1000 magnets. Besides a very precise absolute positioning of these magnets, the alignment of adjacent magnet ends is of particular importance to the success of the whole project. Because of this and the limited time frame, a survey method which was not only reliable and self-checking but also fast had to be developed. Therefore, the concept of MAS (Magnet Alignment System) was developed. This system utilizes the on-line data collection and the rigorous least-squares bundle adjustment of the KERN ECDS-PC system to fulfill these requirements. The ECDS software is embedded in a project tailored software system with modules which take care of: fixture and magnet calibration corrections, the calculation of ideal coordinates and their comparison to measured coordinates, the translation of detected misalignments into the coordinate system of the mechanical adjustments and the control of the adjustments with on-line electronic dial-gauges. This paper gives a brief introduction to the SLC project and some of the survey problems which are unique to this machine. The basic ideas of the KERN ECDS-PC system are explained and a discussion of the practical aspects, such as targeting and set-ups, are given. MAS and its modules are explained in detail

  16. The Impact of Organization, Project and Governance Variables on Software Quality and Project Success

    OpenAIRE

    Abbas, Noura; Gravell, Andy; Wills, Gary

    2010-01-01

    In this paper we present a statistically tested evidence about how quality and success rate are correlated with variables reflecting the organization and aspects of its project’s governance, namely retrospectives and metrics. The results presented in this paper are based on the Agile Projects Governance Survey that collected 129 responses. This paper discuss the deep analysis of this survey, and the main findings suggest that when applying agile software development, the quality of software i...

  17. SAGA: A project to automate the management of software production systems

    Science.gov (United States)

    Campbell, Roy H.; Laliberte, D.; Render, H.; Sum, R.; Smith, W.; Terwilliger, R.

    1987-01-01

    The Software Automation, Generation and Administration (SAGA) project is investigating the design and construction of practical software engineering environments for developing and maintaining aerospace systems and applications software. The research includes the practical organization of the software lifecycle, configuration management, software requirements specifications, executable specifications, design methodologies, programming, verification, validation and testing, version control, maintenance, the reuse of software, software libraries, documentation, and automated management.

  18. Energy efficiency enhancements for semiconductors, communications, sensors and software achieved in cool silicon cluster project

    Science.gov (United States)

    Ellinger, Frank; Mikolajick, Thomas; Fettweis, Gerhard; Hentschel, Dieter; Kolodinski, Sabine; Warnecke, Helmut; Reppe, Thomas; Tzschoppe, Christoph; Dohl, Jan; Carta, Corrado; Fritsche, David; Tretter, Gregor; Wiatr, Maciej; Detlef Kronholz, Stefan; Mikalo, Ricardo Pablo; Heinrich, Harald; Paulo, Robert; Wolf, Robert; Hübner, Johannes; Waltsgott, Johannes; Meißner, Klaus; Richter, Robert; Michler, Oliver; Bausinger, Markus; Mehlich, Heiko; Hahmann, Martin; Möller, Henning; Wiemer, Maik; Holland, Hans-Jürgen; Gärtner, Roberto; Schubert, Stefan; Richter, Alexander; Strobel, Axel; Fehske, Albrecht; Cech, Sebastian; Aßmann, Uwe; Pawlak, Andreas; Schröter, Michael; Finger, Wolfgang; Schumann, Stefan; Höppner, Sebastian; Walter, Dennis; Eisenreich, Holger; Schüffny, René

    2013-07-01

    An overview about the German cluster project Cool Silicon aiming at increasing the energy efficiency for semiconductors, communications, sensors and software is presented. Examples for achievements are: 1000 times reduced gate leakage in transistors using high-fc (HKMG) materials compared to conventional poly-gate (SiON) devices at the same technology node; 700 V transistors integrated in standard 0.35 μm CMOS; solar cell efficiencies above 19% at cars Contribution to the Topical Issue “International Semiconductor Conference Dresden-Grenoble - ISCDG 2012”, Edited by Gérard Ghibaudo, Francis Balestra and Simon Deleonibus.

  19. Naming Conventions for the Large Hadron Collider Project

    CERN Document Server

    Faugeras, Paul E

    1997-01-01

    This report gives the procedures for defining standard abbreviations for the various machine components of the Large Hadron Collider (LHC) Project, as well as for the surface buildings and the underground Civil Engineering works of the LHC. The contents of this report has been approved by the LHC Project Leader and is published in the form of a Project Report in order to allow its immediate implementation. It will be incorporated later in the Quality Assurance Plan of the LHC Project which is under preparation.

  20. Study on Top-Down Estimation Method of Software Project Planning

    Institute of Scientific and Technical Information of China (English)

    ZHANG Jun-guang; L(U) Ting-jie; ZHAO Yu-mei

    2006-01-01

    This paper studies a new software project planning method under some actual project data in order to make software project plans more effective. From the perspective of system theory, our new method regards a software project plan as an associative unit for study. During a top-down estimation of a software project, Program Evaluation and Review Technique (PERT) method and analogy method are combined to estimate its size, then effort estimation and specific schedules are obtained according to distributions of the phase effort. This allows a set of practical and feasible planning methods to be constructed. Actual data indicate that this set of methods can lead to effective software project planning.

  1. Software for Distributed Computation on Medical Databases: A Demonstration Project

    Directory of Open Access Journals (Sweden)

    Balasubramanian Narasimhan

    2017-05-01

    Full Text Available Bringing together the information latent in distributed medical databases promises to personalize medical care by enabling reliable, stable modeling of outcomes with rich feature sets (including patient characteristics and treatments received. However, there are barriers to aggregation of medical data, due to lack of standardization of ontologies, privacy concerns, proprietary attitudes toward data, and a reluctance to give up control over end use. Aggregation of data is not always necessary for model fitting. In models based on maximizing a likelihood, the computations can be distributed, with aggregation limited to the intermediate results of calculations on local data, rather than raw data. Distributed fitting is also possible for singular value decomposition. There has been work on the technical aspects of shared computation for particular applications, but little has been published on the software needed to support the "social networking" aspect of shared computing, to reduce the barriers to collaboration. We describe a set of software tools that allow the rapid assembly of a collaborative computational project, based on the flexible and extensible R statistical software and other open source packages, that can work across a heterogeneous collection of database environments, with full transparency to allow local officials concerned with privacy protections to validate the safety of the method. We describe the principles, architecture, and successful test results for the site-stratified Cox model and rank-k singular value decomposition.

  2. Building Software, Building Community: Lessons from the Ropensci Project

    Science.gov (United States)

    Boettiger, C.

    2014-12-01

    rOpenSci is a developer collective originally formed in 2011 by graduate students and post-docs from ecology and evolutionary biology to collaborate on building software tools to facilitate a more open and synthetic approach in the face of transformative rise of large and heterogeneous data. Born on the internet (the collective only began through chance discussions over social media), we have grown into a widely recognized effort that supports an ecosystem of some 45software packages, engages scores of collaborators, has taught dozens of workshops around the world, and has secured over $480,000 in grant support. As young scientists working in an academic context largely without direct support for our efforts, we have first hand experience with most of the the technical and social challenges in developing sustainable scientific software. I will summarize our experiences, the challenges we have faced, and describe our approach and success in building an effective and diverse community around the rOpenSci project.

  3. Guidance and Control Software Project Data - Volume 4: Configuration Management and Quality Assurance Documents

    Science.gov (United States)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes configuration management and quality assurance documents from the GCS project. Volume 4 contains six appendices: A. Software Accomplishment Summary for the Guidance and Control Software Project; B. Software Configuration Index for the Guidance and Control Software Project; C. Configuration Management Records for the Guidance and Control Software Project; D. Software Quality Assurance Records for the Guidance and Control Software Project; E. Problem Report for the Pluto Implementation of the Guidance and Control Software Project; and F. Support Documentation Change Reports for the Guidance and Control Software Project.

  4. Experiment Software and Projects on the Web with VISPA

    Science.gov (United States)

    Erdmann, M.; Fischer, B.; Fischer, R.; Geiser, E.; Glaser, C.; Müller, G.; Rieger, M.; Urban, M.; von Cube, R. F.; Welling, C.

    2017-10-01

    The Visual Physics Analysis (VISPA) project defines a toolbox for accessing software via the web. It is based on latest web technologies and provides a powerful extension mechanism that enables to interface a wide range of applications. Beyond basic applications such as a code editor, a file browser, or a terminal, it meets the demands of sophisticated experiment-specific use cases that focus on physics data analyses and typically require a high degree of interactivity. As an example, we developed a data inspector that is capable of browsing interactively through event content of several data formats, e.g., MiniAOD which is utilized by the CMS collaboration. The VISPA extension mechanism can also be used to embed external web-based applications that benefit from dynamic allocation of user-defined computing resources via SSH. For example, by wrapping the JSROOT project, ROOT files located on any remote machine can be inspected directly through a VISPA server instance. We introduced domains that combine groups of users and role-based permissions. Thereby, tailored projects are enabled, e.g. for teaching where access to student’s homework is restricted to a team of tutors, or for experiment-specific data that may only be accessible for members of the collaboration. We present the extension mechanism including corresponding applications and give an outlook onto the new permission system.

  5. CAVEAT: an assistance project for software validation using formal techniques

    International Nuclear Information System (INIS)

    Trotin, A.; Antoine, C.; Baudin, P.; Collart, J.M.; Raguideau, J.; Zylberajch, C.

    1995-01-01

    The aim of the CAVEAT project is to provide a tool for the validation of industrial C language softwares. It allows the user to go inside the program and have a good comprehension of it. It allows also the possibility to realize refined verifications of the consistency between the specifications and the program by translating the properties into a more suitable language. It calculates automatically the conditions to demonstrate, and offers an assistance to perform interactive demonstrations. The principal application of this tool is the safety of systems during the verification/certification phase or during the developing phase where it can works as an intelligent debugging system. (J.S.). 5 refs., 1 fig

  6. Vague project start makes project success of outsourced software development projects uncertain

    OpenAIRE

    Savolainen, Paula

    2010-01-01

    peer-reviewed A definition of a project success includes at least three criteria: 1) meeting planning goals, 2) customer benefits, and 3) supplier benefits. This study aims to point out the importance of the definition of the project start, the project start date, and what work should be included in the project effort in order to ensure the supplier's benefits. The ambiguity of the project start risks the profitability of the project and therefore makes project success at least from suppli...

  7. The Generalizability of Private Sector Research on Software Project Management in Two USAF Organizations: An Exploratory Study

    National Research Council Canada - National Science Library

    Garman, Michael

    2003-01-01

    ... budget. However, software projects frequently fail to meet these criteria. Software engineers, acquisition officers, and project managers have all studied this issue and made recommendations for achieving success...

  8. Guidelines for the verification and validation of expert system software and conventional software: Volume 2, Survey and assessment of conventional software verification and validation methods Revision 1, Final report

    International Nuclear Information System (INIS)

    Miller, L.A.; Groundwater, E.H.; Hayes, J.E.; Mirsky, S.M.

    1995-05-01

    By means of a literature survey, a comprehensive set of methods was identified for the verification and validation of conventional software. The 153 methods so identified were classified according to their appropriateness for various phases of a developmental life-cycle -- requirements, design, and implementation; the last category was subdivided into two, static testing and dynamic testing methods. The methods were then characterized in terms of eight rating factors, four concerning ease-of-use of the methods and four concerning the methods' power to detect defects. Based on these factors, two measurements were developed to permit quantitative comparisons among methods, a Cost-Benefit Metric and an Effectiveness Metric. The Effectiveness Metric was further refined to provide three different estimates for each method, depending on three classes of needed stringency of V ampersand V (determined by ratings of a system's complexity and required-integrity). Methods were then rank-ordered for each of the three classes in terms of their overall cost-benefits and effectiveness. The applicability was then assessed of each method for the four identified components of knowledge-based and expert systems, as well as the system as a whole

  9. Coordination Implications of Software Coupling in Open Source Projects

    NARCIS (Netherlands)

    Amrit, Chintan Amrit; van Hillegersberg, Jos; Ågerfalk, Pär

    2010-01-01

    The effect of software coupling on the quality of software has been studied quite widely since the seminal paper on software modularity by Parnas [1]. However, the effect of the increase in software coupling on the coordination of the developers has not been researched as much. In commercial

  10. Using UML Modeling to Facilitate Three-Tier Architecture Projects in Software Engineering Courses

    Science.gov (United States)

    Mitra, Sandeep

    2014-01-01

    This article presents the use of a model-centric approach to facilitate software development projects conforming to the three-tier architecture in undergraduate software engineering courses. Many instructors intend that such projects create software applications for use by real-world customers. While it is important that the first version of these…

  11. Optimizing Seismic Monitoring Networks for EGS and Conventional Geothermal Projects

    Science.gov (United States)

    Kraft, Toni; Herrmann, Marcus; Bethmann, Falko; Stefan, Wiemer

    2013-04-01

    In the past several years, geological energy technologies receive growing attention and have been initiated in or close to urban areas. Some of these technologies involve injecting fluids into the subsurface (e.g., oil and gas development, waste disposal, and geothermal energy development) and have been found or suspected to cause small to moderate sized earthquakes. These earthquakes, which may have gone unnoticed in the past when they occurred in remote sparsely populated areas, are now posing a considerable risk for the public acceptance of these technologies in urban areas. The permanent termination of the EGS project in Basel, Switzerland after a number of induced ML~3 (minor) earthquakes in 2006 is one prominent example. It is therefore essential for the future development and success of these geological energy technologies to develop strategies for managing induced seismicity and keeping the size of induced earthquakes at a level that is acceptable to all stakeholders. Most guidelines and recommendations on induced seismicity published since the 1970ies conclude that an indispensable component of such a strategy is the establishment of seismic monitoring in an early stage of a project. This is because an appropriate seismic monitoring is the only way to detect and locate induced microearthquakes with sufficient certainty to develop an understanding of the seismic and geomechanical response of the reservoir to the geotechnical operation. In addition, seismic monitoring lays the foundation for the establishment of advanced traffic light systems and is therefore an important confidence building measure towards the local population and authorities. We have developed an optimization algorithm for seismic monitoring networks in urban areas that allows to design and evaluate seismic network geometries for arbitrary geotechnical operation layouts. The algorithm is based on the D-optimal experimental design that aims to minimize the error ellipsoid of the linearized

  12. Grounded Theory Study of Conflicts in Norwegian Agile Software Projects: The Project Managers’ Perspective

    Directory of Open Access Journals (Sweden)

    Lubna Siddique

    2016-07-01

    Full Text Available This paper aims to explore the process of conflicts in agile software projects. The purpose was to investigate the causes and consequences of these conflicts. For this purpose, we conducted a qualitative study involving agile software projects in Norway. Grounded theory was used to analyze the data and the interview findings are presented using Glaser´s Six C model (context, condition, causes, consequences, contingencies, and covariance. The research findings suggest that there are several causes of conflicts. These include: the role of the product owner, an inexperienced project manager, the customer’s lack of knowledge about methodology organizational hierarchy in public companies, contracting, personal egos, financial issues, not getting the right team. Consequences of conflicts include: decreased productivity, wastage of time and resources, diverted attention from project objectives loss of motivation, poor decision making, loss of communication. Based on interview data, different conflict strategies are suggested and these include appropriately skilled project manager, communication and negotiation, defining clear roles, stakeholder analysis, managing stakeholder´s expectations, discussion, finding the root cause of conflict. Project managers are using these strategies to avoid or resolve conflicts. The competencies required to handle these kind of conflicts are also discussed in the paper, while the implications of theory and practice of conflict management theory are also presented.

  13. Advanced software development workstation project: Engineering scripting language. Graphical editor

    Science.gov (United States)

    1992-01-01

    Software development is widely considered to be a bottleneck in the development of complex systems, both in terms of development and in terms of maintenance of deployed systems. Cost of software development and maintenance can also be very high. One approach to reducing costs and relieving this bottleneck is increasing the reuse of software designs and software components. A method for achieving such reuse is a software parts composition system. Such a system consists of a language for modeling software parts and their interfaces, a catalog of existing parts, an editor for combining parts, and a code generator that takes a specification and generates code for that application in the target language. The Advanced Software Development Workstation is intended to be an expert system shell designed to provide the capabilities of a software part composition system.

  14. A comparison of conventional and computer-assisted semen analysis (CRISMAS software) using samples from 166 young Danish men

    DEFF Research Database (Denmark)

    Vested, Anne; Ramlau-Hansen, Cecilia; Bonde, Jens P

    2011-01-01

    The aim of the present study was to compare assessments of sperm concentration and sperm motility analysed by conventional semen analysis with those obtained by computer-assisted semen analysis (CASA) (Copenhagen Rigshospitalet Image House Sperm Motility Analysis System (CRISMAS) 4.6 software......) using semen samples from 166 young Danish men. The CRISMAS software identifies sperm concentration and classifies spermatozoa into three motility categories. To enable comparison of the two methods, the four motility stages obtained by conventional semen analysis were, based on their velocity...... classifications, divided into three stages, comparable to the three CRISMAS motility categories: rapidly progressive (A), slowly progressive (B) and non-progressive (C+D). Differences between the two methods were large for all investigated parameters (P sperm concentration...

  15. Software development infrastructure for the HYBRID modeling and simulation project

    International Nuclear Information System (INIS)

    Epiney, Aaron S.; Kinoshita, Robert A.; Kim, Jong Suk; Rabiti, Cristian; Greenwood, M. Scott

    2016-01-01

    One of the goals of the HYBRID modeling and simulation project is to assess the economic viability of hybrid systems in a market that contains renewable energy sources like wind. The idea is that it is possible for the nuclear plant to sell non-electric energy cushions, which absorb (at least partially) the volatility introduced by the renewable energy sources. This system is currently modeled in the Modelica programming language. To assess the economics of the system, an optimization procedure is trying to find the minimal cost of electricity production. The RAVEN code is used as a driver for the whole problem. It is assumed that at this stage, the HYBRID modeling and simulation framework can be classified as non-safety “research and development” software. The associated quality level is Quality Level 3 software. This imposes low requirements on quality control, testing and documentation. The quality level could change as the application development continues.Despite the low quality requirement level, a workflow for the HYBRID developers has been defined that include a coding standard and some documentation and testing requirements. The repository performs automated unit testing of contributed models. The automated testing is achieved via an open-source python script called BuildingsP from Lawrence Berkeley National Lab. BuildingsPy runs Modelica simulation tests using Dymola in an automated manner and generates and runs unit tests from Modelica scripts written by developers. In order to assure effective communication between the different national laboratories a biweekly videoconference has been set-up, where developers can report their progress and issues. In addition, periodic face-face meetings are organized intended to discuss high-level strategy decisions with management. A second means of communication is the developer email list. This is a list to which everybody can send emails that will be received by the collective of the developers and managers

  16. Software development infrastructure for the HYBRID modeling and simulation project

    Energy Technology Data Exchange (ETDEWEB)

    Epiney, Aaron S. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kinoshita, Robert A. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kim, Jong Suk [Idaho National Lab. (INL), Idaho Falls, ID (United States); Rabiti, Cristian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Greenwood, M. Scott [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    One of the goals of the HYBRID modeling and simulation project is to assess the economic viability of hybrid systems in a market that contains renewable energy sources like wind. The idea is that it is possible for the nuclear plant to sell non-electric energy cushions, which absorb (at least partially) the volatility introduced by the renewable energy sources. This system is currently modeled in the Modelica programming language. To assess the economics of the system, an optimization procedure is trying to find the minimal cost of electricity production. The RAVEN code is used as a driver for the whole problem. It is assumed that at this stage, the HYBRID modeling and simulation framework can be classified as non-safety “research and development” software. The associated quality level is Quality Level 3 software. This imposes low requirements on quality control, testing and documentation. The quality level could change as the application development continues.Despite the low quality requirement level, a workflow for the HYBRID developers has been defined that include a coding standard and some documentation and testing requirements. The repository performs automated unit testing of contributed models. The automated testing is achieved via an open-source python script called BuildingsP from Lawrence Berkeley National Lab. BuildingsPy runs Modelica simulation tests using Dymola in an automated manner and generates and runs unit tests from Modelica scripts written by developers. In order to assure effective communication between the different national laboratories a biweekly videoconference has been set-up, where developers can report their progress and issues. In addition, periodic face-face meetings are organized intended to discuss high-level strategy decisions with management. A second means of communication is the developer email list. This is a list to which everybody can send emails that will be received by the collective of the developers and managers

  17. Quality Assurance in Software Development: An Exploratory Investigation in Software Project Failures and Business Performance

    Science.gov (United States)

    Ichu, Emmanuel A.

    2010-01-01

    Software quality is perhaps one of the most sought-after attributes in product development, however; this goal is unattained. Problem factors in software development and how these have affected the maintainability of the delivered software systems requires a thorough investigation. It was, therefore, very important to understand software…

  18. Lessons learned from development and quality assurance of software systems at the Halden Project

    International Nuclear Information System (INIS)

    Bjorlo, T.J.; Berg, O.; Pehrsen, M.; Dahll, G.; Sivertsen, T.

    1996-01-01

    The OECD Halden Reactor Project has developed a number of software systems within the research programmes. These programmes have comprised a wide range of topics, like studies of software for safety-critical applications, development of different operator support systems, and software systems for building and implementing graphical user interfaces. The systems have ranged from simple prototypes to installations in process plants. In the development of these software systems, Halden has gained much experience in quality assurance of different types of software. This paper summarises the accumulated experience at the Halden Project in quality assurance of software systems. The different software systems being developed at the Halden Project may be grouped into three categories. These are plant-specific software systems (one-of-a-kind deliveries), generic software products, and safety-critical software systems. This classification has been found convenient as the categories have different requirements to the quality assurance process. In addition, the experience from use of software development tools and proprietary software systems at Halden, is addressed. The paper also focuses on the experience gained from the complete software life cycle, starting with the software planning phase and ending with software operation and maintenance

  19. OPTiM: Optical projection tomography integrated microscope using open-source hardware and software.

    Science.gov (United States)

    Watson, Thomas; Andrews, Natalie; Davis, Samuel; Bugeon, Laurence; Dallman, Margaret D; McGinty, James

    2017-01-01

    We describe the implementation of an OPT plate to perform optical projection tomography (OPT) on a commercial wide-field inverted microscope, using our open-source hardware and software. The OPT plate includes a tilt adjustment for alignment and a stepper motor for sample rotation as required by standard projection tomography. Depending on magnification requirements, three methods of performing OPT are detailed using this adaptor plate: a conventional direct OPT method requiring only the addition of a limiting aperture behind the objective lens; an external optical-relay method allowing conventional OPT to be performed at magnifications >4x; a remote focal scanning and region-of-interest method for improved spatial resolution OPT (up to ~1.6 μm). All three methods use the microscope's existing incoherent light source (i.e. arc-lamp) and all of its inherent functionality is maintained for day-to-day use. OPT acquisitions are performed on in vivo zebrafish embryos to demonstrate the implementations' viability.

  20. Software Assurance Curriculum Project Volume 1: Master of Software Assurance Reference Curriculum

    Science.gov (United States)

    2010-08-01

    developed products. The above definition was derived from these references: [IEEE-CS 2008] ISO /IEC 12207 , IEEE Std 12207 -2008, Systems and Software...Systems [CNSS 2009]. Software quality Capability of a software product to satisfy stated and implied needs when used under specified conditions [ ISO ...Curriculum ISO International Organization for Standardization IT information technology KA knowledge area KU knowledge unit MBA Master of

  1. Software project profitability analysis using temporal probabilistic reasoning; an empirical study with the CASSE framework

    CSIR Research Space (South Africa)

    Balikuddembe, JK

    2009-04-01

    Full Text Available Undertaking adequate risk management by understanding project requirements and ensuring that viable estimates are made on software projects require extensive application and sophisticated techniques of analysis and interpretation. Informative...

  2. Software support for Motorola 68000 microprocessor at CERN. CERN convention for programming the MC68000 family

    International Nuclear Information System (INIS)

    Cailliau, R.; Carpenter, B.

    1984-01-01

    The CERN convention for programming the MC68000 family of microprocessors gives a set of rules describing the layout of the memory and stack frames used by routines as they should appear before and after their calling sequences. It does not deal with the instructions used to achieve these states. The aim of the convention is to allow programming language mixing as well as debugging of programs built from units written in different languages. It is to be followed by programmers and programming-language compilers. (orig.)

  3. Delivering Software Process-Specific Project Courses in Tertiary Education Environment: Challenges and Solution

    Science.gov (United States)

    Rong, Guoping; Shao, Dong

    2012-01-01

    The importance of delivering software process courses to software engineering students has been more and more recognized in China in recent years. However, students usually cannot fully appreciate the value of software process courses by only learning methodology and principle in the classroom. Therefore, a process-specific project course was…

  4. Software development project success and failure from the supplier's perspective: A systematic literature review

    OpenAIRE

    Savolainen, Paula; Ahonen, Jarmo J.; Richardson, Ita

    2012-01-01

    peer-reviewed In this paper, we consider software development project success and failure from the supplier's perspective. First we clarified concepts in order to be able to exclude review articles on in-house projects, continuous services, the customer's perspective, and software product development, with the aim of providing valid results for supplier firms. We divided success criteria into project success and project management (PM) success, and, in seven articles, identified thre...

  5. SEffEst: Effort estimation in software projects using fuzzy logic and neural networks

    Directory of Open Access Journals (Sweden)

    Israel

    2012-08-01

    Full Text Available Academia and practitioners confirm that software project effort prediction is crucial for an accurate software project management. However, software development effort estimation is uncertain by nature. Literature has developed methods to improve estimation correctness, using artificial intelligence techniques in many cases. Following this path, this paper presents SEffEst, a framework based on fuzzy logic and neural networks designed to increase effort estimation accuracy on software development projects. Trained using ISBSG data, SEffEst presents remarkable results in terms of prediction accuracy.

  6. Software-Enabled Project Management Techniques and Their Relationship to the Triple Constraints

    Science.gov (United States)

    Elleh, Festus U.

    2013-01-01

    This study investigated the relationship between software-enabled project management techniques and the triple constraints (time, cost, and scope). There was the dearth of academic literature that focused on the relationship between software-enabled project management techniques and the triple constraints (time, cost, and scope). Based on the gap…

  7. An Agile Constructionist Mentoring Methodology for Software Projects in the High School

    Science.gov (United States)

    Meerbaum-Salant, Orni; Hazzan, Orit

    2010-01-01

    This article describes the construction process and evaluation of the Agile Constructionist Mentoring Methodology (ACMM), a mentoring method for guiding software development projects in the high school. The need for such a methodology has arisen due to the complexity of mentoring software project development in the high school. We introduce the…

  8. Extensive Evaluation of Using a Game Project in a Software Architecture Course

    Science.gov (United States)

    Wang, Alf Inge

    2011-01-01

    This article describes an extensive evaluation of introducing a game project to a software architecture course. In this project, university students have to construct and design a type of software architecture, evaluate the architecture, implement an application based on the architecture, and test this implementation. In previous years, the domain…

  9. The Software Life-Cycle Based Configuration Management Tasks for the KNICS Project

    International Nuclear Information System (INIS)

    Cheon, Se Woo; Kwon, Kee Choon

    2005-01-01

    Software configuration management (SCM) is an activity, which configures the form of a software system (e.g., design documents and programs) and systematically manages and controls the modifications used to compile the plans, development, and operations resulting from software development and maintenance. The SCM tool, NuSCM, has been specifically developed for the software life-cycle configuration management of developing the KNICS plant protection system (PPS). This paper presents the application of NuSCM to the KNICS project

  10. IT & C Projects Duration Assessment Based on Audit and Software Reengineering

    Directory of Open Access Journals (Sweden)

    2009-01-01

    Full Text Available This paper analyses the effect of applying the core elements of software engineering and reengineering, probabilistic simulations and system development auditing to software development projects. Our main focus is reducing software development project duration. Due to the fast changing economy, the need for efficiency and productivity is greater than ever. Optimal allocation of resources has proved to be the main element contributing to an increase in efficiency.

  11. Sustainability in Open Source Software Commons: Lessons Learned from an Empirical Study of SourceForge Projects

    Directory of Open Access Journals (Sweden)

    Charles M. Schweik

    2013-01-01

    Full Text Available In this article, we summarize a five-year US National Science Foundation funded study designed to investigate the factors that lead some open source projects to ongoing collaborative success while many others become abandoned. Our primary interest was to conduct a study that was closely representative of the population of open source software projects in the world, rather than focus on the more-often studied, high-profile successful cases. After building a large database of projects (n=174,333 and implementing a major survey of open source developers (n=1403, we were able to conduct statistical analyses to investigate over forty theoretically-based testable hypotheses. Our data firmly support what we call the conventional theory of open source software, showing that projects start small, and, in successful cases, grow slightly larger in terms of team size. We describe the “virtuous circle” supporting conventional wisdom of open source collaboration that comes out of this analysis, and we discuss two other interesting findings related to developer motivations and how team members find each other. Each of these findings is related to the sustainability of these projects.

  12. Barriers to Learning in Agile Software Development Projects

    DEFF Research Database (Denmark)

    Babb, Jeffry S.; Hoda, Rashina; Nørbjerg, Jacob

    2013-01-01

    The adoption of agile methods promises many advantages for individual, team, and organizational learning. However, environmental, structural, and organizational/cultural constraints often find teams adapting agile software development methods rather than engaging in full adoption. We present resu...

  13. Description of a project management system software tool (Sugar)

    International Nuclear Information System (INIS)

    Saito, T.

    2000-01-01

    Toshiba has developed a project management tool that can be applied to large-scale and complicated projects such as the outage of a nuclear power station. The project management tool (Sugar) which Toshiba developed is excellent in operative visibility and extendibility, and has been developed from the beginning for use in nuclear periodic-inspection project control. Here, the development circumstances of this project management tool (Sugar) and the feature are described, and an easy demonstration is provided as an example. (author)

  14. The use of intelligent systems for risk management in software projects

    Directory of Open Access Journals (Sweden)

    Oksana A. Gushchina

    2017-06-01

    Full Text Available Introduction: The article identifies the main risks of a software project, examines the use of different types of intelligent systems in the risk management process for software projects, discusses the basic methods used for process estimation and forecasting in the field of software engineering, identifies currently used empty expert systems, software systems for analysis and risk management of software projects. Materials and Methods: The author describes the peculiarities of risk management in the field of software engineering with involvement of intelligent systems. The intelligent techniques allow solving the control task with expert precision without the involvement of human experts. Results: The result of this work: – identification of the key risks of a software project (tax, legal, financial and commercial risks, IT risks, personnel risks, risks related to competitors, suppliers, marketing and demand and market; – investigation of the current, applied to risk management of software system projects, artificial intelligence, particularly expert systems and software tools for evaluation of the process results; – identification of the most popular empty expert systems (Clips, G2 and Leonardo and software products of the analysis of large databases (Orange, Weka, Rattle GUI, Apache Mahout, SCaViS, RapidMiner, Databionic ESOM Tools, ELKI, KNIME, Pandas and UIMA; – consideration of the cluster, correlation, regression, factor and dispersion analysis methods for the estimation and prediction of the processes of software engineering. Discussion and Conclusions: The results show the feasibility of the application of various intelligent systems in the risk management process. The analysis of methods of evaluating risks and the tendency of their application in the modern systems of intellectual analysis can serve as a start point for creating a unified system of risk management for software projects of medium and high complexity with a

  15. Software Support for the Classical, Contemporary and Future Project Management

    OpenAIRE

    Jakov Crnkovic; Peter Ross; Sanjay Desai

    2006-01-01

    The volume and complexity of Project Management (PM) raises many questions for managers. What exactly are we managing? People? Performance? Efficiency? Effectiveness? Cost? Time? At what levels do projects become challenging and worthy of significant management attention? Can some projects be left on auto-pilot? Must others be managed more aggressively? What metrics are useful in Project Management? How can they be integrated with normal performance metrics in the organization? How can metric...

  16. Results of the EC research project REQUEST on software quality and reliability

    International Nuclear Information System (INIS)

    Kersken, M.; Saglietti, F.

    1990-01-01

    GRS work in software safety was mainly concerned with the qualitative assessment of software reliability and quality. As a supplement to these activities the work within the REQUEST project emphasized the quantitative determination of the respective parameters. The three-level quality model COQUAMO serves for the computation - and partly for the prediction - of quality factors during the software life cycle. PERFIDE controls the application of software reliability models during the test phase and in early operational life. Specific attention was paid to the assessment of fault-tolerant diverse software systems. (orig.) [de

  17. Management of Globally Distributed Component-Based Software Development Projects

    NARCIS (Netherlands)

    J. Kotlarsky (Julia)

    2005-01-01

    textabstractGlobally Distributed Component-Based Development (GD CBD) is expected to become a promising area, as increasing numbers of companies are setting up software development in a globally distributed environment and at the same time are adopting CBD methodologies. Being an emerging area, the

  18. Mars Science Laboratory Flight Software Boot Robustness Testing Project Report

    Science.gov (United States)

    Roth, Brian

    2011-01-01

    On the surface of Mars, the Mars Science Laboratory will boot up its flight computers every morning, having charged the batteries through the night. This boot process is complicated, critical, and affected by numerous hardware states that can be difficult to test. The hardware test beds do not facilitate testing a long duration of back-to-back unmanned automated tests, and although the software simulation has provided the necessary functionality and fidelity for this boot testing, there has not been support for the full flexibility necessary for this task. Therefore to perform this testing a framework has been build around the software simulation that supports running automated tests loading a variety of starting configurations for software and hardware states. This implementation has been tested against the nominal cases to validate the methodology, and support for configuring off-nominal cases is ongoing. The implication of this testing is that the introduction of input configurations that have yet proved difficult to test may reveal boot scenarios worth higher fidelity investigation, and in other cases increase confidence in the robustness of the flight software boot process.

  19. Collaborative Learning in Architectural Education: Benefits of Combining Conventional Studio, Virtual Design Studio and Live Projects

    Science.gov (United States)

    Rodriguez, Carolina; Hudson, Roland; Niblock, Chantelle

    2018-01-01

    Combinations of Conventional Studio and Virtual Design Studio (VDS) have created valuable learning environments that take advantage of different instruments of communication and interaction. However, past experiences have reported limitations in regards to student engagement and motivation, especially when the studio projects encourage abstraction…

  20. Developer’s time spent in a software project part using the SGD framework

    OpenAIRE

    Ciesluk, Simon

    2016-01-01

    Resource management is important for software projects to be successful. Time is one of these resources that needs to be managed. To do this you need to know how time resources are spent. Currently the existence of published material on time resources spent in a software project is almost none. In this thesis a research was conducted on how time resources are spent by an individual developer in a software project. The Self-Governance Developer framework was the tool used to gather these resou...

  1. Resource Planning and Management: Job One for Software Project Managers

    Science.gov (United States)

    2011-05-01

    negotiation based on mutual understanding of and belief i th j t t t i lAcceptancen e pro ec managemen r ang e – Project leaders need to communicate...Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA 22202-4302. Respondents should be aware that...puts the figure at $1.22 T (even more zeros) Constant barrage of project doom and gloom Optimize tomorrow today.TM …. Reasons for Project

  2. Success Rates by Software Development Methodology in Information Technology Project Management: A Quantitative Analysis

    Science.gov (United States)

    Wright, Gerald P.

    2013-01-01

    Despite over half a century of Project Management research, project success rates are still too low. Organizations spend a tremendous amount of valuable resources on Information Technology projects and seek to maximize the utility gained from their efforts. The author investigated the impact of software development methodology choice on ten…

  3. Factors that Impact Software Project Success in Offshore Information Technology (IT) Companies

    Science.gov (United States)

    Edara, Venkatarao

    2011-01-01

    Information technology (IT) projects are unsuccessful at a rate of 65% to 75% per year, in spite of employing the latest technologies and training employees. Although many studies have been conducted on project successes in U.S. companies, there is a lack of research studying the impact of various factors on software project success in offshore IT…

  4. The dynamics of software development project management: An integrative systems dynamic perspective

    Science.gov (United States)

    Vandervelde, W. E.; Abdel-Hamid, T.

    1984-01-01

    Rather than continuing to focus on software development projects per se, the system dynamics modeling approach outlined is extended to investigate a broader set of issues pertaining to the software development organization. Rather than trace the life cycle(s) of one or more software projects, the focus is on the operations of a software development department as a continuous stream of software products are developed, placed into operation, and maintained. A number of research questions are ""ripe'' for investigating including: (1) the efficacy of different organizational structures in different software development environments, (2) personnel turnover, (3) impact of management approaches such as management by objectives, and (4) the organizational/environmental determinants of productivity.

  5. Projected costs of nuclear and conventional base load electricity generation in some IAEA Member States

    International Nuclear Information System (INIS)

    1990-09-01

    The cost of nuclear and conventional electricity is one of the most important parameters for power system planning, and in particular for decisions on base load power projects. This study reviews the projected levelized electricity generation costs of the base load power generation options expected to be available in the medium term, using an agreed common economic methodology. Cost projections were obtained and evaluated for nuclear and fossil fuelled (mainly coal-fired) plants that could be commissioned in the mid- to late 1990s in 10 IAEA Member States. 27 refs, figs and tabs

  6. Software Project Management and Measurement on the World-Wide-Web (WWW)

    Science.gov (United States)

    Callahan, John; Ramakrishnan, Sudhaka

    1996-01-01

    We briefly describe a system for forms-based, work-flow management that helps members of a software development team overcome geographical barriers to collaboration. Our system, called the Web Integrated Software Environment (WISE), is implemented as a World-Wide-Web service that allows for management and measurement of software development projects based on dynamic analysis of change activity in the workflow. WISE tracks issues in a software development process, provides informal communication between the users with different roles, supports to-do lists, and helps in software process improvement. WISE minimizes the time devoted to metrics collection and analysis by providing implicit delivery of messages between users based on the content of project documents. The use of a database in WISE is hidden from the users who view WISE as maintaining a personal 'to-do list' of tasks related to the many projects on which they may play different roles.

  7. Software Assurance Curriculum Project Volume 3: Master of Software Assurance Course Syllabi

    Science.gov (United States)

    2011-07-01

    thods and process of model-driven development. • Pressman , Roger S., Software Engineering: A Practitioner’s Approach, 6th ed. McGraw Hill, 2009...audience • [Bishop 2002] Chapter 18 • [Allen 2008] Chapters 1, 2 • [ Pressman 2009] Chapter 1 • [Merkow 2010] Chapter 3 • [Mouratidis 2007...Allen 2008] Chapters 3,4 • [ Pressman 2009] Chapters 3,4 • [Merkow 2010] Chapter 5 • [DHS 2008-2009a] • [Mellado 2010] • [CERT 2009] 3

  8. Analysis of Return on Investment in Different Types of Agile Software Development Project Teams

    Directory of Open Access Journals (Sweden)

    Goran MILANOV

    2012-01-01

    Full Text Available This exploratory study of IT project teams in Serbia investigates how the choice of agile methods in different development project teams affects the return-on-investment (ROI. In this paper different types of software project teams are analyzed in order to examine and identify the business-value of using agile methods. In various software development project teams, the ROI of agile methods is yet to be fully explored, while the ROI of traditional methods is well-understood. Since ROI is important indicator of the projects success, in this paper we examine the factors that influence the ROI both from software solution customer point of view, and different agile project teams.

  9. Software Engineering and eLearning: The MuSofT Project - www.musoft.org

    Directory of Open Access Journals (Sweden)

    Ernst-Erich Doberkat

    2005-12-01

    Full Text Available eLearning supports the education in certain disciplines. Here, we report about novel eLearning concepts, techniques, and tools to support education in Software Engineering, a subdiscipline of computer science. We call this "Software Engineering eLearning". On the other side, software support is a substantial prerequisite for eLearning in any discipline. Thus, Software Engineering techniques have to be applied to develop and maintain those software systems. We call this "eLearning Software Engineering". Both aspects have been investigated in a large joint, BMBF-funded research project, termed MuSofT (Multimedia in Software Engineering. The main results are summarized in this paper.

  10. Project W-211, initial tank retrieval systems, retrieval control system software configuration management plan

    International Nuclear Information System (INIS)

    RIECK, C.A.

    1999-01-01

    This Software Configuration Management Plan (SCMP) provides the instructions for change control of the W-211 Project, Retrieval Control System (RCS) software after initial approval/release but prior to the transfer of custody to the waste tank operations contractor. This plan applies to the W-211 system software developed by the project, consisting of the computer human-machine interface (HMI) and programmable logic controller (PLC) software source and executable code, for production use by the waste tank operations contractor. The plan encompasses that portion of the W-211 RCS software represented on project-specific AUTOCAD drawings that are released as part of the C1 definitive design package (these drawings are identified on the drawing list associated with each C-1 package), and the associated software code. Implementation of the plan is required for formal acceptance testing and production release. The software configuration management plan does not apply to reports and data generated by the software except where specifically identified. Control of information produced by the software once it has been transferred for operation is the responsibility of the receiving organization

  11. WISE: Automated support for software project management and measurement. M.S. Thesis

    Science.gov (United States)

    Ramakrishnan, Sudhakar

    1995-01-01

    One important aspect of software development and IV&V is measurement. Unless a software development effort is measured in some way, it is difficult to judge the effectiveness of current efforts and predict future performances. Collection of metrics and adherence to a process are difficult tasks in a software project. Change activity is a powerful indicator of project status. Automated systems that can handle change requests, issues, and other process documents provide an excellent platform for tracking the status of the project. A World Wide Web based architecture is developed for (a) making metrics collection an implicit part of the software process, (b) providing metric analysis dynamically, (c) supporting automated tools that can complement current practices of in-process improvement, and (d) overcoming geographical barrier. An operational system (WISE) instantiates this architecture allowing for the improvement of software process in a realistic environment. The tool tracks issues in software development process, provides informal communication between the users with different roles, supports to-do lists (TDL), and helps in software process improvement. WISE minimizes the time devoted to metrics collection, analysis, and captures software change data. Automated tools like WISE focus on understanding and managing the software process. The goal is improvement through measurement.

  12. Accelerating Project and Process Improvement using Advanced Software Simulation Technology: From the Office to the Enterprise

    Science.gov (United States)

    2010-04-29

    Technology: From the Office Larry Smith Software Technology Support Center to the Enterprise 517 SMXS/MXDEA 6022 Fir Avenue Hill AFB, UT 84056 801...2010 to 00-00-2010 4. TITLE AND SUBTITLE Accelerating Project and Process Improvement using Advanced Software Simulation Technology: From the Office to

  13. Enhancing Software Engineering Education through Open Source Projects: Four Years of Students' Perspectives

    NARCIS (Netherlands)

    Papadopoulos, P.M.; Stamelos, I.G.; Meiszner, A.

    2015-01-01

    This paper presents the results after four years of running of an instructional method that utilizes free/libre open source software (FLOSS) projects as tools for teaching software engineering in formal education. In the last four academic years, a total of 408 juniors majoring in Informatics (in a

  14. Z-Plant material information tracking system (ZMITS) software development and integration project management plan

    International Nuclear Information System (INIS)

    IBSEN, T.G.

    1999-01-01

    This document plans for software and interface development governing the implementation of ZMITS and other supporting systems necessary to manage information for material stabilization needs of the Project Hanford Management Contract (PHMC)

  15. PLANNING QUALITY ASSURANCE PROCESSES IN A LARGE SCALE GEOGRAPHICALLY SPREAD HYBRID SOFTWARE DEVELOPMENT PROJECT

    Directory of Open Access Journals (Sweden)

    Святослав Аркадійович МУРАВЕЦЬКИЙ

    2016-02-01

    Full Text Available There have been discussed key points of operational activates in a large scale geographically spread software development projects. A look taken at required QA processes structure in such project. There have been given up to date methods of integration quality assurance processes into software development processes. There have been reviewed existing groups of software development methodologies. Such as sequential, agile and based on RPINCE2. There have been given a condensed overview of quality assurance processes in each group. There have been given a review of common challenges that sequential and agile models are having in case of large geographically spread hybrid software development project. Recommendations were given in order to tackle those challenges.  The conclusions about the best methodology choice and appliance to the particular project have been made.

  16. Cost estimation in software engineering projects with web components development

    Directory of Open Access Journals (Sweden)

    Javier de Andrés

    2015-01-01

    Full Text Available Existen multitud de modelos propuestos para la predicción de co stes en proyectos de software, al gunos orientados específicamen te para proyectos Web. Este trabajo analiza si los modelos específicos para proyectos Web están justifi cados, examinando el comportami ento diferencial de los costes entre proyectos de desarrollo softwar e Web y no Web. Se analizan dos aspectos del cálculo de costes: las deseconomías de escala, y el im pacto de algunas características de estos proyectos que son utilizadas como cost drivers. Se en uncian dos hipótesis: (a en estos proyect os las deseconomías de escala so n mayores y (b el incremento de coste que provocan los cost dr ivers es menor para los proyectos Web. Se contrastaron estas hipótesis a nalizando un conjunto de proyectos reales. Los resultados sugie ren que ambas hipótesis se cumplen. Por lo tanto, la principal contribu ción a la literatura de esta inv estigación es que el desarrollo de modelos específicos para los proyectos Web está justificado.

  17. Investigating the Practical Impact of Agile Practices on the Quality of Software Projects in Continuous Delivery

    OpenAIRE

    Olumide Akerele; Muthu Ramachandran; Mark Dixon

    2014-01-01

    Various factors affect the impact of agile factors on the continuous delivery of software projects. This is a major reason why projects perform differently- some failing and some succeeding- when they implement some agile practices in various environments. This is not helped by the fact that many projects work within limited budget while project plans also change-- making them to fall into some sort of pressure to meet deadline when they fall behind in their planned work. This study investiga...

  18. Research on software systems dependability at the OECD Halden Reactor Project

    International Nuclear Information System (INIS)

    Sivertsen, Terje; Owre, Fridtjov

    2011-01-01

    Two central issues related to software systems dependability are those of safety integrity and safety demonstration. A proper understanding of these two issues are important for the selection of process, methods, techniques and tools to be used in the different life cycle phases of the software. Following a brief discussion on the concept of software safety integrity and its relationship to software systems dependability, this paper gives an introduction to research problems addressed by the OECD Halden Reactor Project within this area. The paper concludes with a discussion on the important role of safety demonstration in this context. (author)

  19. Software project effort estimation foundations and best practice guidelines for success

    CERN Document Server

    Trendowicz, Adam

    2014-01-01

    Software effort estimation is one of the oldest and most important problems in software project management, and thus today there are a large number of models, each with its own unique strengths and weaknesses in general, and even more importantly, in relation to the environment and context in which it is to be applied.Trendowicz and Jeffery present a comprehensive look at the principles of software effort estimation and support software practitioners in systematically selecting and applying the most suitable effort estimation approach. Their book not only presents what approach to take and how

  20. Research on cross - Project software defect prediction based on transfer learning

    Science.gov (United States)

    Chen, Ya; Ding, Xiaoming

    2018-04-01

    According to the two challenges in the prediction of cross-project software defects, the distribution differences between the source project and the target project dataset and the class imbalance in the dataset, proposing a cross-project software defect prediction method based on transfer learning, named NTrA. Firstly, solving the source project data's class imbalance based on the Augmented Neighborhood Cleaning Algorithm. Secondly, the data gravity method is used to give different weights on the basis of the attribute similarity of source project and target project data. Finally, a defect prediction model is constructed by using Trad boost algorithm. Experiments were conducted using data, come from NASA and SOFTLAB respectively, from a published PROMISE dataset. The results show that the method has achieved good values of recall and F-measure, and achieved good prediction results.

  1. RISK MANAGEMENT AUTOMATION OF SOFTWARE PROJECTS BASED ОN FUZZY INFERENCE

    Directory of Open Access Journals (Sweden)

    T. M. Zubkova

    2015-09-01

    Full Text Available Application suitability for one of the intelligent methods for risk management of software projects has been shown based on the review of existing algorithms for fuzzy inference in the field of applied problems. Information sources in the management of software projects are analyzed; major and minor risks are highlighted. The most critical parameters have been singled out giving the possibility to estimate the occurrence of an adverse situations (project duration, the frequency of customer’s requirements changing, work deadlines, experience of developers’ participation in such projects and others.. The method of qualitative fuzzy description based on fuzzy logic has been developed for analysis of these parameters. Evaluation of possible situations and knowledge base formation rely on a survey of experts. The main limitations of existing automated systems have been identified in relation to their applicability to risk management in the software design. Theoretical research set the stage for software system that makes it possible to automate the risk management process for software projects. The developed software system automates the process of fuzzy inference in the following stages: rule base formation of the fuzzy inference systems, fuzzification of input variables, aggregation of sub-conditions, activation and accumulation of conclusions for fuzzy production rules, variables defuzzification. The result of risk management automation process in the software design is their quantitative and qualitative assessment and expert advice for their minimization. Practical significance of the work lies in the fact that implementation of the developed automated system gives the possibility for performance improvement of software projects.

  2. Inequalities in Open Source Software Development: Analysis of Contributor’s Commits in Apache Software Foundation Projects

    Science.gov (United States)

    2016-01-01

    While researchers are becoming increasingly interested in studying OSS phenomenon, there is still a small number of studies analyzing larger samples of projects investigating the structure of activities among OSS developers. The significant amount of information that has been gathered in the publicly available open-source software repositories and mailing-list archives offers an opportunity to analyze projects structures and participant involvement. In this article, using on commits data from 263 Apache projects repositories (nearly all), we show that although OSS development is often described as collaborative, but it in fact predominantly relies on radically solitary input and individual, non-collaborative contributions. We also show, in the first published study of this magnitude, that the engagement of contributors is based on a power-law distribution. PMID:27096157

  3. Business Value Is not only Dollars : Results from Case Study Research on Agile Software Projects

    NARCIS (Netherlands)

    Racheva, Z.; Daneva, Maia; Sikkel, Nicolaas; Buglione, Luigi; Ali Babar, M.; Vierimaa, Matias; Oivo, Markku

    Business value is a key concept in agile software development. This paper presents results of a case study on how business value and its creation is perceived in the context of agile projects. Our overall conclusion is that the project participants almost never use an explicit and structured

  4. Conceptions of Software Development by Project Managers: A Study of Managing the Outsourced Development of Software Applications for United States Federal Government Agencies

    Science.gov (United States)

    Eisen, Daniel

    2013-01-01

    This study explores how project managers, working for private federal IT contractors, experience and understand managing the development of software applications for U.S. federal government agencies. Very little is known about how they manage their projects in this challenging environment. Software development is a complex task and only grows in…

  5. NASA's TReK Project: A Case Study in Using the Spiral Model of Software Development

    Science.gov (United States)

    Hendrix, T. Dean; Schneider, Michelle P.

    1998-01-01

    Software development projects face numerous challenges that threaten their successful completion. Whether it is not enough money, too little time, or a case of "requirements creep" that has turned into a full sprint, projects must meet these challenges or face possible disastrous consequences. A robust, yet flexible process model can provide a mechanism through which software development teams can meet these challenges head on and win. This article describes how the spiral model has been successfully tailored to a specific project and relates some notable results to date.

  6. Biomass energy projects for joint implementation of the UN FCCC [Framework Convention on Climate Change

    International Nuclear Information System (INIS)

    Swisher, Joel N.; Renner, Frederick P.

    1998-01-01

    The UN Framework Convention on Climate Change (FCCC) allows for the joint implementation (JI) of measures to mitigate the emissions of greenhouse gases. The concept of JI refers to the implementation of such measures in one country with partial or full financial and/or technical support from another country, potentially fulfilling some of the supporting country's emission-reduction commitment under the FCCC. This paper addresses some key issues related to JI under the FCCC as they relate to the development of biomass energy projects for carbon offsets in developing countries. Issues include the reference case or baseline, carbon accounting and net carbon storage, potential project implementation barriers and risks, monitoring and verification, local agreements and host-country approval. All of these issues are important in project design and evaluation. We discuss briefly several case studies, which consist of a biomass-fueled co-generation projects under development at large sugar mills in the Philippines, India and Brazil, as potential JI projects. The case studies illustrate the benefits of bioenergy for reducing carbon emissions and some of the important barriers and difficulties in developing and crediting such projects. Results to date illustrate both the achievements and the difficulties of this type of project. (author)

  7. Lessons learned applying CASE methods/tools to Ada software development projects

    Science.gov (United States)

    Blumberg, Maurice H.; Randall, Richard L.

    1993-01-01

    This paper describes the lessons learned from introducing CASE methods/tools into organizations and applying them to actual Ada software development projects. This paper will be useful to any organization planning to introduce a software engineering environment (SEE) or evolving an existing one. It contains management level lessons learned, as well as lessons learned in using specific SEE tools/methods. The experiences presented are from Alpha Test projects established under the STARS (Software Technology for Adaptable and Reliable Systems) project. They reflect the front end efforts by those projects to understand the tools/methods, initial experiences in their introduction and use, and later experiences in the use of specific tools/methods and the introduction of new ones.

  8. Learning from open source software projects to improve scientific review.

    Science.gov (United States)

    Ghosh, Satrajit S; Klein, Arno; Avants, Brian; Millman, K Jarrod

    2012-01-01

    Peer-reviewed publications are the primary mechanism for sharing scientific results. The current peer-review process is, however, fraught with many problems that undermine the pace, validity, and credibility of science. We highlight five salient problems: (1) reviewers are expected to have comprehensive expertise; (2) reviewers do not have sufficient access to methods and materials to evaluate a study; (3) reviewers are neither identified nor acknowledged; (4) there is no measure of the quality of a review; and (5) reviews take a lot of time, and once submitted cannot evolve. We propose that these problems can be resolved by making the following changes to the review process. Distributing reviews to many reviewers would allow each reviewer to focus on portions of the article that reflect the reviewer's specialty or area of interest and place less of a burden on any one reviewer. Providing reviewers materials and methods to perform comprehensive evaluation would facilitate transparency, greater scrutiny, and replication of results. Acknowledging reviewers makes it possible to quantitatively assess reviewer contributions, which could be used to establish the impact of the reviewer in the scientific community. Quantifying review quality could help establish the importance of individual reviews and reviewers as well as the submitted article. Finally, we recommend expediting post-publication reviews and allowing for the dialog to continue and flourish in a dynamic and interactive manner. We argue that these solutions can be implemented by adapting existing features from open-source software management and social networking technologies. We propose a model of an open, interactive review system that quantifies the significance of articles, the quality of reviews, and the reputation of reviewers.

  9. Process for planning and control of software projects using XedroGESPRO

    OpenAIRE

    Jacqueline Marín-Sánchez; José Alejandro Lugo-García; Pedro Yobanis Piñero-Pérez; Alena María Santiesteban-García; Félix Noel Abelardo-Santana; Javier Menéndez-Rizo

    2014-01-01

    The software project management in Cuba has become a key area for improving production processes and decisionmaking in organizations. Several models and standards for process improvement, related with project management, proposed best practices on issues of planning and control of projects. However, they are generic guidelines that describe only those activities to execute, leaving the responsibility for implementing to organizations, using sometimes , expensive proprietary infor...

  10. Optimizing strategy software for repetitive construction projects within multi-mode resources

    Directory of Open Access Journals (Sweden)

    Remon Fayek Aziz

    2013-09-01

    Full Text Available Estimating tender data for specific project is the most essential part in construction areas as of contractor’s view such as: proposed project duration with corresponding gross value and cash flows. This paper focuses on how to calculate tender data using Optimizing Strategy Software (OSS for repetitive construction projects with identical activity’s duration in case of single number of crew such as: project duration, project/bid price, project maximum working capital, and project net present value of the studied project. A simplified multi-objective optimization software (OSS will be presented that creates best tender data to contractor compared with more feasible options generated from multi-mode resources in a given project. OSS is intended to give more scenarios which provide practical support for typical construction contractors who need to optimize resource utilization in order to minimize project duration, project/bid price, and project maximum working capital while maximizing its net present value simultaneously. OSS is designed by java programing code system to provide a number of new and unique capabilities, including: (1 Ranking the obtained optimal plans according to a set of planner specified weights representing the relative importance of duration, price, maximum working capital and net present value in the analyzed project; (2 Visualizing and viewing the generated optimal trade-off; and (3 Providing seamless integration with available project management calculations. In order to provide the aforementioned capabilities of OSS, the system is implemented and developed in four main modules: (1 A user interface module; (2 A database module; (3 A running module; (4 A connecting module. At the end of the paper, an illustrative example will be presented to demonstrate and verify the applications of the proposed software (OSS to an optimization expressway of repetitive construction project.

  11. Sustainable and non-conventional monitoring systems to mitigate natural hazards in low income economies: the 4onse project approach.

    Science.gov (United States)

    Cannata, Massimiliano; Ratnayake, Rangajeewa; Antonovic, Milan; Strigaro, Daniele

    2017-04-01

    Environmental monitoring systems in low economies countries are often in decline, outdated or missing with the consequence that there is a very scarce availability and accessibility to these information that are vital for coping and mitigating natural hazards. Non-conventional monitoring systems based on open technologies may constitute a viable solution to create low cost and sustainable monitoring systems that may be fully developed, deployed and maintained at local level without lock-in dependances on copyrights or patents or high costs of replacements. The 4onse research project , funded under the Research for Development program of the Swiss National Science Foundation and the Swiss Office for Development and Cooperation, propose a complete monitoring system that integrates Free & Open Source Software, Open Hardware, Open Data, and Open Standards. After its engineering, it will be tested in the Deduru Oya catchment (Sri Lanka) to evaluate the system and develop a water management information system to optimize the regulation of artificial basins levels and mitigate flash floods. One of the objective is to better scientifically understand strengths, criticalities and applicabilities in terms of data quality; system durability; management costs; performances; sustainability. Results, challenges and experiences from the first six months of the projects will be presented with particular focus on the activities of synergies building and data collection and dissemination system advances.

  12. Accurately Diagnosing Uric Acid Stones from Conventional Computerized Tomography Imaging: Development and Preliminary Assessment of a Pixel Mapping Software.

    Science.gov (United States)

    Ganesan, Vishnu; De, Shubha; Shkumat, Nicholas; Marchini, Giovanni; Monga, Manoj

    2018-02-01

    Preoperative determination of uric acid stones from computerized tomography imaging would be of tremendous clinical use. We sought to design a software algorithm that could apply data from noncontrast computerized tomography to predict the presence of uric acid stones. Patients with pure uric acid and calcium oxalate stones were identified from our stone registry. Only stones greater than 4 mm which were clearly traceable from initial computerized tomography to final composition were included in analysis. A semiautomated computer algorithm was used to process image data. Average and maximum HU, eccentricity (deviation from a circle) and kurtosis (peakedness vs flatness) were automatically generated. These parameters were examined in several mathematical models to predict the presence of uric acid stones. A total of 100 patients, of whom 52 had calcium oxalate and 48 had uric acid stones, were included in the final analysis. Uric acid stones were significantly larger (12.2 vs 9.0 mm, p = 0.03) but calcium oxalate stones had higher mean attenuation (457 vs 315 HU, p = 0.001) and maximum attenuation (918 vs 553 HU, p uric acid stones. A combination of stone size, attenuation intensity and attenuation pattern from conventional computerized tomography can distinguish uric acid stones from calcium oxalate stones with high sensitivity and specificity. Copyright © 2018 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  13. Cross-border software development of health information system: A case study on project between India and Pakistan based on open source software

    OpenAIRE

    Sabir, Uzma

    2017-01-01

    Global software development is a phenomenon that is receiving considerable interest from researchers during past two decades. Several challenges have been identified and approaches to deal with these challenges have been developed. Typically, western companies outsource their projects to countries where costs are lower and skilled professionals are easily available. Majority of these projects are developed for commercial purposes. However, software development projects between India and Pakis...

  14. A CMMI-based approach for medical software project life cycle study.

    Science.gov (United States)

    Chen, Jui-Jen; Su, Wu-Chen; Wang, Pei-Wen; Yen, Hung-Chi

    2013-01-01

    In terms of medical techniques, Taiwan has gained international recognition in recent years. However, the medical information system industry in Taiwan is still at a developing stage compared with the software industries in other nations. In addition, systematic development processes are indispensable elements of software development. They can help developers increase their productivity and efficiency and also avoid unnecessary risks arising during the development process. Thus, this paper presents an application of Light-Weight Capability Maturity Model Integration (LW-CMMI) to Chang Gung Medical Research Project (CMRP) in the Nuclear medicine field. This application was intended to integrate user requirements, system design and testing of software development processes into three layers (Domain, Concept and Instance) model. Then, expressing in structural System Modeling Language (SysML) diagrams and converts part of the manual effort necessary for project management maintenance into computational effort, for example: (semi-) automatic delivery of traceability management. In this application, it supports establishing artifacts of "requirement specification document", "project execution plan document", "system design document" and "system test document", and can deliver a prototype of lightweight project management tool on the Nuclear Medicine software project. The results of this application can be a reference for other medical institutions in developing medical information systems and support of project management to achieve the aim of patient safety.

  15. Radiation safety study for conventional facility and siting pre project phase of International Linear Collider

    International Nuclear Information System (INIS)

    Sanami, Toshiya; Ban, Syuichi; Sasaki, Shin-ichi

    2015-01-01

    The International Linear Collider (ILC) is a proposed high-energy collider consisting of two linear accelerators, two dumping rings, electron and positron sources, and a single colliding hall with two detectors. The total length and CMS energy of the ILC will be 31 km and 500 GeV, respectively (and 50 km and 1 TeV after future upgrade). The design of the ILC has entered the pre-project phase, which includes site-dependent design. Radiation safety design for the ILC is on-going as a part of conventional facility and siting activities of the pre-project phase. The thickness of a central wall of normal concrete is designed to be 3.5 m under a pessimistic assumption of beam loss. The beam loss scenario is under discussion. Experience and knowledge relating to shielding design and radiation control operational work at other laboratories are required. (authors)

  16. Software for improving the quality of project management, a case study: international manufacture of electrical equipment

    Science.gov (United States)

    Preradović, D. M.; Mićić, Lj S.; Barz, C.

    2017-05-01

    Production conditions in today’s world require software support at every stage of production and development of new products, for quality assurance and compliance with ISO standards. In addition to ISO standards such as usual metrics of quality, companies today are focused on other optional standards, such as CMMI (Capability Maturity Model Integrated) or prescribing they own standards. However, while there is intensive progress being made in the PM (project management), there is still a significant number of projects, at the global level, that are failures. These have failed to achieve their goals, within budget or timeframe. This paper focuses on checking the role of software tools through the rate of success in projects implemented in the case of internationally manufactured electrical equipment. The results of this research show the level of contribution of the project management software used to manage and develop new products to improve PM processes and PM functions, and how selection of the software tools affects the quality of PM processes and successfully completed projects.

  17. Integrated management tool for controls software problems, requests and project tasking at SLAC

    International Nuclear Information System (INIS)

    Rogind, D.; Allen, W.; Colocho, W.; DeContreras, G.; Gordon, J.; Pandey, P.; Shoaee, H.

    2012-01-01

    The Accelerator Directorate (AD) Instrumentation and Controls (ICD) Software (SW) Department at SLAC, with its service center model, continuously receives engineering requests to design, build and support controls for accelerator systems lab-wide. Each customer request can vary in complexity from a small software engineering change to a major enhancement. SLAC's Accelerator Improvement Projects (AIPs), along with DOE Construction projects, also contribute heavily to the work load. The various customer requests and projects, paired with the ongoing operational maintenance and problem reports, place a demand on the department that consistently exceeds the capacity of available resources. A centralized repository - comprised of all requests, project tasks, and problems - available to physicists, operators, managers, and engineers alike, is essential to capture, communicate, prioritize, assign, schedule, track, and finally, commission all work components. The Software Department has recently integrated request / project tasking into SLAC's custom online problem tracking 'Comprehensive Accelerator Tool for Enhancing Reliability' (CATER) tool. This paper discusses the newly implemented software request management tool - the workload it helps to track, its structure, features, reports, work-flow and its many usages. (authors)

  18. A View on a Successful International Educational Project in Software Engineering

    Directory of Open Access Journals (Sweden)

    Zoran Budimac

    2012-01-01

    Full Text Available In this paper, a successful and fruitful joint project will be presented. The project joins participants from 9 countries and from 15 universities. Since it started in 2001, this project entitled “Software Engineering: Computer Science Education and Research Cooperation” helped participants to gain excellent, up to date educational material, apply modern teaching methods, exchange experiences with other participants, and work jointly on the further development of lectures, case-studies, assignments, examination questions, and other necessary elements of a course. Project works under auspices of Stability Pact of South-Eastern Europe, and is supported by DAAD. The project started with the creation of a common beginning course in “Software Engineering”, but over time it grew and the number of other courses was developed. Finished almost completely are the courses in “Object-oriented programming”, “Software Project Management”, “Advanced Compiler Construction”, and “Data Structures and Algorithms”, and some other courses are under development. Aside from the educational collaboration, project members also developed good scientific cooperation, and published several research papers.

  19. Bill project authorizing the approval of the amendment to the convention on physical protection of nuclear material - Nr 11

    International Nuclear Information System (INIS)

    Ayrault, Jean-Marc; Fabius, Laurent

    2012-01-01

    This document contains the brief text of the bill project and the text of the amendment to the Convention on physical protection of nuclear material which has been adopted in Vienna in August 2005 to amend the Convention adopted in October 1979. This amendment introduces the following measures: extension of the scope of application of the Convention to nuclear materials used for peaceful purposes, definition of the objectives of the Convention, articulation of the Convention with other international instruments, definition of the main principles of physical protection, strengthened international cooperation, legal issues concerning extradition and legal cooperation

  20. Engineering a large application software project: the controls of the CERN PS accelerator complex

    International Nuclear Information System (INIS)

    Benincasa, G.P.; Daneels, A.; Heymans, P.; Serre, Ch.

    1985-01-01

    The CERN PS accelerator complex has been progressively converted to full computer controls without interrupting its full-time operation (more than 6000 hours per year with on average not more than 1% of the total down-time due to controls). The application software amounts to 120 man-years and 450'000 instructions: it compares with other large software projects, also outside the accelerator world: e.g. Skylab's ground support software. This paper outlines the application software structure which takes into account technical requirements and constraints (resulting from the complexity of the process and its operation) and economical and managerial ones. It presents the engineering and management techniques used to promote implementation, testing and commissioning within budget, manpower and time constraints and concludes with experience gained

  1. Fuzzy system for risk analysis in software projects through the attributes of quality standards iso 25000

    Directory of Open Access Journals (Sweden)

    Chau Sen Shia

    2014-02-01

    Full Text Available With the growth in demand for products and services in the IT area, companies encounter difficulties in establishing a metric or measure of quality of services to address measurably qualitative values in their planning. In this work fuzzy logic, standard SQuaRE (measurement of the quality of software products, Likertscale, GQM method (Goal-Question-Metric -indicator of quality of Software and the project risk analysis model of Boehm were used to assess the quality of services and decision-making, according to your demand and requests for software development. With the aim of improving the quality in the provision of services, the application is used to integrate the team and follow the life cycle of a project from its initial phase, and to assist in the comparison with the proposed schedule during the requirements elicitation.

  2. Modelling the critical success factors of agile software development projects in South Africa

    Directory of Open Access Journals (Sweden)

    Tawanda B. Chiyangwa

    2017-10-01

    Full Text Available Background: The continued in failure of agile and traditional software development projects have led to the consideration, attention and dispute to critical success factors that are the aspects which are most vital to make a software engineering methodology fruitful. Although there is an increasing variety of critical success factors and methodologies, the conceptual frameworks which have causal relationship are limited. Objective: The objective of this study was to identify and provide insights into the critical success factors that influence the success of software development projects using agile methodologies in South Africa. Method: Quantitative method of collecting data was used. Data were collected in South Africa through a Web-based survey using structured questionnaires. Results: These results show that organisational factors have a great influence on performance expectancy characteristics. Conclusion: The results of this study discovered a comprehensive model that could provide guidelines to the agile community and to the agile professionals.

  3. A Constrained and Guided Approach for Managing Software Engineering Course Projects

    Science.gov (United States)

    Cheng, Y.-P.; Lin, J. M.-C.

    2010-01-01

    This paper documents several years of experimentation with a new approach to organizing and managing projects in a software engineering course. The initial failure and subsequent refinements that the new approach has been through since 2004 are described herein. The "constrained and guided" approach, as it is called, has helped to reduce…

  4. Software for relativistic atomic structure theory: The grasp project at oxford

    International Nuclear Information System (INIS)

    Parpia, F.A.; Grant, I.P.

    1991-01-01

    GRASP is an acronym for General-purpose Relativistic Atomic Structure Program. The objective of the GRASP project at Oxford is to produce user-friendly state-of-the-art multiconfiguration Dirac-Fock (MCDF) software packages for rleativistic atomic structure theory

  5. Enhance Learning on Software Project Management through a Role-Play Game in a Virtual World

    Science.gov (United States)

    Maratou, Vicky; Chatzidaki, Eleni; Xenos, Michalis

    2016-01-01

    This article presents a role-play game for software project management (SPM) in a three-dimensional online multiuser virtual world. The Opensimulator platform is used for the creation of an immersive virtual environment that facilitates students' collaboration and realistic interaction, in order to manage unexpected events occurring during the…

  6. Automated water analyser computer supported system (AWACSS) Part I: Project objectives, basic technology, immunoassay development, software design and networking.

    Science.gov (United States)

    Tschmelak, Jens; Proll, Guenther; Riedt, Johannes; Kaiser, Joachim; Kraemmer, Peter; Bárzaga, Luis; Wilkinson, James S; Hua, Ping; Hole, J Patrick; Nudd, Richard; Jackson, Michael; Abuknesha, Ram; Barceló, Damià; Rodriguez-Mozaz, Sara; de Alda, Maria J López; Sacher, Frank; Stien, Jan; Slobodník, Jaroslav; Oswald, Peter; Kozmenko, Helena; Korenková, Eva; Tóthová, Lívia; Krascsenits, Zoltan; Gauglitz, Guenter

    2005-02-15

    A novel analytical system AWACSS (automated water analyser computer-supported system) based on immunochemical technology has been developed that can measure several organic pollutants at low nanogram per litre level in a single few-minutes analysis without any prior sample pre-concentration nor pre-treatment steps. Having in mind actual needs of water-sector managers related to the implementation of the Drinking Water Directive (DWD) (98/83/EC, 1998) and Water Framework Directive WFD (2000/60/EC, 2000), drinking, ground, surface, and waste waters were major media used for the evaluation of the system performance. The instrument was equipped with remote control and surveillance facilities. The system's software allows for the internet-based networking between the measurement and control stations, global management, trend analysis, and early-warning applications. The experience of water laboratories has been utilised at the design of the instrument's hardware and software in order to make the system rugged and user-friendly. Several market surveys were conducted during the project to assess the applicability of the final system. A web-based AWACSS database was created for automated evaluation and storage of the obtained data in a format compatible with major databases of environmental organic pollutants in Europe. This first part article gives the reader an overview of the aims and scope of the AWACSS project as well as details about basic technology, immunoassays, software, and networking developed and utilised within the research project. The second part article reports on the system performance, first real sample measurements, and an international collaborative trial (inter-laboratory tests) to compare the biosensor with conventional anayltical methods.

  7. AN ENHANCED MODEL TO ESTIMATE EFFORT, PERFORMANCE AND COST OF THE SOFTWARE PROJECTS

    Directory of Open Access Journals (Sweden)

    M. Pauline

    2013-04-01

    Full Text Available The Authors have proposed a model that first captures the fundamentals of software metrics in the phase 1 consisting of three primitive primary software engineering metrics; they are person-months (PM, function-points (FP, and lines of code (LOC. The phase 2 consists of the proposed function point which is obtained by grouping the adjustment factors to simplify the process of adjustment and to ensure more consistency in the adjustments. In the proposed method fuzzy logic is used for quantifying the quality of requirements and is added as one of the adjustment factor, thus a fuzzy based approach for the Enhanced General System Characteristics to Estimate Effort of the Software Projects using productivity has been obtained. The phase 3 takes the calculated function point from our work and is given as input to the static single variable model (i.e. to the Intermediate COCOMO and COCOMO II for cost estimation. The Authors have tailored the cost factors in intermediate COCOMO and both; cost and scale factors are tailored in COCOMO II to suite to the individual development environment, which is very important for the accuracy of the cost estimates. The software performance indicators are project duration, schedule predictability, requirements completion ratio and post-release defect density, are also measured for the software projects in my work. A comparative study for effort, performance measurement and cost estimation of the software project is done between the existing model and the authors proposed work. Thus our work analyzes the interaction¬al process through which the estimation tasks were collectively accomplished.

  8. Network, system, and status software enhancements for the autonomously managed electrical power system breadboard. Volume 1: Project summary

    Science.gov (United States)

    Mckee, James W.

    1990-01-01

    This volume (1 of 4) gives a summary of the original AMPS software system configuration, points out some of the problem areas in the original software design that this project is to address, and in the appendix collects all the bimonthly status reports. The purpose of AMPS is to provide a self reliant system to control the generation and distribution of power in the space station. The software in the AMPS breadboard can be divided into three levels: the operating environment software, the protocol software, and the station specific software. This project deals only with the operating environment software and the protocol software. The present station specific software will not change except as necessary to conform to new data formats.

  9. A data model of the Climate and Forecast metadata conventions (CF-1.6 with a software implementation (cf-python v2.1

    Directory of Open Access Journals (Sweden)

    D. Hassell

    2017-12-01

    Full Text Available The CF (Climate and Forecast metadata conventions are designed to promote the creation, processing, and sharing of climate and forecasting data using Network Common Data Form (netCDF files and libraries. The CF conventions provide a description of the physical meaning of data and of their spatial and temporal properties, but they depend on the netCDF file encoding which can currently only be fully understood and interpreted by someone familiar with the rules and relationships specified in the conventions documentation. To aid in development of CF-compliant software and to capture with a minimal set of elements all of the information contained in the CF conventions, we propose a formal data model for CF which is independent of netCDF and describes all possible CF-compliant data. Because such data will often be analysed and visualised using software based on other data models, we compare our CF data model with the ISO 19123 coverage model, the Open Geospatial Consortium CF netCDF standard, and the Unidata Common Data Model. To demonstrate that this CF data model can in fact be implemented, we present cf-python, a Python software library that conforms to the model and can manipulate any CF-compliant dataset.

  10. Project Cerberus: tobacco industry strategy to create an alternative to the Framework Convention on Tobacco Control.

    Science.gov (United States)

    Mamudu, Hadii M; Hammond, Ross; Glantz, Stanton A

    2008-09-01

    Between 1999 and 2001, British American Tobacco, Philip Morris, and Japan Tobacco International executed Project Cerberus to develop a global voluntary regulatory regime as an alternative to the Framework Convention on Tobacco Control (FCTC). They aimed to develop a global voluntary regulatory code to be overseen by an independent audit body and to focus attention on youth smoking prevention. The International Tobacco Products Marketing Standards announced in September 2001, however, did not have the independent audit body. Although the companies did not stop the FCTC, they continue to promote the International Tobacco Products Marketing Standards youth smoking prevention as an alternative to the FCTC. Public health civil society groups should help policymakers and governments understand the importance of not working with the tobacco industry.

  11. Effects of the Meetings-Flow Approach on Quality Teamwork in the Training of Software Capstone Projects

    Science.gov (United States)

    Chen, Chung-Yang; Hong, Ya-Chun; Chen, Pei-Chi

    2014-01-01

    Software development relies heavily on teamwork; determining how to streamline this collaborative development is an essential training subject in computer and software engineering education. A team process known as the meetings-flow (MF) approach has recently been introduced in software capstone projects in engineering programs at various…

  12. A Systematic Mapping on Supporting Approaches for Requirements Traceability in the Context of Software Projects

    Directory of Open Access Journals (Sweden)

    MALCHER, P R.C.

    2015-12-01

    Full Text Available The Requirements Traceability is seen as a quality factor with regard to software development, being present in standards and quality models. In this context, several techniques, models, frameworks and tools have been used to support it. Thus, the purpose of this paper is to present a systematic mapping carried out in order to find in the literature approaches to support the requirements traceability in the context of software projects and make the categorization of the data found in order to demonstrate, by means of a reliable, accurate and auditable method, how this area has developed and what are the main approaches are used to implement it.

  13. Predicting Defects Using Information Intelligence Process Models in the Software Technology Project.

    Science.gov (United States)

    Selvaraj, Manjula Gandhi; Jayabal, Devi Shree; Srinivasan, Thenmozhi; Balasubramanie, Palanisamy

    2015-01-01

    A key differentiator in a competitive market place is customer satisfaction. As per Gartner 2012 report, only 75%-80% of IT projects are successful. Customer satisfaction should be considered as a part of business strategy. The associated project parameters should be proactively managed and the project outcome needs to be predicted by a technical manager. There is lot of focus on the end state and on minimizing defect leakage as much as possible. Focus should be on proactively managing and shifting left in the software life cycle engineering model. Identify the problem upfront in the project cycle and do not wait for lessons to be learnt and take reactive steps. This paper gives the practical applicability of using predictive models and illustrates use of these models in a project to predict system testing defects thus helping to reduce residual defects.

  14. A global conversation about energy from biomass: the continental conventions of the global sustainable bioenergy project

    Science.gov (United States)

    Lynd, Lee Rybeck; Aziz, Ramlan Abdul; de Brito Cruz, Carlos Henrique; Chimphango, Annie Fabian Abel; Cortez, Luis Augusto Barbosa; Faaij, Andre; Greene, Nathanael; Keller, Martin; Osseweijer, Patricia; Richard, Tom L.; Sheehan, John; Chugh, Archana; van der Wielen, Luuk; Woods, Jeremy; van Zyl, Willem Heber

    2011-01-01

    The global sustainable bioenergy (GSB) project was formed in 2009 with the goal of providing guidance with respect to the feasibility and desirability of sustainable, bioenergy-intensive futures. Stage 1 of this project held conventions with a largely common format on each of the world's continents, was completed in 2010, and is described in this paper. Attended by over 400 persons, the five continental conventions featured presentations, breakout sessions, and drafting of resolutions that were unanimously passed by attendees. The resolutions highlight the potential of bioenergy to make a large energy supply contribution while honouring other priorities, acknowledge the breadth and complexity of bioenergy applications as well as the need to take a systemic approach, and attest to substantial intra- and inter-continental diversity with respect to needs, opportunities, constraints and current practice relevant to bioenergy. The following interim recommendations based on stage 1 GSB activities are offered: — Realize that it may be more productive, and also more correct, to view the seemingly divergent assessments of bioenergy as answers to two different questions rather than the same question. Viewed in this light, there is considerably more scope for reconciliation than might first be apparent, and it is possible to be informed rather than paralysed by divergent assessments.— Develop established and advanced bioenergy technologies such that each contributes to the other's success. That is, support and deploy in the near-term meritorious, established technologies in ways that enhance rather than impede deployment of advanced technologies, and support and deploy advanced technologies in ways that expand rather than contract opportunities for early adopters and investors.— Be clear in formulating policies what mix of objectives are being targeted, measure the results of these policies against these objectives and beware of unintended consequences

  15. Historical Post Office Directory Parser (POD Parser Software From the AddressingHistory Project

    Directory of Open Access Journals (Sweden)

    Nicola Osborne

    2014-07-01

    Full Text Available The POD Parser is Python software for parsing the OCR’d (optical character recognised text of digitised historical Scottish Post Office Directories (PODs to produce a consistent structured format for the data and for geocoding each address. The software was developed as part of the AddressingHistory project which sought to combine digitised historic directories with digitised and georeferenced historic maps.  The software has potential for reuse in multiple research contexts where historical post office directory data is relevant, and is therefore particularly of use in historical research into social, economic or demographic trends. The POD Parser is currently designed for use with Scottish directories but is extensible, perhaps with some adaptation, to use with other similarly formatted materials such as the English Trade Directories.

  16. The D2G2 project: a new software tool for nuclear engineering design in Canada

    International Nuclear Information System (INIS)

    Rheaume, P.; Lefebvre, J.F.; Roy, R.; Koclas, J.

    2004-01-01

    Nowadays, high quality neutronic simulation codes are readily available. The open source software suite DRAGON/DONJON is a good example. It is free, it has proven quality and correctness over the years and is still developed and maintained at Ecole Polytechnique de Montreal. However, most simulation codes have the following weaknesses: limited usability, poor maintainability, no internal data standardization and poor portability. The D2G2 project is a software development initiative which aims to create an upper layer software tool that annihilates the weakness of classic simulation codes. This paper presents D2G2Client's and D2G2Server's principal capabilities, how they interact and the libraries they use. (author)

  17. Software Process Improvement through the Removal of Project-Level Knowledge Flow Obstacles: The Perceptions of Software Engineers

    Science.gov (United States)

    Mitchell, Susan Marie

    2012-01-01

    Uncontrollable costs, schedule overruns, and poor end product quality continue to plague the software engineering field. Innovations formulated with the expectation to minimize or eliminate cost, schedule, and quality problems have generally fallen into one of three categories: programming paradigms, software tools, and software process…

  18. BMR: Benchmarking Metrics Recommender for Personnel issues in Software Development Projects

    Directory of Open Access Journals (Sweden)

    Angel Garcia-Crespo

    2009-12-01

    Full Text Available This paper presents an architecture which applies document similarity measures to the documentation produced during the phases of software development in order to generate recommendations of process and people metrics for similar projects. The application makes a judgment of similarity of the Service Provision Offer (SPO document of a new proposed project to a collection of Project History Documents (PHD, stored in a repository of unstructured texts. The process is carried out in three stages: firstly, clustering of the Offer document with the set of PHDs which are most similar to it; this provides the initial indication of whether similar previous projects exist, and signifies similarity. Secondly, determination of which PHD in the set is most comparable with the Offer document, based on various parameters: project effort, project duration (time, project resources (members/size of team, costs, and sector(s involved, indicating comparability of projects. The comparable parameters are extracted using the GATE Natural Language Processing architecture. Lastly, a recommendation of metrics for the new project is made, which is based on the transferability of the metrics of the most similar and comparable PHD extracted, here referred to as recommendation.

  19. Software and man-machine interface considerations for a nuclear plant computer replacement and upgrade project

    International Nuclear Information System (INIS)

    Diamond, G.; Robinson, E.

    1984-01-01

    Some of the key software functions and Man-Machine Interface considerations in a computer replacement and upgrade project for a nuclear power plant are described. The project involves the installation of two separate computer systems: an Emergency Response Facilities Computer System (ERFCS) and a Plant Process Computer System (PPCS). These systems employ state-of-the-art computer hardware and software. The ERFCS is a new system intended to provide enhanced functions to meet NRC post-TMI guidelines. The PPCS is intended to replace and upgrade an existing obsolete plant computer system. A general overview of the hardware and software aspects of the replacement and upgrade is presented. The work done to develop the upgraded Man-Machine Interface is described. For the ERFCS, a detailed discussion is presented of the work done to develop logic to evaluate the readiness and performance of safety systems and their supporting functions. The Man-Machine Interface considerations of reporting readiness and performance to the operator are discussed. Finally, the considerations involved in the implementation of this logic in real-time software are discussed.. For the PPCS, a detailed discussion is presented of some new features

  20. A microprocessor card software server to support the Quebec health microprocessor card project.

    Science.gov (United States)

    Durant, P; Bérubé, J; Lavoie, G; Gamache, A; Ardouin, P; Papillon, M J; Fortin, J P

    1995-01-01

    The Quebec Health Smart Card Project is advocating the use of a memory card software server[1] (SCAM) to implement a portable medical record (PMR) on a smart card. The PMR is viewed as an object that can be manipulated by SCAM's services. In fact, we can talk about a pseudo-object-oriented approach. This software architecture provides a flexible and evolutive way to manage and optimize the PMR. SCAM is a generic software server; it can manage smart cards as well as optical (laser) cards or other types of memory cards. But, in the specific case of the Quebec Health Card Project, SCAM is used to provide services between physicians' or pharmacists' software and IBM smart card technology. We propose to expose the concepts and techniques used to provide a generic environment to deal with smart cards (and more generally with memory cards), to obtain a dynamic an evolutive PMR, to raise the system global security level and the data integrity, to optimize significantly the management of the PMR, and to provide statistic information about the use of the PMR.

  1. Madagascar: open-source software project for multidimensional data analysis and reproducible computational experiments

    Directory of Open Access Journals (Sweden)

    Sergey Fomel

    2013-11-01

    Full Text Available The Madagascar software package is designed for analysis of large-scale multidimensional data, such as those occurring in exploration geophysics. Madagascar provides a framework for reproducible research. By “reproducible research” we refer to the discipline of attaching software codes and data to computational results reported in publications. The package contains a collection of (a computational modules, (b data-processing scripts, and (c research papers. Madagascar is distributed on SourceForge under a GPL v2 license https://sourceforge.net/projects/rsf/. By October 2013, more than 70 people from different organizations around the world have contributed to the project, with increasing year-to-year activity. The Madagascar website is http://www.ahay.org/.

  2. A Data Specification for Software Project Performance Measures: Results of a Collaboration on Performance Measurement

    Science.gov (United States)

    2008-07-01

    cycle Evolution of a system, product, service, project or other human-made entity from conception through retirement [ ISO 12207 ]. Logical line of...012 [ ISO 1995] International Organization for Standardization. ISO /IEC 12207 :1995—Information technology— Software life cycle processes. http...definitions, authors were asked to use or align with already existing standards such as those available through ISO and IEEE when possible. Literature

  3. ProjectQ: An Open Source Software Framework for Quantum Computing

    OpenAIRE

    Steiger, Damian S.; Häner, Thomas; Troyer, Matthias

    2016-01-01

    We introduce ProjectQ, an open source software effort for quantum computing. The first release features a compiler framework capable of targeting various types of hardware, a high-performance simulator with emulation capabilities, and compiler plug-ins for circuit drawing and resource estimation. We introduce our Python-embedded domain-specific language, present the features, and provide example implementations for quantum algorithms. The framework allows testing of quantum algorithms through...

  4. Software pi/4 DQPSK Modem: A Student Project Using the TMS320-C6201 EVM Board

    OpenAIRE

    Weiss, S; Braithwaite, SJ; Stewart, RD

    2000-01-01

    This paper reports on a student project performed at the University of Southampton jointly by 4th year MEng students within the course "Advanced Radio Communications". The aim was to design a software modem capable of transmitting 16kb/s of data, whereby random number generation, advanced modulation, pulse shaping, synchronisation, and error counting techniques had to be applied. The ultimate aim was the implementation on a Texas Instruments TMS320-C6201 EVM board, which dictated some of the ...

  5. Waste receiving and processing facility module 1 data management system software project management plan

    International Nuclear Information System (INIS)

    Clark, R.E.

    1994-01-01

    This document provides the software development plan for the Waste Receiving and Processing (WRAP) Module 1 Data Management System (DMS). The DMS is one of the plant computer systems for the new WRAP 1 facility (Project W-026). The DMS will collect, store, and report data required to certify the low level waste (LLW) and transuranic (TRU) waste items processed at WRAP 1 as acceptable for shipment, storage, or disposal

  6. Cone-Beam CT Angiography for Determination of Tumor-Feeding Vessels During Chemoembolization of Liver Tumors: Comparison of Conventional and Dedicated-Software Analysis.

    Science.gov (United States)

    Ronot, Maxime; Abdel-Rehim, Mohamed; Hakimé, Antoine; Kuoch, Viseth; Roux, Marion; Chiaradia, Mélanie; Vilgrain, Valérie; de Baere, Thierry; Deschamps, Frédéric

    2016-01-01

    To compare the ability of dedicated software and conventional cone-beam computed tomography (CT) analysis to identify tumor-feeding vessels in hypervascular liver tumors treated with chemoembolization. Between January 2012 and January 2013, 45 patients (32 men, mean age of 61 y; range, 27-85 y) were enrolled, and 66 tumors were treated (mean, 32 mm ± 18; range, 10-81 mm) with conventional chemoembolization with arterial cone-beam CT. Data were independently analyzed by six interventional radiologists with standard postprocessing software, a computer-aided analysis with FlightPlan for liver (FPFL; ie, "raw FPFL"), and a review of this computer-aided FPFL analysis ("reviewed FPFL"). Analyses were compared with a reference reading established by two study supervisors in consensus who had access to all imaging data. Sensitivities, positive predictive values (PPVs), and false-positive (FP) ratios were compared by McNemar, χ(2), and Fisher exact tests. Analysis durations were compared by Mann-Whitney test, and interreader agreement was assessed. Reference reading identified 179 feeder vessels. The sensitivity of raw FPFL was significantly higher than those of reviewed FPFL and conventional analyses (90.9% vs 83.2% and 82.1%; P software enabled a fast, accurate, and sensitive detection of tumor feeder vessels. Copyright © 2016 SIR. Published by Elsevier Inc. All rights reserved.

  7. The state and profile of open source software projects in health and medical informatics.

    Science.gov (United States)

    Janamanchi, Balaji; Katsamakas, Evangelos; Raghupathi, Wullianallur; Gao, Wei

    2009-07-01

    Little has been published about the application profiles and development patterns of open source software (OSS) in health and medical informatics. This study explores these issues with an analysis of health and medical informatics related OSS projects on SourceForge, a large repository of open source projects. A search was conducted on the SourceForge website during the period from May 1 to 15, 2007, to identify health and medical informatics OSS projects. This search resulted in a sample of 174 projects. A Java-based parser was written to extract data for several of the key variables of each project. Several visually descriptive statistics were generated to analyze the profiles of the OSS projects. Many of the projects have sponsors, implying a growing interest in OSS among organizations. Sponsorship, we discovered, has a significant impact on project success metrics. Nearly two-thirds of the projects have a restrictive license type. Restrictive licensing may indicate tighter control over the development process. Our sample includes a wide range of projects that are at various stages of development (status). Projects targeted towards the advanced end user are primarily focused on bio-informatics, data formats, database and medical science applications. We conclude that there exists an active and thriving OSS development community that is focusing on health and medical informatics. A wide range of OSS applications are in development, from bio-informatics to hospital information systems. A profile of OSS in health and medical informatics emerges that is distinct and unique to the health care field. Future research can focus on OSS acceptance and diffusion and impact on cost, efficiency and quality of health care.

  8. High precision flux measurements in conventional neutrino beams: the ENUBET project

    CERN Document Server

    Longhin, Andrea

    2017-01-01

    The challenges of precision neutrino physics require measurements of absolute neutrino cross sec- tions at the GeV scale with exquisite (1%) precision. This precision is presently limited to by the uncertainties on neutrino flux at the source. A reduction of this uncertainty by one order of mag- nitude can be achieved monitoring the positron production in the decay tunnel originating from the K e 3 decays of charged kaons in a sign and momentum selected narrow band beam. This novel technique enables the measurement of the most relevant cross-sections for CP violation ( ν e and ̄ ν e ) with a precision of 1% and requires a special instrumented beam-line. Such non-conventional beam-line will be developed in the framework of the ENUBET Horizon-2020 Consolidator Grant, recently approved by the European Research Council. We present the Project, the first experimen- tal results on ultra-compact calorimeters that can embedded in the instrumented decay tunnel and the advances on the simulation of the beamline. A r...

  9. Bill project authorizing the approval of the amendment to the Convention on the physical protection of nuclear materials

    International Nuclear Information System (INIS)

    Juppe, Alain; Fillon, Francois

    2011-01-01

    This document deals with the amendment to the Convention on the Physical Protection of Nuclear Materials which has been adopted in July 2005. This amendment notably extended the Convention's scope, objectives, relation with other international instruments and content (regarding cooperation, sanctions, and so on). After the text of this amendment, this document contains the bill project which reports an impact study (estimated economic, financial, environmental, and legal consequences of the amendment implementation), comments the penal and criminal cooperation defined in the Convention. A table indicates the impact of the amendment's articles on the French law

  10. Médicarte software developed for the Quebec microprocessor health card project.

    Science.gov (United States)

    Lavoie, G; Tremblay, L; Durant, P; Papillon, M J; Bérubé, J; Fortin, J P

    1995-01-01

    The Quebec Patient Smart Card Project is a Provincial Government initiative under the responsibility of the Rgie de l'assurance-maladie du Québec (Quebec Health Insurance Board). Development, implementation, and assessment duties were assigned to a team from Université Laval, which in turn joined a group from the Direction de la santé publique du Bas-St-Laurent in Rimouski, where the experiment is taking place. The pilot project seeks to evaluate the use and acceptance of a microprocessor card as a way to improve the exchange of clinical information between card users and various health professionals. The card can be best described as a résumé containing information pertinent to an individual's health history. It is not a complete medical file; rather, it is a summary to be used as a starting point for a discussion between health professionals and patients. The target population is composed of persons 60 years and over, pregnant women, infants under 18 months, and the residents of a small town located in the target area, St-Fabien, regardless of age. The health professionals involved are general practitioners, specialists, pharmacists, nurses, and ambulance personnel. Participation in the project is on a voluntary basis. Each health care provider participating in the project has a personal identification number (PIN) and must use both an access card and a user card to access information. This prevents unauthorized access to a patient's card and allows the staff to sign and date information entered onto the patient card. To test the microprocessor card, we developed software based on a problem-oriented approach integrating diagnosis, investigations, treatments, and referrals. This software is not an expert system that constrains the clinician to a particular decisional algorithm. Instead, the software supports the physician in decision making. The software was developed with a graphical interface (Windows 3.1) to maximize its user friendliness. A version of the

  11. Designing for Change: Minimizing the Impact of Changing Requirements in the Later Stages of a Spaceflight Software Project

    Science.gov (United States)

    Allen, B. Danette

    1998-01-01

    In the traditional 'waterfall' model of the software project life cycle, the Requirements Phase ends and flows into the Design Phase, which ends and flows into the Development Phase. Unfortunately, the process rarely, if ever, works so smoothly in practice. Instead, software developers often receive new requirements, or modifications to the original requirements, well after the earlier project phases have been completed. In particular, projects with shorter than ideal schedules are highly susceptible to frequent requirements changes, as the software requirements analysis phase is often forced to begin before the overall system requirements and top-level design are complete. This results in later modifications to the software requirements, even though the software design and development phases may be complete. Requirements changes received in the later stages of a software project inevitably lead to modification of existing developed software. Presented here is a series of software design techniques that can greatly reduce the impact of last-minute requirements changes. These techniques were successfully used to add built-in flexibility to two complex software systems in which the requirements were expected to (and did) change frequently. These large, real-time systems were developed at NASA Langley Research Center (LaRC) to test and control the Lidar In-Space Technology Experiment (LITE) instrument which flew aboard the space shuttle Discovery as the primary payload on the STS-64 mission.

  12. Calculator: A Hardware Design, Math and Software Programming Project Base Learning

    Directory of Open Access Journals (Sweden)

    F. Criado

    2015-03-01

    Full Text Available This paper presents the implementation by the students of a complex calculator in hardware. This project meets hardware design goals, and also highly motivates them to use competences learned in others subjects. The learning process, associated to System Design, is hard enough because the students have to deal with parallel execution, signal delay, synchronization … Then, to strengthen the knowledge of hardware design a methodology as project based learning (PBL is proposed. Moreover, it is also used to reinforce cross subjects like math and software programming. This methodology creates a course dynamics that is closer to a professional environment where they will work with software and mathematics to resolve the hardware design problems. The students design from zero the functionality of the calculator. They are who make the decisions about the math operations that it is able to resolve it, and also the operands format or how to introduce a complex equation into the calculator. This will increase the student intrinsic motivation. In addition, since the choices may have consequences on the reliability of the calculator, students are encouraged to program in software the decisions about how implement the selected mathematical algorithm. Although math and hardware design are two tough subjects for students, the perception that they get at the end of the course is quite positive.

  13. The Design and Development of a Computerized Tool Support for Conducting Senior Projects in Software Engineering Education

    Science.gov (United States)

    Chen, Chung-Yang; Teng, Kao-Chiuan

    2011-01-01

    This paper presents a computerized tool support, the Meetings-Flow Project Collaboration System (MFS), for designing, directing and sustaining the collaborative teamwork required in senior projects in software engineering (SE) education. Among many schools' SE curricula, senior projects serve as a capstone course that provides comprehensive…

  14. Exploring the role of instant messaging in a global software development project

    DEFF Research Database (Denmark)

    Dittrich, Y.; Giuffrida, Rosalba

    2011-01-01

    Communication plays a vital role in software devel- opment projects. Globally distributed teams use a mix of dif- ferent communication channels to get the work done. In this paper, we report on an empirical study of a team distributed across Denmark and India. This paper explores the integration...... documentation. Our analysis provides an indication that IM can play a special role in such socio-technical communication systems: IM acts as a real time glue between different chan- nels. The communication through IM also provides a means to build trust and social relationships with co-workers....

  15. Global Software and IT A Guide to Distributed Development, Projects, and Outsourcing

    CERN Document Server

    Ebert, Christof

    2011-01-01

    Global software engineering, implying both internal and outsourced development, is a fast-growing scenario within industry; the growth rates in some sectors are more than 20% per year. However, half of all offshoring activities are cancelled within the first 2 years, at tremendous unanticipated cost to the organization.   This book will provide a more balanced framework for planning global development, covering topics such as managing people in distributed sites, managing a project across locations, mitigating the risk of offshoring, processes for global development, practical outsourcin

  16. Guidelines for the verification and validation of expert system software and conventional software: Volume 4, Evaluation of knowledge base certification methods. Final report

    International Nuclear Information System (INIS)

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M.

    1995-05-01

    Objective is the formulation of guidelines for the V ampersand V of expert systems for use in nuclear power applications. This activity is concerned with the development and testing of various methods for assuring the quality of knowledge bases. The testing procedure used was that of behavioral experiment, the first known such evaluation of any of V ampersand V activity; the value lies in the capability to provide empirical evidence for or against the effectiveness of plausible methods in helping people find problems in knowledge bases. The three-day experiment included 20 participants from three nuclear utilities, the Nuclear Regulatory Commission's Technical training Center, University of Maryland, EG ampersand G Idaho, and SAIC. The study used two real nuclear expert systems: a boiling water reactor emergency operating procedures tracking system and a pressurized water reactor safety assessment systems. Ten participants were assigned to each of the expert systems. All participants were trained in and then used a sequence of four different V ampersand V methods selected as being the best and most appropriate. These methods either involved the analysis and tracing of requirements to elements in the knowledge base or direct inspection of the knowledge base for various kinds of errors. Half of the subjects within each system group used the best annual variant of the V ampersand V methods (the control group), while the other half were supported by the results of applying real or simulated automated tools to the knowledge bases (the experimental group). The four groups of participants were similar in nuclear engineering and software experience characteristics. It is concluded that the use of tools in static knowledge base certification results in significant improvement in detecting all types of defects, avoiding false alarms, and completing the effort in less time. The simulated knowledge-checking tool, based on supplemental engineering information about the systems

  17. In-field inspection support software: A status report on the Common Inspection On-site Software Package (CIOSP) project

    International Nuclear Information System (INIS)

    Novatchev, Dimitre; Titov, Pavel; Siradjov, Bakhtiiar; Vlad, Ioan; Xiao Jing

    2001-01-01

    Full text: IAEA has invested much thought and effort into developing software that can assist inspectors during their inspection work. Experience with such applications has been steadily growing and IAEA has recently commissioned a next-generation software package. This kind of software accommodates inspection tasks that can vary substantially in function depending on the type of installation being inspected as well as ensures that the resulting software package has a wide range of usability and can preclude excessive development of plant-specific applications. The Common Inspection On-site Software Package is being developed in the Department of Safeguards to address the limitations of the existing software and to expand its coverage of the inspection process. CIOSP is 'common' in that it is aimed at providing support for as many facilities as possible with the minimum re-configuration. At the same time it has to cater to varying needs of individual facilities, different instrumentation and verification methods used. A component-based approach was taken to successfully tackle the challenges that the development of this software presented. CIOSP consists of the following major components: A framework into which individual plug-ins supporting various inspection activities can integrate at run-time; A central data store containing all facility configuration data and all data collected during inspections; A local data store, which resides on the inspector's computer, where the current inspection's data is stored; A set of services used by all plug-ins (i.e. data transformation, authentication, replication services etc.). This architecture allows for incremental development and extension of the software with plug-ins that support individual inspection activities. The core set of components along with the framework, the Inventory Verification, Book Examination and Records and Reports Comparison plug-ins have been developed. The development of the Short Notice Random

  18. JULIA: calculation projection software for primary barriers shielding to X-Rays using barite

    International Nuclear Information System (INIS)

    Silva, Júlia R.A.S. da; Vieira, José W.; Lima, Fernando R. A.

    2017-01-01

    The objective was to program a software to calculate the required thicknesses to attenuate X-rays in kilovoltage of 60 kV, 80 kV, 110 kV and 150 kV. The conventional methodological parameters for structural shield calculations established by the NCRP (National Council on Radiation Protection and Measurements) were presented. The descriptive and exploratory methods allowed the construction of the JULIA. In this sense and based on the result obtained, the tool presented is useful for professionals who wish to design structural shielding in radiodiagnostic and/or therapy. The development of calculations in the computational tool corresponds to the accessibility, optimization of time and estimation close to the real. Such heuristic exercise represents improvement of calculations for the estimation of primary barriers with barite

  19. Dynamic Staffing and Rescheduling in Software Project Management: A Hybrid Approach.

    Directory of Open Access Journals (Sweden)

    Yujia Ge

    Full Text Available Resource allocation could be influenced by various dynamic elements, such as the skills of engineers and the growth of skills, which requires managers to find an effective and efficient tool to support their staffing decision-making processes. Rescheduling happens commonly and frequently during the project execution. Control options have to be made when new resources are added or tasks are changed. In this paper we propose a software project staffing model considering dynamic elements of staff productivity with a Genetic Algorithm (GA and Hill Climbing (HC based optimizer. Since a newly generated reschedule dramatically different from the initial schedule could cause an obvious shifting cost increase, our rescheduling strategies consider both efficiency and stability. The results of real world case studies and extensive simulation experiments show that our proposed method is effective and could achieve comparable performance to other heuristic algorithms in most cases.

  20. Management of radiodiagnostic equipment: Implementation of self-maintenance project of the conventional x-ray equipment of Hospital Universitario Clementino Fraga Filho - HUCFF-UFRJ

    International Nuclear Information System (INIS)

    Couto, N.F. do; Azevedo, A.C.P.; Koch, H.A.

    2001-01-01

    The project aims the implantation of a management program, for the maintenance of the conventional X-ray equipment at HUCFF. It has been implemented through the training of the electronic technicians who work at the Hospital. Essential courses were organized such as: Basics of Radioprotection, Radiographs Techniques, and Maintenance of equipment of X-Rays. Equipment: a library with the schemes of the equipment is being assembled in collaboration with UNICAMP. In order to manage the process, a software was created using the tools of the total quality for control of the maintenance. Preliminary tests: the equipment and their working conditions were evaluated, as well as the level of the employees' satisfaction with their use. The creation of a new routine for maintenance seeks to assist the demands of the new legislation in Brazil 5, and also reduce the costs to improve the quality of the images in the Radiodiagnostic Service. (author)

  1. Requirement Volatility, Standardization and Knowledge Integration in Software Projects: An Empirical Analysis on Outsourced IS Development Projects

    Directory of Open Access Journals (Sweden)

    Rajesri Govindaraju

    2015-08-01

    Full Text Available Information systems development (ISD projects are highly complex, with different groups of people having  to collaborate and exchange their knowledge. Considering the intensity of knowledge exchange that takes place in outsourced ISD projects, in this study a conceptual model was developed, aiming to examine the influence of four antecedents, i.e. standardization, requirement volatility, internal integration, and external integration, on two dependent variables, i.e. process performance and product performance. Data  were collected from 46 software companies in four big cities in Indonesia. The collected data were examined to verify the proposed theoretical model using the partial least square structural equation modeling (PLS-SEM technique. The results show that process performance is significantly influenced by internal integration and standardization, while product performance is  significantly influenced by external integration and  requirement volatility. This study contributes  to a better understanding of how knowledge integration can be managed in outsourced ISD projects in view of increasing their success.

  2. A Novel Markerless Technique to Evaluate Daily Lung Tumor Motion Based on Conventional Cone-Beam CT Projection Data

    International Nuclear Information System (INIS)

    Yang Yin; Zhong Zichun; Guo Xiaohu; Wang Jing; Anderson, John; Solberg, Timothy; Mao Weihua

    2012-01-01

    Purpose: In this study, we present a novel markerless technique, based on cone beam computed tomography (CBCT) raw projection data, to evaluate lung tumor daily motion. Method and Materials: The markerless technique, which uses raw CBCT projection data and locates tumors directly on every projection, consists of three steps. First, the tumor contour on the planning CT is used to create digitally reconstructed radiographs (DRRs) at every projection angle. Two sets of DRRs are created: one showing only the tumor, and another with the complete anatomy without the tumor. Second, a rigid two-dimensional image registration is performed to register the DRR set without the tumor to the CBCT projections. After the registration, the projections are subtracted from the DRRs, resulting in a projection dataset containing primarily tumor. Finally, a second registration is performed between the subtracted projection and tumor-only DRR. The methodology was evaluated using a chest phantom containing a moving tumor, and retrospectively in 4 lung cancer patients treated by stereotactic body radiation therapy. Tumors detected on projection images were compared with those from three-dimensional (3D) and four-dimensional (4D) CBCT reconstruction results. Results: Results in both static and moving phantoms demonstrate that the accuracy is within 1 mm. The subsequent application to 22 sets of CBCT scan raw projection data of 4 lung cancer patients includes about 11,000 projections, with the detected tumor locations consistent with 3D and 4D CBCT reconstruction results. This technique reveals detailed lung tumor motion and provides additional information than conventional 4D images. Conclusion: This technique is capable of accurately characterizing lung tumor motion on a daily basis based on a conventional CBCT scan. It provides daily verification of the tumor motion to ensure that these motions are within prior estimation and covered by the treatment planning volume.

  3. A survey of quality assurance practices in biomedical open source software projects.

    Science.gov (United States)

    Koru, Günes; El Emam, Khaled; Neisa, Angelica; Umarji, Medha

    2007-05-07

    Open source (OS) software is continuously gaining recognition and use in the biomedical domain, for example, in health informatics and bioinformatics. Given the mission critical nature of applications in this domain and their potential impact on patient safety, it is important to understand to what degree and how effectively biomedical OS developers perform standard quality assurance (QA) activities such as peer reviews and testing. This would allow the users of biomedical OS software to better understand the quality risks, if any, and the developers to identify process improvement opportunities to produce higher quality software. A survey of developers working on biomedical OS projects was conducted to examine the QA activities that are performed. We took a descriptive approach to summarize the implementation of QA activities and then examined some of the factors that may be related to the implementation of such practices. Our descriptive results show that 63% (95% CI, 54-72) of projects did not include peer reviews in their development process, while 82% (95% CI, 75-89) did include testing. Approximately 74% (95% CI, 67-81) of developers did not have a background in computing, 80% (95% CI, 74-87) were paid for their contributions to the project, and 52% (95% CI, 43-60) had PhDs. A multivariate logistic regression model to predict the implementation of peer reviews was not significant (likelihood ratio test = 16.86, 9 df, P = .051) and neither was a model to predict the implementation of testing (likelihood ratio test = 3.34, 9 df, P = .95). Less attention is paid to peer review than testing. However, the former is a complementary, and necessary, QA practice rather than an alternative. Therefore, one can argue that there are quality risks, at least at this point in time, in transitioning biomedical OS software into any critical settings that may have operational, financial, or safety implications. Developers of biomedical OS applications should invest more effort

  4. Software Configuration Management Plan for the K West Basin Integrated Water Treatment System (IWTS) - Project A.9

    International Nuclear Information System (INIS)

    GREEN, J.W.

    2000-01-01

    This document provides a configuration control plan for the software associated with the operation and control of the Integrated Water Treatment System (IWTS). It establishes requirements for ensuring configuration item identification, configuration control, configuration status accounting, defect reporting and resolution of computer software. It is written to comply with HNF-SD-SNF-CM-001, Spent Nuclear Fuel Configuration Management Plan (Forehand 1998) and HNF-PRO-309 Computer Software Quality Assurance Requirements, and applicable sections of administrative procedure CM-6-037-00, SNF Project Process Automation Software and Equipment

  5. The ALPS project release 2.0: open source software for strongly correlated systems

    International Nuclear Information System (INIS)

    Bauer, B; Gamper, L; Gukelberger, J; Hehn, A; Isakov, S V; Ma, P N; Mates, P; Carr, L D; Evertz, H G; Feiguin, A; Freire, J; Koop, D; Fuchs, S; Gull, E; Guertler, S; Igarashi, R; Matsuo, H; Parcollet, O; Pawłowski, G; Picon, J D

    2011-01-01

    We present release 2.0 of the ALPS (Algorithms and Libraries for Physics Simulations) project, an open source software project to develop libraries and application programs for the simulation of strongly correlated quantum lattice models such as quantum magnets, lattice bosons, and strongly correlated fermion systems. The code development is centered on common XML and HDF5 data formats, libraries to simplify and speed up code development, common evaluation and plotting tools, and simulation programs. The programs enable non-experts to start carrying out serial or parallel numerical simulations by providing basic implementations of the important algorithms for quantum lattice models: classical and quantum Monte Carlo (QMC) using non-local updates, extended ensemble simulations, exact and full diagonalization (ED), the density matrix renormalization group (DMRG) both in a static version and a dynamic time-evolving block decimation (TEBD) code, and quantum Monte Carlo solvers for dynamical mean field theory (DMFT). The ALPS libraries provide a powerful framework for programmers to develop their own applications, which, for instance, greatly simplify the steps of porting a serial code onto a parallel, distributed memory machine. Major changes in release 2.0 include the use of HDF5 for binary data, evaluation tools in Python, support for the Windows operating system, the use of CMake as build system and binary installation packages for Mac OS X and Windows, and integration with the VisTrails workflow provenance tool. The software is available from our web server at http://alps.comp-phys.org/

  6. Guidelines for the verification and validation of expert system software and conventional software: Evaluation of knowledge base certification methods. Volume 4

    International Nuclear Information System (INIS)

    Miller, L.A.; Hayes, J.E.; Mirsky, S.M.

    1995-03-01

    This report presents the results of the Knowledge Base Certification activity of the expert systems verification and validation (V ampersand V) guideline development project which is jointly funded by the US Nuclear Regulatory Commission and the Electric Power Research Institute. The ultimate objective is the formulation of guidelines for the V ampersand V of expert systems for use in nuclear power applications. This activity is concerned with the development and testing of various methods for assuring the quality of knowledge bases. The testing procedure used was that of behavioral experiment, the first known such evaluation of any type of V ampersand V activity. The value of such experimentation is its capability to provide empirical evidence for -- or against -- the effectiveness of plausible methods in helping people find problems in knowledge bases. The three-day experiment included 20 participants from three nuclear utilities, the Nuclear Regulatory Commission's Technical training Center, the University of Maryland, EG ampersand G Idaho, and SAIC. The study used two real nuclear expert systems: a boiling water reactor emergency operating procedures tracking system and a pressurized water reactor safety assessment systems. Ten participants were assigned to each of the expert systems. All participants were trained in and then used a sequence of four different V ampersand V methods selected as being the best and most appropriate for study on the basis of prior evaluation activities. These methods either involved the analysis and tracing of requirements to elements in the knowledge base (requirements grouping and requirements tracing) or else involved direct inspection of the knowledge base for various kinds of errors. Half of the subjects within each system group used the best manual variant of the V ampersand V methods (the control group), while the other half were supported by the results of applying real or simulated automated tools to the knowledge bases

  7. Status report of the SRT radiotelescope control software: the DISCOS project

    Science.gov (United States)

    Orlati, A.; Bartolini, M.; Buttu, M.; Fara, A.; Migoni, C.; Poppi, S.; Righini, S.

    2016-08-01

    The Sardinia Radio Telescope (SRT) is a 64-m fully-steerable radio telescope. It is provided with an active surface to correct for gravitational deformations, allowing observations from 300 MHz to 100 GHz. At present, three receivers are available: a coaxial LP-band receiver (305-410 MHz and 1.5-1.8 GHz), a C-band receiver (5.7-7.7 GHz) and a 7-feed K-band receiver (18-26.5 GHz). Several back-ends are also available in order to perform the different data acquisition and analysis procedures requested by scientific projects. The design and development of the SRT control software started in 2004, and now belongs to a wider project called DISCOS (Development of the Italian Single-dish COntrol System), which provides a common infrastructure to the three Italian radio telescopes (Medicina, Noto and SRT dishes). DISCOS is based on the Alma Common Software (ACS) framework, and currently consists of more than 500k lines of code. It is organized in a common core and three specific product lines, one for each telescope. Recent developments, carried out after the conclusion of the technical commissioning of the instrument (October 2013), consisted in the addition of several new features in many parts of the observing pipeline, spanning from the motion control to the digital back-ends for data acquisition and data formatting; we brie y describe such improvements. More importantly, in the last two years we have supported the astronomical validation of the SRT radio telescope, leading to the opening of the first public call for proposals in late 2015. During this period, while assisting both the engineering and the scientific staff, we massively employed the control software and were able to test all of its features: in this process we received our first feedback from the users and we could verify how the system performed in a real-life scenario, drawing the first conclusions about the overall system stability and performance. We examine how the system behaves in terms of network

  8. Effects of Using Requirements Catalogs on Effectiveness and Productivity of Requirements Specification in a Software Project Management Course

    Science.gov (United States)

    Fernández-Alemán, José Luis; Carrillo-de-Gea, Juan Manuel; Meca, Joaquín Vidal; Ros, Joaquín Nicolás; Toval, Ambrosio; Idri, Ali

    2016-01-01

    This paper presents the results of two educational experiments carried out to determine whether the process of specifying requirements (catalog-based reuse as opposed to conventional specification) has an impact on effectiveness and productivity in co-located and distributed software development environments. The participants in the experiments…

  9. The State of Open Source Electronic Health Record Projects: A Software Anthropology Study.

    Science.gov (United States)

    Alsaffar, Mona; Yellowlees, Peter; Odor, Alberto; Hogarth, Michael

    2017-02-24

    Electronic health records (EHR) are a key tool in managing and storing patients' information. Currently, there are over 50 open source EHR systems available. Functionality and usability are important factors for determining the success of any system. These factors are often a direct reflection of the domain knowledge and developers' motivations. However, few published studies have focused on the characteristics of free and open source software (F/OSS) EHR systems and none to date have discussed the motivation, knowledge background, and demographic characteristics of the developers involved in open source EHR projects. This study analyzed the characteristics of prevailing F/OSS EHR systems and aimed to provide an understanding of the motivation, knowledge background, and characteristics of the developers. This study identified F/OSS EHR projects on SourceForge and other websites from May to July 2014. Projects were classified and characterized by license type, downloads, programming languages, spoken languages, project age, development status, supporting materials, top downloads by country, and whether they were "certified" EHRs. Health care F/OSS developers were also surveyed using an online survey. At the time of the assessment, we uncovered 54 open source EHR projects, but only four of them had been successfully certified under the Office of the National Coordinator for Health Information Technology (ONC Health IT) Certification Program. In the majority of cases, the open source EHR software was downloaded by users in the United States (64.07%, 148,666/232,034), underscoring that there is a significant interest in EHR open source applications in the United States. A survey of EHR open source developers was conducted and a total of 103 developers responded to the online questionnaire. The majority of EHR F/OSS developers (65.3%, 66/101) are participating in F/OSS projects as part of a paid activity and only 25.7% (26/101) of EHR F/OSS developers are, or have been

  10. Management of Globally Distributed Software Development Projects in Multiple-Vendor Constellations

    Science.gov (United States)

    Schott, Katharina; Beck, Roman; Gregory, Robert Wayne

    Global information systems development outsourcing is an apparent trend that is expected to continue in the foreseeable future. Thereby, IS-related services are not only increasingly provided from different geographical sites simultaneously but beyond that from multiple service providers based in different countries. The purpose of this paper is to understand how the involvement of multiple service providers affects the management of the globally distributed information systems development projects. As research on this topic is scarce, we applied an exploratory in-depth single-case study design as research approach. The case we analyzed comprises a global software development outsourcing project initiated by a German bank together with several globally distributed vendors. For data collection and data analysis we have adopted techniques suggested by the grounded theory method. Whereas the extant literature points out the increased management overhead associated with multi-sourcing, the analysis of our case suggests that the required effort for managing global outsourcing projects with multiple vendors depends among other things on the maturation level of the cooperation within the vendor portfolio. Furthermore, our data indicate that this interplay maturity is positively impacted through knowledge about the client that has been derived based on already existing client-vendor relationships. The paper concludes by offering theoretical and practical implications.

  11. Development and Engineering Design in Support of "Rover Ranch": A K-12 Outreach Software Project

    Science.gov (United States)

    Pascali, Raresh

    2003-01-01

    A continuation of the initial development started in the summer of 1999, the body of work performed in support of 'ROVer Ranch' Project during the present fellowship dealt with the concrete concept implementation and resolution of the related issues. The original work performed last summer focused on the initial examination and articulation of the concept treatment strategy, audience and market analysis for the learning technologies software. The presented work focused on finalizing the set of parts to be made available for building an AERCam Sprint type robot and on defining, testing and implementing process necessary to convert the design engineering files to VRML files. Through reverse engineering, an initial set of mission critical systems was designed for beta testing in schools. The files were created in ProEngineer, exported to VRML 1.0 and converted to VRML 97 (VRML 2.0) for final integration in the software. Attributes for each part were assigned using an in-house developed JAVA based program. The final set of attributes for each system, their mutual interaction and the identification of the relevant ones to be tracked, still remain to be decided.

  12. R PROJECT: SU APLICACIÓN COMO SOFTWARE LIBRE PARA ANÁLISIS EN COMPONENTES PRINCIPALES

    Directory of Open Access Journals (Sweden)

    Fabricio Bolaños Guerrero

    2011-01-01

    Full Text Available Este artículo es producto de un proyecto de investigación realizado en colaboración con profesores de la Escuela de Matemáticas de la Universidad de Costa Rica (UCR, para dar a conocer una opción de software estadístico llamado R Project. Con este paquete es posible hacer Análisis en Componentes Principales (ACP y representar los resultados usando el Plano Principal y el Círculo de Correlaciones, como herramientas para poder realizar una mejor interpretación de los datos de la tabla (individuos y variables. El software R es de distribución libre, su implementación es sencilla y no requiere de mayores recursos informáticos. Dentro de sus diversas aplicaciones está el ACP, que es una herramienta que se utiliza para la interpretación de la información presentada en una tabla de datos cuantitativos; por lo tanto, las personas investigadoras de diferentes áreas tienen una opción económica y sencilla para realizar Análisis de Datos. Se llevan a cabo dos ejemplos de ACP, donde se muestra un posible uso de la herramienta y se dan las instrucciones sobre cómo realizarlo paso a paso.

  13. Project maturity evaluation model for SMEs from the software development sub-sector

    Directory of Open Access Journals (Sweden)

    ÁLVARO JULIO CUADROS LÓPEZ

    Full Text Available The purpose of the paper is to present a project management maturity model for SMEs oriented to software development. The proposal is based on CMMI capability maturity model, and the SCAMPI evaluation method. The proposal includes a quantitative satisfaction scale, redundant evidence assessment, and multiple criteria for selecting experts. The proposal was validated with a case study carried out in a medium-sized company from the Information and Communications Technology sector. The model concluded that the company did not reach maturity level 2; however it showed that 92% of the processes from maturity level 2 and 77% of the total process had already been implemented, which allows the company to adopt a specific orientation for its improvement efforts.

  14. ProjectQ: an open source software framework for quantum computing

    Directory of Open Access Journals (Sweden)

    Damian S. Steiger

    2018-01-01

    Full Text Available We introduce ProjectQ, an open source software effort for quantum computing. The first release features a compiler framework capable of targeting various types of hardware, a high-performance simulator with emulation capabilities, and compiler plug-ins for circuit drawing and resource estimation. We introduce our Python-embedded domain-specific language, present the features, and provide example implementations for quantum algorithms. The framework allows testing of quantum algorithms through simulation and enables running them on actual quantum hardware using a back-end connecting to the IBM Quantum Experience cloud service. Through extension mechanisms, users can provide back-ends to further quantum hardware, and scientists working on quantum compilation can provide plug-ins for additional compilation, optimization, gate synthesis, and layout strategies.

  15. Sustainability in Open Source Software Commons: Lessons Learned from an Empirical Study of SourceForge Projects

    OpenAIRE

    Charles M. Schweik

    2013-01-01

    In this article, we summarize a five-year US National Science Foundation funded study designed to investigate the factors that lead some open source projects to ongoing collaborative success while many others become abandoned. Our primary interest was to conduct a study that was closely representative of the population of open source software projects in the world, rather than focus on the more-often studied, high-profile successful cases. After building a large database of projects (n=174,33...

  16. Open source projects as incubators of innovation: From niche phenomenon to integral part of the software industry

    OpenAIRE

    Schrape, Jan-Felix

    2017-01-01

    Over the last 20 years, open source development has become an integral part of the software industry and a key component of the innovation strategies of all major IT providers. Against this backdrop, this paper seeks to develop a systematic overview of open source communities and their socio-economic contexts. I begin with a reconstruction of the genesis of open source software projects and their changing relation- ships to established IT companies. This is followed by the identification of f...

  17. Projeto Seis Sigma para a implementação de software de programação Six Sigma project for scheduling software implementation

    Directory of Open Access Journals (Sweden)

    Rogério Cerávolo Calia

    2005-12-01

    Full Text Available O artigo visa analisar a eficácia organizacional da metodologia Seis Sigma na gestão de projetos para a redução de atrasos e redução de estoques na manufatura, por meio da implementação de um software com algoritmos da Teoria das Restrições. Inicialmente, é apresentada uma revisão bibliográfica sobre a gestão de projetos na perspectiva da gestão da mudança organizacional nos processos de negócios. Em seguida, são revistos os conceitos sobre a metodologia Seis Sigma para a gestão de projetos e sobre os algoritmos da Teoria das Restrições. Então, são descritos os estudos de caso em dois projetos de implementação do software da Teoria das Restrições, sendo que apenas uma das implementações utilizou-se da metodologia Seis Sigma para a gestão do projeto. Na análise dos resultados, busca-se compreender os motivos de o projeto com a metodologia Seis Sigma ter reduzido inventário três vezes mais rápido do que o projeto sem o Seis Sigma.The article aims to analyze the organizational effectiveness of the Six Sigma methodology for project management to reduce delays and to reduce inventory in manufacture, by the implementation of software with Theory of Constraints algorithms. Initially, the article presents a bibliographic revision on project management and its impact on the organizational change management for improving business processes. Then, the article revises the concepts about the Six Sigma methodology for project management and about the Theory of Constraints algorithms. It follows, the case studies descriptions on two implementation projects of the Theory of Constraints software, in which only one of these implementations adopted the Six Sigma methodology in the project management. In the results analyzes, the article discusses the reasons why the project with the Six Sigma methodology was three times faster than the other project.

  18. A Software Engineering Paradigm for Quick-turnaround Earth Science Data Projects

    Science.gov (United States)

    Moore, K.

    2016-12-01

    As is generally the case with applied sciences professional and educational programs, the participants of such programs can come from a variety of technical backgrounds. In the NASA DEVELOP National Program, the participants constitute an interdisciplinary set of backgrounds, with varying levels of experience with computer programming. DEVELOP makes use of geographically explicit data sets, and it is necessary to use geographic information systems and geospatial image processing environments. As data sets cover longer time spans and include more complex sets of parameters, automation is becoming an increasingly prevalent feature. Though platforms such as ArcGIS, ERDAS Imagine, and ENVI facilitate the batch-processing of geospatial imagery, these environments are naturally constricting to the user in that they limit him or her to the tools that are available. Users must then turn to "homemade" scripting in more traditional programming languages such as Python, JavaScript, or R, to automate workflows. However, in the context of quick-turnaround projects like those in DEVELOP, the programming learning curve may be prohibitively steep. In this work, we consider how to best design a software development paradigm that addresses two major constants: an arbitrarily experienced programmer and quick-turnaround project timelines.

  19. Technical Note: Development and performance of a software tool for quality assurance of online replanning with a conventional Linac or MR-Linac

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Guang-Pei, E-mail: gpchen@mcw.edu; Ahunbay, Ergun; Li, X. Allen [Department of Radiation Oncology, Medical College of Wisconsin, Milwaukee, Wisconsin 53226 (United States)

    2016-04-15

    Purpose: To develop an integrated quality assurance (QA) software tool for online replanning capable of efficiently and automatically checking radiation treatment (RT) planning parameters and gross plan quality, verifying treatment plan data transfer from treatment planning system (TPS) to record and verify (R&V) system, performing a secondary monitor unit (MU) calculation with or without a presence of a magnetic field from MR-Linac, and validating the delivery record consistency with the plan. Methods: The software tool, named ArtQA, was developed to obtain and compare plan and treatment parameters from both the TPS and the R&V system database. The TPS data are accessed via direct file reading and the R&V data are retrieved via open database connectivity and structured query language. Plan quality is evaluated with both the logical consistency of planning parameters and the achieved dose–volume histograms. Beams in between the TPS and R&V system are matched based on geometry configurations. To consider the effect of a 1.5 T transverse magnetic field from MR-Linac in the secondary MU calculation, a method based on modified Clarkson integration algorithm was developed and tested for a series of clinical situations. Results: ArtQA has been used in their clinic and can quickly detect inconsistencies and deviations in the entire RT planning process. With the use of the ArtQA tool, the efficiency for plan check including plan quality, data transfer, and delivery check can be improved by at least 60%. The newly developed independent MU calculation tool for MR-Linac reduces the difference between the plan and calculated MUs by 10%. Conclusions: The software tool ArtQA can be used to perform a comprehensive QA check from planning to delivery with conventional Linac or MR-Linac and is an essential tool for online replanning where the QA check needs to be performed rapidly.

  20. Technical Note: Development and performance of a software tool for quality assurance of online replanning with a conventional Linac or MR-Linac.

    Science.gov (United States)

    Chen, Guang-Pei; Ahunbay, Ergun; Li, X Allen

    2016-04-01

    To develop an integrated quality assurance (QA) software tool for online replanning capable of efficiently and automatically checking radiation treatment (RT) planning parameters and gross plan quality, verifying treatment plan data transfer from treatment planning system (TPS) to record and verify (R&V) system, performing a secondary monitor unit (MU) calculation with or without a presence of a magnetic field from MR-Linac, and validating the delivery record consistency with the plan. The software tool, named ArtQA, was developed to obtain and compare plan and treatment parameters from both the TPS and the R&V system database. The TPS data are accessed via direct file reading and the R&V data are retrieved via open database connectivity and structured query language. Plan quality is evaluated with both the logical consistency of planning parameters and the achieved dose-volume histograms. Beams in between the TPS and R&V system are matched based on geometry configurations. To consider the effect of a 1.5 T transverse magnetic field from MR-Linac in the secondary MU calculation, a method based on modified Clarkson integration algorithm was developed and tested for a series of clinical situations. ArtQA has been used in their clinic and can quickly detect inconsistencies and deviations in the entire RT planning process. With the use of the ArtQA tool, the efficiency for plan check including plan quality, data transfer, and delivery check can be improved by at least 60%. The newly developed independent MU calculation tool for MR-Linac reduces the difference between the plan and calculated MUs by 10%. The software tool ArtQA can be used to perform a comprehensive QA check from planning to delivery with conventional Linac or MR-Linac and is an essential tool for online replanning where the QA check needs to be performed rapidly.

  1. Radiation dose optimisation for conventional imaging in infants and newborns using automatic dose management software: an application of the new 2013/59 EURATOM directive.

    Science.gov (United States)

    Alejo, L; Corredoira, E; Sánchez-Muñoz, F; Huerga, C; Aza, Z; Plaza-Núñez, R; Serrada, A; Bret-Zurita, M; Parrón, M; Prieto-Areyano, C; Garzón-Moll, G; Madero, R; Guibelalde, E

    2018-04-09

    Objective: The new 2013/59 EURATOM Directive (ED) demands dosimetric optimisation procedures without undue delay. The aim of this study was to optimise paediatric conventional radiology examinations applying the ED without compromising the clinical diagnosis. Automatic dose management software (ADMS) was used to analyse 2678 studies of children from birth to 5 years of age, obtaining local diagnostic reference levels (DRLs) in terms of entrance surface air kerma. Given local DRL for infants and chest examinations exceeded the European Commission (EC) DRL, an optimisation was performed decreasing the kVp and applying the automatic control exposure. To assess the image quality, an analysis of high-contrast resolution (HCSR), signal-to-noise ratio (SNR) and figure of merit (FOM) was performed, as well as a blind test based on the generalised estimating equations method. For newborns and chest examinations, the local DRL exceeded the EC DRL by 113%. After the optimisation, a reduction of 54% was obtained. No significant differences were found in the image quality blind test. A decrease in SNR (-37%) and HCSR (-68%), and an increase in FOM (42%), was observed. ADMS allows the fast calculation of local DRLs and the performance of optimisation procedures in babies without delay. However, physical and clinical analyses of image quality remain to be needed to ensure the diagnostic integrity after the optimisation process. Advances in knowledge: ADMS are useful to detect radiation protection problems and to perform optimisation procedures in paediatric conventional imaging without undue delay, as ED requires.

  2. Software proyecto y presupuesto de una investigación en salud A software project and estimate of a health research

    Directory of Open Access Journals (Sweden)

    Manuel Piloto Morejón

    2011-03-01

    Full Text Available Introducción: La confección de un proyecto de investigación es una tarea difícil por la diversidad de los conocimientos básicos de Metodología de la Investigación, Bioestadística y Economía, necesarios para enfrentar las diferentes secciones del mismo. Objetivo: garantizar la confección adecuada de los proyectos de investigación de cualquier nivel y la confección uniforme y facilitada del cálculo económico (presupuesto para todos los proyectos de investigación de la provincia. Material y métodos: se confeccionó un software que facilita a los investigadores la confección de un proyecto de investigación, tomando como base la guía oficial de un proyecto ramal del CITMA (según establece la Resolución Ministerial 110/2009 y los numerosos cálculos matemáticos del presupuesto de un proyecto de investigación de forma automatizada a partir de los datos económicos primarios. Resultados: el software no necesita instalación. Es un instrumento útil para cualquier investigador ya que resulta fácil de manejar. Concentra lo elemental y sencillo mediante la existencia del Manual de Usuario. Es aplicable y de pequeño tamaño (Kb, fiable y eficiente. Tiene facilidad de uso y de mantenimiento y beneficios sociales, tecnológicos, científicos y económicos. Este software es utilizado por el 100% de los investigadores de la provincia en los proyectos ramales, territoriales, institucionales, de Trabajos de Terminación de la Especialidad y Trabajos de Terminación de las Maestrías.Introduction: to prepare a project for scientific research is a difficult task due to the diversity of basic knowledge about Methodology of Scientific Research, Biostatistics and Economy which are necessary to face up the different sections of the project. Objective: to guarantee the suitable groundwork of research projects at any level as well as the uniform preparation for economic calculation (estimate of the whole research projects of the province

  3. The MEDA Project: Developing Evaluation Competence in the Training Software Domain.

    Science.gov (United States)

    Machell, Joan; Saunders, Murray

    1992-01-01

    The MEDA (Methodologie d'Evaluation des Didacticiels pour les Adultes) tool is a generic instrument to evaluate training courseware. It was developed for software designers to improve products, for instructors to select appropriate courseware, and for distributors and consultants to match software to client needs. Describes software evaluation…

  4. Application of Open Source Software by the Lunar Mapping and Modeling Project

    Science.gov (United States)

    Ramirez, P.; Goodale, C. E.; Bui, B.; Chang, G.; Kim, R. M.; Law, E.; Malhotra, S.; Rodriguez, L.; Sadaqathullah, S.; Mattmann, C. A.; Crichton, D. J.

    2011-12-01

    The Lunar Mapping and Modeling Project (LMMP), led by the Marshall Space Flight center (MSFC), is responsible for the development of an information system to support lunar exploration, decision analysis, and release of lunar data to the public. The data available through the lunar portal is predominantly derived from present lunar missions (e.g., the Lunar Reconnaissance Orbiter (LRO)) and from historical missions (e.g., Apollo). This project has created a gold source of data, models, and tools for lunar explorers to exercise and incorporate into their activities. At Jet Propulsion Laboratory (JPL), we focused on engineering and building the infrastructure to support cataloging, archiving, accessing, and delivery of lunar data. We decided to use a RESTful service-oriented architecture to enable us to abstract from the underlying technology choices and focus on interfaces to be used internally and externally. This decision allowed us to leverage several open source software components and integrate them by either writing a thin REST service layer or relying on the API they provided; the approach chosen was dependent on the targeted consumer of a given interface. We will discuss our varying experience using open source products; namely Apache OODT, Oracle Berkley DB XML, Apache Solr, and Oracle OpenSSO (now named OpenAM). Apache OODT, developed at NASA's Jet Propulsion Laboratory and recently migrated over to Apache, provided the means for ingestion and cataloguing of products within the infrastructure. Its usage was based upon team experience with the project and past benefit received on other projects internal and external to JPL. Berkeley DB XML, distributed by Oracle for both commercial and open source use, was the storage technology chosen for our metadata. This decision was in part based on our use Federal Geographic Data Committee (FGDC) Metadata, which is expressed in XML, and the desire to keep it in its native form and exploit other technologies built on

  5. Non conventional psychiatric rehabilitation in schizophrenia using therapeutic riding: the FISE multicentre Pindar project

    Directory of Open Access Journals (Sweden)

    Stefania Cerino

    2011-12-01

    Full Text Available The FISE (Federazione Italiana Sport Equestri Pindar is a multicentre research project aimed at testing the potential effects of therapeutic riding on schizophrenic patients. Twenty-four subjects with a diagnosis of schizophrenia were enrolled for a 1 year-treatment involving therapeutic riding sessions. All subjects were tested at the beginning and at the end of treatment with a series of validated test batteries (BPRS and 8 items-PANSS. The results discussed in this paper point out an improvement in negative symptoms, a constant disease remission in both early onset and chronic disease subjects, as well as a reduced rate of hospitalization.

  6. Challenges in Mentoring Software Development Projects in the High School: Analysis According to Shulman's Teacher Knowledge Base Model

    Science.gov (United States)

    Meerbaum-Salant, Orni; Hazzan, Orit

    2009-01-01

    This paper focuses on challenges in mentoring software development projects in the high school and analyzes difficulties encountered by Computer Science teachers in the mentoring process according to Shulman's Teacher Knowledge Base Model. The main difficulties that emerged from the data analysis belong to the following knowledge sources of…

  7. An Investigation of the Relationships between Goals and Software Project Escalation: Insights from Goal Setting and Goal Orientation Theories

    Science.gov (United States)

    Lee, Jong Seok

    2013-01-01

    Escalation of commitment is manifested as a behavior in which an individual resists withdrawing from a failing course of action despite negative feedback, and it is an enduring problem that occurs in a variety of situations, including R&D investment decisions and software project overruns. To date, a variety of theoretical explanations have…

  8. Space Flight Software Development Software for Intelligent System Health Management

    Science.gov (United States)

    Trevino, Luis C.; Crumbley, Tim

    2004-01-01

    The slide presentation examines the Marshall Space Flight Center Flight Software Branch, including software development projects, mission critical space flight software development, software technical insight, advanced software development technologies, and continuous improvement in the software development processes and methods.

  9. Clinical software development for the Web: lessons learned from the BOADICEA project

    OpenAIRE

    Cunningham, Alex P; Antoniou, Antonis C; Easton, Douglas F

    2012-01-01

    Abstract Background In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use...

  10. Software engineering with application-specific languages

    Science.gov (United States)

    Campbell, David J.; Barker, Linda; Mitchell, Deborah; Pollack, Robert H.

    1993-01-01

    Application-Specific Languages (ASL's) are small, special-purpose languages that are targeted to solve a specific class of problems. Using ASL's on software development projects can provide considerable cost savings, reduce risk, and enhance quality and reliability. ASL's provide a platform for reuse within a project or across many projects and enable less-experienced programmers to tap into the expertise of application-area experts. ASL's have been used on several software development projects for the Space Shuttle Program. On these projects, the use of ASL's resulted in considerable cost savings over conventional development techniques. Two of these projects are described.

  11. Types of treatment collaboration between conventional and alternative practitioners – results from a research project at a Danish MS hospital

    Directory of Open Access Journals (Sweden)

    Lasse Skovgaard

    2010-12-01

    Full Text Available Introduction: More than 50% of the People with Multiple Sclerosis (PwMS in Denmark use alternative treatment. Most of them combine alternative and conventional treatment, but PwMS often find that there is no dialogue, coordination or synergy between the parallel courses of treatment offered. For this reason the Danish Multiple Sclerosis Society conducted a research project to develop and examine different models for collaboration between conventional and alternative treatment providers. Materials and methods: Empirical material consist of individual interviews with practitioners, a group interview with practitioners, a group interview with professional staff at the Danish MS hospital that provided the organisational framework for the project, interviews with patients as well as written responses from participating treatment providers in connection with practitioner-researcher seminars held. Results: Collaboration between researchers and the treatment team resulted in the development examination of several models which describe the strengths and weaknesses of various types of collaboration. The models also show that the various types of collaboration place different requirements on the degree of 1 mutual acknowledgement and understanding among practitioners, 2 flexibility and resources in the organizational framework, and 3 patients' activities and own efforts, respectively.    Perspectives: The relationship between integration and pluralism can contribute to a fruitful discussion in regards to the value of treatment collaboration. In addition to the many positive perspectives the characterise integration of different treatment modalities the project points to the importance of not overlooking the opportunities, values and potential inherent in a pluralistic ideal in the form of patients' own active efforts and the dynamism that can arise when the patient becomes a co-informant, co-coordinator and/or co-integrator. 

  12. Types of treatment collaboration between conventional and alternative practitioners – results from a research project at a Danish MS hospital

    Directory of Open Access Journals (Sweden)

    Lasse Skovgaard

    2010-12-01

    Full Text Available Introduction: More than 50% of the People with Multiple Sclerosis (PwMS in Denmark use alternative treatment. Most of them combine alternative and conventional treatment, but PwMS often find that there is no dialogue, coordination or synergy between the parallel courses of treatment offered. For this reason the Danish Multiple Sclerosis Society conducted a research project to develop and examine different models for collaboration between conventional and alternative treatment providers. Materials and methods: Empirical material consist of individual interviews with practitioners, a group interview with practitioners, a group interview with professional staff at the Danish MS hospital that provided the organisational framework for the project, interviews with patients as well as written responses from participating treatment providers in connection with practitioner-researcher seminars held. Results: Collaboration between researchers and the treatment team resulted in the development examination of several models which describe the strengths and weaknesses of various types of collaboration. The models also show that the various types of collaboration place different requirements on the degree of 1 mutual acknowledgement and understanding among practitioners, 2 flexibility and resources in the organizational framework, and 3 patients' activities and own efforts, respectively.    Perspectives: The relationship between integration and pluralism can contribute to a fruitful discussion in regards to the value of treatment collaboration. In addition to the many positive perspectives the characterise integration of different treatment modalities the project points to the importance of not overlooking the opportunities, values and potential inherent in a pluralistic ideal in the form of patients' own active efforts and the dynamism that can arise when the patient becomes a co-informant, co-coordinator and/or co-integrator.

  13. Types of treatment collaboration between conventional and alternative practitioners-results from a research project at a Danish MS hospital.

    Science.gov (United States)

    Skovgaard, Lasse; Haahr, Niels; Bjerre, Liv; Launsø, Laila

    2010-12-23

    More than 50% of People with Multiple Sclerosis (PwMS) in Denmark use alternative treatment. Most of them combine alternative and conventional treatment, but PwMS often find that they engage in parallel courses of treatment between which there is no dialogue, coordination or synergy. For this reason the Danish Multiple Sclerosis Society conducted a research project to develop and examine different models for collaboration between conventional and alternative treatment providers. The empirical material consisted of 10 individual interviews with practitioners, a group interview with practitioners, a group interview with professional staff at the Danish Multiple Sclerosis hospital that provided the organisational framework for the project, interviews with 59 patients and written responses from participating treatment providers in connection with 29 practitioner-researcher seminars held during the period 2004-2010. Collaboration between researchers and the treatment team resulted in the development and examination of several models which describe the strengths and weaknesses of various types of collaboration. The models show that the various types of collaboration place different requirements on the degree of 1) mutual acknowledgement and understanding among practitioners and 2) flexibility and resources in the organizational framework. The analyses also point to the fact that the degree of patient activity must be considered in relation to a given type of collaboration. The relationship between integration and pluralism can contribute to a fruitful discussion in regards to the value of treatment collaboration. In addition to the many positive perspectives that characterise integration of different treatment modalities the project points to the importance of not overlooking the opportunities, values and potential inherent in a pluralistic ideal in the form of patients' own active efforts and the dynamism that can arise when the patient becomes a co-informant, co

  14. Enhancements and Extensions of Formal Models for Risk Assessment in Software Projects

    Science.gov (United States)

    2002-09-01

    the five defect categories. Cosmetic Defects. The name that corresponds to QSM®’s cosmetic defects. Cosmetic defects can be described as deferred...California. June 2002. (Fent00) Fenton , N. E. and Neil, M., Software Metrics: Roadmap. Proceedings of the Conference on the Future of Software

  15. Development and evaluation of a digital dental modeling method based on grating projection and reverse engineering software.

    Science.gov (United States)

    Zhou, Qin; Wang, Zhenzhen; Chen, Jun; Song, Jun; Chen, Lu; Lu, Yi

    2016-01-01

    For reasons of convenience and economy, attempts have been made to transform traditional dental gypsum casts into 3-dimensional (3D) digital casts. Different scanning devices have been developed to generate digital casts; however, each has its own limitations and disadvantages. The purpose of this study was to develop an advanced method for the 3D reproduction of dental casts by using a high-speed grating projection system and noncontact reverse engineering (RE) software and to evaluate the accuracy of the method. The methods consisted of 3 main steps: the scanning and acquisition of 3D dental cast data with a high-resolution grating projection system, the reconstruction and measurement of digital casts with RE software, and the evaluation of the accuracy of this method using 20 dental gypsum casts. The common anatomic landmarks were measured directly on the gypsum casts with a Vernier caliper and on the 3D digital casts with the Geomagic software measurement tool. Data were statistically assessed with the t test. The grating projection system had a rapid scanning speed, and smooth 3D dental casts were obtained. The mean differences between the gypsum and 3D measurements were approximately 0.05 mm, and no statistically significant differences were found between the 2 methods (P>.05), except for the measurements of the incisor tooth width and maxillary arch length. A method for the 3D reconstruction of dental casts was developed by using a grating projection system and RE software. The accuracy of the casts generated using the grating projection system was comparable with that of the gypsum casts. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  16. Metadata database and data analysis software for the ground-based upper atmospheric data developed by the IUGONET project

    Science.gov (United States)

    Hayashi, H.; Tanaka, Y.; Hori, T.; Koyama, Y.; Shinbori, A.; Abe, S.; Kagitani, M.; Kouno, T.; Yoshida, D.; Ueno, S.; Kaneda, N.; Yoneda, M.; Tadokoro, H.; Motoba, T.; Umemura, N.; Iugonet Project Team

    2011-12-01

    The Inter-university Upper atmosphere Global Observation NETwork (IUGONET) is a Japanese inter-university project by the National Institute of Polar Research (NIPR), Tohoku University, Nagoya University, Kyoto University, and Kyushu University to build a database of metadata for ground-based observations of the upper atmosphere. The IUGONET institutes/universities have been collecting various types of data by radars, magnetometers, photometers, radio telescopes, helioscopes, etc. at various locations all over the world and at various altitude layers from the Earth's surface to the Sun. The metadata database will be of great help to researchers in efficiently finding and obtaining these observational data spread over the institutes/universities. This should also facilitate synthetic analysis of multi-disciplinary data, which will lead to new types of research in the upper atmosphere. The project has also been developing a software to help researchers download, visualize, and analyze the data provided from the IUGONET institutes/universities. The metadata database system is built on the platform of DSpace, which is an open source software for digital repositories. The data analysis software is written in the IDL language with the TDAS (THEMIS Data Analysis Software suite) library. These products have been just released for beta-testing.

  17. Two-step web-mining approach to study geology/geophysics-related open-source software projects

    Science.gov (United States)

    Behrends, Knut; Conze, Ronald

    2013-04-01

    Geology/geophysics is a highly interdisciplinary science, overlapping with, for instance, physics, biology and chemistry. In today's software-intensive work environments, geoscientists often encounter new open-source software from scientific fields that are only remotely related to the own field of expertise. We show how web-mining techniques can help to carry out systematic discovery and evaluation of such software. In a first step, we downloaded ~500 abstracts (each consisting of ~1 kb UTF-8 text) from agu-fm12.abstractcentral.com. This web site hosts the abstracts of all publications presented at AGU Fall Meeting 2012, the world's largest annual geology/geophysics conference. All abstracts belonged to the category "Earth and Space Science Informatics", an interdisciplinary label cross-cutting many disciplines such as "deep biosphere", "atmospheric research", and "mineral physics". Each publication was represented by a highly structured record with ~20 short data attributes, the largest authorship-record being the unstructured "abstract" field. We processed texts of the abstracts with the statistics software "R" to calculate a corpus and a term-document matrix. Using R package "tm", we applied text-mining techniques to filter data and develop hypotheses about software-development activities happening in various geology/geophysics fields. Analyzing the term-document matrix with basic techniques (e.g., word frequencies, co-occurences, weighting) as well as more complex methods (clustering, classification) several key pieces of information were extracted. For example, text-mining can be used to identify scientists who are also developers of open-source scientific software, and the names of their programming projects and codes can also be identified. In a second step, based on the intermediate results found by processing the conference-abstracts, any new hypotheses can be tested in another webmining subproject: by merging the dataset with open data from github

  18. The cleanroom case study in the Software Engineering Laboratory: Project description and early analysis

    Science.gov (United States)

    Green, Scott; Kouchakdjian, Ara; Basili, Victor; Weidow, David

    1990-01-01

    This case study analyzes the application of the cleanroom software development methodology to the development of production software at the NASA/Goddard Space Flight Center. The cleanroom methodology emphasizes human discipline in program verification to produce reliable software products that are right the first time. Preliminary analysis of the cleanroom case study shows that the method can be applied successfully in the FDD environment and may increase staff productivity and product quality. Compared to typical Software Engineering Laboratory (SEL) activities, there is evidence of lower failure rates, a more complete and consistent set of inline code documentation, a different distribution of phase effort activity, and a different growth profile in terms of lines of code developed. The major goals of the study were to: (1) assess the process used in the SEL cleanroom model with respect to team structure, team activities, and effort distribution; (2) analyze the products of the SEL cleanroom model and determine the impact on measures of interest, including reliability, productivity, overall life-cycle cost, and software quality; and (3) analyze the residual products in the application of the SEL cleanroom model, such as fault distribution, error characteristics, system growth, and computer usage.

  19. Cost Control and Performance Review of Software Projects by Using the Earned Value Management

    Directory of Open Access Journals (Sweden)

    Felician ALECU

    2014-08-01

    Full Text Available EVM (Earned Value Management is a method that can be successfully used to measure the performance of a project from the cost and schedule points of view. Initially developed for the US government programs in the 60s, it later becomes an important feature of any modern project management practice thanks to its simplicity and efficiency in signaling project anomalies in time. EVM become extremely popular because it can be equally applied for any project in any industry.

  20. High-school software development project helps increasing students' awareness of geo-hydrological hazards and their risks

    Science.gov (United States)

    Marchesini, Ivan; Rossi, Mauro; Balducci, Vinicio; Salvati, Paola; Guzzetti, Fausto; Bianchini, Andrea; Grzeleswki, Emanuell; Canonico, Andrea; Coccia, Rita; Fiorucci, Gianni Mario; Gobbi, Francesca; Ciuchetti, Monica

    2015-04-01

    In Italy, inundation and landslides are widespread phenomena that impact the population and cause significant economic damage to private and public properties. The perception of the risk posed by these natural geo-hydrological hazards varies geographically and in time. The variation in the perception of the risks has negative consequences on risk management, and limits the adoption of effective risk reduction strategies. We maintain that targeted education can foster the understanding of geo-hydrological hazards, improving their perception and the awareness of the associated risk. Collaboration of a research center experienced in geo-hydrological hazards and risks (CNR IRPI, Perugia) and a high school (ITIS Alessandro Volta, Perugia) has resulted in the design and execution of a project aimed at improving the perception of geo-hydrological risks in high school students and teachers through software development. In the two-year project, students, high school teachers and research scientists have jointly developed software broadly related to landslide and flood hazards. User requirements and system specifications were decided to facilitate the distribution and use of the software among students and their peers. This allowed a wider distribution of the project results. We discuss two prototype software developed by the high school students, including an application of augmented reality for improved dissemination of information of landslides and floods with human consequences in Italy, and a crowd science application to allow students (and others, including their families and friends) to collect information on landslide and flood occurrence exploiting modern mobile devices. This information can prove important e.g., for the validation of landslide forecasting models.

  1. Sandia software guidelines: Software quality planning

    Energy Technology Data Exchange (ETDEWEB)

    1987-08-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. In consonance with the IEEE Standard for Software Quality Assurance Plans, this volume identifies procedures to follow in producing a Software Quality Assurance Plan for an organization or a project, and provides an example project SQA plan. 2 figs., 4 tabs.

  2. Comparative study on software development methodologies

    Directory of Open Access Journals (Sweden)

    Mihai Liviu DESPA

    2014-12-01

    Full Text Available This paper focuses on the current state of knowledge in the field of software development methodologies. It aims to set the stage for the formalization of a software development methodology dedicated to innovation orientated IT projects. The paper starts by depicting specific characteristics in software development project management. Managing software development projects involves techniques and skills that are proprietary to the IT industry. Also the software development project manager handles challenges and risks that are predominantly encountered in business and research areas that involve state of the art technology. Conventional software development stages are defined and briefly described. Development stages are the building blocks of any software development methodology so it is important to properly research this aspect. Current software development methodologies are presented. Development stages are defined for every showcased methodology. For each methodology a graphic representation is illustrated in order to better individualize its structure. Software development methodologies are compared by highlighting strengths and weaknesses from the stakeholder's point of view. Conclusions are formulated and a research direction aimed at formalizing a software development methodology dedicated to innovation orientated IT projects is enunciated.

  3. Technology survey of computer software as applicable to the MIUS project

    Science.gov (United States)

    Fulbright, B. E.

    1975-01-01

    Existing computer software, available from either governmental or private sources, applicable to modular integrated utility system program simulation is surveyed. Several programs and subprograms are described to provide a consolidated reference, and a bibliography is included. The report covers the two broad areas of design simulation and system simulation.

  4. Cross Sectional Study of Agile Software Development Methods and Project Performance

    Science.gov (United States)

    Lambert, Tracy

    2011-01-01

    Agile software development methods, characterized by delivering customer value via incremental and iterative time-boxed development processes, have moved into the mainstream of the Information Technology (IT) industry. However, despite a growing body of research which suggests that a predictive manufacturing approach, with big up-front…

  5. Evaluating Managerial Styles for System Development Life Cycle Stages to Ensure Software Project Success

    Science.gov (United States)

    Kocherla, Showry

    2012-01-01

    Information technology (IT) projects are considered successful if they are completed on time, within budget, and within scope. Even though, the required tools and methodologies are in place, IT projects continue to fail at a higher rate. Current literature lacks explanation for success within the stages of system development life-cycle (SDLC) such…

  6. The Activity-Based Computing Project - A Software Architecture for Pervasive Computing Final Report

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind

    . Special attention should be drawn to publication [25], which gives an overview of the ABC project to the IEEE Pervasive Computing community; the ACM CHI 2006 [19] paper that documents the implementation of the ABC technology; and the ACM ToCHI paper [12], which is the main publication of the project......, documenting all of the project’s four objectives. All of these publication venues are top-tier journals and conferences within computer science. From a business perspective, the project had the objective of incorporating relevant parts of the ABC technology into the products of Medical Insight, which has been...... done. Moreover, partly based on the research done in the ABC project, the company Cetrea A/S has been founded, which incorporate ABC concepts and technologies in its products. The concepts of activity-based computing have also been researched in cooperation with IBM Research, and the ABC project has...

  7. SITEGI Project: Applying Geotechnologies to Road Inspection. Sensor Integration and software processing

    Directory of Open Access Journals (Sweden)

    J. Martínez-Sánchez

    2013-10-01

    Full Text Available Infrastructure management represents a critical economic milestone. The current decision-making process in infrastructure rehabilitation is essentially based on qualitative parameters obtained from visual inspections and subject to the ability of technicians. In order to increase both efficiency and productivity in infrastructure management, this work addresses the integration of different instrumentation and sensors in a mobile mapping vehicle. This vehicle allows the continuous recording of quantitative data suitable for roadside inspection. The geometric integration and synchronization of these sensors is achieved through hardware and/or software strategies that permit the georeferencing of the data obtained with each sensor. In addition, a visualization software for simpler data management was implemented using Qt framework, PCL library and C++. As a result, the developed system supports the decision-making in road inspection, providing quantitative information suitable for sophisticated analysis systems.

  8. Configuration management plan. System definition and project development. Repository Based Software Engineering (RBSE) program

    Science.gov (United States)

    Mckay, Charles

    1991-01-01

    This is the configuration management Plan for the AdaNet Repository Based Software Engineering (RBSE) contract. This document establishes the requirements and activities needed to ensure that the products developed for the AdaNet RBSE contract are accurately identified, that proposed changes to the product are systematically evaluated and controlled, that the status of all change activity is known at all times, and that the product achieves its functional performance requirements and is accurately documented.

  9. Optimal reliability allocation for large software projects through soft computing techniques

    DEFF Research Database (Denmark)

    Madsen, Henrik; Albeanu, Grigore; Popentiu-Vladicescu, Florin

    2012-01-01

    or maximizing the system reliability subject to budget constraints. These kinds of optimization problems were considered both in deterministic and stochastic frameworks in literature. Recently, the intuitionistic-fuzzy optimization approach was considered as a soft computing successful modelling approach....... Firstly, a review on existing soft computing approaches to optimization is given. The main section extends the results considering self-organizing migrating algorithms for solving intuitionistic-fuzzy optimization problems attached to complex fault-tolerant software architectures which proved...

  10. Supervision software for string 2 magnet test facility of large hadron collider project

    International Nuclear Information System (INIS)

    Mayya, Y.S.; Sanadhya, Vivek; Lal, Pradeep; Goel, Vijay; Mukhopadhyay, S.; Saha, Shilpi

    2001-01-01

    The Supervisory Control and Data Acquisition (SCADA) software for the String 2 test facility at CERN, Geneva is developed by BARC under the framework of CERN-DAE collaboration for LHC. The supervision application is developed using PCVue32 SCADA/MMI software. The String 2 test facility prototypes one full cell of LHC and is aimed at studying and validating the individual and collective behaviour of the superconducting magnets, before installing in the tunnel. The software integrates monitoring and supervisory control of all the main subsystems of String 2 such as Cryogenics, Vacuum, Power converters, Magnet protection, Energy extraction and interlock systems. It incorporates animated process synoptics, loop and equipment control panels, configurable trend windows for real-time and historical trending of process parameters, user settability for interlock and alarm thresholds, logging of process events, equipment faults and operator activity. The plant equipment are controlled by a variety of field located Programmable Logic Controllers and VME crates which communicate process IO to the central IO server using both vendor specific and custom protocols. The system leverages OPC (OLE for Process Controls) technology for realising a generic IO server. A large number of geographically distributed client stations are arranged to provide the process specific operator interface and these are connected to the Main IO server over CERN wide intranet and internet. (author)

  11. R PROJECT: SU APLICACIÓN COMO SOFTWARE LIBRE PARA ANÁLISIS EN COMPONENTES PRINCIPALES (R PROJECT: ITS USE AS OPEN SOURCE FOR PRINCIPAL COMPONENTS ANALYSIS

    Directory of Open Access Journals (Sweden)

    Bolaños Guerrero Fabricio

    2011-06-01

    Full Text Available Resumen:Este artículo es producto de un proyecto de investigación realizado en colaboración con profesores de la Escuela de Matemáticas de la Universidad de Costa Rica (UCR, para dar a conocer una opción de software estadístico llamado R Project®. Con este paquete es posible hacer Análisis en Componentes Principales (ACP y representar los resultados usando el Plano Principal y el Círculo de Correlaciones, como herramientas para poder realizar una mejor interpretación de los datos de la tabla (individuos y variables. El software R es de distribución libre, su implementación es sencilla y no requiere de mayores recursos informáticos. Dentro de sus diversas aplicaciones está el ACP, que es una herramienta que se utiliza para la interpretación de la información presentada en una tabla de datos cuantitativos; por lo tanto, las personas investigadoras de diferentes áreas tienen una opción económica y sencilla para realizar Análisis de Datos. Se llevan a cabo dos ejemplos de ACP, donde se muestra un posible uso de la herramienta y se dan las instrucciones sobre cómo realizarlo paso a paso.Abstract: This article is a product of a research project made in collaboration with teachers of the Mathematics School of the University of Costa Rica, in order to show an option of statistical software called “R Project®”. With this software, it is possible to do an Analysis in the Principal Components (PCA and to represent the results using the “principal plane” and the “circle of correlations”, as tools to have a better interpretation of the data in the chart (individuals and variables. The software R is for free distribution, its implementation is simple and it does not require great computer resources. Among its diverse applications there is the PCA, which is a tool used to interpret the information showed in a chart with quantitative data; therefore, the researchers of different areas have a cheap and simple option to do a Data

  12. SOFTWARE IMPLEMENTATION OF FORMING OF COLOR-BASED CARDS FOR ASSESSMENT OF EARLY STAGES INNOVATION PROJECTS

    Directory of Open Access Journals (Sweden)

    Ekaterina I. Bragina

    2015-01-01

    Full Text Available The article deals with functional program that allows to generate a visualrepresentation of the shareholder tothe innovative project early stage ofdevelopment, formed a color-based cards.

  13. Transportation research synthesis : state DOT experiences with Primavera P6 project management software.

    Science.gov (United States)

    2010-03-01

    The eight agencies we interviewed all reported general satisfaction with Primavera P6 as a project management tool within their organizations, although they noted that a significant commitment to training is required. Most states have not implemented...

  14. Specialized software utilities for gamma ray spectrometry. Final report of a co-ordinated research project 1996-2000

    International Nuclear Information System (INIS)

    2002-03-01

    A Co-ordinated Research Project (CRP) on Software Utilities for Gamma Ray Spectrometry was initiated by the International Atomic Energy Agency in 1996 for a three year period. In the CRP several basic applications of nuclear data handling were assayed which also dealt with the development of PC computer codes for various spectrometric purposes. The CRP produced several software packages: for the analysis of low level NaI spectra; user controlled analysis of gamma ray spectra from HPGe detectors; a set of routines for the definition of the detector resolution function and for the unfolding of experimental annihilation spectra; a program for the generation of gamma ray libraries for specific applications; a program to calculate true coincidence corrections; a program to calculate full-energy peak efficiency calibration curve for homogenous cylindrical sample geometries including self-attenuation correction; and a program for the library driven analysis of gamma ray spectra and for the quantification of radionuclide content in samples. In addition, the CRP addressed problems of the analysis of naturally occurring radioactive soil material gamma ray spectra, questions of quality assurance and quality control in gamma ray spectrometry, and verification of the expert system SHAMAN for the analysis of air filter spectra obtained within the framework of the Comprehensive Nuclear Test Ban Treaty. This TECDOC contains 10 presentations delivered at the meeting with the description of the software developed. Each of the papers has been indexed separately

  15. Results of a Survey Software Development Project Management in the U.S. Aerospace Industry. Volume II. Project Management Techniques, Procedures and Tools.

    Science.gov (United States)

    1979-12-18

    I Z I UO W-1 eu 0 - 0 tD CL 0, 0 -I7 NW 2 - x M.j a CL W2 X 41 a ~ 0 0,a 4~~ 0 Z .D .J 0 2. N 0 ~N IU 0 4 - 2 0 ~. 0 Q. ’o ~ 0, 𔃺e U - U ~- 0, 0 -a...0 .44 A A A Ao 0 - 2. -U 2- 4’ A 4’ A o .~ 0 .~ 2. flJ 2. A Ao ~. a 2. 𔃾 2- @2 @2 @2 A 0 .~. a I- 2. Xii - 0 2 @2 ON AXe Re 2- Oi. K A.. A A AU .40...Project manager or person appointed by him SE/ TD project manager b. Senior ADP Manager Director Director computer programming Software program design

  16. Global Software Engineering: A Software Process Approach

    Science.gov (United States)

    Richardson, Ita; Casey, Valentine; Burton, John; McCaffery, Fergal

    Our research has shown that many companies are struggling with the successful implementation of global software engineering, due to temporal, cultural and geographical distance, which causes a range of factors to come into play. For example, cultural, project managementproject management and communication difficulties continually cause problems for software engineers and project managers. While the implementation of efficient software processes can be used to improve the quality of the software product, published software process models do not cater explicitly for the recent growth in global software engineering. Our thesis is that global software engineering factors should be included in software process models to ensure their continued usefulness in global organisations. Based on extensive global software engineering research, we have developed a software process, Global Teaming, which includes specific practices and sub-practices. The purpose is to ensure that requirements for successful global software engineering are stipulated so that organisations can ensure successful implementation of global software engineering.

  17. Open-Source Development Experiences in Scientific Software: The HANDE Quantum Monte Carlo Project

    Directory of Open Access Journals (Sweden)

    J. S. Spencer

    2015-11-01

    Full Text Available The HANDE quantum Monte Carlo project offers accessible stochastic algorithms for general use for scientists in the field of quantum chemistry. HANDE is an ambitious and general high-performance code developed by a geographically-dispersed team with a variety of backgrounds in computational science. In the course of preparing a public, open-source release, we have taken this opportunity to step back and look at what we have done and what we hope to do in the future. We pay particular attention to development processes, the approach taken to train students joining the project, and how a flat hierarchical structure aids communication.

  18. Digital image management project for dermatological health care environments: a new dedicated software and review of the literature.

    Science.gov (United States)

    Rubegni, Pietro; Nami, Niccolò; Poggiali, Sara; Tataranno, Domenico; Fimiani, M

    2009-05-01

    Because the skin is the only organ completely accessible to visual examination, digital technology has therefore attracted the attention of dermatologists for documenting, monitoring, measuring and classifying morphological manifestations. To describe a digital image management system dedicated to dermatological health care environments and to compare it with other existing softwares for digital image storage. We designed a reliable hardware structure that could ensure future scaling, because storage needs tend to grow exponentially. For the software, we chose a client-web server application based on a relational database and with a 'minimalist' user interface. We developed a software with a ready-made, adaptable index of skin pathologies. It facilitates classification by pathology, patient and visit, with an advanced search option allowing access to all images according to personalized criteria. The software also offers the possibility of comparing two or more digital images (follow-up). The fact that the archives of years of digital photos acquired and saved on PCs can easily be entered in the program distinguishes it from the others in the market. This option is fundamental for accessing all the photos taken in years of practice in the program without entering them one by one. The program is available to any user connected to the local Intranet and the system may directly be available in the future from the Internet. All clinics and surgeries, especially those that rely on digital images, are obliged to keep up with technological advances. It is therefore hoped that our project will become a model for medical structures intending to rationalise digital and other data according to statutory requirements.

  19. Proposal and application of an approach based on AHP and ISO/IEC 25000 to support the evaluation of the quality of project management software systems

    Directory of Open Access Journals (Sweden)

    Matheus Henrique Bartolomeu Marques Morais

    2017-06-01

    Full Text Available In the current competitive scenario, organizations are seeking to improve factors that influence the administration and the success of their projects. The adoption of a project management software suitable to organizational requirements contributes to ensure satisfactory results. In this context, this study proposes a methodology to support the quality evaluation of project management software, which is based on the AHP method (Analytic Hierarchy Process as well as on the criteria and sub-criteria of ISO/IEC 25000. The AHP method is used to define the relevance of the criteria and sub-criteria chosen. The proposed methodology was applied to an illustrative case to select the most appropriate software. The alternatives Basecamp, MS Project, Service Desk and Primavera were evaluated. Primavera and MS Project achieved the higher global performance in the process evaluation, since they fulfill 29 of 30 requirements. However, due to some particularities of the evaluated products, Primavera was selected as being the most suitable.

  20. Enhancing Project-Based Learning in Software Engineering Lab Teaching through an E-Portfolio Approach

    Science.gov (United States)

    Macias, J. A.

    2012-01-01

    Project-based learning is one of the main successful student-centered pedagogies broadly used in computing science courses. However, this approach can be insufficient when dealing with practical subjects that implicitly require many deliverables and a great deal of feedback and organizational resources. In this paper, a worked e-portfolio is…

  1. Design and Development of a User Interface for the Dynamic Model of Software Project Management.

    Science.gov (United States)

    1988-03-01

    rectory of the user’s choice for future...the last choice selected. Let us assume for the sake of this tour that the user has selected all eight choices . ESTIMATED ACTUAL PROJECT SIZE DEFINITION...manipulation of varaibles in the * •. TJin~ca model "h ... ser Inter ace for the Dynamica model was designed b in iterative process of prototyping

  2. Development of Occupational Safety and Health Requirement Management System (OSHREMS Software Using Adobe Dreamweaver CS5 for Building Construction Project

    Directory of Open Access Journals (Sweden)

    Abas Nor Haslinda

    2017-01-01

    Full Text Available The construction industry sector is considered as being risky with frequent and high accident rate. According to Social Security Organization (SOCSO, the construction accidents has arisen from time to time. Construction Industry Development Board (CIDB has developed the Safety and Health Assessment System in Construction (SHASSIC for evaluating the performance of a contractor in construction project by setting out the safety and health management and practices, however the requirement checklist provided is not comprehensive. Therefore, this study aims to develop a software system for facilitating OSH in building construction project, namely OSH requirements management system (OSHREMS, using Adobe Dreamweaver CS5 and Sublime Text as PHP editor. The results from a preliminary study which was conducted through interviews showed that, the respondents were only implementing the basic requirements that comply with legislations, with the absence of appropriate and specific guideline in ensuring occupational safety and health (OSH at the workplace. The tool will be benefits for contractors and other parties to effectively manage the OSH requirements for their projects based on project details.

  3. Spotting software errors sooner

    International Nuclear Information System (INIS)

    Munro, D.

    1989-01-01

    Static analysis is helping to identify software errors at an earlier stage and more cheaply than conventional methods of testing. RTP Software's MALPAS system also has the ability to check that a code conforms to its original specification. (author)

  4. Automated Transportation Management System (ATMS) Software Project Management Plan (SPMP). Revision 2

    International Nuclear Information System (INIS)

    Weidert, R.S.

    1995-01-01

    As a cabinet level federal agency with a diverse range of missions and an infrastructure spanning the United States, the US Department of Energy (DOE) has extensive freight transportation requirements. Performance and management of this freight activity is a critical function. DOE's Transportation Management Division (TMD) has an agency-wide responsibility for overseeing transportation activities. Actual transportation operations are handled by government or contractor staff at the field locations. These staff have evolved a diverse range of techniques and procedures for performing transportation functions. In addition to minimizing the economic impact of transportation on programs, facility transportation staff must be concerned with the increasingly complex task of complying with complex shipment safety regulations. Maintaining the department's safety record for shipping hazardous and radioactive materials is a primary goal. Use of automation to aid transportation functions is not widespread within DOE, though TMD has a number of software systems designed to gather and analyze data pertaining to field transportation activities. These systems are not integrated. Historically, most field facilities have accomplished transportation-related tasks manually or with minimal computer assistance. At best, information and decision support systems available to transportation staffs within the facilities are fragmented. In deciding where to allocate resources for automation, facility managers have not tended to give the needs of transportation departments a high priority. This diversity causes TMD significant difficulty in collecting data for use in managing department-wide transportation activities

  5. Automated Transportation Management System (ATMS) Software Project Management Plan (SPMP). Revision 2

    Energy Technology Data Exchange (ETDEWEB)

    Weidert, R.S.

    1995-05-26

    As a cabinet level federal agency with a diverse range of missions and an infrastructure spanning the United States, the US Department of Energy (DOE) has extensive freight transportation requirements. Performance and management of this freight activity is a critical function. DOE`s Transportation Management Division (TMD) has an agency-wide responsibility for overseeing transportation activities. Actual transportation operations are handled by government or contractor staff at the field locations. These staff have evolved a diverse range of techniques and procedures for performing transportation functions. In addition to minimizing the economic impact of transportation on programs, facility transportation staff must be concerned with the increasingly complex task of complying with complex shipment safety regulations. Maintaining the department`s safety record for shipping hazardous and radioactive materials is a primary goal. Use of automation to aid transportation functions is not widespread within DOE, though TMD has a number of software systems designed to gather and analyze data pertaining to field transportation activities. These systems are not integrated. Historically, most field facilities have accomplished transportation-related tasks manually or with minimal computer assistance. At best, information and decision support systems available to transportation staffs within the facilities are fragmented. In deciding where to allocate resources for automation, facility managers have not tended to give the needs of transportation departments a high priority. This diversity causes TMD significant difficulty in collecting data for use in managing department-wide transportation activities.

  6. Hardware and Software Integration in Project Development of Automated Controller System Using LABVIEW FPGA

    International Nuclear Information System (INIS)

    Mohd Khairulezwan Abd Manan; Mohd Sabri Minhat; Izhar Abu Hussin

    2014-01-01

    The Field-Programmable Gate Array (FPGA) is a semiconductor device that can be programmed after manufacturing. Instead of being restricted to any predetermined hardware function, an FPGA allows user to program product features and functions, adapt to new standards, and reconfigure hardware for specific applications even after the product has been installed in the field, hence the name field-programmable. This project developed a control system using LabVIEW FPGA. LabVIEW FPGA is easier where it is programmed by using drag and drop icon. Then it will be integrated with the hardware input and output. (author)

  7. QUANTUM ESPRESSO: a modular and open-source software project for quantum simulations of materials.

    Science.gov (United States)

    Giannozzi, Paolo; Baroni, Stefano; Bonini, Nicola; Calandra, Matteo; Car, Roberto; Cavazzoni, Carlo; Ceresoli, Davide; Chiarotti, Guido L; Cococcioni, Matteo; Dabo, Ismaila; Dal Corso, Andrea; de Gironcoli, Stefano; Fabris, Stefano; Fratesi, Guido; Gebauer, Ralph; Gerstmann, Uwe; Gougoussis, Christos; Kokalj, Anton; Lazzeri, Michele; Martin-Samos, Layla; Marzari, Nicola; Mauri, Francesco; Mazzarello, Riccardo; Paolini, Stefano; Pasquarello, Alfredo; Paulatto, Lorenzo; Sbraccia, Carlo; Scandolo, Sandro; Sclauzero, Gabriele; Seitsonen, Ari P; Smogunov, Alexander; Umari, Paolo; Wentzcovitch, Renata M

    2009-09-30

    QUANTUM ESPRESSO is an integrated suite of computer codes for electronic-structure calculations and materials modeling, based on density-functional theory, plane waves, and pseudopotentials (norm-conserving, ultrasoft, and projector-augmented wave). The acronym ESPRESSO stands for opEn Source Package for Research in Electronic Structure, Simulation, and Optimization. It is freely available to researchers around the world under the terms of the GNU General Public License. QUANTUM ESPRESSO builds upon newly-restructured electronic-structure codes that have been developed and tested by some of the original authors of novel electronic-structure algorithms and applied in the last twenty years by some of the leading materials modeling groups worldwide. Innovation and efficiency are still its main focus, with special attention paid to massively parallel architectures, and a great effort being devoted to user friendliness. QUANTUM ESPRESSO is evolving towards a distribution of independent and interoperable codes in the spirit of an open-source project, where researchers active in the field of electronic-structure calculations are encouraged to participate in the project by contributing their own codes or by implementing their own ideas into existing codes.

  8. The software life cycle

    CERN Document Server

    Ince, Darrel

    1990-01-01

    The Software Life Cycle deals with the software lifecycle, that is, what exactly happens when software is developed. Topics covered include aspects of software engineering, structured techniques of software development, and software project management. The use of mathematics to design and develop computer systems is also discussed. This book is comprised of 20 chapters divided into four sections and begins with an overview of software engineering and software development, paying particular attention to the birth of software engineering and the introduction of formal methods of software develop

  9. HPC Institutional Computing Project: W15_lesreactiveflow KIVA-hpFE Development: A Robust and Accurate Engine Modeling Software

    Energy Technology Data Exchange (ETDEWEB)

    Carrington, David Bradley [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Waters, Jiajia [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-01-05

    KIVA-hpFE is a high performance computer software for solving the physics of multi-species and multiphase turbulent reactive flow in complex geometries having immersed moving parts. The code is written in Fortran 90/95 and can be used on any computer platform with any popular complier. The code is in two versions, a serial version and a parallel version utilizing MPICH2 type Message Passing Interface (MPI or Intel MPI) for solving distributed domains. The parallel version is at least 30x faster than the serial version and much faster than our previous generation of parallel engine modeling software, by many factors. The 5th generation algorithm construction is a Galerkin type Finite Element Method (FEM) solving conservative momentum, species, and energy transport equations along with two-equation turbulent model k-ω Reynolds Averaged Navier-Stokes (RANS) model and a Vreman type dynamic Large Eddy Simulation (LES) method. The LES method is capable modeling transitional flow from laminar to fully turbulent; therefore, this LES method does not require special hybrid or blending to walls. The FEM projection method also uses a Petrov-Galerkin (P-G) stabilization along with pressure stabilization. We employ hierarchical basis sets, constructed on the fly with enrichment in areas associated with relatively larger error as determined by error estimation methods. In addition, when not using the hp-adaptive module, the code employs Lagrangian basis or shape functions. The shape functions are constructed for hexahedral, prismatic and tetrahedral elements. The software is designed to solve many types of reactive flow problems, from burners to internal combustion engines and turbines. In addition, the formulation allows for direct integration of solid bodies (conjugate heat transfer), as in heat transfer through housings, parts, cylinders. It can also easily be extended to stress modeling of solids, used in fluid structure interactions problems, solidification, porous media

  10. The Perceived Impact of the Agile Development and Project Management Method Scrum on Information Systems and Software Development Productivity

    DEFF Research Database (Denmark)

    Kautz, Karlheinz; Johansen, Thomas Heide; Uldahl, Andreas

    2013-01-01

    This research contributes to the body of knowledge in information systems development (ISD) with an empirical investigation in form of a case study that demonstrates the positive impact of the agile development and project management method Scrum on information systems and software development...... productivity and it provides a useful operationalization of the concept through seven identified indicators for productivity. Despite the fact that the case unit had challenges with the use of Scrum, the indicators identified the areas where the company had managed to exploit the potential of Scrum and its...... practices with regard to increasing productivity. The research results are discussed both with regard to the existing Scrum literature as well as to complex adaptive systems (CAS) as a foundation for ISD and agile development....

  11. The Perceived Impact of the Agile Development and Project Management Method Scrum on Information Systems and Software Development Productivity

    Directory of Open Access Journals (Sweden)

    Karlheinz Kautz

    2014-11-01

    Full Text Available This research contributes to the body of knowledge in information systems development (ISD with an empirical investigation in form of a case study that demonstrates the positive impact of the agile development and project management method Scrum on information systems and software development productivity and it provides a useful operationalization of the concept through seven identified indicators for productivity. Despite the fact that the case unit had challenges with the use of Scrum, the indicators identified the areas where the company had managed to exploit the potential of Scrum and its practices with regard to increasing productivity. The research results are discussed both with regard to the existing Scrum literature as well as to complex adaptive systems (CAS as a foundation for ISD and agile development.

  12. Bill project authorizing the convention related to the building and exploitation of a European X-ray free electron laser

    International Nuclear Information System (INIS)

    2011-01-01

    This document briefly recalls the objectives of the convention and the planned installations in the region of Hamburg of a free electron laser. It discusses the estimated consequences of the convention implementation: scientific consequences, international context (two existing installations in the United States and in Japan), economic consequences, financial consequences (French contribution), social consequences, legal consequences (none for the French law), and administrative consequences (association of the CEA and CNRS to the XFEL company). It recalls the negotiation history from 2003, and indicates the signature and ratification status in the different involved countries (Germany, France, Denmark, Russia, Sweden, Poland, Switzerland, Hungary, Greece, and Slovakia)

  13. The Alice Project at the IPN, Orsay R and D and software developments 1996-2003

    Energy Technology Data Exchange (ETDEWEB)

    MacCormick, M

    2007-03-15

    This document reviews the theoretical, experimental and technical achievements of the author since the beginning of his scientific career. In 1996 the author became a member of the Alice (A Large heavy Ion Collider Experiment) which was then at the beginning of its research and development phase. The bulk of this report comprises mainly 'snapshots' of the research and development project that was pursued in Orsay for the Alice dimuon arm collaboration. The idea here is to regroup the full set of prototype models, with the technical specifications and their associated test programs. The main results are given for each set of tests, but the details of how data sets were analysed are not included since those details are already available in other, more formal, write-ups. The result is a kind of 'scrapbook' of the research and development phase associated with the Alice dimuon arm station 1 tracker, one of the 5 tracker stations implemented in the dimuon arm spectrometer. This document presented before an academic board will allow its author to manage research works and particularly to tutor thesis students.

  14. The Alice Project at the IPN, Orsay R and D and software developments 1996-2003

    Energy Technology Data Exchange (ETDEWEB)

    MacCormick, M

    2007-03-15

    This document reviews the theoretical, experimental and technical achievements of the author since the beginning of his scientific career. In 1996 the author became a member of the Alice (A Large heavy Ion Collider Experiment) which was then at the beginning of its research and development phase. The bulk of this report comprises mainly 'snapshots' of the research and development project that was pursued in Orsay for the Alice dimuon arm collaboration. The idea here is to regroup the full set of prototype models, with the technical specifications and their associated test programs. The main results are given for each set of tests, but the details of how data sets were analysed are not included since those details are already available in other, more formal, write-ups. The result is a kind of 'scrapbook' of the research and development phase associated with the Alice dimuon arm station 1 tracker, one of the 5 tracker stations implemented in the dimuon arm spectrometer. This document presented before an academic board will allow its author to manage research works and particularly to tutor thesis students.

  15. The Alice Project at the IPN, Orsay R and D and software developments 1996-2003

    International Nuclear Information System (INIS)

    MacCormick, M.

    2007-03-01

    This document reviews the theoretical, experimental and technical achievements of the author since the beginning of his scientific career. In 1996 the author became a member of the Alice (A Large heavy Ion Collider Experiment) which was then at the beginning of its research and development phase. The bulk of this report comprises mainly 'snapshots' of the research and development project that was pursued in Orsay for the Alice dimuon arm collaboration. The idea here is to regroup the full set of prototype models, with the technical specifications and their associated test programs. The main results are given for each set of tests, but the details of how data sets were analysed are not included since those details are already available in other, more formal, write-ups. The result is a kind of 'scrapbook' of the research and development phase associated with the Alice dimuon arm station 1 tracker, one of the 5 tracker stations implemented in the dimuon arm spectrometer. This document presented before an academic board will allow its author to manage research works and particularly to tutor thesis students

  16. Cost-effectiveness of intensified versus conventional multifactorial intervention in type 2 diabetes: results and projections from the Steno-2 study

    DEFF Research Database (Denmark)

    Gaede, Peter; Valentine, William J; Palmer, Andrew J

    2008-01-01

    and account Danish-specific costs to project life expectancy, quality-adjusted life expectancy (QALE), and lifetime direct medical costs expressed in year 2005 Euros. Clinical and cost outcomes were projected over patient lifetimes and discounted at 3% annually. Sensitivity analyses were performed. RESULTS...... gained. This is considered a conservative estimate because accounting prescription of generic drugs and capturing indirect costs would further favor intensified therapy. CONCLUSIONS: From a health care payer perspective in Denmark, intensive therapy was more cost-effective than conventional treatment...

  17. Customer Interaction in Software Development: A Comparison of Software Methodologies Deployed in Namibian Software Firms

    CSIR Research Space (South Africa)

    Iyawa, GE

    2016-01-01

    Full Text Available within the Namibian context. An implication for software project managers and software developers is that customer interaction should be properly managed to ensure that the software methodologies for improving software development processes...

  18. A Requirements-Based Exploration of Open-Source Software Development Projects--Towards a Natural Language Processing Software Analysis Framework

    Science.gov (United States)

    Vlas, Radu Eduard

    2012-01-01

    Open source projects do have requirements; they are, however, mostly informal, text descriptions found in requests, forums, and other correspondence. Understanding such requirements provides insight into the nature of open source projects. Unfortunately, manual analysis of natural language requirements is time-consuming, and for large projects,…

  19. Software engineering architecture-driven software development

    CERN Document Server

    Schmidt, Richard F

    2013-01-01

    Software Engineering: Architecture-driven Software Development is the first comprehensive guide to the underlying skills embodied in the IEEE's Software Engineering Body of Knowledge (SWEBOK) standard. Standards expert Richard Schmidt explains the traditional software engineering practices recognized for developing projects for government or corporate systems. Software engineering education often lacks standardization, with many institutions focusing on implementation rather than design as it impacts product architecture. Many graduates join the workforce with incomplete skil

  20. United Nations Environment Programme Capacity Building Pilot Project - Training on persistent organic pollutant analysis under the Stockholm Convention

    NARCIS (Netherlands)

    de Boer, J.; Leslie, H.A.; van Leeuwen, S.P.J.; Wegener, J.W.M.; van Bavel, B; Lindstrom, G.; Lahoutifard, N.; Fiedler, H.

    2008-01-01

    Within the framework of a United Nations Environment Programme (UNEP) Capacity Building Project for training of laboratory staff in developing countries on persistent organic pollutant (POP) analysis, an interlaboratory study was organised following an initial evaluation of the performance of

  1. A Phenomenological Inquiry into the Perceptions of Software Professionals on the Asperger's Syndrome/High Functioning Autism Spectrum and the Success of Software Development Projects

    Science.gov (United States)

    Kendall, Leslie R.

    2013-01-01

    Individuals who have Asperger's Syndrome/High-Functioning Autism, as a group, are chronically underemployed and underutilized. Many in this group have abilities that are well suited for various roles within the practice of software development. Multiple studies have shown that certain organizational and management changes in the software…

  2. Molecular Cloning Designer Simulator (MCDS: All-in-one molecular cloning and genetic engineering design, simulation and management software for complex synthetic biology and metabolic engineering projects

    Directory of Open Access Journals (Sweden)

    Zhenyu Shi

    2016-12-01

    Full Text Available Molecular Cloning Designer Simulator (MCDS is a powerful new all-in-one cloning and genetic engineering design, simulation and management software platform developed for complex synthetic biology and metabolic engineering projects. In addition to standard functions, it has a number of features that are either unique, or are not found in combination in any one software package: (1 it has a novel interactive flow-chart user interface for complex multi-step processes, allowing an integrated overview of the whole project; (2 it can perform a user-defined workflow of cloning steps in a single execution of the software; (3 it can handle multiple types of genetic recombineering, a technique that is rapidly replacing classical cloning for many applications; (4 it includes experimental information to conveniently guide wet lab work; and (5 it can store results and comments to allow the tracking and management of the whole project in one platform. MCDS is freely available from https://mcds.codeplex.com. Keywords: BioCAD, Genetic engineering software, Molecular cloning software, Synthetic biology, Workflow simulation and management

  3. Commercial Literacy Software.

    Science.gov (United States)

    Balajthy, Ernest

    1997-01-01

    Presents the first year's results of a continuing project to monitor the availability of software of relevance for literacy education purposes. Concludes there is an enormous amount of software available for use by teachers of reading and literacy--whereas drill-and-practice software is the largest category of software available, large numbers of…

  4. ARC Software and Models

    Science.gov (United States)

    Archives RESEARCH ▼ Research Areas Ongoing Projects Completed Projects SOFTWARE CONTACT ▼ Primary Contacts Researchers External Link MLibrary Deep Blue Software Archive Most research conducted at the ARC produce software code and methodologies that are transferred to TARDEC and industry partners. These

  5. Accelerating Software Development through Agile Practices--A Case Study of a Small-Scale, Time-Intensive Web Development Project at a College-Level IT Competition

    Science.gov (United States)

    Zhang, Xuesong; Dorn, Bradley

    2012-01-01

    Agile development has received increasing interest both in industry and academia due to its benefits in developing software quickly, meeting customer needs, and keeping pace with the rapidly changing requirements. However, agile practices and scrum in particular have been mainly tested in mid- to large-size projects. In this paper, we present…

  6. Total cost of ownership of electric vehicles compared to conventional vehicles: A probabilistic analysis and projection across market segments

    International Nuclear Information System (INIS)

    Wu, Geng; Inderbitzin, Alessandro; Bening, Catharina

    2015-01-01

    While electric vehicles (EV) can perform better than conventional vehicles from an environmental standpoint, consumers perceive them to be more expensive due to their higher capital cost. Recent studies calculated the total cost of ownership (TCO) to evaluate the complete cost for the consumer, focusing on individual vehicle classes, powertrain technologies, or use cases. To provide a comprehensive overview, we built a probabilistic simulation model broad enough to capture most of a national market. Our findings indicate that the comparative cost efficiency of EV increases with the consumer's driving distance and is higher for small than for large vehicles. However, our sensitivity analysis shows that the exact TCO is subject to the development of vehicle and operating costs and thus uncertain. Although the TCO of electric vehicles may become close to or even lower than that of conventional vehicles by 2025, our findings add evidence to past studies showing that the TCO does not reflect how consumers make their purchase decision today. Based on these findings, we discuss policy measures that educate consumers about the TCO of different vehicle types based on their individual preferences. In addition, measures improving the charging infrastructure and further decreasing battery cost are discussed. - Highlights: • Calculates the total cost of ownership across competing vehicle technologies. • Uses Monte Carlo simulation to analyse distributions and probabilities of outcomes. • Contains a comprehensive assessment across the main vehicle classes and use cases. • Indicates that cost efficiency of technology depends on vehicle class and use case. • Derives specific policy measures to facilitate electric vehicle diffusion

  7. Independent Verification and Validation Of SAPHIRE 8 Software Requirements Project Number: N6423 U.S. Nuclear Regulatory Commission

    Energy Technology Data Exchange (ETDEWEB)

    Kent Norris

    2009-09-01

    The purpose of the Independent Verification and Validation (IV&V) role in the evaluation of the SAPHIRE requirements definition is to assess the activities that results in the specification, documentation, and review of the requirements that the software product must satisfy, including functionality, performance, design constraints, attributes and external interfaces. The IV&V team began this endeavor after the software engineering and software development of SAPHIRE had already been in production. IV&V reviewed the requirements specified in the NRC Form 189s to verify these requirements were included in SAPHIRE’s Software Verification and Validation Plan (SVVP).

  8. Knowledge Base for an Intelligent System in order to Identify Security Requirements for Government Agencies Software Projects

    Directory of Open Access Journals (Sweden)

    Adán Beltrán G.

    2016-01-01

    Full Text Available It has been evidenced that one of the most common causes in the failure of software security is the lack of identification and specification of requirements for information security, it is an activity with an insufficient importance in the software development or software acquisition We propose the knowledge base of CIBERREQ. CIBERREQ is an intelligent knowledge-based system used for the identification and specification of security requirements in the software development cycle or in the software acquisition. CIBERREQ receives functional software requirements written in natural language and produces non-functional security requirements through a semi-automatic process of risk management. The knowledge base built is formed by an ontology developed collaboratively by experts in information security. In this process has been identified six types of assets: electronic data, physical data, hardware, software, person and service; as well as six types of risk: competitive disadvantage, loss of credibility, economic risks, strategic risks, operational risks and legal sanctions. In addition there are defined 95 vulnerabilities, 24 threats, 230 controls, and 515 associations between concepts. Additionally, automatic expansion was used with Wikipedia for the asset types Software and Hardware, obtaining 7125 and 5894 software and hardware subtypes respectively, achieving thereby an improvement of 10% in the identification of the information assets candidates, one of the most important phases of the proposed system.

  9. ''Augmented reality'' in conventional simulation by projection of 3-D structures into 2-D images. A comparison with virtual methods

    International Nuclear Information System (INIS)

    Deutschmann, H.; Nairz, O.; Zehentmayr, F.; Fastner, G.; Sedlmayer, F.; Steininger, P.; Kopp, P.; Merz, F.; Wurstbauer, K.; Kranzinger, M.; Kametriser, G.; Kopp, M.

    2008-01-01

    Background and purpose: in this study, a new method is introduced, which allows the overlay of three-dimensional structures, that have been delineated on transverse slices, onto the fluoroscopy from conventional simulators in real time. Patients and methods: setup deviations between volumetric imaging and simulation were visualized, measured and corrected for 701 patient isocenters. Results: comparing the accuracy to mere virtual simulation lacking additional X-ray imaging, a clear benefit of the new method could be shown. On average, virtual prostate simulations had to be corrected by 0.48 cm (standard deviation [SD] 0.38), and those of the breast by 0.67 cm (SD 0.66). Conclusion: the presented method provides an easy way to determine entity-specific safety margins related to patient setup errors upon registration of bony anatomy (prostate 0.9 cm for 90% of cases, breast 1.3 cm). The important role of planar X-ray imaging was clearly demonstrated. The innovation can also be applied to adaptive image-guided radiotherapy (IGRT) protocols. (orig.)

  10. Qualitative evaluation of environmental radiological impact in a phosphate associated uranium conventional mine: Santa Quiteria Project, CE, Brazil

    International Nuclear Information System (INIS)

    Reis, Rocio G. dos; Santo, Aline Sa E.

    2013-01-01

    The aim of this study is to identify and evaluate qualitatively the main potential sources of mineral and installation terms of Santa Quiteria, CE, Brazil, evaluating their possible impacts on the environment. The key terms sources in the production of phosphoric acid are usually: the dig of the mines, tailings dams and phospho plaster stack. Thus, this work intends to inform the academic community about this issue, as well as the population in general and also, acting proactively in order to warn about the possible environmental impacts, so that actions to compensate, minimize or avoid these radiological impacts on the environment, can be included in the planning of the industrial mineral project of Santa Quiteria (author)

  11. Artificial intelligence and expert systems in-flight software testing

    Science.gov (United States)

    Demasie, M. P.; Muratore, J. F.

    1991-01-01

    The authors discuss the introduction of advanced information systems technologies such as artificial intelligence, expert systems, and advanced human-computer interfaces directly into Space Shuttle software engineering. The reconfiguration automation project (RAP) was initiated to coordinate this move towards 1990s software technology. The idea behind RAP is to automate several phases of the flight software testing procedure and to introduce AI and ES into space shuttle flight software testing. In the first phase of RAP, conventional tools to automate regression testing have already been developed or acquired. There are currently three tools in use.

  12. Softwareland Chronicles: A Software Development Meta-Process Proposal

    Directory of Open Access Journals (Sweden)

    Bolanos Sandro

    2016-05-01

    Full Text Available This paper presents the software development meta-process (SD-MP as a proposal to set up software projects. Within this proposal we offer conceptual elements that help solve the war of methodologies and processes in favor of an integrating viewpoint, where the main flaws associated with conventional and agile approaches are removed. Our newly developed software platform to support the meta-process is also presented together with three case studies involving projects currently in progress, where the framework proposed in SD-MP has been applied.

  13. Comparison of pure and hybrid iterative reconstruction techniques with conventional filtered back projection: Image quality assessment in the cervicothoracic region

    International Nuclear Information System (INIS)

    Katsura, Masaki; Sato, Jiro; Akahane, Masaaki; Matsuda, Izuru; Ishida, Masanori; Yasaka, Koichiro; Kunimatsu, Akira; Ohtomo, Kuni

    2013-01-01

    Objectives: To evaluate the impact on image quality of three different image reconstruction techniques in the cervicothoracic region: model-based iterative reconstruction (MBIR), adaptive statistical iterative reconstruction (ASIR), and filtered back projection (FBP). Methods: Forty-four patients underwent unenhanced standard-of-care clinical computed tomography (CT) examinations which included the cervicothoracic region with a 64-row multidetector CT scanner. Images were reconstructed with FBP, 50% ASIR-FBP blending (ASIR50), and MBIR. Two radiologists assessed the cervicothoracic region in a blinded manner for streak artifacts, pixilated blotchy appearances, critical reproduction of visually sharp anatomical structures (thyroid gland, common carotid artery, and esophagus), and overall diagnostic acceptability. Objective image noise was measured in the internal jugular vein. Data were analyzed using the sign test and pair-wise Student's t-test. Results: MBIR images had significant lower quantitative image noise (8.88 ± 1.32) compared to ASIR images (18.63 ± 4.19, P 0.9 for ASIR vs. FBP for both readers). MBIR images were all diagnostically acceptable. Unique features of MBIR images included pixilated blotchy appearances, which did not adversely affect diagnostic acceptability. Conclusions: MBIR significantly improves image noise and streak artifacts of the cervicothoracic region over ASIR and FBP. MBIR is expected to enhance the value of CT examinations for areas where image noise and streak artifacts are problematic

  14. A retrospective critic Re-Debate on Stakeholders’ resistance checklist in software project management within multi-cultural, multi-ethnical and cosmopolitan society context: The Malaysian experience

    Directory of Open Access Journals (Sweden)

    Hamed Taherdoost

    2016-12-01

    Full Text Available Risks stemming from software projects were extensively studied. However, software project risk management has rarely researched organizational risks within multi-cultural and multi-ethnical atmospheres. The fact of the matter is that problems occur when the stakeholders’ cultural and ethnical aspects are not addressed, especially in multi-cultural, multi-ethnical, and cosmopolitan society such as Malaysia. To avoid analyzing something that has already been studied in detail, this study conducted based on in-depth literature review considering key word search in subject-specific databases. Journal articles published in reputed journals were reviewed. By employing Rumelt’s resistance to change checklist and culture gap tool source, this paper develops an organizational risk framework considering cross-cultural and cross-ethnical critical factors in order to show how can risks be better comprehended and managed. The significance of bio-cultural dimensions was scrutinized as vital criteria which should be considered in international project sphere, so that, not only the odds of project success would be increased but also the risks can be mitigated significantly. A review of the risk management process, Rumelt’s Checklist, cultural issues in international project environment allows a better understanding of the importance of cultural dimensions in project spheres.

  15. Comparison of pure and hybrid iterative reconstruction techniques with conventional filtered back projection: Image quality assessment in the cervicothoracic region

    Energy Technology Data Exchange (ETDEWEB)

    Katsura, Masaki, E-mail: mkatsura-tky@umin.ac.jp [Department of Radiology, Graduate School of Medicine, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8655 (Japan); Sato, Jiro; Akahane, Masaaki; Matsuda, Izuru; Ishida, Masanori; Yasaka, Koichiro; Kunimatsu, Akira; Ohtomo, Kuni [Department of Radiology, Graduate School of Medicine, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8655 (Japan)

    2013-02-15

    Objectives: To evaluate the impact on image quality of three different image reconstruction techniques in the cervicothoracic region: model-based iterative reconstruction (MBIR), adaptive statistical iterative reconstruction (ASIR), and filtered back projection (FBP). Methods: Forty-four patients underwent unenhanced standard-of-care clinical computed tomography (CT) examinations which included the cervicothoracic region with a 64-row multidetector CT scanner. Images were reconstructed with FBP, 50% ASIR-FBP blending (ASIR50), and MBIR. Two radiologists assessed the cervicothoracic region in a blinded manner for streak artifacts, pixilated blotchy appearances, critical reproduction of visually sharp anatomical structures (thyroid gland, common carotid artery, and esophagus), and overall diagnostic acceptability. Objective image noise was measured in the internal jugular vein. Data were analyzed using the sign test and pair-wise Student's t-test. Results: MBIR images had significant lower quantitative image noise (8.88 ± 1.32) compared to ASIR images (18.63 ± 4.19, P < 0.01) and FBP images (26.52 ± 5.8, P < 0.01). Significant improvements in streak artifacts of the cervicothoracic region were observed with the use of MBIR (P < 0.001 each for MBIR vs. the other two image data sets for both readers), while no significant difference was observed between ASIR and FBP (P > 0.9 for ASIR vs. FBP for both readers). MBIR images were all diagnostically acceptable. Unique features of MBIR images included pixilated blotchy appearances, which did not adversely affect diagnostic acceptability. Conclusions: MBIR significantly improves image noise and streak artifacts of the cervicothoracic region over ASIR and FBP. MBIR is expected to enhance the value of CT examinations for areas where image noise and streak artifacts are problematic.

  16. Caltrans WeatherShare Phase II System: An Application of Systems and Software Engineering Process to Project Development

    Science.gov (United States)

    2009-08-25

    In cooperation with the California Department of Transportation, Montana State University's Western Transportation Institute has developed the WeatherShare Phase II system by applying Systems Engineering and Software Engineering processes. The system...

  17. The Generalizability of Private Sector Research on Software Project Management in Two USAF Organizations: An Exploratory Study

    National Research Council Canada - National Science Library

    Garman, Michael

    2003-01-01

    .... But most of this research in peer reviewed journals has focused on the private sector. Researchers have also identified software acquisitions as one of the major differences between the private sector and public sector MIS...

  18. How does a Collaborative Community Affect Diverse Students' Engagement with an Open Source Software Project: A Pedagogical Paradigm

    Science.gov (United States)

    Morgan, Becka S.

    Open Source Software (OSS) communities are homogenous and their lack of diversity is of concern to many within this field. This problem is becoming more pronounced as it is the practice of many technology companies to use OSS participation as a factor in the hiring process, disadvantaging those who are not a part of this community. We should expect that any field would have a population that reflects the general population given no constraints. The constraints within OSS are documented as being a hostile environment for women and minorities to participate in. Additionally OSS communities rely predominately on volunteers to create and maintain source code, documentation, and user interface as well as the organizational structure of the project. The volunteer nature of OSS projects creates a need for an ongoing pool of participants. This research addresses the lack of diversity along with the continual need for new members by developing a pedagogical paradigm that uses a collaborative environment to promote participation in an OSS project by diverse students. This collaborative environment used a Communities of Practice (CoP) framework to design the course, the indicators of which were used to operationalize the collaboration. The outcomes of this course not only benefit the students by providing them with skills necessary to continue participation and experience for getting a job, but also provide a diverse pool of volunteers for the OSS community. This diverse pool shows promise of creating a more diverse culture within OSS. In the development of this pedagogical paradigm this research looked primarily at student's perception of the importance of their group members and mentors provided to guide their participation in and contribution to an OSS community. These elements were used to facilitate the formation of a CoP. Self-efficacy was also used as a measure; an increase in self-efficacy is associated with the successful formation of a CoP. Finally the intent to

  19. Molecular Cloning Designer Simulator (MCDS): All-in-one molecular cloning and genetic engineering design, simulation and management software for complex synthetic biology and metabolic engineering projects.

    Science.gov (United States)

    Shi, Zhenyu; Vickers, Claudia E

    2016-12-01

    Molecular Cloning Designer Simulator (MCDS) is a powerful new all-in-one cloning and genetic engineering design, simulation and management software platform developed for complex synthetic biology and metabolic engineering projects. In addition to standard functions, it has a number of features that are either unique, or are not found in combination in any one software package: (1) it has a novel interactive flow-chart user interface for complex multi-step processes, allowing an integrated overview of the whole project; (2) it can perform a user-defined workflow of cloning steps in a single execution of the software; (3) it can handle multiple types of genetic recombineering, a technique that is rapidly replacing classical cloning for many applications; (4) it includes experimental information to conveniently guide wet lab work; and (5) it can store results and comments to allow the tracking and management of the whole project in one platform. MCDS is freely available from https://mcds.codeplex.com.

  20. Software Architecture Evolution

    Science.gov (United States)

    Barnes, Jeffrey M.

    2013-01-01

    Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…

  1. The Impact of Project Role on Perceptions of Risk and Performance in Information Technology Software Development: A Comparative Analysis

    Science.gov (United States)

    Okongo, James

    2014-01-01

    The failure rate of information technology (IT) development projects is a significant concern for today's organizations. Perceptions of IT project risk and project performance have been identified as important factors by scholars studying the topic, and Wallace, Keil, and Rai (2004a) developed a survey instrument to measure how dimensions of…

  2. Evolving a lingua franca and associated software infrastructure for computational systems biology: the Systems Biology Markup Language (SBML) project.

    Science.gov (United States)

    Hucka, M; Finney, A; Bornstein, B J; Keating, S M; Shapiro, B E; Matthews, J; Kovitz, B L; Schilstra, M J; Funahashi, A; Doyle, J C; Kitano, H

    2004-06-01

    Biologists are increasingly recognising that computational modelling is crucial for making sense of the vast quantities of complex experimental data that are now being collected. The systems biology field needs agreed-upon information standards if models are to be shared, evaluated and developed cooperatively. Over the last four years, our team has been developing the Systems Biology Markup Language (SBML) in collaboration with an international community of modellers and software developers. SBML has become a de facto standard format for representing formal, quantitative and qualitative models at the level of biochemical reactions and regulatory networks. In this article, we summarise the current and upcoming versions of SBML and our efforts at developing software infrastructure for supporting and broadening its use. We also provide a brief overview of the many SBML-compatible software tools available today.

  3. Pragmatic Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan; Jensen, Rikke Hagensby

    2014-01-01

    We understand software innovation as concerned with introducing innovation into the development of software intensive systems, i.e. systems in which software development and/or integration are dominant considerations. Innovation is key in almost any strategy for competitiveness in existing markets......, for creating new markets, or for curbing rising public expenses, and software intensive systems are core elements in most such strategies. Software innovation therefore is vital for about every sector of the economy. Changes in software technologies over the last decades have opened up for experimentation......, learning, and flexibility in ongoing software projects, but how can this change be used to facilitate software innovation? How can a team systematically identify and pursue opportunities to create added value in ongoing projects? In this paper, we describe Deweyan pragmatism as the philosophical foundation...

  4. First research coordination meeting of the coordinated research project on validation of tracers and software for interwell investigations. Meeting report

    International Nuclear Information System (INIS)

    2004-01-01

    . The introduction and promotion of tracer techniques for oil producing industry have been going on through several national and regional technical cooperation projects. Presently, R and D is going on in interwell tracer technology, including development of new tracers, improvement of analytical and interpretation techniques, and other innovative techniques for multiphase flow pattern characterization. The CRP coordinates knowledge generated in this field to guarantee the continuity of technology and to transfer the best part to developing countries. For effective transfer of the technology to developing countries, the target techniques will be consolidated, developed further and validated though the CRP activities. Technical documents will also be prepared to facilitate upgradation of the capability of tracer groups in developing countries. In line with the CRP objectives, the first RCM summarized the status of tracer technology as applied to interwell tests and discussed the ways to meet the proposed goals. The proposed investigations were focused on three main fields: 1) Software and model development and interpretation, 2) development of new tracers, methods and technologies, and 3) field applications. All participants were encouraged to participate in one or more of these topics of discussion and establish networking activities

  5. How Does a Collaborative Community Affect Diverse Students' Engagement with an Open Source Software Project: A Pedagogical Paradigm

    Science.gov (United States)

    Morgan, Becka S.

    2012-01-01

    Open Source Software (OSS) communities are homogenous and their lack of diversity is of concern to many within this field. This problem is becoming more pronounced as it is the practice of many technology companies to use OSS participation as a factor in the hiring process, disadvantaging those who are not a part of this community. We should…

  6. SigmaPlot 2000, Version 6.00, SPSS Inc. Computer Software Project Management, Requirements, and Design Document

    Energy Technology Data Exchange (ETDEWEB)

    HURLBUT, S.T.

    2000-10-24

    SigmaPlot is a vendor software product that will be used to convert the area under an absorbance curve generated by a Fourier transform infrared spectrometer (FTIR) to a relative area. SigmaPlot will be used in conjunction with procedure ZA-565-301, ''Determination of Moisture by Supercritical Fluid Extraction and Infrared Detection.''

  7. Open Source Software Development

    Science.gov (United States)

    2011-01-01

    appropriate to refer to FOSS or FLOSS (L for Libre , where the alternative term “ libre software ” has popularity in some parts of the world) in order...Applying Social Network Analysis to Community-Drive Libre Software Projects, Intern. J. Info. Tech. and Web Engineering, 2006, 1(3), 27-28. 17...Open Source Software Development* Walt Scacchi Institute for Software Researcher University of California, Irvine Irvine, CA 92697-3455 USA Abstract

  8. Software Quality Assurance Metrics

    Science.gov (United States)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  9. Dumping convention

    International Nuclear Information System (INIS)

    Roche, P.

    1992-01-01

    Sea dumping of radioactive waste has, since 1983, been precluded under a moratorium established by the London Dumping Convention. Pressure from the nuclear industry to allow ocean dumping of nuclear waste is reported in this article. (author)

  10. 2015 Plan. Project 7: the environmental issue and the electrical sector. The options of electrical power supply and their socio-environmental implications: sources/conventional and non-conventional technologies of generation

    International Nuclear Information System (INIS)

    1992-10-01

    The socio-environmental impacts caused by the uses of conventional sources (hydraulic, mineral coal, nuclear, petroleum by-products and natural gas) and non-conventional (biomass, solar, eolic, ocean and organic wastes) in electric power generation are presented. The main topics that integrate the environmental schedule in the last years are described, including some considerations about environmental legislation and atmospheric alterations. The reserves for each source are also cited. (C.G.C.)

  11. Results of a Survey Software Development Project Management in the U.S. Aerospace Industry. Volume III. Major Problems.

    Science.gov (United States)

    1979-12-18

    simplifies the staffing of a project and assures the experience is ’ recyclable ’." "Staff or members are considered ’universal experts’. During estimation...impact of changes upon the original system." "Project reviews are typically exercises in trivia ." [Keider, 1974] ____ ,,, , _ 55 "First, [lesson

  12. The Evolution of a Science Project: A Preliminary System Dynamics Model of a Recurring Software-Reliant Acquisition Behavior

    Science.gov (United States)

    2012-07-01

    Complexity?” Proceedings of the Third Annual Conference on Software- Intensive Systems Acquisition, January 2004. [ Bandura 1986] Bandura , Albert ...Andrew P. Moore Christopher Alberts July 2012 TECHNICAL REPORT CMU/SEI-2012-TR-001 ESC-TR-2012-001 Acquisition Support Program http...an operational environment [ Bandura 1986]. Creating interactive experiential learning tools such as “flight simulators” for use in the classroom is

  13. Business Management Software Axolon ERP

    OpenAIRE

    Axolon ERP Solution

    2018-01-01

    Axolon ERP a Business Management Software www.axolonerp.com by Micromind is a comprehensive business management software solution for businesses. We deliver Business Management Software Dubai in UAE, GCC Countries and products also include ERP Software Dubai. HR & Payroll, Inventory Software, Project Management, Software Development, Solutions and Services in Dubai, UAE for small and medium sized Enterprises (SME) in the middle east with a easy-to-use, secure and efficient business management...

  14. Conventions and workflows for using Situs

    International Nuclear Information System (INIS)

    Wriggers, Willy

    2012-01-01

    Recent developments of the Situs software suite for multi-scale modeling are reviewed. Typical workflows and conventions encountered during processing of biophysical data from electron microscopy, tomography or small-angle X-ray scattering are described. Situs is a modular program package for the multi-scale modeling of atomic resolution structures and low-resolution biophysical data from electron microscopy, tomography or small-angle X-ray scattering. This article provides an overview of recent developments in the Situs package, with an emphasis on workflows and conventions that are important for practical applications. The modular design of the programs facilitates scripting in the bash shell that allows specific programs to be combined in creative ways that go beyond the original intent of the developers. Several scripting-enabled functionalities, such as flexible transformations of data type, the use of symmetry constraints or the creation of two-dimensional projection images, are described. The processing of low-resolution biophysical maps in such workflows follows not only first principles but often relies on implicit conventions. Situs conventions related to map formats, resolution, correlation functions and feature detection are reviewed and summarized. The compatibility of the Situs workflow with CCP4 conventions and programs is discussed

  15. A Demonstration of the System Assessment Capability (SAC) Rev. 1 Software for the Hanford Remediation Assessment Project

    International Nuclear Information System (INIS)

    Eslinger, Paul W.; Kincaid, Charles T.; Nichols, William E.; Wurstner, Signe K.

    2006-01-01

    The System Assessment Capability (SAC) is a suite of interrelated computer codes that provides the capability to conduct large-scale environmental assessments on the Hanford Site. Developed by Pacific Northwest National Laboratory for the Department of Energy, SAC models the fate and transport of radioactive and chemical contaminants, starting with the inventory of those contaminants in waste sites, simulating transport through the environment, and continuing on through impacts to the environment and humans. Separate modules in the SAC address inventory, release from waste forms, water flow and mass transport in the vadose zone, water flow and mass transport in the groundwater, water flow and mass transport in the Columbia River, air transport, and human and ecological impacts. The SAC supports deterministic analyses as well as stochastic analyses using a Monte Carlo approach, enabling SAC users to examine the effect of uncertainties in a number of key parameters. The initial assessment performed with the SAC software identified a number of areas where both the software and the analysis approach could be improved. Since that time the following six major software upgrades have been made: (1) An air pathway model was added to support all-pathway analyses. (2) Models for releases from glass waste forms, buried graphite reactor cores, and buried naval reactor compartments were added. (3) An air-water dual-phase model was added to more accurately track the movement of volatile contaminants in the vadose zone. (4) The ability to run analyses was extended from 1,000 years to 10,000 years or longer after site closure. (5) The vadose zone flow and transport model was upgraded to support two-dimensional or three-dimensional analyses. (6) The ecological model and human risk models were upgraded so the concentrations of contaminants in food products consumed by humans are produced by the ecological model. This report documents the functions in the SAC software and provides a

  16. A Demonstration of the System Assessment Capability (SAC) Rev. 1 Software for the Hanford Remediation Assessment Project

    Energy Technology Data Exchange (ETDEWEB)

    Eslinger, Paul W.; Kincaid, Charles T.; Nichols, William E.; Wurstner, Signe K.

    2006-11-06

    The System Assessment Capability (SAC) is a suite of interrelated computer codes that provides the capability to conduct large-scale environmental assessments on the Hanford Site. Developed by Pacific Northwest National Laboratory for the Department of Energy, SAC models the fate and transport of radioactive and chemical contaminants, starting with the inventory of those contaminants in waste sites, simulating transport through the environment, and continuing on through impacts to the environment and humans. Separate modules in the SAC address inventory, release from waste forms, water flow and mass transport in the vadose zone, water flow and mass transport in the groundwater, water flow and mass transport in the Columbia River, air transport, and human and ecological impacts. The SAC supports deterministic analyses as well as stochastic analyses using a Monte Carlo approach, enabling SAC users to examine the effect of uncertainties in a number of key parameters. The initial assessment performed with the SAC software identified a number of areas where both the software and the analysis approach could be improved. Since that time the following six major software upgrades have been made: (1) An air pathway model was added to support all-pathway analyses. (2) Models for releases from glass waste forms, buried graphite reactor cores, and buried naval reactor compartments were added. (3) An air-water dual-phase model was added to more accurately track the movement of volatile contaminants in the vadose zone. (4) The ability to run analyses was extended from 1,000 years to 10,000 years or longer after site closure. (5) The vadose zone flow and transport model was upgraded to support two-dimensional or three-dimensional analyses. (6) The ecological model and human risk models were upgraded so the concentrations of contaminants in food products consumed by humans are produced by the ecological model. This report documents the functions in the SAC software and provides a

  17. Software Engineering Improvement Plan

    Science.gov (United States)

    2006-01-01

    In performance of this task order, bd Systems personnel provided support to the Flight Software Branch and the Software Working Group through multiple tasks related to software engineering improvement and to activities of the independent Technical Authority (iTA) Discipline Technical Warrant Holder (DTWH) for software engineering. To ensure that the products, comments, and recommendations complied with customer requirements and the statement of work, bd Systems personnel maintained close coordination with the customer. These personnel performed work in areas such as update of agency requirements and directives database, software effort estimation, software problem reports, a web-based process asset library, miscellaneous documentation review, software system requirements, issue tracking software survey, systems engineering NPR, and project-related reviews. This report contains a summary of the work performed and the accomplishments in each of these areas.

  18. The whiteStar development project: Westinghouse's next generation core design simulator and core monitoring software to power the nuclear renaissance

    International Nuclear Information System (INIS)

    Boyd, W. A.; Mayhue, L. T.; Penkrot, V. S.; Zhang, B.

    2009-01-01

    The WhiteStar project has undertaken the development of the next generation core analysis and monitoring system for Westinghouse Electric Company. This on-going project focuses on the development of the ANC core simulator, BEACON core monitoring system and NEXUS nuclear data generation system. This system contains many functional upgrades to the ANC core simulator and BEACON core monitoring products as well as the release of the NEXUS family of codes. The NEXUS family of codes is an automated once-through cross section generation system designed for use in both PWR and BWR applications. ANC is a multi-dimensional nodal code for all nuclear core design calculations at a given condition. ANC predicts core reactivity, assembly power, rod power, detector thimble flux, and other relevant core characteristics. BEACON is an advanced core monitoring and support system which uses existing instrumentation data in conjunction with an analytical methodology for on-line generation and evaluation of 3D core power distributions. This new system is needed to design and monitor the Westinghouse AP1000 PWR. This paper describes provides an overview of the software system, software development methodologies used as well some initial results. (authors)

  19. Multi-dimensional project evaluation: Combining cost-benefit analysis and multi-criteria analysis with the COSIMA software system

    DEFF Research Database (Denmark)

    and not the least construction and maintenance costs. The MCA is made use of to assess noise, land use planning, business potential and tourism impacts for the three alternatives. More technically the software system offers a set of different features to undertake the MCA. Thus the users have two different methods...... for society is ranked uppermost. To compare the different impacts, it is necessary to have a common monetary unit. Theoretically, all benefits and all costs should be accounted for in socio-economic cost-benefit analysis. However, this is far from in practical the general case due to difficulties...... in a valuating all the criteria in monetary terms. Thus CBA does not meet the need for a comprehensive evaluation, for which reason MCA is introduced to overcome this problem. Not only does MCA provides an opportunity to include non-market impacts in the analysis, but MCA also provides a framework for breaking...

  20. Software engineering in industry

    Science.gov (United States)

    Story, C. M.

    1989-12-01

    Can software be "engineered"? Can a few people with limited resources and a negligible budget produce high quality software solutions to complex software problems? It is possible to resolve the conflict between research activities and the necessity to view software development as a means to an end rather than as an end in itself? The aim of this paper is to encourage further thought and discussion on various topics which, in the author's experience, are becoming increasingly critical in large current software production and development projects, inside and outside high energy physics (HEP). This is done by briefly exploring some of the software engineering ideas and technologies now used in the information industry, using, as a case-study, a project with many similarities to those currently under way in HEP.

  1. Comparative study on software development methodologies

    OpenAIRE

    Mihai Liviu DESPA

    2014-01-01

    This paper focuses on the current state of knowledge in the field of software development methodologies. It aims to set the stage for the formalization of a software development methodology dedicated to innovation orientated IT projects. The paper starts by depicting specific characteristics in software development project management. Managing software development projects involves techniques and skills that are proprietary to the IT industry. Also the software development project manager han...

  2. Hospital Management Software Development

    OpenAIRE

    sobogunGod, olawale

    2012-01-01

    The purpose of this thesis was to implement a hospital management software which is suitable for small private hospitals in Nigeria, especially for the ones that use a file based system for storing information rather than having it stored in a more efficient and safer environment like databases or excel programming software. The software developed within this thesis project was specifically designed for the Rainbow specialist hospital which is based in Lagos, the commercial neurological cente...

  3. FASTBUS software status

    International Nuclear Information System (INIS)

    Gustavson, D.B.

    1980-10-01

    Computer software will be needed in addition to the mechanical, electrical, protocol and timing specifications of the FASTBUS, in order to facilitate the use of this flexible new multiprocessor and multisegment data acquisition and processing system. Software considerations have been important in the FASTBUS design, but standard subroutines and recommended algorithms will be needed as the FASTBUS comes into use. This paper summarizes current FASTBUS software projects, goals and status

  4. Solar Asset Management Software

    Energy Technology Data Exchange (ETDEWEB)

    Iverson, Aaron [Ra Power Management, Inc., Oakland, CA (United States); Zviagin, George [Ra Power Management, Inc., Oakland, CA (United States)

    2016-09-30

    Ra Power Management (RPM) has developed a cloud based software platform that manages the financial and operational functions of third party financed solar projects throughout their lifecycle. RPM’s software streamlines and automates the sales, financing, and management of a portfolio of solar assets. The software helps solar developers automate the most difficult aspects of asset management, leading to increased transparency, efficiency, and reduction in human error. More importantly, our platform will help developers save money by improving their operating margins.

  5. Report on the behalf of the Commission for sustainable development and land planning on the bill project, after initiation of the accelerated procedure, putting an end to the search for as well as to the exploitation of conventional and non-conventional hydrocarbons, and bearing various applications related to energy and to the environment (nr 155) - Nr 174

    International Nuclear Information System (INIS)

    Colas-Roy, Jean-Charles

    2017-01-01

    While considering objectives of limitation of temperature increase, and in order to limit the use of fossil energies likes conventional and non conventional hydrocarbons, the exploitation of which would result in an increase of CO 2 emissions, France has to modify its rules related to hydrocarbon extraction. This bill project therefore aims at programming the end of the exploration and of the exploitation of hydrocarbons. As mentioned, some articles of this bill project concern energy regulation, or can be a transposition of a European directive. This document contains transcription of a hearings of the Minister (Nicolas Hulot) held by the Commission, and of discussion about the bill content, and more particularly about amendments. Appendices propose tables containing lists of currently valid exploration permits, hydrocarbon mining concessions, exploration permit demands, and hydrocarbon mining demands. The text of the bill project is then proposed, followed by a comparative vision between initial and final versions of the text

  6. NASA's Software Safety Standard

    Science.gov (United States)

    Ramsay, Christopher M.

    2007-01-01

    NASA relies more and more on software to control, monitor, and verify its safety critical systems, facilities and operations. Since the 1960's there has hardly been a spacecraft launched that does not have a computer on board that will provide command and control services. There have been recent incidents where software has played a role in high-profile mission failures and hazardous incidents. For example, the Mars Orbiter, Mars Polar Lander, the DART (Demonstration of Autonomous Rendezvous Technology), and MER (Mars Exploration Rover) Spirit anomalies were all caused or contributed to by software. The Mission Control Centers for the Shuttle, ISS, and unmanned programs are highly dependant on software for data displays, analysis, and mission planning. Despite this growing dependence on software control and monitoring, there has been little to no consistent application of software safety practices and methodology to NASA's projects with safety critical software. Meanwhile, academia and private industry have been stepping forward with procedures and standards for safety critical systems and software, for example Dr. Nancy Leveson's book Safeware: System Safety and Computers. The NASA Software Safety Standard, originally published in 1997, was widely ignored due to its complexity and poor organization. It also focused on concepts rather than definite procedural requirements organized around a software project lifecycle. Led by NASA Headquarters Office of Safety and Mission Assurance, the NASA Software Safety Standard has recently undergone a significant update. This new standard provides the procedures and guidelines for evaluating a project for safety criticality and then lays out the minimum project lifecycle requirements to assure the software is created, operated, and maintained in the safest possible manner. This update of the standard clearly delineates the minimum set of software safety requirements for a project without detailing the implementation for those

  7. Software engineering ethics

    Science.gov (United States)

    Bown, Rodney L.

    1991-01-01

    Software engineering ethics is reviewed. The following subject areas are covered: lack of a system viewpoint; arrogance of PC DOS software vendors; violation od upward compatibility; internet worm; internet worm revisited; student cheating and company hiring interviews; computing practitioners and the commodity market; new projects and old programming languages; schedule and budget; and recent public domain comments.

  8. Core Flight Software

    Data.gov (United States)

    National Aeronautics and Space Administration — The AES Core Flight Software (CFS) project purpose is to analyze applicability, and evolve and extend the reusability of the CFS system originally developed by...

  9. Core component integration tests for the back-end software sub-system in the ATLAS data acquisition and event filter prototype -1 project

    International Nuclear Information System (INIS)

    Badescu, E.; Caprini, M.; Niculescu, M.; Radu, A.

    2000-01-01

    The ATLAS data acquisition (DAQ) and Event Filter (EF) prototype -1 project was intended to produce a prototype system for evaluating candidate technologies and architectures for the final ATLAS DAQ system on the LHC accelerator at CERN. Within the prototype project, the back-end sub-system encompasses the software for configuring, controlling and monitoring the DAQ. The back-end sub-system includes core components and detector integration components. The core components provide the basic functionality and had priority in terms of time-scale for development in order to have a baseline sub-system that can be used for integration with the data-flow sub-system and event filter. The following components are considered to be the core of the back-end sub-system: - Configuration databases, describe a large number of parameters of the DAQ system architecture, hardware and software components, running modes and status; - Message reporting system (MRS), allows all software components to report messages to other components in the distributed environment; - Information service (IS) allows the information exchange for software components; - Process manager (PMG), performs basic job control of software components (start, stop, monitoring the status); - Run control (RC), controls the data taking activities by coordinating the operations of the DAQ sub-systems, back-end software and external systems. Performance and scalability tests have been made for individual components. The back-end subsystem integration tests bring together all the core components and several trigger/DAQ/detector integration components to simulate the control and configuration of data taking sessions. For back-end integration tests a test plan was provided. The tests have been done using a shell script that goes through different phases as follows: - starting the back-end server processes to initialize communication services and PMG; - launching configuration specific processes via DAQ supervisor as

  10. Status Report on the Development of Micro-Scheduling Software for the Advanced Outage Control Center Project

    Energy Technology Data Exchange (ETDEWEB)

    Germain, Shawn St. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Thomas, Kenneth [Idaho National Lab. (INL), Idaho Falls, ID (United States); Farris, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-09-01

    The long-term viability of existing nuclear power plants (NPPs) in the United States (U.S.) is dependent upon a number of factors, including maintaining high capacity factors, maintaining nuclear safety, and reducing operating costs, particularly those associated with refueling outages. Refueling outages typically take 20-30 days, and for existing light water NPPs in the U.S., the reactor cannot be in operation during the outage. Furthermore, given that many NPPs generate between $1-1.5 million/day in revenue when in operation, there is considerable interest in shortening the length of refueling outages. Yet, refueling outages are highly complex operations, involving multiple concurrent and dependent activities that are difficult to coordinate. Finding ways to improve refueling outage performance while maintaining nuclear safety has proven to be difficult. The Advanced Outage Control Center project is a research and development (R&D) demonstration activity under the Light Water Reactor Sustainability (LWRS) Program. LWRS is a R&D program which works with industry R&D programs to establish technical foundations for the licensing and managing of long-term, safe, and economical operation of current NPPs. The Advanced Outage Control Center project has the goal of improving the management of commercial NPP refueling outages. To accomplish this goal, this INL R&D project is developing an advanced outage control center (OCC) that is specifically designed to maximize the usefulness of communication and collaboration technologies for outage coordination and problem resolution activities. This report describes specific recent efforts to develop a capability called outage Micro-Scheduling. Micro-Scheduling is the ability to allocate and schedule outage support task resources on a sub-hour basis. Micro-Scheduling is the real-time fine-tuning of the outage schedule to react to the actual progress of the primary outage activities to ensure that support task resources are

  11. The sustainability transition. Beyond conventional development

    Energy Technology Data Exchange (ETDEWEB)

    Raskin, P; Chadwick, M; Jackson, T; Leach, G

    1996-10-01

    This paper synthesizes findings of the first phase in SEI`s PoleStar Project - a project aimed at developing long-term strategies and policies for sustainable development. Taking a global and long-range perspective, the paper aims to describe a theoretical framework for addressing sustainability, to identify emerging issues and outline directions for future action. The paper begins by setting today`s development and environmental challenges in historical context, and describing the scenario method for envisioning and evaluating alternative futures, and identifying propitious areas for policy and action. It next summarizes a detailed scenario based on conventional development assumptions, and discusses the implications of this scenario for demographic and economic patterns, energy and water resources, land resources and agriculture, and pollution loads and the environment to the year 2050. The conventional scenario relies in part on the sectorally-oriented work discussed in Papers 3 through 6 of the PoleStar Project report series, and makes use of the PoleStar System, software designed for integrated resource, environment and socio-economic accounting and scenario analysis (described in Paper 2). The paper then examines the critical risks to social, resource and environmental systems lying ahead on the conventional development path. Finally, the paper surveys the requirements for sustainability across a number of policy dimensions, and raises key questions for the future. The PoleStar Project is proceeding to examine a range of alternative development scenarios, in the context of the work of the regionally-diverse Global Scenario Group, convened by SEI. The hope remains to offer wise counsel for a transition to an equitable, humane and sustainable future for the global community. 144 refs, 30 figs, 9 tabs

  12. The sustainability transition. Beyond conventional development

    International Nuclear Information System (INIS)

    Raskin, P.; Chadwick, M.; Jackson, T.; Leach, G.

    1996-01-01

    This paper synthesizes findings of the first phase in SEI's PoleStar Project - a project aimed at developing long-term strategies and policies for sustainable development. Taking a global and long-range perspective, the paper aims to describe a theoretical framework for addressing sustainability, to identify emerging issues and outline directions for future action. The paper begins by setting today's development and environmental challenges in historical context, and describing the scenario method for envisioning and evaluating alternative futures, and identifying propitious areas for policy and action. It next summarizes a detailed scenario based on conventional development assumptions, and discusses the implications of this scenario for demographic and economic patterns, energy and water resources, land resources and agriculture, and pollution loads and the environment to the year 2050. The conventional scenario relies in part on the sectorally-oriented work discussed in Papers 3 through 6 of the PoleStar Project report series, and makes use of the PoleStar System, software designed for integrated resource, environment and socio-economic accounting and scenario analysis (described in Paper 2). The paper then examines the critical risks to social, resource and environmental systems lying ahead on the conventional development path. Finally, the paper surveys the requirements for sustainability across a number of policy dimensions, and raises key questions for the future. The PoleStar Project is proceeding to examine a range of alternative development scenarios, in the context of the work of the regionally-diverse Global Scenario Group, convened by SEI. The hope remains to offer wise counsel for a transition to an equitable, humane and sustainable future for the global community. 144 refs, 30 figs, 9 tabs

  13. Collected software engineering papers, volume 2

    Science.gov (United States)

    1983-01-01

    Topics addressed include: summaries of the software engineering laboratory (SEL) organization, operation, and research activities; results of specific research projects in the areas of resource models and software measures; and strategies for data collection for software engineering research.

  14. Essence: Facilitating Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2008-01-01

      This paper suggests ways to facilitate creativity and innovation in software development. The paper applies four perspectives – Product, Project, Process, and People –to identify an outlook for software innovation. The paper then describes a new facility–Software Innovation Research Lab (SIRL......) – and a new method concept for software innovation – Essence – based on views, modes, and team roles. Finally, the paper reports from an early experiment using SIRL and Essence and identifies further research....

  15. Global Software Engineering

    DEFF Research Database (Denmark)

    Ebert, Christof; Kuhrmann, Marco; Prikladnicki, Rafael

    2016-01-01

    Professional software products and IT systems and services today are developed mostly by globally distributed teams, projects, and companies. Successfully orchestrating Global Software Engineering (GSE) has become the major success factor both for organizations and practitioners. Yet, more than...... and experience reported at the IEEE International Conference on Software Engineering (ICGSE) series. The outcomes of our analysis show GSE as a field highly attached to industry and, thus, a considerable share of ICGSE papers address the transfer of Software Engineering concepts and solutions to the global stage...

  16. The Art of Software Testing

    CERN Document Server

    Myers, Glenford J; Badgett, Tom

    2011-01-01

    The classic, landmark work on software testing The hardware and software of computing have changed markedly in the three decades since the first edition of The Art of Software Testing, but this book's powerful underlying analysis has stood the test of time. Whereas most books on software testing target particular development techniques, languages, or testing methods, The Art of Software Testing, Third Edition provides a brief but powerful and comprehensive presentation of time-proven software testing approaches. If your software development project is mission critical, this book is an investme

  17. Software Project Management Plan for the Integrated Systems Code (ISC) of New Production Reactor -- Modular High Temperature Gas Reactor

    International Nuclear Information System (INIS)

    Taylor, D.

    1990-11-01

    The United States Department of Energy (DOE) has selected the Modular High Temperature Gas-Cooled Reactor (MHTGR) as one of the concepts for the New Production Reactor (NPR). DOE has also established several Technical Working Groups (TWG's) at the national laboratories to provide independent design confirmation of the NPR-MHTGR design. One of those TWG's is concerned with Thermal Fluid Flow (TFF) and analysis methods to provide independent design confirmation of the NPR-MHTGR. Analysis methods are also needed for operational safety evaluations, performance monitoring, sensitivity studies, and operator training. The TFF Program Plan includes, as one of its principal tasks, the development of a computer program (called the Integrated Systems Code, or ISC). This program will provide the needed long-term analysis capabilities for the MHTGR and its subsystems. This document presents the project management plan for development of the ISC. It includes the associated quality assurance tasks, and the schedule and resource requirements to complete these activities. The document conforms to the format of ANSI/IEEE Std. 1058.1-1987. 2 figs

  18. GENII (Generation II): The Hanford Environmental Radiation Dosimetry Software System: Volume 3, Code maintenance manual: Hanford Environmental Dosimetry Upgrade Project

    Energy Technology Data Exchange (ETDEWEB)

    Napier, B.A.; Peloquin, R.A.; Strenge, D.L.; Ramsdell, J.V.

    1988-09-01

    The Hanford Environmental Dosimetry Upgrade Project was undertaken to incorporate the internal dosimetry models recommended by the International Commission on Radiological Protection (ICRP) in updated versions of the environmental pathway analysis models used at Hanford. The resulting second generation of Hanford environmental dosimetry computer codes is compiled in the Hanford Environmental Dosimetry System (Generation II, or GENII). This coupled system of computer codes is intended for analysis of environmental contamination resulting from acute or chronic releases to, or initial contamination of, air, water, or soil, on through the calculation of radiation doses to individuals or populations. GENII is described in three volumes of documentation. This volume is a Code Maintenance Manual for the serious user, including code logic diagrams, global dictionary, worksheets to assist with hand calculations, and listings of the code and its associated data libraries. The first volume describes the theoretical considerations of the system. The second volume is a Users' Manual, providing code structure, users' instructions, required system configurations, and QA-related topics. 7 figs., 5 tabs.

  19. GENII [Generation II]: The Hanford Environmental Radiation Dosimetry Software System: Volume 3, Code maintenance manual: Hanford Environmental Dosimetry Upgrade Project

    International Nuclear Information System (INIS)

    Napier, B.A.; Peloquin, R.A.; Strenge, D.L.; Ramsdell, J.V.

    1988-09-01

    The Hanford Environmental Dosimetry Upgrade Project was undertaken to incorporate the internal dosimetry models recommended by the International Commission on Radiological Protection (ICRP) in updated versions of the environmental pathway analysis models used at Hanford. The resulting second generation of Hanford environmental dosimetry computer codes is compiled in the Hanford Environmental Dosimetry System (Generation II, or GENII). This coupled system of computer codes is intended for analysis of environmental contamination resulting from acute or chronic releases to, or initial contamination of, air, water, or soil, on through the calculation of radiation doses to individuals or populations. GENII is described in three volumes of documentation. This volume is a Code Maintenance Manual for the serious user, including code logic diagrams, global dictionary, worksheets to assist with hand calculations, and listings of the code and its associated data libraries. The first volume describes the theoretical considerations of the system. The second volume is a Users' Manual, providing code structure, users' instructions, required system configurations, and QA-related topics. 7 figs., 5 tabs

  20. GENII: The Hanford Environmental Radiation Dosimetry Software System: Volume 2, Users' manual: Hanford Environmental Dosimetry Upgrade Project

    International Nuclear Information System (INIS)

    Napier, B.A.; Peloquin, R.A.; Strenge, D.L.; Ramsdell, J.V.

    1988-11-01

    The Hanford Environmental Dosimetry Upgrade Project was undertaken to incorporate the internal dosimetry models recommended by the International Commission on Radiological Protection (ICRP) in updated versions of the environmental pathway analysis models used at Hanford. The resulting second generation of Hanford environmental dosimetry computer codes is compiled in the Hanford Environmental Dosimetry System (Generation II, or GENII). The purpose of this coupled system of computer codes is to analyze environmental contamination of, air, water, or soil. This is accomplished by calculating radiation doses to individuals or populations. GENII is described in three volumes of documentation. This second volume is a Users' Manual, providing code structure, users' instructions, required system configurations, and QA-related topics. The first volume describes the theoretical considerations of the system. The third volume is a Code Maintenance Manual for the user who requires knowledge of code detail. It includes logic diagrams, global dictionary, worksheets, example hand calculations, and listings of the code and its associated data libraries. 27 refs., 17 figs., 23 tabs

  1. Software engineering

    CERN Document Server

    Sommerville, Ian

    2010-01-01

    The ninth edition of Software Engineering presents a broad perspective of software engineering, focusing on the processes and techniques fundamental to the creation of reliable, software systems. Increased coverage of agile methods and software reuse, along with coverage of 'traditional' plan-driven software engineering, gives readers the most up-to-date view of the field currently available. Practical case studies, a full set of easy-to-access supplements, and extensive web resources make teaching the course easier than ever.

  2. New ATLAS Software & Computing Organization

    CERN Multimedia

    Barberis, D

    Following the election by the ATLAS Collaboration Board of Dario Barberis (Genoa University/INFN) as Computing Coordinator and David Quarrie (LBNL) as Software Project Leader, it was considered necessary to modify the organization of the ATLAS Software & Computing ("S&C") project. The new organization is based upon the following principles: separation of the responsibilities for computing management from those of software development, with the appointment of a Computing Coordinator and a Software Project Leader who are both members of the Executive Board; hierarchical structure of responsibilities and reporting lines; coordination at all levels between TDAQ, S&C and Physics working groups; integration of the subdetector software development groups with the central S&C organization. A schematic diagram of the new organization can be seen in Fig.1. Figure 1: new ATLAS Software & Computing organization. Two Management Boards will help the Computing Coordinator and the Software Project...

  3. ANÁLISIS MULTIVARIADO DE DATOS COMO SOPORTE A LA DECISIÓN EN LA SELECCIÓN DE ESTUDIANTES EN PROYECTOS DE SOFTWARE / MULTIVARIATE DATA ANALYSIS AS DECISION MAKING SUPPORT IN STUDENT SELECTION IN SOFTWARE PROJECTS

    Directory of Open Access Journals (Sweden)

    Hugo Arnaldo Martínez-Noriegas

    2013-05-01

    Full Text Available La selección de personal es un proceso vital que tiene influencia directa en el éxito de cualquier organización. En este trabajo tiene como objetivo generar información de soporte a la decisión en la selección de estudiantes para su vinculación a proyectos de software . Para este fin se aplican técnicas del análisis multivariados a las calificaciones obtenidas por estudiantes de segundo año de la carrera Ingeniería en Ciencias Informáticas. Para reducir la cantidad de variables en estudio se utiliza el análisis de componentes principales y basado en la información resumida, se emplea el análisis de cluster para formar 3 grupos. A través del análisis factorial común, se lograron identificar 3 factores latentes que actúan sobre diferentes grupos de asignaturas. La información generada es utilizada como soporte a la toma de decisiones para formular estrategias en el trabajo de formación desde la producción.AbstractPersonnel selection is a vital process that has a direct influence on the success of any organization. This paper aims to generate information for decision support in the selection of students for software projects. The multivariate data analysis techniques are applied to the data set of academic qualifications of Computer Science Engineering´s second year students. The principal component analysis is used in order to reduce the number of variables under study and based in the summarized information, it is utilized the cluster analysis to form 3 groups. Through the factor analysis, it was possible to identify 3 latent factors that act on different groups of subjects. The generated information is used as a support for the decision-making to develop strategies on the training job from production.

  4. Statistical reliability assessment of software-based systems

    International Nuclear Information System (INIS)

    Korhonen, J.; Pulkkinen, U.; Haapanen, P.

    1997-01-01

    Plant vendors nowadays propose software-based systems even for the most critical safety functions. The reliability estimation of safety critical software-based systems is difficult since the conventional modeling techniques do not necessarily apply to the analysis of these systems, and the quantification seems to be impossible. Due to lack of operational experience and due to the nature of software faults, the conventional reliability estimation methods can not be applied. New methods are therefore needed for the safety assessment of software-based systems. In the research project Programmable automation systems in nuclear power plants (OHA), financed together by the Finnish Centre for Radiation and Nuclear Safety (STUK), the Ministry of Trade and Industry and the Technical Research Centre of Finland (VTT), various safety assessment methods and tools for software based systems are developed and evaluated. This volume in the OHA-report series deals with the statistical reliability assessment of software based systems on the basis of dynamic test results and qualitative evidence from the system design process. Other reports to be published later on in OHA-report series will handle the diversity requirements in safety critical software-based systems, generation of test data from operational profiles and handling of programmable automation in plant PSA-studies. (orig.) (25 refs.)

  5. SOFTWARE DESIGN MODELLING WITH FUNCTIONAL PETRI NETS

    African Journals Online (AJOL)

    Dr Obe

    the system, which can be described as a set of conditions. ... FPN Software prototype proposed for the conventional programming construct: if-then-else ... mathematical modeling tool allowing for ... methods and techniques of software design.

  6. Sandia software guidelines, Volume 4: Configuration management

    Energy Technology Data Exchange (ETDEWEB)

    1992-06-01

    This volume is one in a series of Sandia Software Guidelines for use in producing quality software within Sandia National Laboratories. This volume is based on the IEEE standard and guide for software configuration management. The basic concepts and detailed guidance on implementation of these concepts are discussed for several software project types. Example planning documents for both projects and organizations are included.

  7. NASA software documentation standard software engineering program

    Science.gov (United States)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as Standard) can be applied to the documentation of all NASA software. This Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. This basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  8. Entropy based software processes improvement

    NARCIS (Netherlands)

    Trienekens, J.J.M.; Kusters, R.J.; Kriek, D.; Siemons, P.

    2009-01-01

    Actual results of software process improvement projects show different levels of success. Although many software development organisations have adopted improvement models such as CMMI, it appears to be difficult to improve software development processes in the right way, e.g. tuned to the actual

  9. SOFTWARE OPEN SOURCE, SOFTWARE GRATIS?

    Directory of Open Access Journals (Sweden)

    Nur Aini Rakhmawati

    2006-01-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 Berlakunya Undang – undang Hak Atas Kekayaan Intelektual (HAKI, memunculkan suatu alternatif baru untuk menggunakan software open source. Penggunaan software open source menyebar seiring dengan isu global pada Information Communication Technology (ICT saat ini. Beberapa organisasi dan perusahaan mulai menjadikan software open source sebagai pertimbangan. Banyak konsep mengenai software open source ini. Mulai dari software yang gratis sampai software tidak berlisensi. Tidak sepenuhnya isu software open source benar, untuk itu perlu dikenalkan konsep software open source mulai dari sejarah, lisensi dan bagaimana cara memilih lisensi, serta pertimbangan dalam memilih software open source yang ada. Kata kunci :Lisensi, Open Source, HAKI

  10. Software Epistemology

    Science.gov (United States)

    2016-03-01

    in-vitro decision to incubate a startup, Lexumo [7], which is developing a commercial Software as a Service ( SaaS ) vulnerability assessment...LTS Label Transition System MUSE Mining and Understanding Software Enclaves RTEMS Real-Time Executive for Multi-processor Systems SaaS Software ...as a Service SSA Static Single Assignment SWE Software Epistemology UD/DU Def-Use/Use-Def Chains (Dataflow Graph)

  11. On-Orbit Software Analysis

    Science.gov (United States)

    Moran, Susanne I.

    2004-01-01

    The On-Orbit Software Analysis Research Infusion Project was done by Intrinsyx Technologies Corporation (Intrinsyx) at the National Aeronautics and Space Administration (NASA) Ames Research Center (ARC). The Project was a joint collaborative effort between NASA Codes IC and SL, Kestrel Technology (Kestrel), and Intrinsyx. The primary objectives of the Project were: Discovery and verification of software program properties and dependencies, Detection and isolation of software defects across different versions of software, and Compilation of historical data and technical expertise for future applications

  12. Science and Software

    Science.gov (United States)

    Zelt, C. A.

    2017-12-01

    Earth science attempts to understand how the earth works. This research often depends on software for modeling, processing, inverting or imaging. Freely sharing open-source software is essential to prevent reinventing the wheel and allows software to be improved and applied in ways the original author may never have envisioned. For young scientists, releasing software can increase their name ID when applying for jobs and funding, and create opportunities for collaborations when scientists who collect data want the software's creator to be involved in their project. However, we frequently hear scientists say software is a tool, it's not science. Creating software that implements a new or better way of earth modeling or geophysical processing, inverting or imaging should be viewed as earth science. Creating software for things like data visualization, format conversion, storage, or transmission, or programming to enhance computational performance, may be viewed as computer science. The former, ideally with an application to real data, can be published in earth science journals, the latter possibly in computer science journals. Citations in either case should accurately reflect the impact of the software on the community. Funding agencies need to support more software development and open-source releasing, and the community should give more high-profile awards for developing impactful open-source software. Funding support and community recognition for software development can have far reaching benefits when the software is used in foreseen and unforeseen ways, potentially for years after the original investment in the software development. For funding, an open-source release that is well documented should be required, with example input and output files. Appropriate funding will provide the incentive and time to release user-friendly software, and minimize the need for others to duplicate the effort. All funded software should be available through a single web site

  13. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  14. The Software Management Environment (SME)

    Science.gov (United States)

    Valett, Jon D.; Decker, William; Buell, John

    1988-01-01

    The Software Management Environment (SME) is a research effort designed to utilize the past experiences and results of the Software Engineering Laboratory (SEL) and to incorporate this knowledge into a tool for managing projects. SME provides the software development manager with the ability to observe, compare, predict, analyze, and control key software development parameters such as effort, reliability, and resource utilization. The major components of the SME, the architecture of the system, and examples of the functionality of the tool are discussed.

  15. Contributions to the back-end software sub-system of the ATLAS data acquisition of event filter prototype -1 project

    International Nuclear Information System (INIS)

    Badescu, E.; Caprini, M.; Niculescu, M.; Radu, A.

    1998-01-01

    A project has been approved by the ATLAS Collaboration for the design and implementation of a Data Acquisition (DAQ) and Event Filter (EF) prototype, based on the functional architecture described in the ATLAS Technical Proposal. The prototype consists of a full 'vertical' slice of the ATLAS Data Acquisition and Event Filter architecture and can be seen as made of 4 sub-systems: the Detector Interface, the Dataflow, the Back-end DAQ and the Event Filter. The Bucharest group is member of DAQ/EF collaboration and during 1997 was involved in the Back-end activities. The back-end software encompasses the software for configuring, controlling and monitoring the DAQ but specifically excludes the management, processing or transportation of physics data. The user requirements gathered for the back-end sub-system have been divided into groups related to activities providing similar functionality. The groups have been further developed into components of the Back-end with a well defined purpose and boundaries. Each component offers some unique functionality and has its own architecture. The actual Back-end component model includes 5 core components (run control, configuration databases, message reporting system, process manager and information service) and 6 detector integration components (partition and resource manager, status display, run bookkeeper, event dump, test manager and diagnostic package). The Bucharest group participated to the high level design, implementation and testing of three components (information service, message reporting system and status display). The Information Service (IS) provides an information exchange facility for software components of the DAQ. Information (defined by the supplier) from many sources can be categorized and made available to requesting applications asynchronously or on demand. The design of the information service followed an object oriented approach. It is a multiple server configuration in which servers are dedicated to

  16. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  17. Software for safety critical applications

    International Nuclear Information System (INIS)

    Kropik, M.; Matejka, K.; Jurickova, M.; Chudy, R.

    2001-01-01

    The contribution gives an overview of the project of the software development for safety critical applications. This project has been carried out since 1997. The principal goal of the project was to establish a research laboratory for the development of the software with the highest requirements for quality and reliability. This laboratory was established at the department, equipped with proper hardware and software to support software development. A research team of predominantly young researchers for software development was created. The activities of the research team started with studying and proposing the software development methodology. In addition, this methodology was applied to the real software development. The verification and validation process followed the software development. The validation system for the integrated hardware and software tests was brought into being and its control software was developed. The quality of the software tools was also observed, and the SOSAT tool was used during these activities. National and international contacts were established and maintained during the project solution.(author)

  18. Integrating interface slicing into software engineering processes

    Science.gov (United States)

    Beck, Jon

    1993-01-01

    Interface slicing is a tool which was developed to facilitate software engineering. As previously presented, it was described in terms of its techniques and mechanisms. The integration of interface slicing into specific software engineering activities is considered by discussing a number of potential applications of interface slicing. The applications discussed specifically address the problems, issues, or concerns raised in a previous project. Because a complete interface slicer is still under development, these applications must be phrased in future tenses. Nonetheless, the interface slicing techniques which were presented can be implemented using current compiler and static analysis technology. Whether implemented as a standalone tool or as a module in an integrated development or reverse engineering environment, they require analysis no more complex than that required for current system development environments. By contrast, conventional slicing is a methodology which, while showing much promise and intuitive appeal, has yet to be fully implemented in a production language environment despite 12 years of development.

  19. Comparison between the application of the conventional mine planning and of the direct block scheduling on an open pit mine Project

    Directory of Open Access Journals (Sweden)

    Pedro Henrique Alves Campos

    Full Text Available Abstract Historically, since the 60's, traditional mine planning consists of several distinct stages: 1 Definition of the ultimate pit - the portion of the blocks that results in the greatest total value; 2 Pushback selection - based on the generation of nested pits, obtained with the change in the value of the ore price; 3 Long-term production scheduling. Although considered quite satisfactory, this methodology presents some flaws: The stages, even if considered individually optimal, may not be when put together. The opportunity cost is not considered and the cut-off is fixed. Due to the recent computational advances, a new technique has been growing and is more reliable: the direct block sequencing. In this methodology, the steps are consolidated into only one process, improving the economic results, reducing the total execution time and obtaining, in fact, an optimal planning. The aim of this work is to compare the results of the two planning methods applied in a database of a Brazilian iron ore mine and to show the real advantages and disadvantages of each one. To solve the direct block sequencing technique, Doppler was used, a tool developed by Delphos Mine Planning Laboratory, located at the University of Chile. The traditional methodology was executed through Whittle software. Lastly, a medium-term scheduling was performed using Deswik software.

  20. Error Free Software

    Science.gov (United States)

    1985-01-01

    A mathematical theory for development of "higher order" software to catch computer mistakes resulted from a Johnson Space Center contract for Apollo spacecraft navigation. Two women who were involved in the project formed Higher Order Software, Inc. to develop and market the system of error analysis and correction. They designed software which is logically error-free, which, in one instance, was found to increase productivity by 600%. USE.IT defines its objectives using AXES -- a user can write in English and the system converts to computer languages. It is employed by several large corporations.

  1. Software Formal Inspections Guidebook

    Science.gov (United States)

    1993-01-01

    The Software Formal Inspections Guidebook is designed to support the inspection process of software developed by and for NASA. This document provides information on how to implement a recommended and proven method for conducting formal inspections of NASA software. This Guidebook is a companion document to NASA Standard 2202-93, Software Formal Inspections Standard, approved April 1993, which provides the rules, procedures, and specific requirements for conducting software formal inspections. Application of the Formal Inspections Standard is optional to NASA program or project management. In cases where program or project management decide to use the formal inspections method, this Guidebook provides additional information on how to establish and implement the process. The goal of the formal inspections process as documented in the above-mentioned Standard and this Guidebook is to provide a framework and model for an inspection process that will enable the detection and elimination of defects as early as possible in the software life cycle. An ancillary aspect of the formal inspection process incorporates the collection and analysis of inspection data to effect continual improvement in the inspection process and the quality of the software subjected to the process.

  2. The STARLINK software collection

    Science.gov (United States)

    Penny, A. J.; Wallace, P. T.; Sherman, J. C.; Terret, D. L.

    1993-12-01

    A demonstration will be given of some recent Starlink software. STARLINK is: a network of computers used by UK astronomers; a collection of programs for the calibration and analysis of astronomical data; a team of people giving hardware, software and administrative support. The Starlink Project has been in operation since 1980 to provide UK astronomers with interactive image processing and data reduction facilities. There are now Starlink computer systems at 25 UK locations, serving about 1500 registered users. The Starlink software collection now has about 25 major packages covering a wide range of astronomical data reduction and analysis techniques, as well as many smaller programs and utilities. At the core of most of the packages is a common `software environment', which provides many of the functions which applications need and offers standardized methods of structuring and accessing data. The software environment simplifies programming and support, and makes it easy to use different packages for different stages of the data reduction. Users see a consistent style, and can mix applications without hitting problems of differing data formats. The Project group coordinates the writing and distribution of this software collection, which is Unix based. Outside the UK, Starlink is used at a large number of places, which range from installations at major UK telescopes, which are Starlink-compatible and managed like Starlink sites, to individuals who run only small parts of the Starlink software collection.

  3. Interface-based software testing

    Directory of Open Access Journals (Sweden)

    Aziz Ahmad Rais

    2016-10-01

    Full Text Available Software quality is determined by assessing the characteristics that specify how it should work, which are verified through testing. If it were possible to touch, see, or measure software, it would be easier to analyze and prove its quality. Unfortunately, software is an intangible asset, which makes testing complex. This is especially true when software quality is not a question of particular functions that can be tested through a graphical user interface. The primary objective of software architecture is to design quality of software through modeling and visualization. There are many methods and standards that define how to control and manage quality. However, many IT software development projects still fail due to the difficulties involved in measuring, controlling, and managing software quality. Software quality failure factors are numerous. Examples include beginning to test software too late in the development process, or failing properly to understand, or design, the software architecture and the software component structure. The goal of this article is to provide an interface-based software testing technique that better measures software quality, automates software quality testing, encourages early testing, and increases the software’s overall testability

  4. A software engineering process for safety-critical software application

    International Nuclear Information System (INIS)

    Kang, Byung Heon; Kim, Hang Bae; Chang, Hoon Seon; Jeon, Jong Sun

    1995-01-01

    Application of computer software to safety-critical systems in on the increase. To be successful, the software must be designed and constructed to meet the functional and performance requirements of the system. For safety reason, the software must be demonstrated not only to meet these requirements, but also to operate safely as a component within the system. For longer-term cost consideration, the software must be designed and structured to ease future maintenance and modifications. This paper presents a software engineering process for the production of safety-critical software for a nuclear power plant. The presentation is expository in nature of a viable high quality safety-critical software development. It is based on the ideas of a rational design process and on the experience of the adaptation of such process in the production of the safety-critical software for the shutdown system number two of Wolsung 2, 3 and 4 nuclear power generation plants. This process is significantly different from a conventional process in terms of rigorous software development phases and software design techniques, The process covers documentation, design, verification and testing using mathematically precise notations and highly reviewable tabular format to specify software requirements and software requirements and software requirements and code against software design using static analysis. The software engineering process described in this paper applies the principle of information-hiding decomposition in software design using a modular design technique so that when a change is required or an error is detected, the affected scope can be readily and confidently located. it also facilitates a sense of high degree of confidence in the 'correctness' of the software production, and provides a relatively simple and straightforward code implementation effort. 1 figs., 10 refs. (Author)

  5. A study on effectiveness of project management of software development in IT vendors for financial institutions of Japan : Influence investigation in PM action

    OpenAIRE

    WATANABE, Eiichi; WATANABE, Tsunemi

    2012-01-01

    Concerning the information-system development for financial institutions, the mass media often focuses on the effectiveness and efficiency of project management of the large-scale development projects which have big social impacts. In practice, however, there are more small-scale development projects and so called enhance projects. The enhance project refers to a project of adding a function to the original product and system and/or improving its performance. The enhance project needs differe...

  6. Does HDR Pre-Processing Improve the Accuracy of 3D Models Obtained by Means of two Conventional SfM-MVS Software Packages? The Case of the Corral del Veleta Rock Glacier

    Directory of Open Access Journals (Sweden)

    Álvaro Gómez-Gutiérrez

    2015-08-01

    Full Text Available The accuracy of different workflows using Structure-from-Motion and Multi-View-Stereo techniques (SfM-MVS is tested. Twelve point clouds of the Corral del Veleta rock glacier, in Spain, were produced with two different software packages (123D Catch and Agisoft Photoscan, using Low Dynamic Range images and High Dynamic Range compositions (HDR for three different years (2011, 2012 and 2014. The accuracy of the resulting point clouds was assessed using benchmark models acquired every year with a Terrestrial Laser Scanner. Three parameters were used to estimate the accuracy of each point cloud: the RMSE, the Cloud-to-Cloud distance (C2C and the Multiscale-Model-to-Model comparison (M3C2. The M3C2 mean error ranged from 0.084 m (standard deviation of 0.403 m to 1.451 m (standard deviation of 1.625 m. Agisoft Photoscan overcome 123D Catch, producing more accurate and denser point clouds in 11 out 12 cases, being this work, the first available comparison between both software packages in the literature. No significant improvement was observed using HDR pre-processing. To our knowledge, this is the first time that the geometrical accuracy of 3D models obtained using LDR and HDR compositions are compared. These findings may be of interest for researchers who wish to estimate geomorphic changes using SfM-MVS approaches.

  7. Software Innovation

    DEFF Research Database (Denmark)

    Rose, Jeremy

      Innovation is the forgotten key to modern systems development - the element that defines the enterprising engineer, the thriving software firm and the cutting edge software application.  Traditional forms of technical education pay little attention to creativity - often encouraging overly...

  8. Software development an open source approach

    CERN Document Server

    Tucker, Allen; de Silva, Chamindra

    2011-01-01

    Overview and Motivation Software Free and Open Source Software (FOSS)Two Case Studies Working with a Project Team Key FOSS Activities Client-Oriented vs. Community-Oriented Projects Working on a Client-Oriented Project Joining a Community-Oriented Project Using Project Tools Collaboration Tools Code Management Tools Run-Time System ConstraintsSoftware Architecture Architectural Patterns Layers, Cohesion, and Coupling Security Concurrency, Race Conditions, and DeadlocksWorking with Code Bad Smells and Metrics Refactoring Testing Debugging Extending the Software for a New ProjectDeveloping the D

  9. Software quality: Process or people

    Science.gov (United States)

    Palmer, Regina; Labaugh, Modenna

    1993-01-01

    This paper will present data related to software development processes and personnel involvement from the perspective of software quality assurance. We examine eight years of data collected from six projects. Data collected varied by project but usually included defect and fault density with limited use of code metrics, schedule adherence, and budget growth information. The data are a blend of AFSCP 800-14 and suggested productivity measures in Software Metrics: A Practioner's Guide to Improved Product Development. A software quality assurance database tool, SQUID, was used to store and tabulate the data.

  10. Software engineering

    CERN Document Server

    Sommerville, Ian

    2016-01-01

    For courses in computer science and software engineering The Fundamental Practice of Software Engineering Software Engineering introduces readers to the overwhelmingly important subject of software programming and development. In the past few years, computer systems have come to dominate not just our technological growth, but the foundations of our world's major industries. This text seeks to lay out the fundamental concepts of this huge and continually growing subject area in a clear and comprehensive manner. The Tenth Edition contains new information that highlights various technological updates of recent years, providing readers with highly relevant and current information. Sommerville's experience in system dependability and systems engineering guides the text through a traditional plan-based approach that incorporates some novel agile methods. The text strives to teach the innovators of tomorrow how to create software that will make our world a better, safer, and more advanced place to live.

  11. Software Engineering for Human Spaceflight

    Science.gov (United States)

    Fredrickson, Steven E.

    2014-01-01

    The Spacecraft Software Engineering Branch of NASA Johnson Space Center (JSC) provides world-class products, leadership, and technical expertise in software engineering, processes, technology, and systems management for human spaceflight. The branch contributes to major NASA programs (e.g. ISS, MPCV/Orion) with in-house software development and prime contractor oversight, and maintains the JSC Engineering Directorate CMMI rating for flight software development. Software engineering teams work with hardware developers, mission planners, and system operators to integrate flight vehicles, habitats, robotics, and other spacecraft elements. They seek to infuse automation and autonomy into missions, and apply new technologies to flight processor and computational architectures. This presentation will provide an overview of key software-related projects, software methodologies and tools, and technology pursuits of interest to the JSC Spacecraft Software Engineering Branch.

  12. Software quality concepts and practice

    CERN Document Server

    Galin, Daniel

    2018-01-01

    The book presents a comprehensive discussion on software quality issues and software quality assurance (SQA) principles and practices, and lays special emphasis on implementing and managing SQA. Primarily designed to serve three audiences; universities and college students, vocational training participants, and software engineers and software development managers, the book may be applicable to all personnel engaged in a software projects Features: * A broad view of SQA. The book delves into SQA issues, going beyond the classic boundaries of custom-made software development to also cover in-house software development, subcontractors, and readymade software. * An up-to-date wide-range coverage of SQA and SQA related topics. Providing comprehensive coverage on multifarious SQA subjects, including topics, hardly explored till in SQA texts. * A systematic presentation of the SQA function and its tasks: establishing the SQA processes, planning, coordinating, follow-up, review and evaluation of SQA processes. * Fo...

  13. Projectables

    DEFF Research Database (Denmark)

    Rasmussen, Troels A.; Merritt, Timothy R.

    2017-01-01

    CNC cutting machines have become essential tools for designers and architects enabling rapid prototyping, model-building and production of high quality components. Designers often cut from new materials, discarding the irregularly shaped remains. We introduce ProjecTables, a visual augmented...... reality system for interactive packing of model parts onto sheet materials. ProjecTables enables designers to (re)use scrap materials for CNC cutting that would have been previously thrown away, at the same time supporting aesthetic choices related to wood grain, avoiding surface blemishes, and other...... relevant material properties. We conducted evaluations of ProjecTables with design students from Aarhus School of Architecture, demonstrating that participants could quickly and easily place and orient model parts reducing material waste. Contextual interviews and ideation sessions led to a deeper...

  14. Drift tube with an electro-quadrupole magnet made with a conventional enamel wire for the proton engineering frontier project drift tube linac

    Science.gov (United States)

    Kim, Y. H.; Kwon, H. J.; Cho, Y. S.

    2006-12-01

    The proton engineering frontier project (PEFP) drift tube linac (DTL) chose the new type of electro-quadrupole magnet (EQM) using an enameled wire for a drift tube. By using this kind of EQM, we could simplify the drift tube structure. We verified the structural stability and thermal stability of this drift tube structure through a computational analysis and a simple experiment. We also verified the stability of the enameled wire regarding corrosion through a long period test of about 1 year. It was concluded that the design and fabrication of the drift tube and the EQM were successful.

  15. Drift tube with an electro-quadrupole magnet made with a conventional enamel wire for the proton engineering frontier project drift tube linac

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Y.H. [PEFP, KAERI, DaeJeon (Korea, Republic of)]. E-mail: yhkim72@kaeri.re.kr; Kwon, H.J. [PEFP, KAERI, DaeJeon (Korea, Republic of); Cho, Y.S. [PEFP, KAERI, DaeJeon (Korea, Republic of)

    2006-12-21

    The proton engineering frontier project (PEFP) drift tube linac (DTL) chose the new type of electro-quadrupole magnet (EQM) using an enameled wire for a drift tube. By using this kind of EQM, we could simplify the drift tube structure. We verified the structural stability and thermal stability of this drift tube structure through a computational analysis and a simple experiment. We also verified the stability of the enameled wire regarding corrosion through a long period test of about 1 year. It was concluded that the design and fabrication of the drift tube and the EQM were successful.

  16. Drift tube with an electro-quadrupole magnet made with a conventional enamel wire for the proton engineering frontier project drift tube linac

    International Nuclear Information System (INIS)

    Kim, Y.H.; Kwon, H.J.; Cho, Y.S.

    2006-01-01

    The proton engineering frontier project (PEFP) drift tube linac (DTL) chose the new type of electro-quadrupole magnet (EQM) using an enameled wire for a drift tube. By using this kind of EQM, we could simplify the drift tube structure. We verified the structural stability and thermal stability of this drift tube structure through a computational analysis and a simple experiment. We also verified the stability of the enameled wire regarding corrosion through a long period test of about 1 year. It was concluded that the design and fabrication of the drift tube and the EQM were successful

  17. 'Augmented reality' in conventional simulation by projection of 3-D structures into 2-D images. A comparison with virtual methods

    Energy Technology Data Exchange (ETDEWEB)

    Deutschmann, H.; Nairz, O.; Zehentmayr, F.; Fastner, G.; Sedlmayer, F. [Univ. Clinic for Radiotherapy and Radio-Oncology, Salzburg (Austria); radART - Inst. for research and development on Advanced Radiation Technologies at the Paracelsus Medical Univ., Salzburg (Austria); Steininger, P. [radART - Inst. for research and development on Advanced Radiation Technologies at the Paracelsus Medical Univ., Salzburg (Austria); Dept. of Medical Computer Science and Technology, Univ. for Health Sciences, Hall i. T. (Austria); Kopp, P.; Merz, F.; Wurstbauer, K.; Kranzinger, M.; Kametriser, G.; Kopp, M. [Univ. Clinic for Radiotherapy and Radio-Oncology, Salzburg (Austria)

    2008-02-15

    Background and purpose: in this study, a new method is introduced, which allows the overlay of three-dimensional structures, that have been delineated on transverse slices, onto the fluoroscopy from conventional simulators in real time. Patients and methods: setup deviations between volumetric imaging and simulation were visualized, measured and corrected for 701 patient isocenters. Results: comparing the accuracy to mere virtual simulation lacking additional X-ray imaging, a clear benefit of the new method could be shown. On average, virtual prostate simulations had to be corrected by 0.48 cm (standard deviation [SD] 0.38), and those of the breast by 0.67 cm (SD 0.66). Conclusion: the presented method provides an easy way to determine entity-specific safety margins related to patient setup errors upon registration of bony anatomy (prostate 0.9 cm for 90% of cases, breast 1.3 cm). The important role of planar X-ray imaging was clearly demonstrated. The innovation can also be applied to adaptive image-guided radiotherapy (IGRT) protocols. (orig.)

  18. Fuel savings with conventional hot water space heating systems by incorporating a natural gas powered heat pump. Preliminary project: Development of heat pump technology

    Science.gov (United States)

    Vanheyden, L.; Evertz, E.

    1980-12-01

    Compression type air/water heat pumps were developed for domestic heating systems rated at 20 to 150 kW. The heat pump is driven either by a reciprocating piston or rotary piston engine modified to operate on natural gas. Particular features of natural gas engines as prime movers, such as waste heat recovery and variable speed, are stressed. Two systems suitable for heat pump operation were selected from among five different mass produced car engines and were modified to incorporate reciprocating piston compressor pairs. The refrigerants used are R 12 and R 22. Test rig data transferred to field conditions show that the fuel consumption of conventional boilers can be reduced by 50% and more by the installation of engine driven heat pumps. Pilot heat pumps based on a 1,600 cc reciprocating piston engine were built for heating four two-family houses. Pilot pump operation confirms test rig findings. The service life of rotary piston and reciprocating piston engines was investigated. The tests reveal characteristic curves for reciprocating piston engines and include exhaust composition measurements.

  19. Automating Object-Oriented Software Development Methods

    NARCIS (Netherlands)

    Tekinerdogan, B.; Saeki, Motoshi; Sunyé, Gerson; van den Broek, P.M.; Hruby, Pavel; Tekinerdogan, B.; van den Broek, P.M.; Saeki, M.; Hruby, P.; Sunye, G.

    2001-01-01

    Current software projects have generally to deal with producing and managing large and complex software products. It is generally believed that applying software development methods are useful in coping with this complexity and for supporting quality. As such numerous object-oriented software

  20. Automating Object-Oriented Software Development Methods

    NARCIS (Netherlands)

    Tekinerdogan, B.; Frohner, A´ kos; Saeki, Motoshi; Sunyé, Gerson; van den Broek, P.M.; Hruby, Pavel

    2002-01-01

    Current software projects have generally to deal with producing and managing large and complex software products. It is generally believed that applying software development methods are useful in coping with this complexity and for supporting quality. As such numerous object-oriented software

  1. Integrated modeling of software cost and quality

    International Nuclear Information System (INIS)

    Rone, K.Y.; Olson, K.M.

    1994-01-01

    In modeling the cost and quality of software systems, the relationship between cost and quality must be considered. This explicit relationship is dictated by the criticality of the software being developed. The balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and the developers with respect to the processes being employed

  2. Software quality assurance plan for GCS

    Science.gov (United States)

    Duncan, Stephen E.; Bailey, Elizabeth K.

    1990-01-01

    The software quality assurance (SQA) function for the Guidance and Control Software (GCS) project which is part of a software error studies research program is described. The SQA plan outlines all of the procedures, controls, and audits to be carried out by the SQA organization to ensure adherence to the policies, procedures, and standards for the GCS project.

  3. Patterns in Software Development

    DEFF Research Database (Denmark)

    Corry, Aino Vonge

    the university and I entered a project to industry within Center for Object Technology (COT). I focused on promoting the pattern concept to the Danish software industry in order to help them take advantage of the benefits of applying patterns in system development. In the obligatory stay abroad, I chose to visit...

  4. Software requirements

    CERN Document Server

    Wiegers, Karl E

    2003-01-01

    Without formal, verifiable software requirements-and an effective system for managing them-the programs that developers think they've agreed to build often will not be the same products their customers are expecting. In SOFTWARE REQUIREMENTS, Second Edition, requirements engineering authority Karl Wiegers amplifies the best practices presented in his original award-winning text?now a mainstay for anyone participating in the software development process. In this book, you'll discover effective techniques for managing the requirements engineering process all the way through the development cy

  5. Software engineering methodologies and tools

    Science.gov (United States)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  6. Projections of Ocean Acidification Under the U.N. Framework Convention of Climate Change Using a Reduced-Form Climate Carbon-Cycle Model

    Science.gov (United States)

    Hartin, C.

    2016-02-01

    Ocean chemistry is quickly changing in response to continued anthropogenic emissions of carbon to the atmosphere. Mean surface ocean pH has already decreased by 0.1 units relative to the preindustrial era. We use an open-source, simple climate and carbon cycle model ("Hector") to investigate future changes in ocean acidification (pH and calcium carbonate saturations) under the climate agreement from the United Nations Convention on Climate Change Conference (UNFCCC) of Parties in Paris 2015 (COP 21). Hector is a reduced-form, very fast-executing model that can emulate the global mean climate of the CMIP5 models, as well as the inorganic carbon cycle in the upper ocean, allowing us to investigate future changes in ocean acidification. We ran Hector under three different emissions trajectories, using a sensitivity analysis approach to quantify model uncertainty and capture a range of possible ocean acidification changes. The first trajectory is a business-as-usual scenario comparable to a Representative Concentration Pathway (RCP) 8.5, the second a scenario with the COP 21 commitments enacted, and the third an idealized scenario keeping global temperature change to 2°C, comparable to a RCP 2.6. Preliminary results suggest that under the COP 21 agreements ocean pH at 2100 will decrease by 0.2 units and surface saturations of aragonite (calcite) will decrease by 0.9 (1.4) units relative to 1850. Under the COP 21 agreement the world's oceans will be committed to a degree of ocean acidification, however, these changes may be within the range of natural variability evident in some paleo records.

  7. Discriminant Analysis of the Effects of Software Cost Drivers on ...

    African Journals Online (AJOL)

    The paper work investigates the effect of software cost drivers on project schedule estimation of software development projects in Nigeria. Specifically, the paper determines the extent to which software cost variables affect our software project time schedule in our environment. Such studies are lacking in the recent ...

  8. Software Reviews.

    Science.gov (United States)

    Dwyer, Donna; And Others

    1989-01-01

    Reviewed are seven software packages for Apple and IBM computers. Included are: "Toxicology"; "Science Corner: Space Probe"; "Alcohol and Pregnancy"; "Science Tool Kit Plus"; Computer Investigations: Plant Growth"; "Climatrolls"; and "Animal Watch: Whales." (CW)

  9. Software Reviews.

    Science.gov (United States)

    Davis, Shelly J., Ed.; Knaupp, Jon, Ed.

    1984-01-01

    Reviewed is computer software on: (1) classification of living things, a tutorial program for grades 5-10; and (2) polynomial practice using tiles, a drill-and-practice program for algebra students. (MNS)

  10. Software Reviews.

    Science.gov (United States)

    Miller, Anne, Ed.; Radziemski, Cathy, Ed.

    1988-01-01

    Three pieces of computer software are described and reviewed: HyperCard, to build and use varied applications; Iggy's Gnees, for problem solving with shapes in grades kindergarten-two; and Algebra Shop, for practicing skills and problem solving. (MNS)

  11. How Does Software Process Improvement Address Global Software Engineering?

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Diebold, Philipp; Münch, Jürgen

    2016-01-01

    For decades, Software Process Improvement (SPI) programs have been implemented, inter alia, to improve quality and speed of software development. To set up, guide, and carry out SPI projects, and to measure SPI state, impact, and success, a multitude of different SPI approaches and considerable...

  12. Factors that motivate software developers in Nigerian's software ...

    African Journals Online (AJOL)

    It was also observed those courtesy, good reward systems, regular training, recognition, tolerance of mistakes and good leadership were high motivators of software developers. Keywords: Software developers, information technology, project managers, Nigeria International Journal of Natural and Applied Sciences, 6(4): ...

  13. An engineering context for software engineering

    OpenAIRE

    Riehle, Richard D.

    2008-01-01

    New engineering disciplines are emerging in the late Twentieth and early Twenty-first Century. One such emerging discipline is software engineering. The engineering community at large has long harbored a sense of skepticism about the validity of the term software engineering. During most of the fifty-plus years of software practice, that skepticism was probably justified. Professional education of software developers often fell short of the standard expected for conventional engineers; so...

  14. Software process in Geant4

    International Nuclear Information System (INIS)

    Cosmo, G.

    2001-01-01

    Since its erliest years of R and D, the GEANT4 simulation toolkit has been developed following software process standards which dictated the overall evolution of the project. The complexity of the software involved, the wide areas of application of the software product, the huge amount of code and Category complexity, the size and distributed nature of the Collaboration itself are all ingredients which involve and correlate together a wide variety of software processes. Although in 'production' and available to the public since December 1998, the GEANT4 software product includes Category Domains which are still under active development. Therefore they require different treatment also in terms of improvement of the development cycle, system testing and user support. The author is meant to describe some of the software processes as they are applied in GEANT4 for both development, testing and maintenance of the software

  15. R D software quality assurance

    Energy Technology Data Exchange (ETDEWEB)

    Hood, F.C.

    1991-10-01

    Research software quality assurance (QA) requirements must be adequate to strengthen development or modification objectives, but flexible enough not to restrict creativity. Application guidelines are needed for the different kinds of research and development (R D) software activities to assure project objectives are achieved.

  16. THE ADAPTIVE NATURE OF MANAGING SOFTWARE INNOVATION

    OpenAIRE

    Mihai Liviu Despa

    2013-01-01

    The focus of this article is pointed at adaptive management in the context of innovative software projects. Software development is presented through the filter of innovation. The aspects that differentiate software innovation from any other kind of innovation are highlighted. Adaptive management is addressed from a general point of view. The circumstances that require adaptive management are emphasized. Methods of implementing adaptive management in innovation oriented software projects are ...

  17. Formalizing the ISDF Software Development Methodology

    OpenAIRE

    Mihai Liviu DESPA

    2015-01-01

    The paper is aimed at depicting the ISDF software development methodology by emphasizing quality management and software development lifecycle. The ISDF methodology was built especially for innovative software development projects. The ISDF methodology was developed empirically by trial and error in the process of implementing multiple innovative projects. The research process began by analysing key concepts like innovation and software development and by settling the important dilemma of wha...

  18. Toward Measures for Software Architectures

    National Research Council Canada - National Science Library

    Chastek, Gary; Ferguson, Robert

    2006-01-01

    .... Defining these architectural measures is very difficult. The software architecture deeply affects subsequent development and project management decisions, such as the breakdown of the coding tasks and the definition of the development increments...

  19. Software reengineering

    Science.gov (United States)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  20. Verification of safety critical software

    International Nuclear Information System (INIS)

    Son, Ki Chang; Chun, Chong Son; Lee, Byeong Joo; Lee, Soon Sung; Lee, Byung Chai

    1996-01-01

    To assure quality of safety critical software, software should be developed in accordance with software development procedures and rigorous software verification and validation should be performed. Software verification is the formal act of reviewing, testing of checking, and documenting whether software components comply with the specified requirements for a particular stage of the development phase[1]. New software verification methodology was developed and was applied to the Shutdown System No. 1 and 2 (SDS1,2) for Wolsung 2,3 and 4 nuclear power plants by Korea Atomic Energy Research Institute(KAERI) and Atomic Energy of Canada Limited(AECL) in order to satisfy new regulation requirements of Atomic Energy Control Boars(AECB). Software verification methodology applied to SDS1 for Wolsung 2,3 and 4 project will be described in this paper. Some errors were found by this methodology during the software development for SDS1 and were corrected by software designer. Outputs from Wolsung 2,3 and 4 project have demonstrated that the use of this methodology results in a high quality, cost-effective product. 15 refs., 6 figs. (author)

  1. Agile software development

    CERN Document Server

    Stober, Thomas

    2009-01-01

    Software Development is moving towards a more agile and more flexible approach. It turns out that the traditional 'waterfall' model is not supportive in an environment where technical, financial and strategic constraints are changing almost every day. But what is agility? What are today's major approaches? And especially: What is the impact of agile development principles on the development teams, on project management and on software architects? How can large enterprises become more agile and improve their business processes, which have been existing since many, many years? What are the limit

  2. Agile distributed software development

    DEFF Research Database (Denmark)

    Persson, John Stouby; Mathiassen, Lars; Aaen, Ivan

    2012-01-01

    While face-to-face interaction is fundamental in agile software development, distributed environments must rely extensively on mediated interactions. Practicing agile principles in distributed environments therefore poses particular control challenges related to balancing fixed vs. evolving quality...... requirements and people vs. process-based collaboration. To investigate these challenges, we conducted an in-depth case study of a successful agile distributed software project with participants from a Russian firm and a Danish firm. Applying Kirsch’s elements of control framework, we offer an analysis of how...

  3. Behavior Protocols for Software Components

    Czech Academy of Sciences Publication Activity Database

    Plášil, František; Višňovský, Stanislav

    2002-01-01

    Roč. 28, č. 11 (2002), s. 1056-1076 ISSN 0098-5589 R&D Projects: GA AV ČR IAA2030902; GA ČR GA201/99/0244 Grant - others:Eureka(XE) Pepita project no.2033 Institutional research plan: AV0Z1030915 Keywords : behavior protocols * component-based programming * software architecture Subject RIV: JC - Computer Hardware ; Software Impact factor: 1.170, year: 2002

  4. Software-Engineering Process Simulation (SEPS) model

    Science.gov (United States)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  5. How the NWC handles software as product

    Energy Technology Data Exchange (ETDEWEB)

    Vinson, D.

    1997-11-01

    This tutorial provides a hands-on view of how the Nuclear Weapons Complex project should be handling (or planning to handle) software as a product in response to Engineering Procedure 401099. The SQAS has published the document SQAS96-002, Guidelines for NWC Processes for Handling Software Product, that will be the basis for the tutorial. The primary scope of the tutorial is on software products that result from weapons and weapons-related projects, although the information presented is applicable to many software projects. Processes that involve the exchange, review, or evaluation of software product between or among NWC sites, DOE, and external customers will be described.

  6. Ethics and Practice of Free Software

    CERN Document Server

    CERN. Geneva

    2007-01-01

    About the speaker Richard Matthew Stallman is a software freedom activist, hacker, and software developer. In September 1983, he launched the GNU Project to create a free Unix-like operating system, and has been the project's lead architect and organizer. With the launch of the GNU project he started the free software movement, and in October 1985 set up the Free Software Foundation. He co-founded the League for Programming Freedom. Stallman pioneered the concept of copyleft and is the main author of several copyleft licenses including the GNU General Public License, the most widely used free software license. (from Wikipedia)

  7. Software quality assurance plans for safety-critical software

    International Nuclear Information System (INIS)

    Liddle, P.

    2006-01-01

    Application software is defined as safety-critical if a fault in the software could prevent the system components from performing their nuclear-safety functions. Therefore, for nuclear-safety systems, the AREVA TELEPERM R XS (TXS) system is classified 1E, as defined in the Inst. of Electrical and Electronics Engineers (IEEE) Std 603-1998. The application software is classified as Software Integrity Level (SIL)-4, as defined in IEEE Std 7-4.3.2-2003. The AREVA NP Inc. Software Program Manual (SPM) describes the measures taken to ensure that the TELEPERM XS application software attains a level of quality commensurate with its importance to safety. The manual also describes how TELEPERM XS correctly performs the required safety functions and conforms to established technical and documentation requirements, conventions, rules, and standards. The program manual covers the requirements definition, detailed design, integration, and test phases for the TELEPERM XS application software, and supporting software created by AREVA NP Inc. The SPM is required for all safety-related TELEPERM XS system applications. The program comprises several basic plans and practices: 1. A Software Quality-Assurance Plan (SQAP) that describes the processes necessary to ensure that the software attains a level of quality commensurate with its importance to safety function. 2. A Software Safety Plan (SSP) that identifies the process to reasonably ensure that safety-critical software performs as intended during all abnormal conditions and events, and does not introduce any new hazards that could jeopardize the health and safety of the public. 3. A Software Verification and Validation (V and V) Plan that describes the method of ensuring the software is in accordance with the requirements. 4. A Software Configuration Management Plan (SCMP) that describes the method of maintaining the software in an identifiable state at all times. 5. A Software Operations and Maintenance Plan (SO and MP) that

  8. Strategies for successful software development risk management

    Directory of Open Access Journals (Sweden)

    Marija Boban

    2003-01-01

    Full Text Available Nowadays, software is becoming a major part of enterprise business. Software development is activity connected with advanced technology and high level of knowledge. Risks on software development projects must be successfully mitigated to produce successful software systems. Lack of a defined approach to risk management is one of the common causes for project failures. To improve project chances for success, this work investigates common risk impact areas to perceive a foundation that can be used to define a common approach to software risk management. Based on typical risk impact areas on software development projects, we propose three risk management strategies suitable for a broad area of enterprises and software development projects with different amounts of connected risks. Proposed strategies define activities that should be performed for successful risk management, the one that will enable software development projects to perceive risks as soon as possible and to solve problems connected with risk materialization. We also propose a risk-based approach to software development planning and risk management as attempts to address and retire the highest impact risks as early as possible in the development process. Proposed strategies should improve risk management on software development projects and help create a successful software solution.

  9. Software Authentication

    International Nuclear Information System (INIS)

    Wolford, J.K.; Geelhood, B.D.; Hamilton, V.A.; Ingraham, J.; MacArthur, D.W.; Mitchell, D.J.; Mullens, J.A.; Vanier, P. E.; White, G.K.; Whiteson, R.

    2001-01-01

    The effort to define guidance for authentication of software for arms control and nuclear material transparency measurements draws on a variety of disciplines and has involved synthesizing established criteria and practices with newer methods. Challenges include the need to protect classified information that the software manipulates as well as deal with the rapid pace of innovation in the technology of nuclear material monitoring. The resulting guidance will shape the design of future systems and inform the process of authentication of instruments now being developed. This paper explores the technical issues underlying the guidance and presents its major tenets

  10. Software engineering

    CERN Document Server

    Thorin, Marc

    1985-01-01

    Software Engineering describes the conceptual bases as well as the main methods and rules on computer programming. This book presents software engineering as a coherent and logically built synthesis and makes it possible to properly carry out an application of small or medium difficulty that can later be developed and adapted to more complex cases. This text is comprised of six chapters and begins by introducing the reader to the fundamental notions of entities, actions, and programming. The next two chapters elaborate on the concepts of information and consistency domains and show that a proc

  11. Testing Object-Oriented Software

    DEFF Research Database (Denmark)

    Caspersen, Michael Edelgaard; Madsen, Ole Lehrmann; Skov, Stefan H.

    The report is a result of an activity within the project Centre for Object Technology (COT), case 2. In case 2 a number of pilot projects have been carried out to test the feasibility of using object technology within embedded software. Some of the pilot projects have resulted in proto-types that......The report is a result of an activity within the project Centre for Object Technology (COT), case 2. In case 2 a number of pilot projects have been carried out to test the feasibility of using object technology within embedded software. Some of the pilot projects have resulted in proto......-types that are currently being developed into production versions. To assure a high quality in the product it was decided to carry out an activ-ity regarding issues in testing OO software. The purpose of this report is to discuss the issues of testing object-oriented software. It is often claimed that testing of OO...... software is radically different form testing traditional software developed using imperative/procedural programming. Other authors claim that there is no difference. In this report we will attempt to give an answer to these questions (or at least initiate a discussion)....

  12. Web accessibility and open source software.

    Science.gov (United States)

    Obrenović, Zeljko

    2009-07-01

    A Web browser provides a uniform user interface to different types of information. Making this interface universally accessible and more interactive is a long-term goal still far from being achieved. Universally accessible browsers require novel interaction modalities and additional functionalities, for which existing browsers tend to provide only partial solutions. Although functionality for Web accessibility can be found as open source and free software components, their reuse and integration is complex because they were developed in diverse implementation environments, following standards and conventions incompatible with the Web. To address these problems, we have started several activities that aim at exploiting the potential of open-source software for Web accessibility. The first of these activities is the development of Adaptable Multi-Interface COmmunicator (AMICO):WEB, an infrastructure that facilitates efficient reuse and integration of open source software components into the Web environment. The main contribution of AMICO:WEB is in enabling the syntactic and semantic interoperability between Web extension mechanisms and a variety of integration mechanisms used by open source and free software components. Its design is based on our experiences in solving practical problems where we have used open source components to improve accessibility of rich media Web applications. The second of our activities involves improving education, where we have used our platform to teach students how to build advanced accessibility solutions from diverse open-source software. We are also partially involved in the recently started Eclipse projects called Accessibility Tools Framework (ACTF), the aim of which is development of extensible infrastructure, upon which developers can build a variety of utilities that help to evaluate and enhance the accessibility of applications and content for people with disabilities. In this article we briefly report on these activities.

  13. Open Source Software and Design-Based Research Symbiosis in Developing 3D Virtual Learning Environments: Examples from the iSocial Project

    Science.gov (United States)

    Schmidt, Matthew; Galyen, Krista; Laffey, James; Babiuch, Ryan; Schmidt, Carla

    2014-01-01

    Design-based research (DBR) and open source software are both acknowledged as potentially productive ways for advancing learning technologies. These approaches have practical benefits for the design and development process and for building and leveraging community to augment and sustain design and development. This report presents a case study of…

  14. Application of Software Safety Analysis Methods

    International Nuclear Information System (INIS)

    Park, G. Y.; Hur, S.; Cheon, S. W.; Kim, D. H.; Lee, D. Y.; Kwon, K. C.; Lee, S. J.; Koo, Y. H.

    2009-01-01

    A fully digitalized reactor protection system, which is called the IDiPS-RPS, was developed through the KNICS project. The IDiPS-RPS has four redundant and separated channels. Each channel is mainly composed of a group of bistable processors which redundantly compare process variables with their corresponding setpoints and a group of coincidence processors that generate a final trip signal when a trip condition is satisfied. Each channel also contains a test processor called the ATIP and a display and command processor called the COM. All the functions were implemented in software. During the development of the safety software, various software safety analysis methods were applied, in parallel to the verification and validation (V and V) activities, along the software development life cycle. The software safety analysis methods employed were the software hazard and operability (Software HAZOP) study, the software fault tree analysis (Software FTA), and the software failure modes and effects analysis (Software FMEA)

  15. Resource utilization during software development

    Science.gov (United States)

    Zelkowitz, Marvin V.

    1988-01-01

    This paper discusses resource utilization over the life cycle of software development and discusses the role that the current 'waterfall' model plays in the actual software life cycle. Software production in the NASA environment was analyzed to measure these differences. The data from 13 different projects were collected by the Software Engineering Laboratory at NASA Goddard Space Flight Center and analyzed for similarities and differences. The results indicate that the waterfall model is not very realistic in practice, and that as technology introduces further perturbations to this model with concepts like executable specifications, rapid prototyping, and wide-spectrum languages, we need to modify our model of this process.

  16. Software Quality Perceptions of Stakeholders Involved in the Software Development Process

    Science.gov (United States)

    Padmanabhan, Priya

    2013-01-01

    Software quality is one of the primary determinants of project management success. Stakeholders involved in software development widely agree that quality is important (Barney and Wohlin 2009). However, they may differ on what constitutes software quality, and which of its attributes are more important than others. Although, software quality…

  17. Reviews, Software.

    Science.gov (United States)

    Science Teacher, 1988

    1988-01-01

    Reviews two computer software packages for use in physical science, physics, and chemistry classes. Includes "Physics of Model Rocketry" for Apple II, and "Black Box" for Apple II and IBM compatible computers. "Black Box" is designed to help students understand the concept of indirect evidence. (CW)

  18. Software Reviews.

    Science.gov (United States)

    Kinnaman, Daniel E.; And Others

    1988-01-01

    Reviews four educational software packages for Apple, IBM, and Tandy computers. Includes "How the West was One + Three x Four,""Mavis Beacon Teaches Typing,""Math and Me," and "Write On." Reviews list hardware requirements, emphasis, levels, publisher, purchase agreements, and price. Discusses the strengths…

  19. Software Review.

    Science.gov (United States)

    McGrath, Diane, Ed.

    1989-01-01

    Reviewed is a computer software package entitled "Audubon Wildlife Adventures: Grizzly Bears" for Apple II and IBM microcomputers. Included are availability, hardware requirements, cost, and a description of the program. The murder-mystery flavor of the program is stressed in this program that focuses on illegal hunting and game…

  20. Software Reviews.

    Science.gov (United States)

    Teles, Elizabeth, Ed.; And Others

    1990-01-01

    Reviewed are two computer software packages for Macintosh microcomputers including "Phase Portraits," an exploratory graphics tool for studying first-order planar systems; and "MacMath," a set of programs for exploring differential equations, linear algebra, and other mathematical topics. Features, ease of use, cost, availability, and hardware…

  1. MIAWARE Software

    DEFF Research Database (Denmark)

    Wilkowski, Bartlomiej; Pereira, Oscar N. M.; Dias, Paulo

    2008-01-01

    is automatically generated. Furthermore, MIAWARE software is accompanied with an intelligent search engine for medical reports, based on the relations between parts of the lungs. A logical structure of the lungs is introduced to the search algorithm through the specially developed ontology. As a result...

  2. On the Prospects and Concerns of Integrating Open Source Software Environment in Software Engineering Education

    Science.gov (United States)

    Kamthan, Pankaj

    2007-01-01

    Open Source Software (OSS) has introduced a new dimension in software community. As the development and use of OSS becomes prominent, the question of its integration in education arises. In this paper, the following practices fundamental to projects and processes in software engineering are examined from an OSS perspective: project management;…

  3. STEM - software test and evaluation methods. A study of failure dependency in diverse software

    International Nuclear Information System (INIS)

    Bishop, P.G.; Pullen, F.D.

    1989-02-01

    STEM is a collaborative software reliability project undertaken in partnership with Halden Reactor Project, UKAEA, and the Finnish Technical Research Centre. The objective of STEM is to evaluate a number of fault detection and fault estimation methods which can be applied to high integrity software. This Report presents a study of the observed failure dependencies between faults in diversely produced software. (author)

  4. Human rights and conventionality control in Mexico

    Directory of Open Access Journals (Sweden)

    Azul América Aguiar-Aguilar

    2014-12-01

    Full Text Available The protection of human rights in Mexico has, de jure, suffered an important change in the last years, given a new judicial interpretation delivered by the National Supreme Court of Justice that allows the use of conventionality control, which means, that it allows federal and state judges to verify the conformity of domestic laws with those established in the Inter-American Convention of Human Rights. To what extent domestic actors are protecting human rights using this new legal tool called conventionality control? In this article I explore whom and how is conventionality control being used in Mexico. Using N-Vivo Software I reviewed concluded decisions delivered by intermediate level courts (Collegiate Circuit Courts in three Mexican states. The evidence points that conventionality control is a very useful tool especially to defenders, who appear in sentences claiming compliance with the commitments Mexico has acquired when this country ratified the Convention.

  5. A Field Study of Scale Economies in Software Maintenance

    OpenAIRE

    Rajiv D. Banker; Sandra A. Slaughter

    1997-01-01

    Software maintenance is a major concern for organizations. Productivity gains in software maintenance can enable redeployment of Information Systems resources to other activities. Thus, it is important to understand how software maintenance productivity can be improved. In this study, we investigate the relationship between project size and software maintenance productivity. We explore scale economies in software maintenance by examining a number of software enhancement projects at a large fi...

  6. Optimal Conventional and Semi-Natural Treatments for the Upper Yakima Spring Chinook Salmon Supplementation Project, Treatment Definitions and Descriptions, and Biological Specifications for Facility Design, Final Report 1999

    International Nuclear Information System (INIS)

    Hager, Robert C.; Costello, Ronald J.

    1999-01-01

    This report describes the Yakima Fisheries Project facilities (Cle Elum Hatchery and acclimation satellites) which provide the mechanism to conduct state-of-the-art research for addressing questions about spring chinook supplementation strategies. The definition, descriptions, and specifications for the Yakima spring chinook supplementation program permit evaluation of alternative fish culture techniques that should yield improved methods and procedures to produce wild-like fish with higher survival that can be used to rebuild depleted spring chinook stocks of the Columbia River Basin. The definition and description of three experimental treatments, Optimal Conventional (OCT), Semi-Natural (SNT), Limited Semi-Natural (LSNT), and the biological specifications for facilities have been completed for the upper Yakima spring chinook salmon stock of the Yakima Fisheries Project. The task was performed by the Biological Specifications Work Group (BSWG) represented by Yakama Indian Nation, Washington Department of Fish and Wildlife, National Marine Fisheries Service, and Bonneville Power Administration. The control and experimental variables of the experimental treatments (OCT, SNT, and LSNT) are described in sufficient detail to assure that the fish culture facilities will be designed and operated as a production scale laboratory to produce and test supplemented upper Yakima spring chinook salmon. Product specifications of the treatment groups are proposed to serve as the generic templates for developing greater specificity for measurements of product attributes. These product specifications will be used to monitor and evaluate treatment effects, with respect to the biological response variables (post release survival, long-term fitness, reproductive success and ecological interactions)

  7. Optimal Conventional and Semi-Natural Treatments for the Upper Yakima Spring Chinook Salmon Supplementation Project; Treatment Definitions and Descriptions and Biological Specifications for Facility Design, 1995-1999 Final Report.

    Energy Technology Data Exchange (ETDEWEB)

    Hager, Robert C. (Hatchery Operations Consulting); Costello, Ronald J. (Mobrand Biometrics, Inc., Vashon Island, WA)

    1999-10-01

    This report describes the Yakima Fisheries Project facilities (Cle Elum Hatchery and acclimation satellites) which provide the mechanism to conduct state-of-the-art research for addressing questions about spring chinook supplementation strategies. The definition, descriptions, and specifications for the Yakima spring chinook supplementation program permit evaluation of alternative fish culture techniques that should yield improved methods and procedures to produce wild-like fish with higher survival that can be used to rebuild depleted spring chinook stocks of the Columbia River Basin. The definition and description of three experimental treatments, Optimal Conventional (OCT), Semi-Natural (SNT), Limited Semi-Natural (LSNT), and the biological specifications for facilities have been completed for the upper Yakima spring chinook salmon stock of the Yakima Fisheries Project. The task was performed by the Biological Specifications Work Group (BSWG) represented by Yakama Indian Nation, Washington Department of Fish and Wildlife, National Marine Fisheries Service, and Bonneville Power Administration. The control and experimental variables of the experimental treatments (OCT, SNT, and LSNT) are described in sufficient detail to assure that the fish culture facilities will be designed and operated as a production scale laboratory to produce and test supplemented upper Yakima spring chinook salmon. Product specifications of the treatment groups are proposed to serve as the generic templates for developing greater specificity for measurements of product attributes. These product specifications will be used to monitor and evaluate treatment effects, with respect to the biological response variables (post release survival, long-term fitness, reproductive success and ecological interactions).

  8. Activities implemented jointly: First report to the Secretariat of the United Nations Framework Convention on Climate Change. Accomplishments and descriptions of projects accepted under the U.S. Initiative on Joint Implementation

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-07-01

    More than 150 countries are now Party to the United Nations Framework Convention on Climate Change (FCCC), which seeks, as its ultimate objective, to stabilize atmospheric concentrations of greenhouse gases at a level that would prevent dangerous human interference with the climate system. As a step toward this goal, all Parties are to take measures to mitigate climate change and to promote and cooperate in the development and diffusion of technologies and practices that control or reduce emissions and enhance sinks of greenhouse gases. In the US view, efforts between countries or entities within them to reduce net greenhouse gas emissions undertaken cooperatively--called joint implementation (JI)--holds significant potential both for combating the threat of global warming and for promoting sustainable development. To develop and operationalize the JI concept, the US launched its Initiative on Joint Implementation (USIJI) in October 1993, and designed the program to attract private sector resources and to encourage the diffusion of innovative technologies to mitigate climate change. The USIJI provides a mechanism for investments by US entities in projects to reduce greenhouse gas emissions worldwide and has developed a set of criteria for evaluating proposed projects for their potential to reduce net GHG emissions.

  9. Software reliability assessment

    International Nuclear Information System (INIS)

    Barnes, M.; Bradley, P.A.; Brewer, M.A.

    1994-01-01

    The increased usage and sophistication of computers applied to real time safety-related systems in the United Kingdom has spurred on the desire to provide a standard framework within which to assess dependable computing systems. Recent accidents and ensuing legislation have acted as a catalyst in this area. One particular aspect of dependable computing systems is that of software, which is usually designed to reduce risk at the system level, but which can increase risk if it is unreliable. Various organizations have recognized the problem of assessing the risk imposed to the system by unreliable software, and have taken initial steps to develop and use such assessment frameworks. This paper relates the approach of Consultancy Services of AEA Technology in developing a framework to assess the risk imposed by unreliable software. In addition, the paper discusses the experiences gained by Consultancy Services in applying the assessment framework to commercial and research projects. The framework is applicable to software used in safety applications, including proprietary software. Although the paper is written with Nuclear Reactor Safety applications in mind, the principles discussed can be applied to safety applications in all industries

  10. Software Tools for Software Maintenance

    Science.gov (United States)

    1988-10-01

    COMMUNICATIONS, AND COMPUTER SCIENCES I ,(AIRMICS) FO~SOFTWARE TOOLS (.o FOR SOF1 ’ARE MAINTENANCE (ASQBG-1-89-001) October, 1988 DTIC ELECTE -ifB...SUNWW~. B..c Program An~Iysw HA.c C-Tractr C Cobol Stncturing Facility VS Cobol 11 F-Scan Foctma Futbol Cobol Fortran Sltiuc Code Anaiyaer Fortran IS

  11. Aplicação do método ágil scrum no desenvolvimento de produtos de software em uma pequena empresa de base tecnológica Implementation of scrum agile methodology in software product project in a small technology-based company

    Directory of Open Access Journals (Sweden)

    Bernardo Vasconcelos de Carvalho

    2012-01-01

    Full Text Available Este trabalho apresenta o resultado de uma pesquisa-ação, realizada em uma pequena empresa de base tecnológica, na qual se aplicou o método ágil Scrum em um projeto de desenvolvimento de um produto de software. A empresa objeto desta atua em Itajubá/MG e seus principais produtos são sistemas de software. Estudos indicam que a indústria de produção de software é ineficiente e ineficaz. E as micro e pequenas empresas de base tecnológica (MPEBT têm um desafio ainda maior devido aos seus recursos restritos. Além disso, os métodos tradicionais de desenvolvimento de produtos de softwares demandam muitos custos. Tendo em vista a importância estratégica das MPEBT no desenvolvimento regional, seria importante que o Scrum fosse compatível com seus processos, para que elas pudessem se tornar mais competitivas e usufruir de seus benefícios. O objetivo deste trabalho foi analisar a implantação do método ágil Scrum nos projetos de desenvolvimento de novos produtos de software de uma MPEBT, além de compreender e mensurar o impacto desta implantação na empresa. Concluiu-se que os resultados alcançados sugerem que o método melhorou a comunicação e aumentou a motivação do time, diminuiu o custo, o tempo e o risco do projeto e aumentou a produtividade da equipe. Com esses resultados alcançados, a organização tende a se tornar mais competitiva, pois a bem-sucedida gestão de desenvolvimento de produtos é ponto crucial para o sucesso de uma empresa de base tecnológica.This study presents the result of an action research that was carried out in a small technology-based company, in which the Scrum agile methodology was applied in software product project. The company object of this research operates in Itajubá/MG, and its main products are software systems. Studies have shown that the software industry is inefficient and ineffective. Micro and small technology-based companies have an even greater challenge, considering their

  12. Unified Engineering Software System

    Science.gov (United States)

    Purves, L. R.; Gordon, S.; Peltzman, A.; Dube, M.

    1989-01-01

    Collection of computer programs performs diverse functions in prototype engineering. NEXUS, NASA Engineering Extendible Unified Software system, is research set of computer programs designed to support full sequence of activities encountered in NASA engineering projects. Sequence spans preliminary design, design analysis, detailed design, manufacturing, assembly, and testing. Primarily addresses process of prototype engineering, task of getting single or small number of copies of product to work. Written in FORTRAN 77 and PROLOG.

  13. Software Design Modelling with Functional Petri Nets | Bakpo ...

    African Journals Online (AJOL)

    Software Design Modelling with Functional Petri Nets. ... of structured programs and a FPN Software prototype proposed for the conventional programming construct: if-then-else statement. ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  14. Proceedings of the 14th Annual Software Engineering Workshop

    Science.gov (United States)

    1989-01-01

    Several software related topics are presented. Topics covered include studies and experiment at the Software Engineering Laboratory at the Goddard Space Flight Center, predicting project success from the Software Project Management Process, software environments, testing in a reuse environment, domain directed reuse, and classification tree analysis using the Amadeus measurement and empirical analysis.

  15. EPIQR software

    Energy Technology Data Exchange (ETDEWEB)

    Flourentzos, F. [Federal Institute of Technology, Lausanne (Switzerland); Droutsa, K. [National Observatory of Athens, Athens (Greece); Wittchen, K.B. [Danish Building Research Institute, Hoersholm (Denmark)

    1999-11-01

    The support of the EPIQR method is a multimedia computer program. Several modules help the users of the method to treat the data collected during a diagnosis survey, to set up refurbishment scenario and calculate their cost or energy performance, and finally to visualize the results in a comprehensive way and to prepare quality reports. This article presents the structure and the main features of the software. (au)

  16. Software preservation

    Directory of Open Access Journals (Sweden)

    Tadej Vodopivec

    2011-01-01

    Full Text Available Comtrade Ltd. covers a wide range of activities related to information and communication technologies; its deliverables include web applications, locally installed programs,system software, drivers, embedded software (used e.g. in medical devices, auto parts,communication switchboards. Also the extensive knowledge and practical experience about digital long-term preservation technologies have been acquired. This wide spectrum of activities puts us in the position to discuss the often overlooked aspect of the digital preservation - preservation of software programs. There are many resources dedicated to digital preservation of digital data, documents and multimedia records,but not so many about how to preserve the functionalities and features of computer programs. Exactly these functionalities - dynamic response to inputs - render the computer programs rich compared to documents or linear multimedia. The article opens the questions on the beginning of the way to the permanent digital preservation. The purpose is to find a way in the right direction, where all relevant aspects will be covered in proper balance. The following questions are asked: why at all to preserve computer programs permanently, who should do this and for whom, when we should think about permanent program preservation, what should be persevered (such as source code, screenshots, documentation, and social context of the program - e.g. media response to it ..., where and how? To illustrate the theoretic concepts given the idea of virtual national museum of electronic banking is also presented.

  17. Validation of the first step of the pilot project for the development of a software for the management of radiation protection of occupationally exposed personnel at Rio de Janeiro Federal University, Brazil

    International Nuclear Information System (INIS)

    Padilha Filho, L.G.; Souza, A.N. de; Gonçalves, A. de O.; Padilha, C.M.L.; Belem, P.H.A.; Souza, S.A.L. de; Sousa, C.H.S

    2017-01-01

    UFRJ counts on the Coordination of Worker Health Policies for the implementation of the Workplace Risks Prevention Program, which evaluates the safety conditions for carrying out work activities, which includes the Advisory Committee for Rectorate for activities with Radiation (COTAR X). To assist in the management of radioprotection, software is being developed that intends to contribute significantly to the control of occupationally exposed individuals - IOEs. Objective: To present the first step of the validation of the pilot project for the development of software for the management of radiation protection of occupationally exposed individuals (IOEs). Methodology: It is based on a proposal for the development and implementation of an online system, as a model for the control of radioprotection and on the issue of skills and reports to meet UFRJ's demand, involving all occupationally exposed servers that use or have contact with some form of ionizing radiation. Results: Thirty-six professionals, who requested expertise directed at radioprotection, were evaluated. 80.6% (n = 29) were statutory servers and 19.4% (n = 7) CLT (Workers Law Consolidation). 36.1% were higher teaching profession; 19.4% physician; 19.4% nursing assistant; 8.3% nurse; 5.5% nursing technician and four other professions presented 2.8%. More than half (67%) of the servers' activities did not present a potential risk of exposure, 22% had minimal risk and 11% had maximum risk. Conclusions: The project will guarantee protection and permanent access to important information for the management of health control and evaluation of IOEs. Conclusions: The project will ensure the protection and permanent access of important information for the management of health control and evaluation of IOEs. The implementation of the software is contributing significantly to the optimization of the time in the production of information and systematization of behaviors

  18. Software for validating parameters retrieved from satellite

    Digital Repository Service at National Institute of Oceanography (India)

    Muraleedharan, P.M.; Sathe, P.V.; Pankajakshan, T.

    -channel Scanning Microwave Radiometer (MSMR) onboard the Indian satellites Occansat-1 during 1999-2001 were validated using this software as a case study. The program has several added advantages over the conventional method of validation that involves strenuous...

  19. Establishing software quality assurance

    International Nuclear Information System (INIS)

    Malsbury, J.

    1983-01-01

    This paper is concerned with four questions about establishing software QA: What is software QA. Why have software QA. What is the role of software QA. What is necessary to ensure the success of software QA

  20. Proceedings of the Ninth Annual Software Engineering Workshop

    Science.gov (United States)

    1984-01-01

    Experiences in measurement, utilization, and evaluation of software methodologies, models, and tools are discussed. NASA's involvement in ever larger and more complex systems, like the space station project, provides a motive for the support of software engineering research and the exchange of ideas in such forums. The topics of current SEL research are software error studies, experiments with software development, and software tools.