WorldWideScience

Sample records for experiences initiating software

  1. Software for the LHCb experiment

    CERN Document Server

    Corti, Gloria; Belyaev, Ivan; Cattaneo, Marco; Charpentier, Philippe; Frank, Markus; Koppenburg, Patrick; Mato-Vila, P; Ranjard, Florence; Roiser, Stefan

    2006-01-01

    LHCb is an experiment for precision measurements of CP-violation and rare decays in B mesons at the LHC collider at CERN. The LHCb software development strategy follows an architecture-centric approach as a way of creating a resilient software framework that can withstand changes in requirements and technology over the expected long lifetime of the experiment. The software architecture, called GAUDI, supports event data processing applications that run in different processing environments ranging from the real-time high- level triggers in the online system to the final physics analysis performed by more than one hundred physicists. The major architectural design choices and the arguments that lead to these choices will be outlined. Object oriented technologies have been used throughout. Initially developed for the LHCb experiment, GAUDI has been adopted and extended by other experiments. Several iterations of the GAUDI software framework have been released and are now being used routinely by the physicists of...

  2. Software Development Initiatives to Identify and Mitigate Security Threats - Two Systematic Mapping Studies

    Directory of Open Access Journals (Sweden)

    Paulina Silva

    2016-12-01

    Full Text Available Software Security and development experts have addressed the problem of building secure software systems. There are several processes and initiatives to achieve secure software systems. However, most of these lack empirical evidence of its application and impact in building secure software systems. Two systematic mapping studies (SM have been conducted to cover the existent initiatives for identification and mitigation of security threats. The SMs created were executed in two steps, first in 2015 July, and complemented through a backward snowballing in 2016 July. Integrated results of these two SM studies show a total of 30 relevant sources were identified; 17 different initiatives covering threats identification and 14 covering the mitigation of threats were found. All the initiatives were associated to at least one activity of the Software Development Lifecycle (SDLC; while 6 showed signs of being applied in industrial settings, only 3 initiatives presented experimental evidence of its results through controlled experiments, some of the other selected studies presented case studies or proposals.

  3. EXPERIENCES INITIATING SOFTWARE PRODUCT LINE ENGINEERING IN SMALL TEAMS WITH PULSE

    DEFF Research Database (Denmark)

    Mærsk-Møller, Hans Martin; Jørgensen, Bo Nørregaard

    2010-01-01

    Small teams of software engineers are represented both in small companies and semi-independent fractions in medium and large companies. Even though some results and experience papers have been published in the context of Small- and Medium-Sized Enterprises (SMEs), there is a lack of experience pa...... an existing methodology, PuLSETM, drew advantages of NetBeans Rich Client Platform, and based the product line on the existing application....

  4. Innovation Initiatives in Large Software Companies

    DEFF Research Database (Denmark)

    Edison, Henry; Wang, Xiaofeng; Jabangwe, Ronald

    2018-01-01

    empirical studies on innovation initiative in the context of large software companies. A total of 7 studies are conducted in the context of large software companies, which reported 5 types of initiatives: intrapreneurship, bootlegging, internal venture, spin-off and crowdsourcing. Our study offers three......Context: To keep the competitive advantage and adapt to changes in the market and technology, companies need to innovate in an organised, purposeful and systematic manner. However, due to their size and complexity, large companies tend to focus on the structure in maintaining their business, which...... can potentially lower their agility to innovate. Objective:The aims of this study are to provide an overview of the current research on innovation initiatives and to identify the challenges of implementing those initiatives in the context of large software companies. Method: The investigation...

  5. Software inspections at Fermilab -- Use and experience

    International Nuclear Information System (INIS)

    Berman, E.F.

    1998-01-01

    Because of the critical nature of DA/Online software it is important to commission software which is correct, usable, reliable, and maintainable, i.e., has the highest quality possible. In order to help meet these goals Fermi National Accelerator Laboratory (Fermilab) has begun implementing a formal software inspection process. Formal Inspections are used to reduce the number of defects in software at as early a stage as possible. These Inspections, in use at a wide variety of institutions (e.g., NASA, Motorola), implement a well-defined procedure that can be used to improve the quality of many different types of deliverables. The inspection process, initially designed by Michael Fagan, will be described as it was developed and as it is currently implemented at Fermilab where it has been used to improve the quality of a variety of different experiment DA/Online software. Benefits of applying inspections at many points in the software life-cycle and benefits to the people involved will be investigated. Experience with many different types of Inspections and the lessons learned about the inspection process itself will be detailed. Finally, the future of Inspections at Fermilab will be given

  6. Net-VISA used as a complement to standard software at the CTBTO: initial operational experience with next-generation software.

    Science.gov (United States)

    Le Bras, R. J.; Arora, N. S.; Kushida, N.; Kebede, F.; Feitio, P.; Tomuta, E.

    2017-12-01

    The International Monitoring System of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) has reached out to the broader scientific community through a series of conferences, the later one of which took place in June 2017 in Vienna, Austria. Stemming out of this outreach effort, after the inception of research and development efforts in 2009, the NET-VISA software, following a Bayesian modelling approach, has been elaborated to improve on the key step of automatic association of joint seismic, hydro-acoustic, and infrasound detections. When compared with the current operational system, it has been consistently shown on off-line tests to improve the overlap with the analyst-reviewed Reviewed Event Bulletin (REB) by ten percent for an average of 85% overlap, while the inconsistency rate is essentially the same at about 50%. Testing by analysts in realistic conditions on a few days of data has also demonstrated the software performance in finding additional events which qualify for publication in the REB. Starting in August 2017, the automatic events produced by the software will be reviewed by analysts at the CTBTO, and we report on the initial evaluation of this introduction into operations.

  7. Software Dependability and Safety Evaluations ESA's Initiative

    Science.gov (United States)

    Hernek, M.

    ESA has allocated funds for an initiative to evaluate Dependability and Safety methods of Software. The objectives of this initiative are; · More extensive validation of Safety and Dependability techniques for Software · Provide valuable results to improve the quality of the Software thus promoting the application of Dependability and Safety methods and techniques. ESA space systems are being developed according to defined PA requirement specifications. These requirements may be implemented through various design concepts, e.g. redundancy, diversity etc. varying from project to project. Analysis methods (FMECA. FTA, HA, etc) are frequently used during requirements analysis and design activities to assure the correct implementation of system PA requirements. The criticality level of failures, functions and systems is determined and by doing that the critical sub-systems are identified, on which dependability and safety techniques are to be applied during development. Proper performance of the software development requires the development of a technical specification for the products at the beginning of the life cycle. Such technical specification comprises both functional and non-functional requirements. These non-functional requirements address characteristics of the product such as quality, dependability, safety and maintainability. Software in space systems is more and more used in critical functions. Also the trend towards more frequent use of COTS and reusable components pose new difficulties in terms of assuring reliable and safe systems. Because of this, its dependability and safety must be carefully analysed. ESA identified and documented techniques, methods and procedures to ensure that software dependability and safety requirements are specified and taken into account during the design and development of a software system and to verify/validate that the implemented software systems comply with these requirements [R1].

  8. Data acquisition software for DIRAC experiment

    International Nuclear Information System (INIS)

    Ol'shevskij, V.G.; Trusov, S.V.

    2000-01-01

    The structure and basic processes of data acquisition software of DIRAC experiment for the measurement of π + π - atom life-time are described. The experiment is running on PS accelerator of CERN. The developed software allows one to accept, record and distribute to consumers up to 3 Mbytes of data in one accelerator supercycle of 14.4 s duration. The described system is used successfully in the DIRAC experiment starting from 1998 year

  9. Data acquisition software for DIRAC experiment

    Science.gov (United States)

    Olshevsky, V.; Trusov, S.

    2001-08-01

    The structure and basic processes of data acquisition software of the DIRAC experiment for the measurement of π +π - atom lifetime are described. The experiment is running on the PS accelerator of CERN. The developed software allows one to accept, record and distribute up to 3 Mbytes of data to consumers in one accelerator supercycle of 14.4 s duration. The described system is successfully in use in the experiment since its startup in 1998.

  10. Data acquisition software for DIRAC experiment

    International Nuclear Information System (INIS)

    Olshevsky, V.; Trusov, S.

    2001-01-01

    The structure and basic processes of data acquisition software of the DIRAC experiment for the measurement of π + π - atom lifetime are described. The experiment is running on the PS accelerator of CERN. The developed software allows one to accept, record and distribute up to 3 Mbytes of data to consumers in one accelerator supercycle of 14.4 s duration. The described system is successfully in use in the experiment since its startup in 1998

  11. Data acquisition software for DIRAC experiment

    CERN Document Server

    Olshevsky, V G

    2001-01-01

    The structure and basic processes of data acquisition software of the DIRAC experiment for the measurement of pi /sup +/ pi /sup -/ atom lifetime are described. The experiment is running on the PS accelerator of CERN. The developed software allows one to accept, record and distribute up to 3 Mbytes of data to consumers in one accelerator supercycle of 14.4 s duration. The described system is successfully in use in the experiment since its startup in 1998. (13 refs).

  12. Automated support for experience-based software management

    Science.gov (United States)

    Valett, Jon D.

    1992-01-01

    To effectively manage a software development project, the software manager must have access to key information concerning a project's status. This information includes not only data relating to the project of interest, but also, the experience of past development efforts within the environment. This paper describes the concepts and functionality of a software management tool designed to provide this information. This tool, called the Software Management Environment (SME), enables the software manager to compare an ongoing development effort with previous efforts and with models of the 'typical' project within the environment, to predict future project status, to analyze a project's strengths and weaknesses, and to assess the project's quality. In order to provide these functions the tool utilizes a vast corporate memory that includes a data base of software metrics, a set of models and relationships that describe the software development environment, and a set of rules that capture other knowledge and experience of software managers within the environment. Integrating these major concepts into one software management tool, the SME is a model of the type of management tool needed for all software development organizations.

  13. Software quality assurance - seven years experience

    International Nuclear Information System (INIS)

    Malsbury, J.A.

    1987-01-01

    This paper describes seven years experience with software quality assurance at PPPL. It covers the early attempts of 1980 and 1981 to establish software quality assurance; the first attempt of 1982 to develop a complete software quality assurance plan; the significant modifications of this plan in 1985; and the future. In addition, the paper describes the role of the Quality Assurance organization within each plan. The scope of this paper is limited to discussions of the software development procedures used in the seven year period. Other software quality topics, such as configuration control or problem identification and resolution, are not discussed

  14. Continuous software quality analysis for the ATLAS experiment

    CERN Document Server

    Washbrook, Andrew; The ATLAS collaboration

    2017-01-01

    The software for the ATLAS experiment on the Large Hadron Collider at CERN has evolved over many years to meet the demands of Monte Carlo simulation, particle detector reconstruction and data analysis. At present over 3.8 million lines of C++ code (and close to 6 million total lines of code) are maintained by an active worldwide developer community. In order to run the experiment software efficiently at hundreds of computing centres it is essential to maintain a high level of software quality standards. The methods proposed to improve software quality practices by incorporating checks into the new ATLAS software build infrastructure.

  15. Experience with case tools in the design of process-oriented software

    Science.gov (United States)

    Novakov, Ognian; Sicard, Claude-Henri

    1994-12-01

    In Accelerator systems such as the CERN PS complex, process equipment has a life time which may exceed the typical life cycle of its related software. Taking into account the variety of such equipment, it is important to keep the analysis and design of the software in a system-independent form. This paper discusses the experience gathered in using commercial CASE tools for analysis, design and reverse engineering of different process-oriented software modules, with a principal emphasis on maintaining the initial analysis in a standardized form. Such tools have been in existence for several years, but this paper shows that they are not fully adapted to our needs. In particular, the paper stresses the problems of integrating such a tool into an existing data-base-dependent development chain, the lack of real-time simulation tools and of Object-Oriented concepts in existing commercial packages. Finally, the paper gives a broader view of software engineering needs in our particular context.

  16. Software-Based Student Response Systems: An Interdisciplinary Initiative

    Science.gov (United States)

    Fischer, Carol M.; Hoffman, Michael S.; Casey, Nancy C.; Cox, Maureen P.

    2015-01-01

    Colleagues from information technology and three academic departments collaborated on an instructional technology initiative to employ student response systems in classes in mathematics, accounting and education. The instructors assessed the viability of using software-based systems to enable students to use their own devices (cell phones,…

  17. Experiences with Software Quality Metrics in the EMI middlewate

    OpenAIRE

    Alandes, M; Kenny, E M; Meneses, D; Pucciani, G

    2012-01-01

    The EMI Quality Model has been created to define, and later review, the EMI (European Middleware Initiative) software product and process quality. A quality model is based on a set of software quality metrics and helps to set clear and measurable quality goals for software products and processes. The EMI Quality Model follows the ISO/IEC 9126 Software Engineering – Product Quality to identify a set of characteristics that need to be present in the EMI software. For each software characteristi...

  18. Offline software for the DAMPE experiment

    Science.gov (United States)

    Wang, Chi; Liu, Dong; Wei, Yifeng; Zhang, Zhiyong; Zhang, Yunlong; Wang, Xiaolian; Xu, Zizong; Huang, Guangshun; Tykhonov, Andrii; Wu, Xin; Zang, Jingjing; Liu, Yang; Jiang, Wei; Wen, Sicheng; Wu, Jian; Chang, Jin

    2017-10-01

    A software system has been developed for the DArk Matter Particle Explorer (DAMPE) mission, a satellite-based experiment. The DAMPE software is mainly written in C++ and steered using a Python script. This article presents an overview of the DAMPE offline software, including the major architecture design and specific implementation for simulation, calibration and reconstruction. The whole system has been successfully applied to DAMPE data analysis. Some results obtained using the system, from simulation and beam test experiments, are presented. Supported by Chinese 973 Program (2010CB833002), the Strategic Priority Research Program on Space Science of the Chinese Academy of Science (CAS) (XDA04040202-4), the Joint Research Fund in Astronomy under cooperative agreement between the National Natural Science Foundation of China (NSFC) and CAS (U1531126) and 100 Talents Program of the Chinese Academy of Science

  19. View of software for HEP experiments

    Energy Technology Data Exchange (ETDEWEB)

    Johnstad, H.; Lebrun, P.; Lessner, E.S.; Montgomery, H.E.

    1986-05-01

    A view of the software structure typical of a High Energy Physics experiment is given and the availability of general software modules in most of the important regions is discussed. The aim is to provide a framework for discussion of capabilities and inadequecies and thereby define areas where effort should be assigned and perhaps also to serve as a useful source document for the newcomer to High Energy Physics. 74 refs.

  20. View of software for HEP experiments

    International Nuclear Information System (INIS)

    Johnstad, H.; Lebrun, P.; Lessner, E.S.; Montgomery, H.E.

    1986-05-01

    A view of the software structure typical of a High Energy Physics experiment is given and the availability of general software modules in most of the important regions is discussed. The aim is to provide a framework for discussion of capabilities and inadequecies and thereby define areas where effort should be assigned and perhaps also to serve as a useful source document for the newcomer to High Energy Physics. 74 refs

  1. Rapport fra forbedringsaktivitet: Software udviklingsmodel, Brüel & Kjær CMS

    DEFF Research Database (Denmark)

    Nørbjerg, Jacob; Vinter, Otto

    1999-01-01

    Final report from a software process improvement initiative at Brüel & Kjær CMS. The initiative aimed to develop and obtain practical experience with a software development model based on incremental development and timeboxing.......Final report from a software process improvement initiative at Brüel & Kjær CMS. The initiative aimed to develop and obtain practical experience with a software development model based on incremental development and timeboxing....

  2. Experiences with Architectural Software Configuration Management in Ragnarok

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak

    1998-01-01

    This paper describes a model, denoted architectural software configuration management, that minimises the gap between software design and configuration management by allowing developers to do configuration- and version control of the abstractions and hierarchy in a software architecture. The model...... emphasises traceability and reproducibility by unifying the concepts version and bound configuration. Experiences with such a model, implemented in a prototype “Ragnarok”, from three real-life, small- to medium-sized, software development projects are reported. The conclusion is that the presented model...

  3. SPADE - software package to aid diffraction experiments

    International Nuclear Information System (INIS)

    Farren, J.; Giltrap, J.W.

    1978-10-01

    A software package is described which enables the DEC PDP-11/03 microcomputer to execute several different X-ray diffraction experiments and other similar experiments where stepper motors are driven and data is gathered and processed in real time. (author)

  4. Experiences with Software Quality Metrics in the EMI Middleware

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    The EMI Quality Model has been created to define, and later review, the EMI (European Middleware Initiative) software product and process quality. A quality model is based on a set of software quality metrics and helps to set clear and measurable quality goals for software products and processes. The EMI Quality Model follows the ISO/IEC 9126 Software Engineering – Product Quality to identify a set of characteristics that need to be present in the EMI software. For each software characteristic, such as portability, maintainability, compliance, etc, a set of associated metrics and KPIs (Key Performance Indicators) are identified. This article presents how the EMI Quality Model and the EMI Metrics have been defined in the context of the software quality assurance activities carried out in EMI. It also describes the measurement plan and presents some of the metrics reports that have been produced for the EMI releases and updates. It also covers which tools and techniques can be used by any software project t...

  5. Framework for Small-Scale Experiments in Software Engineering: Guidance and Control Software Project: Software Engineering Case Study

    Science.gov (United States)

    Hayhurst, Kelly J.

    1998-01-01

    Software is becoming increasingly significant in today's critical avionics systems. To achieve safe, reliable software, government regulatory agencies such as the Federal Aviation Administration (FAA) and the Department of Defense mandate the use of certain software development methods. However, little scientific evidence exists to show a correlation between software development methods and product quality. Given this lack of evidence, a series of experiments has been conducted to understand why and how software fails. The Guidance and Control Software (GCS) project is the latest in this series. The GCS project is a case study of the Requirements and Technical Concepts for Aviation RTCA/DO-178B guidelines, Software Considerations in Airborne Systems and Equipment Certification. All civil transport airframe and equipment vendors are expected to comply with these guidelines in building systems to be certified by the FAA for use in commercial aircraft. For the case study, two implementations of a guidance and control application were developed to comply with the DO-178B guidelines for Level A (critical) software. The development included the requirements, design, coding, verification, configuration management, and quality assurance processes. This paper discusses the details of the GCS project and presents the results of the case study.

  6. Closing the loop on improvement: Packaging experience in the Software Engineering Laboratory

    Science.gov (United States)

    Waligora, Sharon R.; Landis, Linda C.; Doland, Jerry T.

    1994-01-01

    As part of its award-winning software process improvement program, the Software Engineering Laboratory (SEL) has developed an effective method for packaging organizational best practices based on real project experience into useful handbooks and training courses. This paper shares the SEL's experience over the past 12 years creating and updating software process handbooks and training courses. It provides cost models and guidelines for successful experience packaging derived from SEL experience.

  7. Software for the Integration of Multiomics Experiments in Bioconductor.

    Science.gov (United States)

    Ramos, Marcel; Schiffer, Lucas; Re, Angela; Azhar, Rimsha; Basunia, Azfar; Rodriguez, Carmen; Chan, Tiffany; Chapman, Phil; Davis, Sean R; Gomez-Cabrero, David; Culhane, Aedin C; Haibe-Kains, Benjamin; Hansen, Kasper D; Kodali, Hanish; Louis, Marie S; Mer, Arvind S; Riester, Markus; Morgan, Martin; Carey, Vince; Waldron, Levi

    2017-11-01

    Multiomics experiments are increasingly commonplace in biomedical research and add layers of complexity to experimental design, data integration, and analysis. R and Bioconductor provide a generic framework for statistical analysis and visualization, as well as specialized data classes for a variety of high-throughput data types, but methods are lacking for integrative analysis of multiomics experiments. The MultiAssayExperiment software package, implemented in R and leveraging Bioconductor software and design principles, provides for the coordinated representation of, storage of, and operation on multiple diverse genomics data. We provide the unrestricted multiple 'omics data for each cancer tissue in The Cancer Genome Atlas as ready-to-analyze MultiAssayExperiment objects and demonstrate in these and other datasets how the software simplifies data representation, statistical analysis, and visualization. The MultiAssayExperiment Bioconductor package reduces major obstacles to efficient, scalable, and reproducible statistical analysis of multiomics data and enhances data science applications of multiple omics datasets. Cancer Res; 77(21); e39-42. ©2017 AACR . ©2017 American Association for Cancer Research.

  8. Research and Development of Statistical Analysis Software System of Maize Seedling Experiment

    OpenAIRE

    Hui Cao

    2014-01-01

    In this study, software engineer measures were used to develop a set of software system for maize seedling experiments statistics and analysis works. During development works, B/S structure software design method was used and a set of statistics indicators for maize seedling evaluation were established. The experiments results indicated that this set of software system could finish quality statistics and analysis for maize seedling very well. The development of this software system explored a...

  9. Software for on-line experiments

    International Nuclear Information System (INIS)

    Ivanchenko, I.M.

    1981-01-01

    A review of nowadays development state of software of on-line electron experiments is presented. The principles of organization of real time systems on second generation computer base are considered. The following methods for projections search are considered: combinator methods, global methods, methods of tracking, methods of a supporting band. The following methods for determining parameter estimates based on the Lorentz equation are analysed: analytical simulation of trajectories, determination of parameters by the iterative method using the technique of calculation of recycled integrals, multidimensional statistical analysis. For the purpose of successful usage and development of software the technique of selfdocumented programs is created and the computer is applied for preparing, revising and circulation of external descriptions which as program complexes are constructed according to the hierarchical principle [ru

  10. Rapid Software Development for Experiment Control at OPAL

    International Nuclear Information System (INIS)

    Hathaway, P.V.; Lam, Tony; Franceschini, Ferdi; Hauser, Nick; Rayner, Hugh

    2005-01-01

    Full text: ANSTO is undertaking the parallel development of instrument control and graphical experiment interface software for seven neutron beam instruments at OPAL. Each instrument poses several challenges for a common system solution, including custom detector interfaces, a range of motion and beamline optics schema, and a spectrum of online data reduction requirements. To provide a superior system with the least development effort, the computing team have adopted proven, configurable, server-based control software (SICS)1., a highly Integrated Scientific Experimental Environment (GumTree)2. and industry-standard database management systems. The resulting graphical interfaces allow operation in a familiar experiment domain, with monitoring of data and parameters independent of control system specifics. GumTree presents the experimenter with a consistent interface for experiment management, instrument control and data reduction tasks. The facility instrument scientists can easily reconfigure instruments and add ancillaries. The user community can expect a reduced learning curve for performing each experiment. GumTree can be installed anywhere for pre-experiment familiarisation, postprocessing of acquired data sets, and integration with third party analysis tools. Instrument scientists are seeing faster software development iterations and have a solid basis to prepare for the next suite of instruments. 1. SICS from PSI (lns00.psi.ch). 2. GumTree (gumtree.sourceforge.net), new site: http://gumtree.sourceforge.net/wiki/index.php/Main_Page

  11. Software for physics of tau lepton decay in LHC experiments

    CERN Document Server

    Przedzinski, Tomasz

    2010-01-01

    Software development in high energy physics experiments offers unique experience with rapidly changing environment and variety of different standards and frameworks that software must be adapted to. As such, regular methods of software development are hard to use as they do not take into account how greatly some of these changes influence the whole structure. The following thesis summarizes development of TAUOLA C++ Interface introducing tau decays to new event record standard. Documentation of the program is already published. That is why it is not recalled here again. We focus on the development cycle and methodology used in the project, starting from the definition of the expectations through planning and designing the abstract model and concluding with the implementation. In the last part of the paper we present installation of the software within different experiments surrounding Large Hadron Collider and the problems that emerged during this process.

  12. Software Development Infrastructure for the FAIR Experiments

    International Nuclear Information System (INIS)

    Uhlig, F; Al-Turany, M; Bertini, D; Karabowicz, R

    2011-01-01

    The proposed project FAIR (Facility for Anti-proton and Ion Research) is an international accelerator facility of the next generation. It builds on top of the experience and technological developments already made at the existing GSI facility, and incorporate new technological concepts. The four scientific pillars of FAIR are NUSTAR (nuclear structure and astrophysics), PANDA (QCD studies with cooled beams of anti-protons), CBM (physics of hadronic matter at highest baryon densities), and APPA (atomic physics, plasma physics, and applications). The FairRoot framework used by all of the big FAIR experiments as a base for their own specific developments, provides basic functionality like IO, geometry handling etc. The challenge is to support all the different experiments with their heterogeneous requirements. Due to the limited manpower, one of the first design decisions was to (re)use as much as possible already available and tested software and to focus on the development of the framework. Beside the framework itself, the FairRoot core team also provides some software development tools. We will describe the complete set of tools in this article. The Makefiles for all projects are generated using CMake. For software testing and the corresponding quality assurance, we use CTest to generate the results and CDash as web front end. The tools are completed by subversion as source code repository and trac as tool for the complete source code management. This set of tools allows us to offer the full functionality we have for FairRoot also to the experiments based on FairRoot.

  13. The Qualification Experiences for Safety-critical Software of POSAFE-Q

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jang Yeol; Son, Kwang Seop; Cheon, Se Woo; Lee, Jang Soo; Kwon, Kee Choon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2009-05-15

    Programmable Logic Controllers (PLC) have been applied to the Reactor Protection System (RPS) and the Engineered Safety Feature (ESF)-Component Control System (CCS) as the major safety system components of nuclear power plants. This paper describes experiences on the qualification of the safety-critical software including the pCOS kernel and system tasks related to a safety-grade PLC, i.e. the works done for the Software Verification and Validation, Software Safety Analysis, Software Quality Assurance, and Software Configuration Management etc.

  14. Managing Change in Software Process Improvement

    DEFF Research Database (Denmark)

    Mathiassen, Lars; Ngwenyama, Ojelanki K.; Aaen, Ivan

    2005-01-01

    When software managers initiate SPI, most are ill prepared for the scale and complexity of the organizational change involved. Although they typically know how to deal with large software projects, few managers have sufficient experience with projects that transform organizations. To succeed with...

  15. Experiences with Software Quality Metrics in the EMI middleware

    International Nuclear Information System (INIS)

    Alandes, M; Meneses, D; Pucciani, G; Kenny, E M

    2012-01-01

    The EMI Quality Model has been created to define, and later review, the EMI (European Middleware Initiative) software product and process quality. A quality model is based on a set of software quality metrics and helps to set clear and measurable quality goals for software products and processes. The EMI Quality Model follows the ISO/IEC 9126 Software Engineering – Product Quality to identify a set of characteristics that need to be present in the EMI software. For each software characteristic, such as portability, maintainability, compliance, etc, a set of associated metrics and KPIs (Key Performance Indicators) are identified. This article presents how the EMI Quality Model and the EMI Metrics have been defined in the context of the software quality assurance activities carried out in EMI. It also describes the measurement plan and presents some of the metrics reports that have been produced for the EMI releases and updates. It also covers which tools and techniques can be used by any software project to extract “code metrics” on the status of the software products and “process metrics” related to the quality of the development and support process such as reaction time to critical bugs, requirements tracking and delays in product releases.

  16. Software engineering and data management for automated payload experiment tool

    Science.gov (United States)

    Maddux, Gary A.; Provancha, Anna; Chattam, David

    1994-01-01

    The Microgravity Projects Office identified a need to develop a software package that will lead experiment developers through the development planning process, obtain necessary information, establish an electronic data exchange avenue, and allow easier manipulation/reformatting of the collected information. An MS-DOS compatible software package called the Automated Payload Experiment Tool (APET) has been developed and delivered. The objective of this task is to expand on the results of the APET work previously performed by University of Alabama in Huntsville (UAH) and provide versions of the software in a Macintosh and Windows compatible format. Appendix 1 science requirements document (SRD) Users Manual is attached.

  17. Metronome LKM: An open source virtual keyboard driver to measure experiment software latencies.

    Science.gov (United States)

    Garaizar, Pablo; Vadillo, Miguel A

    2017-10-01

    Experiment software is often used to measure reaction times gathered with keyboards or other input devices. In previous studies, the accuracy and precision of time stamps has been assessed through several means: (a) generating accurate square wave signals from an external device connected to the parallel port of the computer running the experiment software, (b) triggering the typematic repeat feature of some keyboards to get an evenly separated series of keypress events, or (c) using a solenoid handled by a microcontroller to press the input device (keyboard, mouse button, touch screen) that will be used in the experimental setup. Despite the advantages of these approaches in some contexts, none of them can isolate the measurement error caused by the experiment software itself. Metronome LKM provides a virtual keyboard to assess an experiment's software. Using this open source driver, researchers can generate keypress events using high-resolution timers and compare the time stamps collected by the experiment software with those gathered by Metronome LKM (with nanosecond resolution). Our software is highly configurable (in terms of keys pressed, intervals, SysRq activation) and runs on 2.6-4.8 Linux kernels.

  18. A Knowledge Management Approach to Support Software Process Improvement Implementation Initiatives

    Science.gov (United States)

    Montoni, Mariano Angel; Cerdeiral, Cristina; Zanetti, David; Cavalcanti da Rocha, Ana Regina

    The success of software process improvement (SPI) implementation initiatives depends fundamentally of the strategies adopted to support the execution of such initiatives. Therefore, it is essential to define adequate SPI implementation strategies aiming to facilitate the achievement of organizational business goals and to increase the benefits of process improvements. The objective of this work is to present an approach to support the execution of SPI implementation initiatives. We also describe a methodology applied to capture knowledge related to critical success factors that influence SPI initiatives. This knowledge was used to define effective SPI strategies aiming to increase the success of SPI initiatives coordinated by a specific SPI consultancy organization. This work also presents the functionalities of a set of tools integrated in a process-centered knowledge management environment, named CORE-KM, customized to support the presented approach.

  19. Project W-211, initial tank retrieval systems, retrieval control system software configuration management plan

    International Nuclear Information System (INIS)

    RIECK, C.A.

    1999-01-01

    This Software Configuration Management Plan (SCMP) provides the instructions for change control of the W-211 Project, Retrieval Control System (RCS) software after initial approval/release but prior to the transfer of custody to the waste tank operations contractor. This plan applies to the W-211 system software developed by the project, consisting of the computer human-machine interface (HMI) and programmable logic controller (PLC) software source and executable code, for production use by the waste tank operations contractor. The plan encompasses that portion of the W-211 RCS software represented on project-specific AUTOCAD drawings that are released as part of the C1 definitive design package (these drawings are identified on the drawing list associated with each C-1 package), and the associated software code. Implementation of the plan is required for formal acceptance testing and production release. The software configuration management plan does not apply to reports and data generated by the software except where specifically identified. Control of information produced by the software once it has been transferred for operation is the responsibility of the receiving organization

  20. The Muon Ionization Cooling Experiment User Software

    Science.gov (United States)

    Dobbs, A.; Rajaram, D.; MICE Collaboration

    2017-10-01

    The Muon Ionization Cooling Experiment (MICE) is a proof-of-principle experiment designed to demonstrate muon ionization cooling for the first time. MICE is currently on Step IV of its data taking programme, where transverse emittance reduction will be demonstrated. The MICE Analysis User Software (MAUS) is the reconstruction, simulation and analysis framework for the MICE experiment. MAUS is used for both offline data analysis and fast online data reconstruction and visualization to serve MICE data taking. This paper provides an introduction to MAUS, describing the central Python and C++ based framework, the data structure and and the code management and testing procedures.

  1. Early experiences building a software quality prediction model

    Science.gov (United States)

    Agresti, W. W.; Evanco, W. M.; Smith, M. C.

    1990-01-01

    Early experiences building a software quality prediction model are discussed. The overall research objective is to establish a capability to project a software system's quality from an analysis of its design. The technical approach is to build multivariate models for estimating reliability and maintainability. Data from 21 Ada subsystems were analyzed to test hypotheses about various design structures leading to failure-prone or unmaintainable systems. Current design variables highlight the interconnectivity and visibility of compilation units. Other model variables provide for the effects of reusability and software changes. Reported results are preliminary because additional project data is being obtained and new hypotheses are being developed and tested. Current multivariate regression models are encouraging, explaining 60 to 80 percent of the variation in error density of the subsystems.

  2. Planning the Unplanned Experiment: Assessing the Efficacy of Standards for Safety Critical Software

    Science.gov (United States)

    Graydon, Patrick J.; Holloway, C. Michael

    2015-01-01

    We need well-founded means of determining whether software is t for use in safety-critical applications. While software in industries such as aviation has an excellent safety record, the fact that software aws have contributed to deaths illustrates the need for justi ably high con dence in software. It is often argued that software is t for safety-critical use because it conforms to a standard for software in safety-critical systems. But little is known about whether such standards `work.' Reliance upon a standard without knowing whether it works is an experiment; without collecting data to assess the standard, this experiment is unplanned. This paper reports on a workshop intended to explore how standards could practicably be assessed. Planning the Unplanned Experiment: Assessing the Ecacy of Standards for Safety Critical Software (AESSCS) was held on 13 May 2014 in conjunction with the European Dependable Computing Conference (EDCC). We summarize and elaborate on the workshop's discussion of the topic, including both the presented positions and the dialogue that ensued.

  3. THE IMPACT OF MARKETING EXPERIMENTS ON THE RELATIONSHIP BETWEEN SOFTWARE PRODUCERS AND THEIR RETAILERS

    Directory of Open Access Journals (Sweden)

    HERȚANU ANDREEA

    2013-07-01

    Full Text Available This paper presents the results of a marketing experiment done on the Romanian software market. The main purpose of this research is to determine how the marketing campaigns of software manufacturers can influence the decisions of software retailers. Through this marketing experimental research an evaluation and an analysis of the impact that marketing policies of software companies have on the retailers from all over the country is made. Three different marketing campaigns were proposed to three groups of software vendors from the most important cities of the country. The total number of software retailers included in this experiment is of 45, and the marketing campaigns proposed by the authors in this experiment refer to the Microsoft brand. Promotion strategies such as: sales promotion by encouraging sales force and promotional pricing or even the policy of partner relationship management have a great impact on three aspects regarding software retailers: loyalty, purchase and resale intention and attitude towards a brand. The results of the experiment show a high interest for the strategy of promotional pricing. The representatives of the software vendors have a positive orientation towards sales promotion by encouraging sales force. Regarding the influences of the manipulations used in the experiment, the greatest impact on the loyalty of the software vendors it has the strategy of promotional pricing. Also the policy of sales promotion by encouraging sales force has the biggest impact on the purchase and sale intention of the software retailers. All three manipulations have also an impact on the attitude towards a brand of the vendors, but the differences are too small to determine which of the proposed stimuli has a greater impact on this aspect. The results of the experiment may help and could have a great influence on the future marketing decisions of manufacturers regarding the strategies and marketing policies used on the Romanian

  4. PsyToolkit: a software package for programming psychological experiments using Linux.

    Science.gov (United States)

    Stoet, Gijsbert

    2010-11-01

    PsyToolkit is a set of software tools for programming psychological experiments on Linux computers. Given that PsyToolkit is freely available under the Gnu Public License, open source, and designed such that it can easily be modified and extended for individual needs, it is suitable not only for technically oriented Linux users, but also for students, researchers on small budgets, and universities in developing countries. The software includes a high-level scripting language, a library for the programming language C, and a questionnaire presenter. The software easily integrates with other open source tools, such as the statistical software package R. PsyToolkit is designed to work with external hardware (including IoLab and Cedrus response keyboards and two common digital input/output boards) and to support millisecond timing precision. Four in-depth examples explain the basic functionality of PsyToolkit. Example 1 demonstrates a stimulus-response compatibility experiment. Example 2 demonstrates a novel mouse-controlled visual search experiment. Example 3 shows how to control light emitting diodes using PsyToolkit, and Example 4 shows how to build a light-detection sensor. The last two examples explain the electronic hardware setup such that they can even be used with other software packages.

  5. An Educational Software for Simulating the Sample Size of Molecular Marker Experiments

    Science.gov (United States)

    Helms, T. C.; Doetkott, C.

    2007-01-01

    We developed educational software to show graduate students how to plan molecular marker experiments. These computer simulations give the students feedback on the precision of their experiments. The objective of the software was to show students using a hands-on approach how: (1) environmental variation influences the range of the estimates of the…

  6. Developer Initiation and Social Interactions in OSS: A Case Study of the Apache Software Foundation

    Science.gov (United States)

    2014-08-01

    pp. 201–215, 2003. 2. K. Crowston, K. Wei, J. Howison, and A. Wiggins, “Free/ libre open-source software devel- opment: What we know and what we do not...Understanding the process of participating in open source communities,” in International Workshop on Emerging Trends in Free/ Libre /Open Source Software ...Noname manuscript No. (will be inserted by the editor) Developer Initiation and Social Interactions in OSS: A Case Study of the Apache Software

  7. Main real time software for high-energy physics experiments

    International Nuclear Information System (INIS)

    Tikhonov, A.N.

    1985-01-01

    The general problems of organization of software complexes, as well as development of typical algorithms and packages of applied programs for real time systems used in experiments with charged particle accelerators are discussed. It is noted that numerous qualitatively different real time tasks are solved by parallel programming of the processes of data acquisition, equipment control, data exchange with remote terminals, data express processing and accumulation, operator's instruction interpretation, generation and buffering of resulting files for data output and information processing which is realized on the basis of multicomputer system utilization. Further development of software for experiments is associated with improving the algorithms for automatic recognition and analysis of events with complex topology and standardization of applied program packages

  8. Operation Request Gatekeeper: A software system for remote access control of diagnostic instruments in fusion experiments

    International Nuclear Information System (INIS)

    Abla, G.; Schissel, D. P.; Fredian, T. W.; Stillerman, J. A.; Greenwald, M. J.; Stepanov, D. N.; Ciarlette, D. J.

    2010-01-01

    Tokamak diagnostic settings are repeatedly modified to meet the changing needs of each experiment. Enabling the remote diagnostic control has significant challenges due to security and efficiency requirements. The Operation Request Gatekeeper (ORG) is a software system that addresses the challenges of remotely but securely submitting modification requests. The ORG provides a framework for screening all the requests before they enter the secure machine zone and are executed by performing user authentication and authorization, grammar validation, and validity checks. A prototype ORG was developed for the ITER CODAC that satisfies their initial requirements for remote request submission and has been tested with remote control of the KSTAR Plasma Control System. This paper describes the software design principles and implementation of ORG as well as worldwide test results.

  9. Open Source Software Development with Your Mother Language : Intercultural Collaboration Experiment 2002

    DEFF Research Database (Denmark)

    Nomura, Saeko; Ishida, Saeko; Jensen, Mika Yasuoka

    2002-01-01

    ”Open Source Software Development with Your Mother Language: Intercultural Collaboration Experiment 2002,” 10th International Conference on Human – Computer Interaction (HCII2003), June 2003, Crete, Greece.......”Open Source Software Development with Your Mother Language: Intercultural Collaboration Experiment 2002,” 10th International Conference on Human – Computer Interaction (HCII2003), June 2003, Crete, Greece....

  10. The CMS Data Quality Monitoring software experience and future improvements

    CERN Document Server

    De Guio, Federico

    2013-01-01

    The Data Quality Monitoring (DQM) Software proved to be a central tool in the CMS experiment. Its flexibility allowed its integration in several environments Online, for real-time detector monitoring; Offline, for the final, fine-grained Data Certification; Release Validation, to constantly validate the functionality and the performance of the reconstruction software; in Monte Carlo productions. The central tool to deliver Data Quality information is a web site for browsing data quality histograms (DQM GUI). In this contribution the usage of the DQM Software in the different environments and its integration in the CMS Reconstruction Software Framework and in all production workflows are presented.

  11. Software Engineering Researchers' Attitudes on Case Studies and Experiments: an Exploratory Survey

    OpenAIRE

    Tofan, Dan; Galster, Matthias; Avgeriou, Paris; Weyns, Danny

    2011-01-01

    Background: Case studies and experiments are research methods frequently applied in empirical software engineering. Experiments are well-­understood and their value as an empirical method is recognized. On the other hand, there seem to be different opinions on what constitutes a case study, and about the value of case studies as a thorough research method. Aim: We aim at exploring the attitudes of software engineering researchers on case studies and experiments. Furthermore, we investigate ho...

  12. Safe Software for Space Applications: Building on the DO-178 Experience

    Science.gov (United States)

    Dorsey, Cheryl A.; Dorsey, Timothy A.

    2013-09-01

    DO-178, Software Considerations in Airborne Systems and Equipment Certification, is the well-known international standard dealing with the assurance of software used in airborne systems [1,2]. Insights into the DO-178 experiences, strengths and weaknesses can benefit the international space community. As DO-178 is an excellent standard for safe software development when used appropriately, this paper provides lessons learned and suggestions for using it effectively.

  13. The CMS Data Quality Monitoring software experience and future improvements

    CERN Document Server

    Batinkov, Atanas Ivanov

    2013-01-01

    The Data Quality Monitoring Software proved to be a central tool in the Compact Muon Solenoid experiment. Its flexibility allowed its integration in several environments: online, for real-time detector monitoring; offline, for the final, fine-grained data certification. The usage of the Data Quality Monitoring software in the different environments and its integration in the Compact Muon Solenoid reconstruction software framework and in all production workflows are presented. The main technical challenges and the adopted solutions to them will be also discussed with emphasis on functionality, long-term robustness and performance.

  14. Effective UI The Art of Building Great User Experience in Software

    CERN Document Server

    Anderson, Jonathan; Wilson, Robb

    2010-01-01

    People expect effortless, engaging interaction with desktop and web applications, but producing software that generates enjoyable user experiences is much harder than many companies anticipate. With Effective UI, you'll learn proven user-experience strategies that will satisfy your clients and customers, drive business value, and increase brand strength. This book shows you how to capture the collaborative and cooperative spirit among designers, engineers, and management required for building engaging software. You'll also learn valuable methods for maintaining focus throughout the process -

  15. Clinical experience of quantex coordinate software for CT guided stereotactic surgery

    International Nuclear Information System (INIS)

    Yabashi, Toshitake; Ichikawa, Hideo; Yasuda, Eisuke; Tsuruta, Hatsuo; Ishikawa, Yoshihisa; Kimura, Tokuji; Kanamori, Isao

    1991-01-01

    Recently, Quantex Coordinate Software was newly developed for CT-guided stereotactic surgery. We have the opportunity of using this software in 6 cases with intracerebral hematoma for evacuation and 2 cases with brain tumor for needle biopsy by using CT-guided stereotactic surgery. The followings are the features with a little clinical experience. One of the biggest features is that this software can simulate the best expected route of the puncture needle from burr hole to target point before inserting. Also compared with CT 9000 series Software, it has many new functions for more advanced hardware as well as advance standard software. Two cases of intracerebral hematoma for evacuation and 1 case of a brain tumor for a biopsy were carried out using this software mainly as a simulation. In all cases, this software proved to be very useful. (author)

  16. Identifying criteria for multimodel software process improvement solutions : based on a review of current problems and initiatives

    NARCIS (Netherlands)

    Kelemen, Z.D.; Kusters, R.J.; Trienekens, J.J.M.

    2012-01-01

    In this article, we analyze current initiatives in multimodel software process improvement and identify criteria for multimodel solutions. With multimodel, we mean the simultaneous usage of more than one quality approach (e.g. standards, methods, techniques to improve software processes). This paper

  17. Fourth Workshop on Sustainable Software for Science: Practice and Experiences (WSSSPE4)

    NARCIS (Netherlands)

    Katz, Daniel S; Niemeyer, Kyle E; Gesing, Sandra; Hwang, Lorraine; Bangerth, Wolfgang; Hettrick, Simon; Idaszak, Ray; Salac, Jean; Chue Hong, Neil; Núñez-Corrales, Santiago; Allen, Alice; Geiger, R Stuart; Miller, Jonah; Chen, Emily; Dubey, Anshu; Lago, Patricia

    2018-01-01

    This article summarizes motivations, organization, and activities of the Fourth Workshop on Sustainable Software for Science: Practice and Experiences (WSSSPE4). The WSSSPE series promotes sustainable research software by positively impacting principles and best practices, careers, learning, and

  18. Servicing HEP experiments with a complete set of ready integreated and configured common software components

    International Nuclear Information System (INIS)

    Roiser, Stefan; Gaspar, Ana; Perrin, Yves; Kruzelecki, Karol

    2010-01-01

    The LCG Applications Area at CERN provides basic software components for the LHC experiments such as ROOT, POOL, COOL which are developed in house and also a set of 'external' software packages (70) which are needed in addition such as Python, Boost, Qt, CLHEP, etc. These packages target many different areas of HEP computing such as data persistency, math, simulation, grid computing, databases, graphics, etc. Other packages provide tools for documentation, debugging, scripting languages and compilers. All these packages are provided in a consistent manner on different compilers, architectures and operating systems. The Software Process and Infrastructure project (SPI) [1] is responsible for the continous testing, coordination, release and deployment of these software packages. The main driving force for the actions carried out by SPI are the needs of the LHC experiments, but also other HEP experiments could profit from the set of consistent libraries provided and receive a stable and well tested foundation to build their experiment software frameworks. This presentation will first provide a brief description of the tools and services provided for the coordination, testing, release, deployment and presentation of LCG/AA software packages and then focus on a second set of tools provided for outside LHC experiments to deploy a stable set of HEP related software packages both as binary distribution or from source.

  19. Servicing HEP experiments with a complete set of ready integreated and configured common software components

    Energy Technology Data Exchange (ETDEWEB)

    Roiser, Stefan; Gaspar, Ana; Perrin, Yves [CERN, CH-1211 Geneva 23, PH Department, SFT Group (Switzerland); Kruzelecki, Karol, E-mail: stefan.roiser@cern.c, E-mail: ana.gaspar@cern.c, E-mail: yves.perrin@cern.c, E-mail: karol.kruzelecki@cern.c [CERN, CH-1211 Geneva 23, PH Department, LBC Group (Switzerland)

    2010-04-01

    The LCG Applications Area at CERN provides basic software components for the LHC experiments such as ROOT, POOL, COOL which are developed in house and also a set of 'external' software packages (70) which are needed in addition such as Python, Boost, Qt, CLHEP, etc. These packages target many different areas of HEP computing such as data persistency, math, simulation, grid computing, databases, graphics, etc. Other packages provide tools for documentation, debugging, scripting languages and compilers. All these packages are provided in a consistent manner on different compilers, architectures and operating systems. The Software Process and Infrastructure project (SPI) [1] is responsible for the continous testing, coordination, release and deployment of these software packages. The main driving force for the actions carried out by SPI are the needs of the LHC experiments, but also other HEP experiments could profit from the set of consistent libraries provided and receive a stable and well tested foundation to build their experiment software frameworks. This presentation will first provide a brief description of the tools and services provided for the coordination, testing, release, deployment and presentation of LCG/AA software packages and then focus on a second set of tools provided for outside LHC experiments to deploy a stable set of HEP related software packages both as binary distribution or from source.

  20. Infusing Software Engineering Technology into Practice at NASA

    Science.gov (United States)

    Pressburger, Thomas; Feather, Martin S.; Hinchey, Michael; Markosia, Lawrence

    2006-01-01

    We present an ongoing effort of the NASA Software Engineering Initiative to encourage the use of advanced software engineering technology on NASA projects. Technology infusion is in general a difficult process yet this effort seems to have found a modest approach that is successful for some types of technologies. We outline the process and describe the experience of the technology infusions that occurred over a two year period. We also present some lessons from the experiences.

  1. Agile methods in biomedical software development: a multi-site experience report

    Directory of Open Access Journals (Sweden)

    Kuhlmman Karl F

    2006-05-01

    Full Text Available Abstract Background Agile is an iterative approach to software development that relies on strong collaboration and automation to keep pace with dynamic environments. We have successfully used agile development approaches to create and maintain biomedical software, including software for bioinformatics. This paper reports on a qualitative study of our experiences using these methods. Results We have found that agile methods are well suited to the exploratory and iterative nature of scientific inquiry. They provide a robust framework for reproducing scientific results and for developing clinical support systems. The agile development approach also provides a model for collaboration between software engineers and researchers. We present our experience using agile methodologies in projects at six different biomedical software development organizations. The organizations include academic, commercial and government development teams, and included both bioinformatics and clinical support applications. We found that agile practices were a match for the needs of our biomedical projects and contributed to the success of our organizations. Conclusion We found that the agile development approach was a good fit for our organizations, and that these practices should be applicable and valuable to other biomedical software development efforts. Although we found differences in how agile methods were used, we were also able to identify a set of core practices that were common to all of the groups, and that could be a focus for others seeking to adopt these methods.

  2. EXPERIENCES WITH IDEA PROMOTING INITIATIVES

    DEFF Research Database (Denmark)

    Gish, Liv

    2011-01-01

    In new product development a central activity is to provide new ideas. Over the last decades experiences with stimulating employee creativity and establishing idea promoting initiatives have been made in industrial practice. Such initiatives are often labeled Idea Management – a research field...... with a growing interest. In this paper I examine three different idea promoting initiatives carried out in Grundfos, a leading pump manufacturer. In the analysis I address what understandings of idea work are inscribed in the initiatives and what role these initiatives play in the organization with respect...... understandings of idea work are inscribed in the idea promoting initiatives as they to some degree have to fit with the understandings embedded in practice in order to work....

  3. CMS software deployment on OSG

    International Nuclear Information System (INIS)

    Kim, B; Avery, P; Thomas, M; Wuerthwein, F

    2008-01-01

    A set of software deployment tools has been developed for the installation, verification, and removal of a CMS software release. The tools that are mainly targeted for the deployment on the OSG have the features of instant release deployment, corrective resubmission of the initial installation job, and an independent web-based deployment portal with Grid security infrastructure login mechanism. We have been deploying over 500 installations and found the tools are reliable and adaptable to cope with problems with changes in the Grid computing environment and the software releases. We present the design of the tools, statistics that we gathered during the operation of the tools, and our experience with the CMS software deployment on the OSG Grid computing environment

  4. CMS software deployment on OSG

    Energy Technology Data Exchange (ETDEWEB)

    Kim, B; Avery, P [University of Florida, Gainesville, FL 32611 (United States); Thomas, M [California Institute of Technology, Pasadena, CA 91125 (United States); Wuerthwein, F [University of California at San Diego, La Jolla, CA 92093 (United States)], E-mail: bockjoo@phys.ufl.edu, E-mail: thomas@hep.caltech.edu, E-mail: avery@phys.ufl.edu, E-mail: fkw@fnal.gov

    2008-07-15

    A set of software deployment tools has been developed for the installation, verification, and removal of a CMS software release. The tools that are mainly targeted for the deployment on the OSG have the features of instant release deployment, corrective resubmission of the initial installation job, and an independent web-based deployment portal with Grid security infrastructure login mechanism. We have been deploying over 500 installations and found the tools are reliable and adaptable to cope with problems with changes in the Grid computing environment and the software releases. We present the design of the tools, statistics that we gathered during the operation of the tools, and our experience with the CMS software deployment on the OSG Grid computing environment.

  5. Potential errors when fitting experience curves by means of spreadsheet software

    International Nuclear Information System (INIS)

    Sark, W.G.J.H.M. van; Alsema, E.A.

    2010-01-01

    Progress ratios (PRs) are widely used in forecasting development of many technologies; they are derived from historical data represented in experience curves. Fitting the double logarithmic graphs is easily done with spreadsheet software like Microsoft Excel, by adding a trend line to the graph. However, it is unknown to many that these data are transformed to linear data before a fit is performed. This leads to erroneous results or a transformation bias in the PR, as we demonstrate using the experience curve for photovoltaic technology: logarithmic transformation leads to overestimates of progress ratios and underestimates of goodness of fit. Therefore, other graphing and analysis software is recommended.

  6. Initial conditions of radiative shock experiments

    International Nuclear Information System (INIS)

    Kuranz, C. C.; Drake, R. P.; Krauland, C. M.; Marion, D. C.; Grosskopf, M. J.; Rutter, E.; Torralva, B.; Holloway, J. P.; Bingham, D.; Goh, J.; Boehly, T. R.; Sorce, A. T.

    2013-01-01

    We performed experiments at the Omega Laser Facility to characterize the initial, laser-driven state of a radiative shock experiment. These experiments aimed to measure the shock breakout time from a thin, laser-irradiated Be disk. The data are then used to inform a range of valid model parameters, such as electron flux limiter and polytropic γ, used when simulating radiative shock experiments using radiation hydrodynamics codes. The characterization experiment and the radiative shock experiment use a laser irradiance of ∼7 × 10 14 W cm −2 to launch a shock in the Be disk. A velocity interferometer and a streaked optical pyrometer were used to infer the amount of time for the shock to move through the Be disk. The experimental results were compared with simulation results from the Hyades code, which can be used to model the initial conditions of a radiative shock system using the CRASH code

  7. Software Engineering Program: Software Process Improvement Guidebook

    Science.gov (United States)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  8. Planning the Unplanned Experiment: Towards Assessing the Efficacy of Standards for Safety-Critical Software

    Science.gov (United States)

    Graydon, Patrick J.; Holloway, C. M.

    2015-01-01

    Safe use of software in safety-critical applications requires well-founded means of determining whether software is fit for such use. While software in industries such as aviation has a good safety record, little is known about whether standards for software in safety-critical applications 'work' (or even what that means). It is often (implicitly) argued that software is fit for safety-critical use because it conforms to an appropriate standard. Without knowing whether a standard works, such reliance is an experiment; without carefully collecting assessment data, that experiment is unplanned. To help plan the experiment, we organized a workshop to develop practical ideas for assessing software safety standards. In this paper, we relate and elaborate on the workshop discussion, which revealed subtle but important study design considerations and practical barriers to collecting appropriate historical data and recruiting appropriate experimental subjects. We discuss assessing standards as written and as applied, several candidate definitions for what it means for a standard to 'work,' and key assessment strategies and study techniques and the pros and cons of each. Finally, we conclude with thoughts about the kinds of research that will be required and how academia, industry, and regulators might collaborate to overcome the noted barriers.

  9. Report on the Second Workshop on Sustainable Software for Science: Practice and Experiences (WSSSPE2)

    Science.gov (United States)

    Katz, Daniel S.; Choi, Sou-Cheng T.; Wilkins-Diehr, Nancy; Chue Hong, Neil; Venters, Colin C.; Howison, James; Seinstra, Frank; Jones, Matthew; Cranston, Karen; Clune, Thomas L.; de Val-Borro, Miguel; Littauer, Richard

    2016-02-01

    This technical report records and discusses the Second Workshop on Sustainable Software for Science: Practice and Experiences (WSSSPE2). The report includes a description of the alternative, experimental submission and review process, two workshop keynote presentations, a series of lightning talks, a discussion on sustainability, and five discussions from the topic areas of exploring sustainability; software development experiences; credit & incentives; reproducibility & reuse & sharing; and code testing & code review. For each topic, the report includes a list of tangible actions that were proposed and that would lead to potential change. The workshop recognized that reliance on scientific software is pervasive in all areas of world-leading research today. The workshop participants then proceeded to explore different perspectives on the concept of sustainability. Key enablers and barriers of sustainable scientific software were identified from their experiences. In addition, recommendations with new requirements such as software credit files and software prize frameworks were outlined for improving practices in sustainable software engineering. There was also broad consensus that formal training in software development or engineering was rare among the practitioners. Significant strides need to be made in building a sense of community via training in software and technical practices, on increasing their size and scope, and on better integrating them directly into graduate education programs. Finally, journals can define and publish policies to improve reproducibility, whereas reviewers can insist that authors provide sufficient information and access to data and software to allow them reproduce the results in the paper. Hence a list of criteria is compiled for journals to provide to reviewers so as to make it easier to review software submitted for publication as a "Software Paper."

  10. Muon Event Filter Software for the ATLAS Experiment at LHC

    CERN Document Server

    Biglietti, M; Assamagan, Ketevi A; Baines, J T M; Bee, C P; Bellomo, M; Bogaerts, J A C; Boisvert, V; Bosman, M; Caron, B; Casado, M P; Cataldi, G; Cavalli, D; Cervetto, M; Comune, G; Conde, P; Conde-Muíño, P; De Santo, A; De Seixas, J M; Di Mattia, A; Dos Anjos, A; Dosil, M; Díaz-Gómez, M; Ellis, Nick; Emeliyanov, D; Epp, B; Falciano, S; Farilla, A; George, S; Ghete, V M; González, S; Grothe, M; Kabana, S; Khomich, A; Kilvington, G; Konstantinidis, N P; Kootz, A; Lowe, A; Luminari, L; Maeno, T; Masik, J; Meessen, C; Mello, A G; Merino, G; Moore, R; Morettini, P; Negri, A; Nikitin, N V; Nisati, A; Padilla, C; Panikashvili, N; Parodi, F; Pinfold, J L; Pinto, P; Primavera, M; Pérez-Réale, V; Qian, Z; Resconi, S; Rosati, S; Santamarina-Rios, C; Scannicchio, D A; Schiavi, C; Segura, E; Sivoklokov, S Yu; Soluk, R A; Stefanidis, E; Sushkov, S; Sutton, M; Sánchez, C; Tapprogge, Stefan; Thomas, E; Touchard, F; Venda-Pinto, B; Ventura, A; Vercesi, V; Werner, P; Wheeler, S; Wickens, F J; Wiedenmann, W; Wielers, M; Zobernig, G; Computing In High Energy Physics

    2005-01-01

    At LHC the 40 MHz bunch crossing rate dictates a high selectivity of the ATLAS Trigger system, which has to keep the full physics potential of the experiment in spite of a limited storage capability. The level-1 trigger, implemented in a custom hardware, will reduce the initial rate to 75 kHz and is followed by the software based level-2 and Event Filter, usually referred as High Level Triggers (HLT), which further reduce the rate to about 100 Hz. In this paper an overview of the implementation of the offline muon recostruction algortihms MOORE (Muon Object Oriented REconstruction) and MuId (Muon Identification) as Event Filter in the ATLAS online framework is given. The MOORE algorithm performs the reconstruction inside the Muon Spectrometer providing a precise measurement of the muon track parameters outside the calorimeters; MuId combines the measurements of all ATLAS sub-detectors in order to identify muons and provides the best estimate of their momentum at the production vertex. In the HLT implementatio...

  11. A controlled experiment on the impact of software structure on maintainability

    Science.gov (United States)

    Rombach, Dieter H.

    1987-01-01

    The impact of software structure on maintainability aspects including comprehensibility, locality, modifiability, and reusability in a distributed system environment is studied in a controlled maintenance experiment involving six medium-size distributed software systems implemented in LADY (language for distributed systems) and six in an extended version of sequential PASCAL. For all maintenance aspects except reusability, the results were quantitatively given in terms of complexity metrics which could be automated. The results showed LADY to be better suited to the development of maintainable software than the extension of sequential PASCAL. The strong typing combined with high parametrization of units is suggested to improve the reusability of units in LADY.

  12. Incorrect results in software engineering experiments: How to improve research practices

    OpenAIRE

    Jørgensen, Magne; Dybå, Tore; Liestøl, Knut; Sjøberg, Dag

    2016-01-01

    Context The trustworthiness of research results is a growing concern in many empirical disciplines. Aim The goals of this paper are to assess how much the trustworthiness of results reported in software engineering experiments is affected by researcher and publication bias, given typical statistical power and significance levels, and to suggest improved research practices. Method First, we conducted a small-scale survey to document the presence of researcher and publication biases in software...

  13. Report on the Second Workshop on Sustainable Software for Science: Practice and Experiences (WSSSPE2

    Directory of Open Access Journals (Sweden)

    Daniel S. Katz

    2016-02-01

    Full Text Available This technical report records and discusses the Second Workshop on Sustainable Software for Science: Practice and Experiences (WSSSPE2. The report includes a description of the alternative, experimental submission and review process, two workshop keynote presentations, a series of lightning talks, a discussion on sustainability, and five discussions from the topic areas of exploring sustainability; software development experiences; credit & incentives; reproducibility & reuse & sharing; and code testing & code review. For each topic, the report includes a list of tangible actions that were proposed and that would lead to potential change. The workshop recognized that reliance on scientific software is pervasive in all areas of world-leading research today. The workshop participants then proceeded to explore different perspectives on the concept of sustainability. Key enablers and barriers of sustainable scientific software were identified from their experiences. In addition, recommendations with new requirements such as software credit files and software prize frameworks were outlined for improving practices in sustainable software engineering. There was also broad consensus that formal training in software development or engineering was rare among the practitioners. Significant strides need to be made in building a sense of community via training in software and technical practices, on increasing their size and scope, and on better integrating them directly into graduate education programs. Finally, journals can define and publish policies to improve reproducibility, whereas reviewers can insist that authors provide sufficient information and access to data and software to allow them reproduce the results in the paper. Hence a list of criteria is compiled for journals to provide to reviewers so as to make it easier to review software submitted for publication as a “Software Paper.” 

  14. Software reliability assessment

    International Nuclear Information System (INIS)

    Barnes, M.; Bradley, P.A.; Brewer, M.A.

    1994-01-01

    The increased usage and sophistication of computers applied to real time safety-related systems in the United Kingdom has spurred on the desire to provide a standard framework within which to assess dependable computing systems. Recent accidents and ensuing legislation have acted as a catalyst in this area. One particular aspect of dependable computing systems is that of software, which is usually designed to reduce risk at the system level, but which can increase risk if it is unreliable. Various organizations have recognized the problem of assessing the risk imposed to the system by unreliable software, and have taken initial steps to develop and use such assessment frameworks. This paper relates the approach of Consultancy Services of AEA Technology in developing a framework to assess the risk imposed by unreliable software. In addition, the paper discusses the experiences gained by Consultancy Services in applying the assessment framework to commercial and research projects. The framework is applicable to software used in safety applications, including proprietary software. Although the paper is written with Nuclear Reactor Safety applications in mind, the principles discussed can be applied to safety applications in all industries

  15. Summary of the First Workshop on Sustainable Software for Science: Practice and Experiences (WSSSPE1

    Directory of Open Access Journals (Sweden)

    Daniel S Katz

    2014-07-01

    Full Text Available Challenges related to development, deployment, and maintenance of reusable software for science are becoming a growing concern. Many scientists’ research increasingly depends on the quality and availability of software upon which their works are built. To highlight some of these issues and share experiences, the First Workshop on Sustainable Software for Science: Practice and Experiences (WSSSPE1 was held in November 2013 in conjunction with the SC13 Conference. The workshop featured keynote presentations and a large number (54 of solicited extended abstracts that were grouped into three themes and presented via panels. A set of collaborative notes of the presentations and discussion was taken during the workshop. Unique perspectives were captured about issues such as comprehensive documentation, development and deployment practices, software licenses and career paths for developers. Attribution systems that account for evidence of software contribution and impact were also discussed. These include mechanisms such as Digital Object Identifiers, publication of “software papers”, and the use of online systems, for example source code repositories like GitHub. This paper summarizes the issues and shared experiences that were discussed, including cross-cutting issues and use cases. It joins a nascent literature seeking to understand what drives software work in science, and how it is impacted by the reward systems of science. These incentives can determine the extent to which developers are motivated to build software for the long-term, for the use of others, and whether to work collaboratively or separately. It also explores community building, leadership, and dynamics in relation to successful scientific software.

  16. Quality of Design, Analysis and Reporting of Software Engineering Experiments:A Systematic Review

    OpenAIRE

    By Kampenes, Vigdis

    2007-01-01

    Background: Like any research discipline, software engineering research must be of a certain quality to be valuable. High quality research in software engineering ensures that knowledge is accumulated and helpful advice is given to the industry. One way of assessing research quality is to conduct systematic reviews of the published research literature. Objective: The purpose of this work was to assess the quality of published experiments in software engineering with respect to the validit...

  17. Experience Supporting the Integration of LHC Experiments Software Framework with the LCG Middleware

    CERN Document Server

    Santinelli, Roberto

    2006-01-01

    The LHC experiments are currently preparing for data acquisition in 2007 and because of the large amount of required computing and storage resources, they decided to embrace the grid paradigm. The LHC Computing Project (LCG) provides and operates a computing infrastructure suitable for data handling, Monte Carlo production and analysis. While LCG offers a set of high level services, intended to be generic enough to accommodate the needs of different Virtual Organizations, the LHC experiments software framework and applications are very specific and focused on the computing and data models. The LCG Experiment Integration Support team works in close contact with the experiments, the middleware developers and the LCG certification and operations teams to integrate the underlying grid middleware with the experiment specific components. The strategical position between the experiments and the middleware suppliers allows EIS team to play a key role at communications level between the customers and the service provi...

  18. DAQ Software Contributions, Absolute Scale Energy Calibration and Background Evaluation for the NOvA Experiment at Fermilab

    Energy Technology Data Exchange (ETDEWEB)

    Flumerfelt, Eric Lewis [Univ. of Tennessee, Knoxville, TN (United States)

    2015-08-01

    The NOvA (NuMI Off-axis ve [nu_e] Appearance) Experiment is a long-baseline accelerator neutrino experiment currently in its second year of operations. NOvA uses the Neutrinos from the Main Injector (NuMI) beam at Fermilab, and there are two main off-axis detectors: a Near Detector at Fermilab and a Far Detector 810 km away at Ash River, MN. The work reported herein is in support of the NOvA Experiment, through contributions to the development of data acquisition software, providing an accurate, absolute-scale energy calibration for electromagnetic showers in NOvA detector elements, crucial to the primary electron neutrino search, and through an initial evaluation of the cosmic background rate in the NOvA Far Detector, which is situated on the surface without significant overburden. Additional support work for the NOvA Experiment is also detailed, including DAQ Server Administration duties and a study of NOvA’s sensitivity to neutrino oscillations into a “sterile” state.

  19. Solar Constant (SOLCON) Experiment: Ground Support Equipment (GSE) software development

    Science.gov (United States)

    Gibson, M. Alan; Thomas, Susan; Wilson, Robert

    1991-01-01

    The Solar Constant (SOLCON) Experiment, the objective of which is to determine the solar constant value and its variability, is scheduled for launch as part of the Space Shuttle/Atmospheric Laboratory for Application and Science (ATLAS) spacelab mission. The Ground Support Equipment (GSE) software was developed to monitor and analyze the SOLCON telemetry data during flight and to test the instrument on the ground. The design and development of the GSE software are discussed. The SOLCON instrument was tested during Davos International Solar Intercomparison, 1989 and the SOLCON data collected during the tests are analyzed to study the behavior of the instrument.

  20. Software Engineering Researchers' Attitudes on Case Studies and Experiments : an Exploratory Survey

    NARCIS (Netherlands)

    Tofan, Dan; Galster, Matthias; Avgeriou, Paris; Weyns, Danny

    2011-01-01

    Background: Case studies and experiments are research methods frequently applied in empirical software engineering. Experiments are well-­understood and their value as an empirical method is recognized. On the other hand, there seem to be different opinions on what constitutes a case study, and

  1. Supporting Interdisciplinary Collaboration Through Reusable Free Software. A Research Student Experience

    Science.gov (United States)

    Dimech, C.

    2013-12-01

    In this contribution, I present a critical evaluation of my experience as a research student conducting an interdisciplinary project that bridges the world of geoscience with that of astronomy. The major challenge consists in studying and modifying existing geophysical software to work with synthetic solar data not obtained by direct measurement but useful for testing and evaluation, and data released from the satellite HINODE and the Solar Dynamics Observatory. I have been fortunate to collaborate closely with multiple geoscientists keen to share their software codes and help me understand their implementations so I can extend the methodology to solve problems in solar physics. Moreover, two additional experiences have helped me develop my research and collaborative skills. First was an opportunity to involve an undergraduate student, and secondly, my participation at the GNU Hackers Meeting in Paris. Three aspects that need particular attention to enhance the collective productivity of any group of individuals keen to extend existing codes to achieve further interdisciplinary goals have been identified. (1) The production of easily reusable code that users can study and modify even when large sets of computations are involved. (2) The transformation of solutions into tools that are 100% free software. (3) The harmonisation of collaborative interactions that effectively tackle the two aforementioned tasks. Each one will be discussed in detail during this session based on my experience as a research student.

  2. Lessons learned applying CASE methods/tools to Ada software development projects

    Science.gov (United States)

    Blumberg, Maurice H.; Randall, Richard L.

    1993-01-01

    This paper describes the lessons learned from introducing CASE methods/tools into organizations and applying them to actual Ada software development projects. This paper will be useful to any organization planning to introduce a software engineering environment (SEE) or evolving an existing one. It contains management level lessons learned, as well as lessons learned in using specific SEE tools/methods. The experiences presented are from Alpha Test projects established under the STARS (Software Technology for Adaptable and Reliable Systems) project. They reflect the front end efforts by those projects to understand the tools/methods, initial experiences in their introduction and use, and later experiences in the use of specific tools/methods and the introduction of new ones.

  3. Experiment to evaluate software safety

    International Nuclear Information System (INIS)

    Soubies, B.; Henry, J.Y.

    1994-01-01

    The process of licensing nuclear power plants for operation consists of mandatory steps featuring detailed examination of the instrumentation and control system by the safety authorities, including softwares. The criticality of these softwares obliges the manufacturer to develop in accordance with the IEC 880 standard 'Computer software in nuclear power plant safety systems' issued by the International Electronic Commission. The evaluation approach, a two-stage assessment is described in detail. In this context, the IPSN (Institute of Protection and Nuclear Safety), the technical support body of the safety authority uses the MALPAS tool to analyse the quality of the programs. (R.P.). 4 refs

  4. Experience with highly-parallel software for the storage system of the ATLAS Experiment at CERN

    CERN Document Server

    Colombo, T; The ATLAS collaboration

    2012-01-01

    The ATLAS experiment is observing proton-proton collisions delivered by the LHC accelerator. The ATLAS Trigger and Data Acquisition (TDAQ) system selects interesting events on-line in a three-level trigger system in order to store them at a budgeted rate of several hundred Hz. This paper focuses on the TDAQ data-logging system and in particular on the implementation and performance of a novel parallel software design. In this respect, the main challenge presented by the data-logging workload is the conflict between the largely parallel nature of the event processing, especially the recently introduced event compression, and the constraint of sequential file writing and checksum evaluation. This is further complicated by the necessity of operating in a fully data-driven mode, to cope with continuously evolving trigger and detector configurations. In this paper we report on the design of the new ATLAS on-line storage software. In particular we will discuss our development experience using recent concurrency-ori...

  5. Real-time software for the fusion experiment WENDELSTEIN 7-X

    International Nuclear Information System (INIS)

    Laqua, Heike; Niedermeyer, Helmut; Schacht, Joerg; Spring, Anett

    2006-01-01

    The super conducting stellarator WENDELSTEIN 7-X will be capable of steady state operation as well as of pulsed operation. All discharge scenarios compatible with these capabilities will be supported by the control system. Each technical component and each diagnostic system will have its own control system, based on a real-time computer with the dedicated software described here, permitting autonomous operation for commissioning and testing and coordinated operation during experimental sessions. The system behaviour as far as it is relevant for the experiment, like parameters and algorithms, will be exclusively controlled by complex software objects. By changing references to these objects synchronously in all computers the whole system behaviour can be changed from one cycle to the next. All data required for the construction of the software objects will be stored in one central database and constructed in the control computers well before they are required

  6. Software design practice using two SCADA software packages

    DEFF Research Database (Denmark)

    Basse, K.P.; Christensen, Georg Kronborg; Frederiksen, P. K.

    1996-01-01

    Typical software development for manufacturing control is done either by specialists with consideral real-time programming experience or done by the adaptation of standard software packages for manufacturing control. After investigation and test of two commercial software packages: "InTouch" and ......Touch" and "Fix", it is argued, that a more efficient software solution can be achieved by utilising an integrated specification for SCADA and PLC-programming. Experiences gained from process control is planned investigated for descrete parts manufacturing....

  7. Software complex for developing dynamically packed program system for experiment automation

    International Nuclear Information System (INIS)

    Baluka, G.; Salamatin, I.M.

    1985-01-01

    Software complex for developing dynamically packed program system for experiment automation is considered. The complex includes general-purpose programming systems represented as the RT-11 standard operating system and specially developed problem-oriented moduli providing execution of certain jobs. The described complex is realized in the PASKAL' and MAKRO-2 languages and it is rather flexible to variations of the technique of the experiment

  8. Enhancing the Student Learning Experience in Software Engineering Project Courses

    Science.gov (United States)

    Marques, Maira; Ochoa, Sergio F.; Bastarrica, Maria Cecilia; Gutierrez, Francisco J.

    2018-01-01

    Carrying out real-world software projects in their academic studies helps students to understand what they will face in industry, and to experience first-hand the challenges involved when working collaboratively. Most of the instructional strategies used to help students take advantage of these activities focus on supporting agile programming,…

  9. SOFTWARE PROCESS IMPROVEMENT: AWARENESS, USE, AND BENEFITS IN CANADIAN SOFTWARE DEVELOPMENT FIRMS

    OpenAIRE

    CHEVERS, DELROY

    2017-01-01

    ABSTRACT Since 1982, the software development community has been concerned with the delivery of quality systems. Software process improvement (SPI) is an initiative to avoid the delivery of low quality systems. However, the awareness and adoption of SPI is low. Thus, this study examines the rate of awareness, use, and benefits of SPI initiatives in Canadian software development firms. Using SPSS as the analytical tool, this study found that 59% of Canadian software development firms are aware...

  10. Return to Experience and Initial Wage Level

    DEFF Research Database (Denmark)

    Sørensen, Kenneth Lykke; Vejlin, Rune Majlund

    This paper estimates the relationship between initial wage and return to experience. We use a Mincer-like wage model to nonparametrically estimate this relationship allowing for an unobservable individual permanent effect in wages and unobservable individual return to experience. The relationship...

  11. Comparison of SOLA-FLX calculations with experiments at systems, science and software

    International Nuclear Information System (INIS)

    Dienes, J.K.; Hirt, C.W.; Stein, L.R.

    1977-03-01

    Preliminary results of a comparison between hydroelastic calculations at the Los Alamos Scientific Laboratory and experiments at Systems, Science and Software are described. The axisymmetric geometry is an idealization of a pressurized water reactor at a scale of 1/25. Reasons for some of the discrepancies are described, and suggestions for improving both experiments and calculations are discussed

  12. Return to experience and initial wage level

    DEFF Research Database (Denmark)

    Sørensen, K.L.; Vejlin, R.

    2014-01-01

    This paper estimates the relationship between initial wage and return to experience. We use a Mincer-like wage model to non-parametrically estimate this relationship allowing for an unobservable individual permanent effect in wages and unobservable individual return to experience. The relationshi...

  13. Potential errors when fitting experience curves by means of spreadsheet software

    NARCIS (Netherlands)

    van Sark, W.G.J.H.M.|info:eu-repo/dai/nl/074628526; Alsema, E.A.|info:eu-repo/dai/nl/073416258

    2010-01-01

    Progress ratios (PRs) are widely used in forecasting development of many technologies; they are derived from historical data represented in experience curves. Fitting the double logarithmic graphs is easily done with spreadsheet software like Microsoft Excel, by adding a trend line to the graph.

  14. SOA-Driven Business-Software Alignment

    NARCIS (Netherlands)

    Shishkov, Boris; van Sinderen, Marten J.; Quartel, Dick; Tsai, W.; Chung, J.; Younas, M.

    2006-01-01

    The alignment of business processes and their supporting application software is a major concern during the initial software design phases. This paper proposes a design approach addressing this problem of business-software alignment. The approach takes an initial business model as a basis in

  15. METHOD FOR SECURITY SPECIFICATION SOFTWARE REQUIREMENTS AS A MEANS FOR IMPLEMENTING A SOFTWARE DEVELOPMENT PROCESS SECURE - MERSEC

    Directory of Open Access Journals (Sweden)

    Castro Mecías, L.T.

    2015-06-01

    Full Text Available Often security incidents that have the object or use the software as a means of causing serious damage and legal, economic consequences, etc. Results of a survey by Kaspersky Lab reflectvulnerabilities in software are the main cause of security incidents in enterprises, the report shows that 85% of them have reported security incidents and vulnerabilities in software are the main reason is further estimated that incidents can cause significant losses estimated from 50,000 to $ 649.000. (1 In this regard academic and industry research focuses on proposals based on reducing vulnerabilities and failures of technology, with a positive influence on how the software is developed. A development process for improved safety practices and should include activities from the initial phases of the software; so that security needs are identified, manage risk and appropriate measures are implemented. This article discusses a method of analysis, acquisition and requirements specification of the software safety analysis on the basis of various proposals and deficiencies identified from participant observation in software development teams. Experiments performed using the proposed yields positive results regarding the reduction of security vulnerabilities and compliance with the safety objectives of the software.

  16. Initial state radiation experiment at MAMI

    Energy Technology Data Exchange (ETDEWEB)

    Mihovilovič, M.; Merkel, H. [Institut für Kernphysik, Johannes Gutenberg-Universität Mainz, Johann-Joachim-Becher-Weg 45, 55128 Mainz (Germany); Collaboration: A1-Collaboration

    2013-11-07

    In an attempt to contribute further insight into the discrepancy between the Lamb shift and elastic scattering determinations of the proton charge radius, a new experiment at MAMI is underway, aimed at measuring proton form-factors at very low momentum transfers by using a new technique based on initial state radiation. This paper reports on first findings of the pilot measurement performed in 2010, whose main goal was to check the feasibility of the proposed experiment and to recognize and overcome potential obstacles before running the full experiment in 2013.

  17. Software-Based Wireless Power Transfer Platform for Various Power Control Experiments

    Directory of Open Access Journals (Sweden)

    Sun-Han Hwang

    2015-07-01

    Full Text Available In this paper, we present the design and evaluation of a software-based wireless power transfer platform that enables the development of a prototype involving various open- and closed-loop power control functions. Our platform is based on a loosely coupled planar wireless power transfer circuit that uses a class-E power amplifier. In conjunction with this circuit, we implement flexible control functions using a National Instruments Data Acquisition (NI DAQ board and algorithms in the MATLAB/Simulink. To verify the effectiveness of our platform, we conduct two types of power-control experiments: a no-load or metal detection using open-loop power control, and an output voltage regulation for different receiver positions using closed-loop power control. The use of the MATLAB/Simulink software as a part of the planar wireless power transfer platform for power control experiments is shown to serve as a useful and inexpensive alternative to conventional hardware-based platforms.

  18. Views of Pre-Service Teachers Following Teaching Experience on Use of Dynamic Geometry Software

    Science.gov (United States)

    Günes, Kardelen; Tapan-Broutin, Menekse Seden

    2017-01-01

    The study aims to determine the views of final-year pre-service mathematics teachers towards their experience of the use of dynamic geometry software in teaching, following the implementation processes that they carried out when using this software in a real classroom environment. The study was designed as a case study, which is one of the…

  19. Experiences from the formal specification of the integration platform and the synthesis of SDT with the software bus

    International Nuclear Information System (INIS)

    Thunem, Harald; Mohn, Peter; Sandmark, Haakon; Stoelen, Ketil

    1999-04-01

    The three year programme 1997-1999 for the OECD Halden Reactor Project (HRP) identifies the need to gain experience from applying formal techniques in real-life system developments. This motivated the initiation of the HRP research activity Integration of Formal Specification in the Development of HAMMLAB 2000 (INT-FS). The principal objective was to experiment with formal techniques in system developments at the HRP; in particular, system developments connected to HAMMLAB 2000 - the computerised laboratory for man-machine-interaction experiments currently under construction. It was hoped that this experimentation with formal techniques should result in a better understanding of how such techniques should be utilised in a more industrial setting. To obtain more knowledge with respect to the practical effects and consequences of an increased level of formalization was another objective. This report summarises experiences, results and conclusions from a pre-study addressing INT-FS related issues connected to the development of the HAMMLAB 2000 Integration Platform (IP). The report starts by giving a brief overview of the IP. Then it describes and summarises experiences from the formalization of a top-level requirements specification for the IP. Finally, it discusses various approaches for the integration of applications generated automatically through the CASE-tool SDT and the Software Bus on which the communication within HAMMLAB 2000 will be based. The report concludes that the selected formalisms and tools are well-suited to describe IP-like systems. The report also concludes that the integration of SDT applications with the Software Bus will not be a major obstacle, and finally that a monitoring component for the IP is well-suited for development within INT-FS (author) (ml)

  20. Guidelines for using empirical studies in software engineering education

    Directory of Open Access Journals (Sweden)

    Fabian Fagerholm

    2017-09-01

    Full Text Available Software engineering education is under constant pressure to provide students with industry-relevant knowledge and skills. Educators must address issues beyond exercises and theories that can be directly rehearsed in small settings. Industry training has similar requirements of relevance as companies seek to keep their workforce up to date with technological advances. Real-life software development often deals with large, software-intensive systems and is influenced by the complex effects of teamwork and distributed software development, which are hard to demonstrate in an educational environment. A way to experience such effects and to increase the relevance of software engineering education is to apply empirical studies in teaching. In this paper, we show how different types of empirical studies can be used for educational purposes in software engineering. We give examples illustrating how to utilize empirical studies, discuss challenges, and derive an initial guideline that supports teachers to include empirical studies in software engineering courses. Furthermore, we give examples that show how empirical studies contribute to high-quality learning outcomes, to student motivation, and to the awareness of the advantages of applying software engineering principles. Having awareness, experience, and understanding of the actions required, students are more likely to apply such principles under real-life constraints in their working life.

  1. Implementation of Software Configuration Management Process by Models: Practical Experiments and Learned Lessons

    Directory of Open Access Journals (Sweden)

    Bartusevics Arturs

    2014-12-01

    Full Text Available Nowadays software configuration management process is not only dilemma which system should be used for version control or how to merge changes from one source code branch to other. There are multiple tasks such as version control, build management, deploy management, status accounting, bug tracking and many others that should be solved to support full configuration management process according to most popular quality standards. The main scope of the mentioned process is to include only valid and tested software items to final version of product and prepare a new version as soon as possible. To implement different tasks of software configuration management process, a set of different tools, scripts and utilities should be used. The current paper provides a new model-based approach to implementation of configuration management. Using different models, a new approach helps to organize existing solutions and develop new ones by a parameterized way, thus increasing reuse of solutions. The study provides a general description of new model-based conception and definitions of all models needed to implement a new approach. The second part of the paper contains an overview of criteria, practical experiments and lessons learned from using new models in software configuration management. Finally, further works are defined based on results of practical experiments and lessons learned.

  2. ACTS: from ATLAS software towards a common track reconstruction software

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00349786; The ATLAS collaboration; Salzburger, Andreas; Kiehn, Moritz; Hrdinka, Julia; Calace, Noemi

    2017-01-01

    Reconstruction of charged particles' trajectories is a crucial task for most particle physics experiments. The high instantaneous luminosity achieved at the LHC leads to a high number of proton-proton collisions per bunch crossing, which has put the track reconstruction software of the LHC experiments through a thorough test. Preserving track reconstruction performance under increasingly difficult experimental conditions, while keeping the usage of computational resources at a reasonable level, is an inherent problem for many HEP experiments. Exploiting concurrent algorithms and using multivariate techniques for track identification are the primary strategies to achieve that goal. Starting from current ATLAS software, the ACTS project aims to encapsulate track reconstruction software into a generic, framework- and experiment-independent software package. It provides a set of high-level algorithms and data structures for performing track reconstruction tasks as well as fast track simulation. The software is de...

  3. Quality assurance for CORAL and COOL within the LCG software stack for the LHC experiments

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    CORAL and COOL are software packages used by the LHC experiments for managing different categories of physics data using a variety of relational database technologies. The core components are written in C++, but Python bindings are also provided. CORAL is a generic relational access layer, while COOL includes the implementation of a specific relational data model and optimization of SQL queries for "conditions data". The software is the result of more than 10 years of development in colaboration between the IT department and the LHC experiments. The packages are built and released within the LCG software stack, for which automatic nightly builds and release installations are provided by PH-SFT (cmake, jenkins, cdash) for many different platforms, compilers and software version configurations. Test-driven development and functional tests of both C++ and Python components (CppUnit, unittest) have been key elements in the success of the projects. Dedicated test suites have also been prepared to commission and ma...

  4. Offline Software for the Mu2e Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Kutschke, Robert K. [Fermilab

    2012-01-01

    The Mu2e experiment at Fermilab is in the midst of its R&D and approval processes. To aid and inform this process, a small team has developed an end-to-end Geant4-based simulation package and has developed reconstruction code that is already at the stage of an advanced prototype. Having these tools available at an early stage allows design options and tradeoffs to be studied using high level physics quantities. A key to the success of this effort has been, as much as possible, to acquire software and customize it, rather than to build it in-house.

  5. Experiment on safety software evaluation

    International Nuclear Information System (INIS)

    Soubies, B.; Henry, J.Y.

    1994-06-01

    The licensing procedures process of nuclear plants includes compulsory steps which bring about a thorough exam of the commands control system. In this context the IPSN uses a tool called MALPAS to carry out an analysis of the quality of the software involved in safety control. The IPSN also try to obtain the automation of the generation of test games necessary for dynamical analysis. The MALPAS tool puts forward the particularities of programing which can influence the testability and the upholding of the studied software. (TEC). 4 refs

  6. GIMS-Software for asset market experiments.

    Science.gov (United States)

    Palan, Stefan

    2015-03-01

    In this article we lay out requirements for an experimental market software for financial and economic research. We then discuss existing solutions. Finally, we introduce GIMS, an open source market software which is characterized by extensibility and ease of use, while offering nearly all of the required functionality.

  7. GIMS—Software for asset market experiments

    Science.gov (United States)

    Palan, Stefan

    2015-01-01

    In this article we lay out requirements for an experimental market software for financial and economic research. We then discuss existing solutions. Finally, we introduce GIMS, an open source market software which is characterized by extensibility and ease of use, while offering nearly all of the required functionality. PMID:26525085

  8. ACTS: from ATLAS software towards a common track reconstruction software

    Science.gov (United States)

    Gumpert, C.; Salzburger, A.; Kiehn, M.; Hrdinka, J.; Calace, N.; ATLAS Collaboration

    2017-10-01

    Reconstruction of charged particles’ trajectories is a crucial task for most particle physics experiments. The high instantaneous luminosity achieved at the LHC leads to a high number of proton-proton collisions per bunch crossing, which has put the track reconstruction software of the LHC experiments through a thorough test. Preserving track reconstruction performance under increasingly difficult experimental conditions, while keeping the usage of computational resources at a reasonable level, is an inherent problem for many HEP experiments. Exploiting concurrent algorithms and using multivariate techniques for track identification are the primary strategies to achieve that goal. Starting from current ATLAS software, the ACTS project aims to encapsulate track reconstruction software into a generic, framework- and experiment-independent software package. It provides a set of high-level algorithms and data structures for performing track reconstruction tasks as well as fast track simulation. The software is developed with special emphasis on thread-safety to support parallel execution of the code and data structures are optimised for vectorisation to speed up linear algebra operations. The implementation is agnostic to the details of the detection technologies and magnetic field configuration which makes it applicable to many different experiments.

  9. A Cooperative Coevolution Approach to Automate Pattern-based Software Architectural Synthesis

    NARCIS (Netherlands)

    Xu, Y.R.; Liang, P.

    2014-01-01

    To reuse successful experience in software architecture design, architects use architectural patterns as reusable architectural knowledge for architectural synthesis. However, it has been observed that the resulting architecture does not always conform to the initial architectural patterns employed.

  10. Automatically generated acceptance test: A software reliability experiment

    Science.gov (United States)

    Protzel, Peter W.

    1988-01-01

    This study presents results of a software reliability experiment investigating the feasibility of a new error detection method. The method can be used as an acceptance test and is solely based on empirical data about the behavior of internal states of a program. The experimental design uses the existing environment of a multi-version experiment previously conducted at the NASA Langley Research Center, in which the launch interceptor problem is used as a model. This allows the controlled experimental investigation of versions with well-known single and multiple faults, and the availability of an oracle permits the determination of the error detection performance of the test. Fault interaction phenomena are observed that have an amplifying effect on the number of error occurrences. Preliminary results indicate that all faults examined so far are detected by the acceptance test. This shows promise for further investigations, and for the employment of this test method on other applications.

  11. The ATLAS online High Level Trigger framework: Experience reusing offline software components in the ATLAS trigger

    International Nuclear Information System (INIS)

    Wiedenmann, Werner

    2010-01-01

    Event selection in the ATLAS High Level Trigger is accomplished to a large extent by reusing software components and event selection algorithms developed and tested in an offline environment. Many of these offline software modules are not specifically designed to run in a heavily multi-threaded online data flow environment. The ATLAS High Level Trigger (HLT) framework based on the GAUDI and ATLAS ATHENA frameworks, forms the interface layer, which allows the execution of the HLT selection and monitoring code within the online run control and data flow software. While such an approach provides a unified environment for trigger event selection across all of ATLAS, it also poses strict requirements on the reused software components in terms of performance, memory usage and stability. Experience of running the HLT selection software in the different environments and especially on large multi-node trigger farms has been gained in several commissioning periods using preloaded Monte Carlo events, in data taking periods with cosmic events and in a short period with proton beams from LHC. The contribution discusses the architectural aspects of the HLT framework, its performance and its software environment within the ATLAS computing, trigger and data flow projects. Emphasis is also put on the architectural implications for the software by the use of multi-core processors in the computing farms and the experiences gained with multi-threading and multi-process technologies.

  12. Operating Experience of Digital, Software-based Components Used in I and C and Electrical Systems in German NPPs

    International Nuclear Information System (INIS)

    Blum, Stefanie; Lochthofen, Andre; Quester, Claudia; Arians, Robert

    2015-01-01

    In recent years, many components in instrumentation and control (I and C) and electrical systems of nuclear power plants (NPPs) were replaced by digital, software-based components. Due to the more complex structure, software-based I and C and electrical components show the potential for new failure mechanisms and an increasing number of failure possibilities, including the potential for common cause failures. An evaluation of the operating experience of digital, software-based components may help to determine new failure modes of these components. In this paper, we give an overview over the results of the evaluation of the operating experience of digital, software-based components used in I and C and electrical systems in NPPs in Germany. (authors)

  13. Introduction to the KWALON Experiment: Discussions on Qualitative Data Analysis Software by Developers and Users

    Directory of Open Access Journals (Sweden)

    Jeanine C. Evers

    2010-11-01

    Full Text Available In this introduction to the KWALON Experiment and related conference, we describe the motivations of the collaborating European networks in organising this joint endeavour. The KWALON Experiment consisted of five developers of Qualitative Data Analysis (QDA software analysing a dataset regarding the financial crisis in the time period 2008-2009, provided by the conference organisers. Besides this experiment, researchers were invited to present their reflective papers on the use of QDA software. This introduction gives a description of the experiment, the "rules", research questions and reflective points, as well as a full description of the dataset and search rules used, and our reflection on the lessons learned. The related conference is described, as are the papers which are included in this FQS issue. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1101405

  14. Cyclic Voltammetry Simulations with DigiSim Software: An Upper-Level Undergraduate Experiment

    Science.gov (United States)

    Messersmith, Stephania J.

    2014-01-01

    An upper-division undergraduate chemistry experiment is described which utilizes DigiSim software to simulate cyclic voltammetry (CV). Four mechanisms were studied: a reversible electron transfer with no subsequent or proceeding chemical reactions, a reversible electron transfer followed by a reversible chemical reaction, a reversible chemical…

  15. A Monte Carlo software for the 1-dimensional simulation of IBIC experiments

    Energy Technology Data Exchange (ETDEWEB)

    Forneris, J., E-mail: jacopo.forneris@unito.it [Physics Department, NIS Centre and CNISM, University of Torino, INFN-sez. Torino, Via P. Giuria 1, 10125 Torino (Italy); Jakšić, M. [Ruđer Bošković Institute, Bijenička cesta 54, P.O. Box 180, 10002 Zagreb (Croatia); Pastuović, Ž. [Australian Nuclear Science and Technology Organization, Locked Bag 2001, Kirrawee DC, NSW 2234 (Australia); Vittone, E. [Physics Department, NIS Centre and CNISM, University of Torino, INFN-sez. Torino, Via P. Giuria 1, 10125 Torino (Italy)

    2014-08-01

    The ion beam induced charge (IBIC) microscopy is a valuable tool for the analysis of the electronic properties of semiconductors. In this work, a recently developed Monte Carlo approach for the simulation of IBIC experiments is presented along with a self-standing software equipped with graphical user interface. The method is based on the probabilistic interpretation of the excess charge carrier continuity equations and it offers to the end-user the full control not only of the physical properties ruling the induced charge formation mechanism (i.e., mobility, lifetime, electrostatics, device’s geometry), but also of the relevant experimental conditions (ionization profiles, beam dispersion, electronic noise) affecting the measurement of the IBIC pulses. Moreover, the software implements a novel model for the quantitative evaluation of the radiation damage effects on the charge collection efficiency degradation of ion-beam-irradiated devices. The reliability of the model implementation is then validated against a benchmark IBIC experiment.

  16. Classroom Live: a software-assisted gamification tool

    Science.gov (United States)

    de Freitas, Adrian A.; de Freitas, Michelle M.

    2013-06-01

    Teachers have come to rely on a variety of approaches in order to elicit and sustain student interest in the classroom. One particular approach, known as gamification, seeks to improve student engagement by transforming the traditional classroom experience into a competitive multiplayer game. Initial attempts at classroom gamification relied on the teacher manually tracking student progress. At the US Air Force Academy, we wanted to experiment with a software gamification tool. Our client/server suite, dubbed Classroom Live, streamlines the gamification process for the teacher by simplifying common tasks. Simultaneously, the tool provides students with an esthetically pleasing user interface that offers in game rewards in exchange for their participation. Classroom Live is still in development, but our initial experience using the tool has been extremely positive and confirms our belief that students respond positively to gamification, even at the undergraduate level.

  17. Software methodologies for the SSC

    International Nuclear Information System (INIS)

    Loken, S.C.

    1990-01-01

    This report describes some of the considerations that will determine how the author developed software for the SSC. He begins with a review of the general computing problem for SSC experiments and recent experiences in software engineering for the present generation of experiments. This leads to a discussion of the software technologies that will be critical for the SSC experiments. He describes the emerging software standards and commercial products that may be useful in addressing the SSC needs. He concludes with some comments on how collaborations and the SSC Lab should approach the software development issue

  18. A Half-Day Workshop on ``Smarter Investment by Aligning SPI Initiatives, Capabilities and Stakeholder Values''

    Science.gov (United States)

    Selioukova, Yana; Frühwirth, Christian

    Software companies who want to improve software process capabilities (SPCs)a systematic method to make informed investment decisions on software process improvement (SPI) initiatives. Such decisions should aim at creating maximum stakeholder values. To address this problem, we present a method with tool support that may help companies align stakeholder values with SPCs and SPI initiatives. The proposed method has been developed based on the well-established “Quality Function Deployment” (QFD) approach. The experience with the proposed method suggests that it particularly helps to reduce the risk of misalignment by identifying those SPI initiatives that are most beneficial to stakeholders. The tool support provided with the proposed method also generated positive experiences in increasing the usability of the method and helped companies in the elicitation and prioritization of stakeholder values. Therefore, we propose a workshop for the method work out named “Smarter Investment by Aligning SPI Initiatives, Capabilities and Stakeholder Values” in hypothetical case company.

  19. Evolving impact of Ada on a production software environment

    Science.gov (United States)

    Mcgarry, F.; Esker, L.; Quimby, K.

    1988-01-01

    Many aspects of software development with Ada have evolved as our Ada development environment has matured and personnel have become more experienced in the use of Ada. The Software Engineering Laboratory (SEL) has seen differences in the areas of cost, reliability, reuse, size, and use of Ada features. A first Ada project can be expected to cost about 30 percent more than an equivalent FORTRAN project. However, the SEL has observed significant improvements over time as a development environment progresses to second and third uses of Ada. The reliability of Ada projects is initially similar to what is expected in a mature FORTRAN environment. However, with time, one can expect to gain improvements as experience with the language increases. Reuse is one of the most promising aspects of Ada. The proportion of reusable Ada software on our Ada projects exceeds the proportion of reusable FORTRAN software on our FORTRAN projects. This result was noted fairly early in our Ada projects, and experience shows an increasing trend over time.

  20. Ontario Hydro experience in the identification and mitigation of potential failures in safety critical software systems

    International Nuclear Information System (INIS)

    Huget, R.G.; Viola, M.; Froebel, P.A.

    1995-01-01

    Ontario Hydro has had experience in designing and qualifying safety critical software used in the reactor shutdown systems of its nuclear generating stations. During software design, an analysis of system level hazards and potential hardware failure effects provide input to determining what safeguards will be needed. One form of safeguard, called software self checks, continually monitor the health of the computer on line. The design of self checks usually is a trade off between the amount of computing resources required, the software complexity, and the level of safeguarding provided. As part of the software verification activity, a software hazards analysis is performed, which identifiers any failure modes that could lead to the software causing an unsafe state, and which recommends changes to mitigate that potential. These recommendations may involve a re-structuring of the software to be more resistant to failure, or the introduction of other safeguarding measures. This paper discusses how Ontario Hydro has implemented these aspects of software design and verification into safety critical software used in reactor shutdown systems

  1. Software engineering the current practice

    CERN Document Server

    Rajlich, Vaclav

    2011-01-01

    INTRODUCTION History of Software EngineeringSoftware PropertiesOrigins of SoftwareBirth of Software EngineeringThird Paradigm: Iterative ApproachSoftware Life Span ModelsStaged ModelVariants of Staged ModelSoftware Technologies Programming Languages and CompilersObject-Oriented TechnologyVersion Control SystemSoftware ModelsClass DiagramsUML Activity DiagramsClass Dependency Graphs and ContractsSOFTWARE CHANGEIntroduction to Software ChangeCharacteristics of Software ChangePhases of Software ChangeRequirements and Their ElicitationRequirements Analysis and Change InitiationConcepts and Concept

  2. Initial Educational Experiences of Tertiary Students. LSAY Briefing Number 14

    Science.gov (United States)

    Hillman, Kylie

    2008-01-01

    This "Briefing" presents information about the initial tertiary education experiences, such as satisfaction with aspects of student life and changes to initial enrolments, of two groups of young people, based on two recent Longitudinal Surveys of Australian Youth (LSAY) research reports. One study focused on the first year experiences of…

  3. Evaluating Open Source Software for Use in Library Initiatives: A Case Study Involving Electronic Publishing

    Science.gov (United States)

    Samuels, Ruth Gallegos; Griffy, Henry

    2012-01-01

    This article discusses best practices for evaluating open source software for use in library projects, based on the authors' experience evaluating electronic publishing solutions. First, it presents a brief review of the literature, emphasizing the need to evaluate open source solutions carefully in order to minimize Total Cost of Ownership. Next,…

  4. Detailed review and analysis of complex radiotherapy clinical trial planning data: Evaluation and initial experience with the SWAN software system

    International Nuclear Information System (INIS)

    Ebert, Martin A.; Haworth, Annette; Kearvell, Rachel; Hooton, Ben; Coleman, Rhonda; Spry, Nigel; Bydder, Sean; Joseph, David

    2008-01-01

    Aim: Contemporary radiotherapy clinical trials typically require complex three-dimensional (3D) treatment planning. This produces large amounts of data relating technique and dose delivery for correlation with patient outcomes. Assessment of the quality of this information is required to ensure protocol compliance, to quantify the variation in treatments given to patients and to enhance the power of studies to determine correlates of patient outcomes. Materials and methods: A software system ('SWAN') was developed to facilitate the objective analysis, quality-assurance and review of digital treatment planning data from multi-centre radiotherapy trials. The utility of this system was assessed on the basis of its functionality and our experience of its use in the context of multi-centre clinical trials and trials-support activities. Results: The SWAN system has been shown to have the functionality required for use in several multi-centre trials, including automated review and archive processes. Approximately 800 treatment plans from over 30 participating institutions have so far been assessed with the system for several treatment planning scenarios. To illustrate this we include a description of the use of the system for a large-recruitment prostate radiotherapy trial being undertaken in Australasia, including examples of how the review process has changed clinical practice. Conclusion: The successful implementation of SWAN has been demonstrated in a number of clinical trials. The software provides an opportunity for comprehensive review of treatment parameters that could impact on clinical outcomes and trial results. Such quality-assurance (QA) has previously been difficult or impossible to achieve, particularly for a clinical trial involving large numbers of patients. Such reviews have highlighted inconsistencies in clinical practice that have since been addressed through feedback from the review process. The process of data collection and review should be

  5. Report on the Third Workshop on Sustainable Software for Science: Practice and Experiences (WSSSPE3

    Directory of Open Access Journals (Sweden)

    Daniel S. Katz

    2016-10-01

    Full Text Available This report records and discusses the Third Workshop on Sustainable Software for Science: Practice and Experiences (WSSSPE3. The report includes a description of the keynote presentation of the workshop, which served as an overview of sustainable scientific software. It also summarizes a set of lightning talks in which speakers highlighted to-the-point lessons and challenges pertaining to sustaining scientific software. The final and main contribution of the report is a summary of the discussions, future steps, and future organization for a set of self-organized working groups on topics including developing pathways to funding scientific software; constructing useful common metrics for crediting software stakeholders; identifying principles for sustainable software engineering design; reaching out to research software organizations around the world; and building communities for software sustainability. For each group, we include a point of contact and a landing page that can be used by those who want to join that group’s future activities. The main challenge left by the workshop is to see if the groups will execute these activities that they have scheduled, and how the WSSSPE community can encourage this to happen.

  6. Report on the Third Workshop on Sustainable Software for Science: Practice and Experiences (WSSSPE3)

    Science.gov (United States)

    Katz, Daniel S.; Choi, Sou-Cheng T.; Niemeyer, Kyle E.; Hetherington, James; Löffler, Frank; Gunter, Dan; Idaszak, Ray; Brandt, Steven R.; Miller, Mark A.; Gesing, Sandra; Jones, Nick D.; Weber, Nic; Marru, Suresh; Allen, Gabrielle; Penzenstadler, Birgit; Venters, Colin C.; Davis, Ethan; Hwang, Lorraine; Todorov, Ilian; Patra, Abani; de Val-Borro, Miguel

    2016-02-01

    This report records and discusses the Third Workshop on Sustainable Software for Science: Practice and Experiences (WSSSPE3). The report includes a description of the keynote presentation of the workshop, which served as an overview of sustainable scientific software. It also summarizes a set of lightning talks in which speakers highlighted to-the-point lessons and challenges pertaining to sustaining scientific software. The final and main contribution of the report is a summary of the discussions, future steps, and future organization for a set of self-organized working groups on topics including developing pathways to funding scientific software; constructing useful common metrics for crediting software stakeholders; identifying principles for sustainable software engineering design; reaching out to research software organizations around the world; and building communities for software sustainability. For each group, we include a point of contact and a landing page that can be used by those who want to join that group's future activities. The main challenge left by the workshop is to see if the groups will execute these activities that they have scheduled, and how the WSSSPE community can encourage this to happen.

  7. Initial deuterium pellet experiments on FTU

    International Nuclear Information System (INIS)

    Snipes, J.A.

    1993-01-01

    Initial experiments have been performed with the Single Pellet INjector (SPIN) on FTU. SPIN is a two-stage cryogenic deuterium pellet injector capable of injection,a pellets with velocities up to 2.5 km/s. The nominal pellet mass for these experiments was approximately 1 x 10 20 atoms. These initial pellet experiments concentrated on studying pellet penetration under a variety of plasma conditions to compare with code predictions and to examine toroidal particle transport. The principal diagnostics used were two fast (∼1 μsec) photomultiplier tubes at nearly opposite toroidal locations with H α (D α ) interference filters (λ = 656 nm), a microwave cavity for pellet mass and velocity, a vertical array of soft x ray diodes without filters looking down onto the pellet, a DCN interferometer for electron density profiles, and a Michelson ECE system for electron temperature profiles. The time integral of the absolutely calibrated fast H α signal appears to give reasonable agreement with the expected pellet mass. Toroidal transport of deuterium ions from the pellet to nearly the opposite side of the tokamak agrees with calculated thermal deuterium velocities near the plasma edge. Comparison of the experimental results with code calculations using the Neutral Gas Shielding model show good agreement for the post-pellet electron temperature and density profiles and the H α profiles in some cases. Calculated penetration distances agree within 20%

  8. Continuous Software Quality analysis for the ATLAS experiment

    CERN Document Server

    Washbrook, Andrew; The ATLAS collaboration

    2017-01-01

    The regular application of software quality tools in large collaborative projects is required to reduce code defects to an acceptable level. If left unchecked the accumulation of defects invariably results in performance degradation at scale and problems with the long-term maintainability of the code. Although software quality tools are effective for identification there remains a non-trivial sociological challenge to resolve defects in a timely manner. This is a ongoing concern for the ATLAS software which has evolved over many years to meet the demands of Monte Carlo simulation, detector reconstruction and data analysis. At present over 3.8 million lines of C++ code (and close to 6 million total lines of code) are maintained by a community of hundreds of developers worldwide. It is therefore preferable to address code defects before they are introduced into a widely used software release. Recent wholesale changes to the ATLAS software infrastructure have provided an ideal opportunity to apply software quali...

  9. Software Process Improvement

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Konopka, Claudia; Nellemann, Peter

    2016-01-01

    directions. An analysis of 635 publications draws a big picture of SPI-related research of the past 25 years. Our study shows a high number of solution proposals, experience reports, and secondary studies, but only few theories. In particular, standard SPI models are analyzed and evaluated for applicability......Software process improvement (SPI) is around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out...... there? Are there new emerging approaches? What are open issues? Still, we struggle to answer the question for what is the current state of SPI and related research? We present initial results from a systematic mapping study to shed light on the field of SPI and to draw conclusions for future research...

  10. High energy physics experiment triggers and the trustworthiness of software

    International Nuclear Information System (INIS)

    Nash, T.

    1991-10-01

    For all the time and frustration that high energy physicists expend interacting with computers, it is surprising that more attention is not paid to the critical role computers play in the science. With large, expensive colliding beam experiments now dependent on complex programs working at startup, questions of reliability -- the trustworthiness of software -- need to be addressed. This issue is most acute in triggers, used to select data to record -- and data to discard -- in the real time environment of an experiment. High level triggers are built on codes that now exceed 2 million source lines -- and for the first time experiments are truly dependent on them. This dependency will increase at the accelerators planned for the new millennium (SSC and LHC), where cost and other pressures will reduce tolerance for first run problems, and the high luminosities will make this on-line data selection essential. A sense of this incipient crisis motivated the unusual juxtaposition to topics in these lectures. 37 refs., 1 fig

  11. Improving Software Engineering on NASA Projects

    Science.gov (United States)

    Crumbley, Tim; Kelly, John C.

    2010-01-01

    Software Engineering Initiative: Reduces risk of software failure -Increases mission safety. More predictable software cost estimates and delivery schedules. Smarter buyer of contracted out software. More defects found and removed earlier. Reduces duplication of efforts between projects. Increases ability to meet the challenges of evolving software technology.

  12. HL-1 tokamak data acquisition system and its initial application in the physical experiment

    International Nuclear Information System (INIS)

    Deng Huichen; Fu Bo; Dong Jiafu

    1989-11-01

    A HL-1 tokamak data acquisition system has been developed and has been used in the physical experiment. The hardware and software configuration of the system, as well as the typical acquired data in the HL-1 experiment are introduced

  13. [Mandibular reconstruction with fibula free flap. Experience of virtual reconstruction using Osirix®, a free and open source software for medical imagery].

    Science.gov (United States)

    Albert, S; Cristofari, J-P; Cox, A; Bensimon, J-L; Guedon, C; Barry, B

    2011-12-01

    The techniques of free tissue transfers are mainly used for mandibular reconstruction by specialized surgical teams. This type of reconstruction is mostly realized in matters of head and neck cancers affecting mandibular bone and requiring a wide surgical resection and interruption of the mandible. To decrease the duration of the operation, surgical procedure involves generally two teams, one devoted to cancer resection and the other one to raise the fibular flap and making the reconstruction. For a better preparation of this surgical procedure, we propose here the use of a medical imaging software enabling mandibular reconstructions in three dimensions using the CT-scan done during the initial disease-staging checkup. The software used is Osirix®, developed since 2004 by a team of radiologists from Geneva and UCLA, working on Apple® computers and downloadable free of charge in its basic version. We report here our experience of this software in 17 patients, with a preoperative modelling in three dimensions of the mandible, of the segment of mandible to be removed. It also forecasts the numbers of fragments of fibula needed and the location of osteotomies. Copyright © 2009 Elsevier Masson SAS. All rights reserved.

  14. Sci—Fri PM: Topics — 05: Experience with linac simulation software in a teaching environment

    International Nuclear Information System (INIS)

    Carlone, Marco; Harnett, Nicole; Jaffray, David; Norrlinger, Bern; Prooijen, Monique van; Milne, Emily

    2014-01-01

    Medical linear accelerator education is usually restricted to use of academic textbooks and supervised access to accelerators. To facilitate the learning process, simulation software was developed to reproduce the effect of medical linear accelerator beam adjustments on resulting clinical photon beams. The purpose of this report is to briefly describe the method of operation of the software as well as the initial experience with it in a teaching environment. To first and higher orders, all components of medical linear accelerators can be described by analytical solutions. When appropriate calibrations are applied, these analytical solutions can accurately simulate the performance of all linear accelerator sub-components. Grouped together, an overall medical linear accelerator model can be constructed. Fifteen expressions in total were coded using MATLAB v 7.14. The program was called SIMAC. The SIMAC program was used in an accelerator technology course offered at our institution; 14 delegates attended the course. The professional breakdown of the participants was: 5 physics residents, 3 accelerator technologists, 4 regulators and 1 physics associate. The course consisted of didactic lectures supported by labs using SIMAC. At the conclusion of the course, eight of thirteen delegates were able to successfully perform advanced beam adjustments after two days of theory and use of the linac simulator program. We suggest that this demonstrates good proficiency in understanding of the accelerator physics, which we hope will translate to a better ability to understand real world beam adjustments on a functioning medical linear accelerator

  15. Sci—Fri PM: Topics — 05: Experience with linac simulation software in a teaching environment

    Energy Technology Data Exchange (ETDEWEB)

    Carlone, Marco; Harnett, Nicole; Jaffray, David [Radiation Medicine Program, Princess Margaret Cancer Centre, Toronto, ON (Canada); Department of Radiation Oncology, University of Toronto, Toronto, ON (Canada); Norrlinger, Bern; Prooijen, Monique van; Milne, Emily [Radiation Medicine Program, Princess Margaret Cancer Centre, Toronto, ON (Canada)

    2014-08-15

    Medical linear accelerator education is usually restricted to use of academic textbooks and supervised access to accelerators. To facilitate the learning process, simulation software was developed to reproduce the effect of medical linear accelerator beam adjustments on resulting clinical photon beams. The purpose of this report is to briefly describe the method of operation of the software as well as the initial experience with it in a teaching environment. To first and higher orders, all components of medical linear accelerators can be described by analytical solutions. When appropriate calibrations are applied, these analytical solutions can accurately simulate the performance of all linear accelerator sub-components. Grouped together, an overall medical linear accelerator model can be constructed. Fifteen expressions in total were coded using MATLAB v 7.14. The program was called SIMAC. The SIMAC program was used in an accelerator technology course offered at our institution; 14 delegates attended the course. The professional breakdown of the participants was: 5 physics residents, 3 accelerator technologists, 4 regulators and 1 physics associate. The course consisted of didactic lectures supported by labs using SIMAC. At the conclusion of the course, eight of thirteen delegates were able to successfully perform advanced beam adjustments after two days of theory and use of the linac simulator program. We suggest that this demonstrates good proficiency in understanding of the accelerator physics, which we hope will translate to a better ability to understand real world beam adjustments on a functioning medical linear accelerator.

  16. Software engineering knowledge at your fingertips: Experiences with a software engineering-portal

    OpenAIRE

    Punter, T.; Kalmar, R.

    2003-01-01

    In order to keep up the pace with technology development, knowledge on Software Engineering (SE) methods, techniques, and tools is required. For an effective and efficient knowledge transfer, especially Small and Medium-sized Enterprises (SMEs) might benefit from Software Engineering Portals (SE-Portals). This paper provides an analysis of SE-Portals by distinguishing two types: 1) the Knowledge Portal and 2) the Knowledge & Community Portal. On behalf of the analysis we conclude that most SE...

  17. Reflections on Software Engineering Education

    NARCIS (Netherlands)

    van Vliet, H.

    2006-01-01

    In recent years, the software engineering community has focused on organizing its existing knowledge and finding opportunities to transform that knowledge into a university curriculum. SWEBOK (the Guide to the Software Engineering Body of Knowledge) and Software Engineering 2004 are two initiatives

  18. Multidisciplinary Optimization Branch Experience Using iSIGHT Software

    Science.gov (United States)

    Padula, S. L.; Korte, J. J.; Dunn, H. J.; Salas, A. O.

    1999-01-01

    The Multidisciplinary Optimization (MDO) Branch at NASA Langley is investigating frameworks for supporting multidisciplinary analysis and optimization research. A framework provides software and system services to integrate computational tasks and allows the researcher to concentrate more on the application and less on the programming details. A framework also provides a common working environment and a full range of optimization tools, and so increases the productivity of multidisciplinary research teams. Finally, a framework enables staff members to develop applications for use by disciplinary experts in other organizations. This year, the MDO Branch has gained experience with the iSIGHT framework. This paper describes experiences with four aerospace applications, including: (1) reusable launch vehicle sizing, (2) aerospike nozzle design, (3) low-noise rotorcraft trajectories, and (4) acoustic liner design. Brief overviews of each problem are provided, including the number and type of disciplinary codes and computation time estimates. In addition, the optimization methods, objective functions, design variables, and constraints are described for each problem. For each case, discussions on the advantages and disadvantages of using the iSIGHT framework are provided as well as notes on the ease of use of various advanced features and suggestions for areas of improvement.

  19. Belle II Software

    International Nuclear Information System (INIS)

    Kuhr, T; Ritter, M

    2016-01-01

    Belle II is a next generation B factory experiment that will collect 50 times more data than its predecessor, Belle. The higher luminosity at the SuperKEKB accelerator leads to higher background levels and requires a major upgrade of the detector. As a consequence, the simulation, reconstruction, and analysis software must also be upgraded substantially. Most of the software has been redesigned from scratch, taking into account the experience from Belle and other experiments and utilizing new technologies. The large amount of experimental and simulated data requires a high level of reliability and reproducibility, even in parallel environments. Several technologies, tools, and organizational measures are employed to evaluate and monitor the performance of the software during development. (paper)

  20. Initial experience with AcQsim CT simulator

    International Nuclear Information System (INIS)

    Michalski, Jeff M.; Gerber, Russell; Bosch, Walter R.; Harms, William; Matthews, John W.; Purdy, James A.; Perez, Carlos A.

    1995-01-01

    Purpose: We recently replaced our university developed CT simulator prototype with a commercial grade spiral CT simulator (Picker AcQsim) that is networked with three independent virtual simulation workstations and our 3D radiation therapy planning (3D-RTP) system multiple workstations. This presentation will report our initial experience with this CT simulation device and define criteria for optimum clinical use as well as describe some potential drawbacks of the current system. Methods and Materials: Over a 10 month period, 210 patients underwent CT simulation using the AcQsim. An additional 127 patients had a volumetric CT scan done on the device with their CT data and target and normal tissue contours ultimately transferred to our 3D-RTP system. We currently perform the initial patient localization and immobilization in the CT simulation suite by using CT topograms and a fiducial laser marking system. Immobilization devices, required for all patients undergoing CT simulation, are constructed and registered to a device that defines the treatment table coordinates. Orthogonal anterior and lateral CT topograms document patient alignment and the position of a reference coordinate center. The volumetric CT scan with appropriate CT contrast materials administered is obtained while the patient is in the immobilization device. On average, more than 100 CT slices are obtained per study. Contours defining tumor, target, and normal tissues are drawn on a slice by slice basis. Isocenter definition can be automatically defined within the target volume and marked on the patient and immobilization device before leaving the initial CT simulation session. Virtual simulation is then performed on the patient data set with the assistance of predefined target volumes and normal tissue contours displayed on rapidly computed digital reconstructed radiographs (DRRs) in a manner similar to a conventional fluoroscopic radiotherapy simulator. Lastly, a verification simulation is

  1. The Structure of Design Theories, and an Analysis of their Use in Software Engineering Experiments

    NARCIS (Netherlands)

    Wieringa, Roelf J.; Daneva, Maia; Condori-Fernandez, Nelly

    In this paper we analyse possible reasons for the relatively low use of theories in software engineering (SE) papers found by Hannay et al.~cite{Hannay07}. We provide an initial explanation in terms of properties of theories, test this by analyzing 32 of the 40 theories reviewed by Hannay et al.,

  2. Initial validation of ATLAS software on the ARM architecture

    Energy Technology Data Exchange (ETDEWEB)

    Kawamura, Gen; Quadt, Arnulf; Smith, Joshua Wyatt [II. Physikalisches Institut, Georg-August Universitaet Goettingen (Germany); Seuster, Rolf [TRIUMF (Canada); Stewart, Graeme [University of Glasgow (United Kingdom)

    2016-07-01

    In the early 2000's the introduction of the multi-core era of computing helped industry and experiments such as ATLAS realize even more computing power. This was necessary as the limits of what a single-core processor could do where quickly being reached. Our current model of computing is to increase the number of multi-core nodes in a server farm in order to handle the increased influx of data. As power costs and our need for more computing power increase, this model will eventually become non-realistic. Once again a paradigm shift has to take place. One such option is to look at alternative architectures for large scale server farms. ARM processors are such an example. Making up approximately 95 % of the smartphone and tablet market these processors are widely available, very power conservative and constantly becoming faster. The ATLAS software code base (Athena) is extremely complex comprising of more than 6.5 million lines of code. It has very recently been ported to the ARM 64-bit architecture. The process of our port as well as the first validation plots are presented and compared to the traditional x86 architecture.

  3. Software Agents Applications Using Real-Time CORBA

    Science.gov (United States)

    Fowell, S.; Ward, R.; Nielsen, M.

    This paper describes current projects being performed by SciSys in the area of the use of software agents, built using CORBA middleware, to improve operations within autonomous satellite/ground systems. These concepts have been developed and demonstrated in a series of experiments variously funded by ESA's Technology Flight Opportunity Initiative (TFO) and Leading Edge Technology for SMEs (LET-SME), and the British National Space Centre's (BNSC) National Technology Programme. Some of this earlier work has already been reported in [1]. This paper will address the trends, issues and solutions associated with this software agent architecture concept, together with its implementation using CORBA within an on-board environment, that is to say taking account of its real- time and resource constrained nature.

  4. Initial Results from Coaxial Helicity Injection Experiments in NSTX

    International Nuclear Information System (INIS)

    Raman, R.; Jarboe, T.R.; Mueller, D.; Schaffer, M.J.; Maqueda, R.; Nelson, B.A.; Sabbagh, S.; Bell, M.; Ewig, R.; Fredrickson, E.; Gates, D.; Hosea, J.; Ji, H.; Kaita, R.; Kaye, S.M.; Kugel, H.; Maingi, R.; Menard, J.; Ono, M.; Orvis, D.; Paolette, F.; Paul, S.; Peng, M.; Skinner, C.H.; Wilgen, W.; Zweben, S.

    2001-01-01

    Coaxial Helicity Injection (CHI) has been investigated on the National Spherical Torus Experiment (NSTX). Initial experiments produced 130 kA of toroidal current without the use of the central solenoid. The corresponding injector current was 20 kA. Discharges with pulse lengths up to 130 ms have been produced

  5. Software-defined Quantum Networking Ecosystem

    Energy Technology Data Exchange (ETDEWEB)

    2017-01-01

    The software enables a user to perform modeling and simulation of software-defined quantum networks. The software addresses the problem of how to synchronize transmission of quantum and classical signals through multi-node networks and to demonstrate quantum information protocols such as quantum teleportation. The software approaches this problem by generating a graphical model of the underlying network and attributing properties to each node and link in the graph. The graphical model is then simulated using a combination of discrete-event simulators to calculate the expected state of each node and link in the graph at a future time. A user interacts with the software by providing an initial network model and instantiating methods for the nodes to transmit information with each other. This includes writing application scripts in python that make use of the software library interfaces. A user then initiates the application scripts, which invokes the software simulation. The user then uses the built-in diagnostic tools to query the state of the simulation and to collect statistics on synchronization.

  6. Software engineering experience from the LEP experiment OPAL

    International Nuclear Information System (INIS)

    Schaile, O.

    1990-01-01

    This contribution describes some of the activities within the OPAL collaboration at LEP to apply Software Engineering Techniques for program development and data documentation. It concentrates on two aspects: Structured Analysis Techniques and a data documentation system developed within OPAL. As far as evaluations are given they are the authors view and opinion

  7. Finding upper bounds for software failure probabilities - experiments and results

    International Nuclear Information System (INIS)

    Kristiansen, Monica; Winther, Rune

    2005-09-01

    This report looks into some aspects of using Bayesian hypothesis testing to find upper bounds for software failure probabilities. In the first part, the report evaluates the Bayesian hypothesis testing approach for finding upper bounds for failure probabilities of single software components. The report shows how different choices of prior probability distributions for a software component's failure probability influence the number of tests required to obtain adequate confidence in a software component. In the evaluation, both the effect of the shape of the prior distribution as well as one's prior confidence in the software component were investigated. In addition, different choices of prior probability distributions are discussed based on their relevance in a software context. In the second part, ideas on how the Bayesian hypothesis testing approach can be extended to assess systems consisting of multiple software components are given. One of the main challenges when assessing systems consisting of multiple software components is to include dependency aspects in the software reliability models. However, different types of failure dependencies between software components must be modelled differently. Identifying different types of failure dependencies are therefore an important condition for choosing a prior probability distribution, which correctly reflects one's prior belief in the probability for software components failing dependently. In this report, software components include both general in-house software components, as well as pre-developed software components (e.g. COTS, SOUP, etc). (Author)

  8. The U.S./IAEA Workshop on Software Sustainability for Safeguards Instrumentation

    Energy Technology Data Exchange (ETDEWEB)

    Pepper S. E.; .; Worrall, L.; Pickett, C.; Bachner, K.; Queirolo, A.

    2014-08-08

    The U.S. National Nuclear Security Administration’s Next Generation Safeguards Initiative, the U.S. Department of State, and the International Atomic Energy Agency (IAEA) organized a a workshop on the subject of ”Software Sustainability for Safeguards Instrumentation.” The workshop was held at the Vienna International Centre in Vienna, Austria, May 6-8, 2014. The workshop participants included software and hardware experts from national laboratories, industry, government, and IAEA member states who were specially selected by the workshop organizers based on their experience with software that is developed for the control and operation of safeguards instrumentation. The workshop included presentations, to orient the participants to the IAEA Department of Safeguards software activities related to instrumentation data collection and processing, and case studies that were designed to inspire discussion of software development, use, maintenance, and upgrades in breakout sessions and to result in recommendations for effective software practices and management. This report summarizes the results of the workshop.

  9. The ATLAS online High Level Trigger framework experience reusing offline software components in the ATLAS trigger

    CERN Document Server

    Wiedenmann, W

    2009-01-01

    Event selection in the Atlas High Level Trigger is accomplished to a large extent by reusing software components and event selection algorithms developed and tested in an offline environment. Many of these offline software modules are not specifically designed to run in a heavily multi-threaded online data flow environment. The Atlas High Level Trigger (HLT) framework based on the Gaudi and Atlas Athena frameworks, forms the interface layer, which allows the execution of the HLT selection and monitoring code within the online run control and data flow software. While such an approach provides a unified environment for trigger event selection across all of Atlas, it also poses strict requirements on the reused software components in terms of performance, memory usage and stability. Experience of running the HLT selection software in the different environments and especially on large multi-node trigger farms has been gained in several commissioning periods using preloaded Monte Carlo events, in data taking peri...

  10. Modelling software failures of digital I and C in probabilistic safety analyses based on the TELEPERM registered XS operating experience

    International Nuclear Information System (INIS)

    Jockenhoevel-Barttfeld, Mariana; Taurines Andre; Baeckstroem, Ola; Holmberg, Jan-Erik; Porthin, Markus; Tyrvaeinen, Tero

    2015-01-01

    Digital instrumentation and control (I and C) systems appear as upgrades in existing nuclear power plants (NPPs) and in new plant designs. In order to assess the impact of digital system failures, quantifiable reliability models are needed along with data for digital systems that are compatible with existing probabilistic safety assessments (PSA). The paper focuses on the modelling of software failures of digital I and C systems in probabilistic assessments. An analysis of software faults, failures and effects is presented to derive relevant failure modes of system and application software for the PSA. The estimations of software failure probabilities are based on an analysis of the operating experience of TELEPERM registered XS (TXS). For the assessment of application software failures the analysis combines the use of the TXS operating experience at an application function level combined with conservative engineering judgments. Failure probabilities to actuate on demand and of spurious actuation of typical reactor protection application are estimated. Moreover, the paper gives guidelines for the modelling of software failures in the PSA. The strategy presented in this paper is generic and can be applied to different software platforms and their applications.

  11. Magnetohydrodynamic simulation of solid-deuterium-initiated Z-pinch experiments

    International Nuclear Information System (INIS)

    Sheehey, P.T.

    1994-02-01

    Solid-deuterium-initiated Z-pinch experiments are numerically simulated using a two-dimensional resistive magnetohydrodynamic model, which includes many important experimental details, such as ''cold-start'' initial conditions, thermal conduction, radiative energy loss, actual discharge current vs. time, and grids of sufficient size and resolution to allow realistic development of the plasma. The alternating-direction-implicit numerical technique used meets the substantial demands presented by such a computational task. Simulations of fiber-initiated experiments show that when the fiber becomes fully ionized rapidly developing m=0 instabilities, which originated in the coronal plasma generated from the ablating fiber, drive intense non-uniform heating and rapid expansion of the plasma column. The possibility that inclusion of additional physical effects would improve stability is explored. Finite-Larmor-radius-ordered Hall and diamagnetic pressure terms in the magnetic field evolution equation, corresponding energy equation terms, and separate ion and electron energy equations are included; these do not change the basic results. Model diagnostics, such as shadowgrams and interferograms, generated from simulation results, are in good agreement with experiment. Two alternative experimental approaches are explored: high-current magnetic implosion of hollow cylindrical deuterium shells, and ''plasma-on-wire'' (POW) implosion of low-density plasma onto a central deuterium fiber. By minimizing instability problems, these techniques may allow attainment of higher temperatures and densities than possible with bare fiber-initiated Z-pinches. Conditions for significant D-D or D-T fusion neutron production may be realizable with these implosion-based approaches

  12. Accuracy and initial clinical experience with measurement software (advanced vessel analysis) in three-dimensional imaging

    International Nuclear Information System (INIS)

    Abe, Toshi; Hirohata, Masaru; Tanigawa, Hitoshi

    2002-01-01

    Recently, the clinical benefits of three dimensional (3D) imaging, such as 3D-CTA and 3D-DSA, in cerebro-vascular disease have been widely recognized. Software for quantitative analysis of vascular structure in 3D imaging (advanced vessel analysis: AVA) has been developed. We evaluated AVA with both phantom studies and a few clinical cases. In spiral and curvy aluminum tube phantom studies, the accuracy of diameter measurements was good in 3D images produced from data set generated by multi-detector row CT or rotational angiography. The measurement error was less than 0.03 mm on aluminum tube phantoms that were 3 mm and 5 mm in diameter. In the clinical studies, the differences of carotid artery diameter measurements between 2D-DSA and 3D-DSA was less than 0.3 mm in. The measurement of length, diameter and angle by AVA should provide useful information for planning surgical and endovascular treatments of cerebro-vascular disease. (author)

  13. Analytical software design : introduction and industrial experience report

    NARCIS (Netherlands)

    Osaiweran, A.A.H.; Boosten, M.; Mousavi, M.R.

    2010-01-01

    Analytical Software Design (ASD) is a design approach that combines formal and empirical methods for developing mathematically verified software systems. Unlike conventional design methods, the design phase is extended with more formal techniques, so that flaws are detected earlier, thereby reducing

  14. SDI (Strategic Defense Initiative) Software Technology Program Plan

    Science.gov (United States)

    1987-06-01

    station control, and defense. c. Simulation Display Generator ( SDG ) [Patterson 83] I0 SDG supports the creation, display, modification, storage, and...34 Proceedings Trends and Applications 1981, IEEE, (May 28, 1981). [Parnas 86] Parnas, D.L., "When can Software be Trustworthy?" Keynote Address to Compass 󈨚

  15. Application of QC_DR software for acceptance testing and routine quality control of direct digital radiography systems: initial experiences using the Italian Association of Physicist in Medicine quality control protocol.

    Science.gov (United States)

    Nitrosi, Andrea; Bertolini, Marco; Borasi, Giovanni; Botti, Andrea; Barani, Adriana; Rivetti, Stefano; Pierotti, Luisa

    2009-12-01

    Ideally, medical x-ray imaging systems should be designed to deliver maximum image quality at an acceptable radiation risk to the patient. Quality assurance procedures are employed to ensure that these standards are maintained. A quality control protocol for direct digital radiography (DDR) systems is described and discussed. Software to automatically process and analyze the required images was developed. In this paper, the initial results obtained on equipment of different DDR manufacturers were reported. The protocol was developed to highlight even small discrepancies in standard operating performance.

  16. Systems and software variability management concepts, tools and experiences

    CERN Document Server

    Capilla, Rafael; Kang, Kyo-Chul

    2013-01-01

    The success of product line engineering techniques in the last 15 years has popularized the use of software variability as a key modeling approach for describing the commonality and variability of systems at all stages of the software lifecycle. Software product lines enable a family of products to share a common core platform, while allowing for product specific functionality being built on top of the platform. Many companies have exploited the concept of software product lines to increase the resources that focus on highly differentiating functionality and thus improve their competitiveness

  17. Software quality assurance and software safety in the Biomed Control System

    International Nuclear Information System (INIS)

    Singh, R.P.; Chu, W.T.; Ludewigt, B.A.; Marks, K.M.; Nyman, M.A.; Renner, T.R.; Stradtner, R.

    1989-01-01

    The Biomed Control System is a hardware/software system used for the delivery, measurement and monitoring of heavy-ion beams in the patient treatment and biology experiment rooms in the Bevalac at the Lawrence Berkeley Laboratory (LBL). This paper describes some aspects of this system including historical background philosophy, configuration management, hardware features that facilitate software testing, software testing procedures, the release of new software quality assurance, safety and operator monitoring. 3 refs

  18. Adaptation of Software Entities for Synchronous Exogenous Coordination - An Initial Approach

    NARCIS (Netherlands)

    N.K. Diakov (Nikolay); F. Arbab (Farhad)

    2005-01-01

    htmlabstractIn this paper we present an ongoing work on a framework for adaptation of heterogeneous software entities to allow their integration together with the help of synchronous connectors. By using synchronous connectors for software integration, we intend to make it possible

  19. A proposed acceptance process for commercial off-the-shelf (COTS) software in reactor applications

    International Nuclear Information System (INIS)

    Preckshot, G.G.; Scott, J.A.

    1996-03-01

    This paper proposes a process for acceptance of commercial off-the-shelf (COTS) software products for use in reactor systems important to safety. An initial set of four criteria establishes COTS software product identification and its safety category. Based on safety category, three sets of additional criteria, graded in rigor, are applied to approve/disapprove the product. These criteria fall roughly into three areas: product assurance, verification of safety function and safety impact, and examination of usage experience of the COTS product in circumstances similar to the proposed application. A report addressing the testing of existing software is included as an appendix

  20. Design for software a playbook for developers

    CERN Document Server

    Klimczak, Erik

    2013-01-01

    A unique resource to help software developers create a desirable user experience Today, top-flight software must feature a desirable user experience. This one-of-a-kind book creates a design process specifically for software, making it easy for developers who lack design background to create that compelling user experience. Appealing to both tech-savvy designers and creative-minded technologists, it establishes a hybrid discipline that will produce first-rate software. Illustrated in full color, it shows how to plan and visualize the design to create software that works on every l

  1. The experiences of employees participating in organisational corporate social responsibility initiatives

    Directory of Open Access Journals (Sweden)

    Gretha Cook

    2018-04-01

    Full Text Available Orientation: This article is about the experiences of employees who actively participate in organisational corporate social responsibility (CSR initiatives.   Research purpose: The general aim of this study was to explore the experiences of employees who participate in CSR initiatives within an organisation where a well-developed framework exists.   Motivation for the study: Whilst an emergent number of studies have considered the various dimensions of CSR initiatives, the focus appears to be on stakeholders such as the recipients of CSR, organisations, consumers and shareholders but not the perspective of the employees who actively participate in CSR initiatives.   Research design, approach and method: A qualitative research approach was employed with the intent of exploring the experiences of employees participating in organisational CSR initiatives. Data were collected and analysed from a purposive sample of 12 employees, by means of interactive qualitative analysis.   Main findings: The study revealed that the primary driver that motivates employees to participate in CSR is love. Love sparks a sense of compassion. Compassion, coupled with an enabling environment, stimulates generosity. By being generous, a feeling of hope and inspiration is induced in both the givers and receivers of generosity. A secondary outcome of generosity and hope and inspiration is bringing about change to others, and whilst going through this journey and making a difference in the lives of others, participants experience a progressive change within themselves. This change evokes a feeling of fulfilment, and ultimately a feeling of complete joy.   Contributions or value-add: This research complements existing CSR literature by focussing and reporting on the experiences of the employee as an important stakeholder.

  2. Lightweight and Continuous Architectural Software Quality Assurance using the aSQA Technique

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Hansen, Klaus Marius; Lindstrøm, Bo

    2010-01-01

    In this paper, we present a novel technique for assessing and prioritizing architectural quality in large-scale software development projects. The technique can be applied with relatively little effort by software architects and thus suited for agile development in which quality attributes can...... be assessed and prioritized, e.g., within each development sprint. We outline the processes and metrics embodied in the technique, and report initial experiences on the benefits and liabilities. In conclusion, the technique is considered valuable and a viable tool, and has benefits in an architectural...

  3. Software: our quest for excellence. Honoring 50 years of software history, progress, and process

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-06-01

    The Software Quality Forum was established by the Software Quality Assurance (SQA) Subcommittee, which serves as a technical advisory group on software engineering and quality initiatives and issues for DOE`s quality managers. The forum serves as an opportunity for all those involved in implementing SQA programs to meet and share ideas and concerns. Participation from managers, quality engineers, and software professionals provides an ideal environment for identifying and discussing issues and concerns. The interaction provided by the forum contributes to the realization of a shared goal--high quality software product. Topics include: testing, software measurement, software surety, software reliability, SQA practices, assessments, software process improvement, certification and licensing of software professionals, CASE tools, software project management, inspections, and management`s role in ensuring SQA. The bulk of this document consists of vugraphs. Selected papers have been indexed separately for inclusion in the Energy Science and Technology Database.

  4. BASTILLE - Better Analysis Software to Treat ILL Experiments - a unified, unifying approach to data reduction and analysis

    International Nuclear Information System (INIS)

    Johnson, M.

    2011-01-01

    Data reduction and analysis is a key component in the production of scientific results. If this component, like any other in the chain, is weak, the final output is compromised. The current situation for data reduction and analysis may be regarded as adequate, but it is variable, depending on the instrument, and should be improved. In particular the delivery of new and upgraded instruments in Millennium Phase I and those proposed for Phase II will bring new demands and challenges for software development. Failure to meet these challenges will hamper the exploitation of higher data rates and the delivery of new science. The proposed project is to provide a single, underpinning software infrastructure for data analysis, which would ensure: 1) a clear vision of software provision at ILL; 2) a clear role for the 'Computing for Science' Group (CS) in maintaining and developing the infrastructure and the codes; 3) a well-defined framework for recruiting and training CS staff; 4) ease and efficiency of development within a common, well-defined software environment; 5) safeguarding of key, existing software; and 6) ease of communication with other software like instrument control software to allow real-time data analysis and experiment control, or software from other institutes or sources

  5. Nuclear medicine software: safety aspects

    International Nuclear Information System (INIS)

    Anon.

    1989-01-01

    A brief editorial discusses the safety aspects of nuclear medicine software. Topics covered include some specific features which should be incorporated into a well-written piece of software, some specific points regarding software testing and legal liability if inappropriate medical treatment was initiated as a result of information derived from a piece of clinical apparatus incorporating a malfunctioning computer program. (U.K.)

  6. Open source software and libraries

    OpenAIRE

    Randhawa, Sukhwinder

    2008-01-01

    Open source software is, software that users have the ability to run, copy, distribute, study, change, share and improve for any purpose. Open source library software’s does not need the initial cost of commercial software and enables libraries to have greater control over their working environment. Library professionals should be aware of the advantages of open source software and should involve in their development. They should have basic knowledge about the selection, installation and main...

  7. QuantiFly: Robust Trainable Software for Automated Drosophila Egg Counting.

    Directory of Open Access Journals (Sweden)

    Dominic Waithe

    Full Text Available We report the development and testing of software called QuantiFly: an automated tool to quantify Drosophila egg laying. Many laboratories count Drosophila eggs as a marker of fitness. The existing method requires laboratory researchers to count eggs manually while looking down a microscope. This technique is both time-consuming and tedious, especially when experiments require daily counts of hundreds of vials. The basis of the QuantiFly software is an algorithm which applies and improves upon an existing advanced pattern recognition and machine-learning routine. The accuracy of the baseline algorithm is additionally increased in this study through correction of bias observed in the algorithm output. The QuantiFly software, which includes the refined algorithm, has been designed to be immediately accessible to scientists through an intuitive and responsive user-friendly graphical interface. The software is also open-source, self-contained, has no dependencies and is easily installed (https://github.com/dwaithe/quantifly. Compared to manual egg counts made from digital images, QuantiFly achieved average accuracies of 94% and 85% for eggs laid on transparent (defined and opaque (yeast-based fly media. Thus, the software is capable of detecting experimental differences in most experimental situations. Significantly, the advanced feature recognition capabilities of the software proved to be robust to food surface artefacts like bubbles and crevices. The user experience involves image acquisition, algorithm training by labelling a subset of eggs in images of some of the vials, followed by a batch analysis mode in which new images are automatically assessed for egg numbers. Initial training typically requires approximately 10 minutes, while subsequent image evaluation by the software is performed in just a few seconds. Given the average time per vial for manual counting is approximately 40 seconds, our software introduces a timesaving advantage for

  8. A Web-based modeling tool for the SEMAT Essence theory of software engineering

    Directory of Open Access Journals (Sweden)

    Daniel Graziotin

    2013-09-01

    Full Text Available As opposed to more mature subjects, software engineering lacks general theories that establish its foundations as a discipline. The Essence Theory of software engineering (Essence has been proposed by the Software Engineering Methods and Theory (SEMAT initiative. The goal of Essence is to develop a theoretically sound basis for software engineering practice and its wide adoption. However, Essence is far from reaching academic- and industry-wide adoption. The reasons for this include a struggle to foresee its utilization potential and a lack of tools for implementation. SEMAT Accelerator (SematAcc is a Web-positioning tool for a software engineering endeavor, which implements the SEMAT’s Essence kernel. SematAcc permits the use of Essence, thus helping to understand it. The tool enables the teaching, adoption, and research of Essence in controlled experiments and case studies.

  9. The mythical man-month essays on software engineering

    CERN Document Server

    Brooks, Frederick Phillips

    1995-01-01

    Few books on software project management have been as influential and timeless as The Mythical Man-Month. With a blend of software engineering facts and thought-provoking opinions, Fred Brooks offers insight for anyone managing complex projects. These essays draw from his experience as project manager for the IBM System/360 computer family and then for OS/360, its massive software system. Now, 20 years after the initial publication of his book, Brooks has revisited his original ideas and added new thoughts and advice, both for readers already familiar with his work and for readers discovering it for the first time. The added chapters contain (1) a crisp condensation of all the propositions asserted in the original book, including Brooks' central argument in The Mythical Man-Month: that large programming projects suffer management problems different from small ones due to the division of labor; that the conceptual integrity of the product is therefore critical; and that it is difficult but possible to achi...

  10. Practical experience with software tools to assess and improve the quality of existing nuclear analysis and safety codes

    International Nuclear Information System (INIS)

    Marshall, N.H.; Marwil, E.S.; Matthews, S.D.; Stacey, B.J.

    1990-01-01

    Within the constraints of schedule and budget, software tools and techniques were applied to existing FORTRAN codes determining software quality metrics and improving the code quality. Specifically discussed are INEL experiences in applying pretty printers, cross-reference analyzers, and computer aided software engineering (CASE) tools and techniques. These have provided management with measures of the risk potential for individual program modules so that rational decisions can be made on resource allocation. Selected program modules have been modified to reduce the complexity, achieve higher functional independence, and improve the code vectorization. (orig.)

  11. Object Oriented Approach to Software Development for LHC Experiments

    CERN Multimedia

    Tummers, B J; Day, C; Innocente, V; Mount, R; Visser, E; Burnett, T H; Balke, C

    2002-01-01

    % RD41 \\\\ \\\\ We propose to study the viability of the Object Oriented~(OO) approach for developing the code for LHC experiments. The authors of this proposal will learn the key issues of this approach:~~OO analysis and design. Several methodologies will be studied to select the most appropriate for the High Energy Physics case. Some Computer Aided Software Engineering tools and implementation languages will be evaluated. These studies will be carried out with various well-defined prototypes, some of which have been defined in a preceding study and some of which will be defined in the course of this R\\&D project. We propose to also study in this project how the OO approach enhances a different, and hopefully better, project management. Management tools will be tried and professional training will be organized.

  12. The software development process in worldwide collaborations

    International Nuclear Information System (INIS)

    Amako, K.

    1998-01-01

    High energy physics experiments in future colliders are inevitably large scale international collaborations. In these experiments, software development has to be done by a large number of physicists, software engineers and computer scientists, dispersed all over the world. The major subject of this paper is to discuss on various aspects of software development in the worldwide environment. These include software engineering and methodology, software development process and management. (orig.)

  13. Software engineering principles applied to large healthcare information systems--a case report.

    Science.gov (United States)

    Nardon, Fabiane Bizinella; de A Moura, Lincoln

    2007-01-01

    São Paulo is the largest city in Brazil and one of the largest cities in the world. In 2004, São Paulo City Department of Health decided to implement a Healthcare Information System to support managing healthcare services and provide an ambulatory health record. The resulting information system is one of the largest public healthcare information systems ever built, with more than 2 million lines of code. Although statistics shows that most software projects fail, and the risks for the São Paulo initiative were enormous, the information system was completed on-time and on-budget. In this paper, we discuss the software engineering principles adopted that allowed to accomplish that project's goals, hoping that sharing the experience of this project will help other healthcare information systems initiatives to succeed.

  14. Global Software Engineering

    DEFF Research Database (Denmark)

    Ebert, Christof; Kuhrmann, Marco; Prikladnicki, Rafael

    2016-01-01

    Professional software products and IT systems and services today are developed mostly by globally distributed teams, projects, and companies. Successfully orchestrating Global Software Engineering (GSE) has become the major success factor both for organizations and practitioners. Yet, more than...... and experience reported at the IEEE International Conference on Software Engineering (ICGSE) series. The outcomes of our analysis show GSE as a field highly attached to industry and, thus, a considerable share of ICGSE papers address the transfer of Software Engineering concepts and solutions to the global stage...

  15. Using formal specification in the Guidance and Control Software (GCS) experiment. Formal design and verification technology for life critical systems

    Science.gov (United States)

    Weber, Doug; Jamsek, Damir

    1994-01-01

    The goal of this task was to investigate how formal methods could be incorporated into a software engineering process for flight-control systems under DO-178B and to demonstrate that process by developing a formal specification for NASA's Guidance and Controls Software (GCS) Experiment. GCS is software to control the descent of a spacecraft onto a planet's surface. The GCS example is simplified from a real example spacecraft, but exhibits the characteristics of realistic spacecraft control software. The formal specification is written in Larch.

  16. An experience of qualified preventive screening: shiraz smart screening software.

    Science.gov (United States)

    Islami Parkoohi, Parisa; Zare, Hashem; Abdollahifard, Gholamreza

    2015-01-01

    Computerized preventive screening software is a cost effective intervention tool to address non-communicable chronic diseases. Shiraz Smart Screening Software (SSSS) was developed as an innovative tool for qualified screening. It allows simultaneous smart screening of several high-burden chronic diseases and supports reminder notification functionality. The extent in which SSSS affects screening quality is also described. Following software development, preventive screening and annual health examinations of 261 school staff (Medical School of Shiraz, Iran) was carried out in a software-assisted manner. To evaluate the quality of the software-assisted screening, we used quasi-experimental study design and determined coverage, irregular attendance and inappropriateness proportions in relation with the manual and software-assisted screening as well as the corresponding number of requested tests. In manual screening method, 27% of employees were covered (with 94% irregular attendance) while by software-assisted screening, the coverage proportion was 79% (attendance status will clear after the specified time). The frequency of inappropriate screening test requests, before the software implementation, was 41.37% for fasting plasma glucose, 41.37% for lipid profile, 0.84% for occult blood, 0.19% for flexible sigmoidoscopy/colonoscopy, 35.29% for Pap smear, 19.20% for mammography and 11.2% for prostate specific antigen. All of the above were corrected by the software application. In total, 366 manual screening and 334 software-assisted screening tests were requested. SSSS is an innovative tool to improve the quality of preventive screening plans in terms of increased screening coverage, reduction in inappropriateness and the total number of requested tests.

  17. Data Acquisition Software for Experiments at the MAMI-C Tagged Photon Facility

    Science.gov (United States)

    Oussena, Baya; Annand, John

    2013-10-01

    Tagged-photon experiments at Mainz use the electron beam of the MAMI (Mainzer MIcrotron) accelerator, in combination with the Glasgow Tagged Photon Spectrometer. The AcquDAQ DAQ system is implemented in the C + + language and makes use of CERN ROOT software libraries and tools. Electronic hardware is characterized in C + + classes, based on a general purpose class TDAQmodule and implementation in an object-oriented framework makes the system very flexible. The DAQ system provides slow control and event-by-event readout of the Photon Tagger, the Crystal Ball 4-pi electromagnetic calorimeter, central MWPC tracker and plastic-scintillator, particle-ID systems and the TAPS forward-angle calorimeter. A variety of front-end controllers running Linux are supported, reading data from VMEbus, FASTBUS and CAMAC systems. More specialist hardware, based on optical communication systems and developed for the COMPASS experiment at CERN, is also supported. AcquDAQ also provides an interface to configure and control the Mainz programmable trigger system, which uses FPGA-based hardware developed at GSI. Currently the DAQ system runs at data rates of up to 3MB/s and, with upgrades to both hardware and software later this year, we anticipate a doubling of that rate. This work was supported in part by the U.S. DOE Grant No. DE-FG02-99ER41110.

  18. Software R&D for Next Generation of HEP Experiments, Inspired by Theano

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    In the next decade, the frontiers of High Energy Physics (HEP) will be explored by three machines: the High Luminosity Large Hadron Collider (HL-LHC) in Europe, the Long Base Neutrino Facility (LBNF) in the US, and the International Linear Collider (ILC) in Japan. These next generation experiments must address two fundamental problems in the current generation of HEP experimental software: the inability to take advantage and adapt to the rapidly evolving processor landscape, and the difficulty in developing and maintaining increasingly complex software systems by physicists. I will propose a strategy, inspired by the automatic optimization and code generation in Theano, to simultaneously address both problems. I will describe three R&D projects with short-term physics deliverables aimed at developing this strategy. The first project is to develop maximally sensitive General Search for New Physics at the LHC by applying the Matrix Element Method running GPUs of HPCs. The second is to classify and reconstru...

  19. The Data Quality Monitoring Software for the CMS experiment at the LHC

    CERN Document Server

    AUTHOR|(CDS)2071602

    2016-01-01

    The Data Quality Monitoring (DQM) Software is a central tool in the CMS experiment. Its flexibility allows for integration in several key environments Online, for real-time detector monitoring; Offline, for the final, fine-grained Data Certification; Release-Validation, to constantly validate the functionalities and the performance of the reconstruction software; in Monte Carlo productions.Since the end of data taking at a center of mass energy of 8 TeV, the environment in which the DQM lives has undergone fundamental changes. In turn, the DQM system has made significant upgrades in many areas to respond to not only the changes in infrastructure, but also the growing specialized needs of the collaboration with an emphasis on more sophisticated methods for evaluating dataquality, as well as advancing the DQM system to provide quality assessments of various Monte Carlo simulations versus data distributions, monitoring changes in physical effects due to modifications of algorithms or framework, and enabling reg...

  20. Modernization of tank floor scanning system (TAFLOSS) software

    International Nuclear Information System (INIS)

    Mohd Fitri Abdul Rahman; Jaafar Abdullah; Susan Maria Sipaun

    2002-01-01

    Tank Floor Scanning System (TAFLOSS) is a portable nucleonic device based on the scattering and moderation phenomena of neutrons. TAFLOSS, which was developed by MINT, can precisely and non-destructively measure the gap and hydrogen content in the foundation of a gigantic industrial tank in a practical and cost-effective manner. In recording and analysing measured data, three different computer software were used. In analysing the initial data, a Disk Operating System (DOS) based software called MesTank 3.0 have been developed. The system also used commercial software such as Table Curve 2D and SURFER for graphics purposes. Table Curve 2D was used to plot and evaluate curve fitting, whereas SURFER software used to draw contours. It is not user friendly and time consuming to switch from a software to another software for different tasks of this system. Therefore, the main objective of the project is to develop new user-friendly software that combined the old and commercial software into a single package. The computer programming language that was used to develop the software is Microsoft Visual C++ ver. 6.0. The process of developing this software involved complex mathematical calculation, curve fitting and contour plot. This paper describes the initial development of a computer programme for analysing the initial data and plotting exponential curve fitting. (Author)

  1. A methodology for software documentation

    OpenAIRE

    Torres Júnior, Roberto Dias; Ahlert, Hubert

    2000-01-01

    With the growing complexity of window based software and the use of object-oriented, the development of software is getting more complex than ever. Based on that, this article intends to present a methodology for software documentation and to analyze our experience and how this methodology can aid the software maintenance

  2. Analysing the control software of the Compact Muon Solenoid Experiment at the Large Hadron Collider

    NARCIS (Netherlands)

    Hwong, Y.L.; Kusters, V.J.J.; Willemse, T.A.C.; Arbab, F.; Sirjani, M.

    2012-01-01

    The control software of the CERN Compact Muon Solenoid experiment contains over 30,000 finite state machines. These state machines are organised hierarchically: commands are sent down the hierarchy and state changes are sent upwards. The sheer size of the system makes it virtually impossible to

  3. Initially curved microplates under electrostatic actuation: theory and experiment

    KAUST Repository

    Saghir, Shahid; Bellaredj, Mohammed Lamine Faycal; Ramini, Abdallah; Younis, Mohammad I.

    2016-01-01

    Microplates are the building blocks of many micro-electro-mechanical systems. It is common for them to experience initial curvature imperfection due to residual stresses caused by the micro fabrication process. Such plates are essentially different

  4. Sustaining an Online, Shared Community Resource for Models, Robust Open source Software Tools and Data for Volcanology - the Vhub Experience

    Science.gov (United States)

    Patra, A. K.; Valentine, G. A.; Bursik, M. I.; Connor, C.; Connor, L.; Jones, M.; Simakov, N.; Aghakhani, H.; Jones-Ivey, R.; Kosar, T.; Zhang, B.

    2015-12-01

    Over the last 5 years we have created a community collaboratory Vhub.org [Palma et al, J. App. Volc. 3:2 doi:10.1186/2191-5040-3-2] as a place to find volcanology-related resources, and a venue for users to disseminate tools, teaching resources, data, and an online platform to support collaborative efforts. As the community (current active users > 6000 from an estimated community of comparable size) embeds the tools in the collaboratory into educational and research workflows it became imperative to: a) redesign tools into robust, open source reusable software for online and offline usage/enhancement; b) share large datasets with remote collaborators and other users seamlessly with security; c) support complex workflows for uncertainty analysis, validation and verification and data assimilation with large data. The focus on tool development/redevelopment has been twofold - firstly to use best practices in software engineering and new hardware like multi-core and graphic processing units. Secondly we wish to enhance capabilities to support inverse modeling, uncertainty quantification using large ensembles and design of experiments, calibration, validation. Among software engineering practices we practice are open source facilitating community contributions, modularity and reusability. Our initial targets are four popular tools on Vhub - TITAN2D, TEPHRA2, PUFF and LAVA. Use of tools like these requires many observation driven data sets e.g. digital elevation models of topography, satellite imagery, field observations on deposits etc. These data are often maintained in private repositories that are privately shared by "sneaker-net". As a partial solution to this we tested mechanisms using irods software for online sharing of private data with public metadata and access limits. Finally, we adapted use of workflow engines (e.g. Pegasus) to support the complex data and computing workflows needed for usage like uncertainty quantification for hazard analysis using physical

  5. ATLAS experience with HEP software at the Argonne leadership computing facility

    International Nuclear Information System (INIS)

    Uram, Thomas D; LeCompte, Thomas J; Benjamin, D

    2014-01-01

    A number of HEP software packages used by the ATLAS experiment, including GEANT4, ROOT and ALPGEN, have been adapted to run on the IBM Blue Gene supercomputers at the Argonne Leadership Computing Facility. These computers use a non-x86 architecture and have a considerably less rich operating environment than in common use in HEP, but also represent a computing capacity an order of magnitude beyond what ATLAS is presently using via the LCG. The status and potential for making use of leadership-class computing, including the status of integration with the ATLAS production system, is discussed.

  6. ATLAS Experience with HEP Software at the Argonne Leadership Computing Facility

    CERN Document Server

    LeCompte, T; The ATLAS collaboration; Benjamin, D

    2014-01-01

    A number of HEP software packages used by the ATLAS experiment, including GEANT4, ROOT and ALPGEN, have been adapted to run on the IBM Blue Gene supercomputers at the Argonne Leadership Computing Facility. These computers use a non-x86 architecture and have a considerably less rich operating environment than in common use in HEP, but also represent a computing capacity an order of magnitude beyond what ATLAS is presently using via the LCG. The status and potential for making use of leadership-class computing, including the status of integration with the ATLAS production system, is discussed.

  7. Toward objective software process information : experiences from a case study

    NARCIS (Netherlands)

    Samalikova, J.; Kusters, R.J.; Trienekens, J.J.M.; Weijters, A.J.M.M.; Siemons, P.

    2011-01-01

    A critical problem in software development is the monitoring, control and improvement in the processes of software developers. Software processes are often not explicitly modeled, and manuals to support the development work contain abstract guidelines and procedures. Consequently, there are huge

  8. Proceedings of the Ninth Annual Software Engineering Workshop

    Science.gov (United States)

    1984-01-01

    Experiences in measurement, utilization, and evaluation of software methodologies, models, and tools are discussed. NASA's involvement in ever larger and more complex systems, like the space station project, provides a motive for the support of software engineering research and the exchange of ideas in such forums. The topics of current SEL research are software error studies, experiments with software development, and software tools.

  9. Model-based software process improvement

    Science.gov (United States)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  10. Guide to software export

    CERN Document Server

    Philips, Roger A

    2014-01-01

    An ideal reference source for CEOs, marketing and sales managers, sales consultants, and students of international marketing, Guide to Software Export provides a step-by-step approach to initiating or expanding international software sales. It teaches you how to examine critically your candidate product for exportability; how to find distributors, agents, and resellers abroad; how to identify the best distribution structure for export; and much, much more!Not content with providing just the guidelines for setting up, expanding, and managing your international sales channels, Guide to Software

  11. Software Engineering Reviews and Audits

    CERN Document Server

    Summers, Boyd L

    2011-01-01

    Accurate software engineering reviews and audits have become essential to the success of software companies and military and aerospace programs. These reviews and audits define the framework and specific requirements for verifying software development efforts. Authored by an industry professional with three decades of experience, Software Engineering Reviews and Audits offers authoritative guidance for conducting and performing software first article inspections, and functional and physical configuration software audits. It prepares readers to answer common questions for conducting and perform

  12. The IceCube Data Acquisition Software: Lessons Learned during Distributed, Collaborative, Multi-Disciplined Software Development.

    Energy Technology Data Exchange (ETDEWEB)

    Beattie, Keith S; Beattie, Keith; Day Ph.D., Christopher; Glowacki, Dave; Hanson Ph.D., Kael; Jacobsen Ph.D., John; McParland, Charles; Patton Ph.D., Simon

    2007-09-21

    In this experiential paper we report on lessons learned during the development ofthe data acquisition software for the IceCube project - specifically, how to effectively address the unique challenges presented by a distributed, collaborative, multi-institutional, multi-disciplined project such as this. While development progress in software projects is often described solely in terms of technical issues, our experience indicates that non- and quasi-technical interactions play a substantial role in the effectiveness of large software development efforts. These include: selection and management of multiple software development methodologies, the effective useof various collaborative communication tools, project management structure and roles, and the impact and apparent importance of these elements when viewed through the differing perspectives of hardware, software, scientific and project office roles. Even in areas clearly technical in nature, success is still influenced by non-technical issues that can escape close attention. In particular we describe our experiences on software requirements specification, development methodologies and communication tools. We make observations on what tools and techniques have and have not been effective in this geographically disperse (including the South Pole) collaboration and offer suggestions on how similarly structured future projects may build upon our experiences.

  13. Teaching Agile Software Development: A Case Study

    Science.gov (United States)

    Devedzic, V.; Milenkovic, S. R.

    2011-01-01

    This paper describes the authors' experience of teaching agile software development to students of computer science, software engineering, and other related disciplines, and comments on the implications of this and the lessons learned. It is based on the authors' eight years of experience in teaching agile software methodologies to various groups…

  14. Software engineering

    CERN Document Server

    Sommerville, Ian

    2016-01-01

    For courses in computer science and software engineering The Fundamental Practice of Software Engineering Software Engineering introduces readers to the overwhelmingly important subject of software programming and development. In the past few years, computer systems have come to dominate not just our technological growth, but the foundations of our world's major industries. This text seeks to lay out the fundamental concepts of this huge and continually growing subject area in a clear and comprehensive manner. The Tenth Edition contains new information that highlights various technological updates of recent years, providing readers with highly relevant and current information. Sommerville's experience in system dependability and systems engineering guides the text through a traditional plan-based approach that incorporates some novel agile methods. The text strives to teach the innovators of tomorrow how to create software that will make our world a better, safer, and more advanced place to live.

  15. SWiFT Software Quality Assurance Plan.

    Energy Technology Data Exchange (ETDEWEB)

    Berg, Jonathan Charles [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-01-01

    This document describes the software development practice areas and processes which contribute to the ability of SWiFT software developers to provide quality software. These processes are designed to satisfy the requirements set forth by the Sandia Software Quality Assurance Program (SSQAP). APPROVALS SWiFT Software Quality Assurance Plan (SAND2016-0765) approved by: Department Manager SWiFT Site Lead Dave Minster (6121) Date Jonathan White (6121) Date SWiFT Controls Engineer Jonathan Berg (6121) Date CHANGE HISTORY Issue Date Originator(s) Description A 2016/01/27 Jon Berg (06121) Initial release of the SWiFT Software Quality Assurance Plan

  16. Development of workflow planning software and a tracking study of the decay B+- --> J / Psi at the D0 Experiment

    International Nuclear Information System (INIS)

    Evans, David Edward

    2003-01-01

    A description of the development of the mc( ) runjob software package used to manage large scale computing tasks for the D0 Experiment at Fermilab is presented, along with a review of the Digital Front End Trigger electronics and the software used to control them. A tracking study is performed on detector data to determine that the D0 Experiment can detect charged B mesons, and that these results are in accordance with current results. B mesons are found by searching for the decay channel B ± → J/ψK ±

  17. A software engineering process for safety-critical software application

    International Nuclear Information System (INIS)

    Kang, Byung Heon; Kim, Hang Bae; Chang, Hoon Seon; Jeon, Jong Sun

    1995-01-01

    Application of computer software to safety-critical systems in on the increase. To be successful, the software must be designed and constructed to meet the functional and performance requirements of the system. For safety reason, the software must be demonstrated not only to meet these requirements, but also to operate safely as a component within the system. For longer-term cost consideration, the software must be designed and structured to ease future maintenance and modifications. This paper presents a software engineering process for the production of safety-critical software for a nuclear power plant. The presentation is expository in nature of a viable high quality safety-critical software development. It is based on the ideas of a rational design process and on the experience of the adaptation of such process in the production of the safety-critical software for the shutdown system number two of Wolsung 2, 3 and 4 nuclear power generation plants. This process is significantly different from a conventional process in terms of rigorous software development phases and software design techniques, The process covers documentation, design, verification and testing using mathematically precise notations and highly reviewable tabular format to specify software requirements and software requirements and software requirements and code against software design using static analysis. The software engineering process described in this paper applies the principle of information-hiding decomposition in software design using a modular design technique so that when a change is required or an error is detected, the affected scope can be readily and confidently located. it also facilitates a sense of high degree of confidence in the 'correctness' of the software production, and provides a relatively simple and straightforward code implementation effort. 1 figs., 10 refs. (Author)

  18. Propriedade Intelectual em Software: o que podemos apreender da experiência internacional?

    Directory of Open Access Journals (Sweden)

    Arlan Souza

    2007-01-01

    Full Text Available Este artigo, elaborado no âmbito de um grupo de estudos sobre propriedade intelectual (PI em tecnologias da informação e sensíveis,1 analisa a questão do regime de proteção do programa de computador no Brasil à luz das experiências européia e norte-americana. A revisão da evolução da legislação internacional e brasileira bem como seus efeitos práticos sobre a proteção dos programas de computador revelam que as mudanças tecnológicas nas tecnologias da informação vêm levantando dilemas complexos no âmbito legal em todo o mundo. Nos Estados Unidos, a legislação é mais flexível, privilegiando os interesses das empresas de software. Já na União Européia a maior diversidade de atores é um entrave para o desenvolvimento de uma diretiva regional. O Brasil segue as regras do Acordo TRIPS e também enfrenta muitas dificuldades para harmonizar o estímulo ao desenvolvimento tecnológico com condutas que promovam o benefício econômico e social.This paper reviews the legal framework for intellectual property protection for software as a product in the USA, European Union and Brazil. Rapid technological change in the world software industry poses new challenges for existing intellectual property regimes. The USA has responded with a flexible interpretation of property rights which in fact favors the software industry. In the European Union, the larger heterogeneity of actors and interests resulted in a fail in the attempt to approve a unified directive. Brazil follows the rules agreed in the TRIPS but also faces difficulties for harmonizing incentives for innovation with conducts which head to social and economic development.

  19. Organization of the STAR experiment software framework at JINR. Results and experience from the first two years of work

    International Nuclear Information System (INIS)

    Arkhipkin, D.A.; Zul'karneeva, Yu.R.

    2004-01-01

    The organization of STAR experiment software framework at JINR is described. The approach being based on the distributed file system ASF was implemented at the NEOSTAR minicluster at LPP, JINR. An operation principle of the cluster as well as its work description and samples of the performed analysis are also given. The results of the NEOSTAR minicluster performance have demonstrated broad facilities of the distributed computing concept to be employed in experimental data analysis and high-energy physics modeling

  20. System for inspection and quality assurance of software - A knowledge-based experiment with code understanding

    International Nuclear Information System (INIS)

    Das, B.K.

    1989-01-01

    This paper describes a knowledge-based prototype that inspects and quality-assures software components. The prototype model, which offers a singular representation of these components, is used to automate both the mechanical and nonmechanical activities in the quality assurance (QA) process. It is shown that the prototype, in addition to automating the QA process, provides a novel approach to understanding code. These approaches are compared with recent approaches to code understanding. The paper also presents the results of an experiment with several classes of nonsyntactic bugs. It is argued that a structured environment, as facilitated by this unique architecture, along with software development standards used in the QA process, is essential for meaningful analysis of code. 8 refs

  1. Software testing for evolutionary iterative rapid prototyping

    OpenAIRE

    Davis, Edward V., Jr.

    1990-01-01

    Approved for public release; distribution unlimited. Rapid prototyping is emerging as a promising software development paradigm. It provides a systematic and automatable means of developing a software system under circumstances where initial requirements are not well known or where requirements change frequently during development. To provide high software quality assurance requires sufficient software testing. The unique nature of evolutionary iterative prototyping is not well-suited for ...

  2. SQIMSO: Quality Improvement for Small Software Organizations

    OpenAIRE

    Rabih Zeineddine; Nashat Mansour

    2005-01-01

    Software quality improvement process remains incomplete if it is not initiated and conducted through a wide improvement program that considers process quality improvement, product quality improvement and evolution of human resources. But, small software organizations are not capable of bearing the cost of establishing software process improvement programs. In this work, we propose a new software quality improvement model for small organizations, SQIMSO, based on three ...

  3. Software Development as Music Education Research

    Science.gov (United States)

    Brown, Andrew R.

    2007-01-01

    This paper discusses how software development can be used as a method for music education research. It explains how software development can externalize ideas, stimulate action and reflection, and provide evidence to support the educative value of new software-based experiences. Parallels between the interactive software development process and…

  4. Experiences with Integrating Simulation into a Software Engineering Curriculum

    Science.gov (United States)

    Bollin, Andreas; Hochmuller, Elke; Mittermeir, Roland; Samuelis, Ladislav

    2012-01-01

    Software Engineering education must account for a broad spectrum of knowledge and skills software engineers will be required to apply throughout their professional life. Covering all the topics in depth within a university setting is infeasible due to curricular constraints as well as due to the inherent differences between educational…

  5. Essence: Facilitating Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2008-01-01

      This paper suggests ways to facilitate creativity and innovation in software development. The paper applies four perspectives – Product, Project, Process, and People –to identify an outlook for software innovation. The paper then describes a new facility–Software Innovation Research Lab (SIRL......) – and a new method concept for software innovation – Essence – based on views, modes, and team roles. Finally, the paper reports from an early experiment using SIRL and Essence and identifies further research....

  6. Technical report on the Korea-Japan software collaboration

    International Nuclear Information System (INIS)

    Inamura, Yasuhiro; Nakajima, Kenji; Nakatani, Takeshi; Kajimoto, Ryoichi; Arai, Masatoshi; So, Ji-Yong; Moon, Myung-Kook; Lee, Chang-Hee; Suzuki, Jiro; Otomo, Toshiya; Yasu, Yoshiji; Nakayoshi, Kazuo; Sendai, Hiroshi; Nam, Uk-Won; Park, Je-Geun

    2011-02-01

    Both Materials and Life Science Experimental Facility (MLF) of J-PARC and HANARO of KAERI started new neutron facility projects in 2002 and 2003, respectively. As part of their projects, both institutes began developments of new Time-of-Flight (ToF) spectrometer including DC-TOF of HANARO, 4SEASONS and AMATERAS of MLF. With this new instrument development, we saw an opportunity for collaboration between Korea and Japan regarding ToF software. This Korea-Japan collaboration officially started in 2007 with an initially 6 items as its final goal. The 6 items include 1) basic data reduction software, 2) informative visualization software, 3) data visualization software, 4) decision making and optimization software, 5) single crystal alignment software, and 6) advanced analysis software. Using Manyo library developed at J-PARC as our software framework, we developed our software based on a combination of Python and C++ wrapped under SWIG. In August 2008 we successfully released a beta-version of basic data reduction software which has been tested at the 2 beamlines of MLF; 4SEASONS and AMATERAS, and regularly updated. Other 2 beta-versions of informative visualization software and data visualization software have also been released and are successfully used during experiments at 4SEASONS and AMATERAS. Although we have had several discussions on the 3 remaining topics of the original goal of this collaboration, progress has been rather limited on these items. Therefore, we decided to consider them as the subject of the next Korea-Japan collaboration. This report summarizes the 2-years (2007-2009) activities of Korea-Japan collaboration of chopper software development. Here we describe the background of the collaboration and the main part of our work. We also discuss briefly a future plan of our collaboration starting in 2010. Some of detailed descriptions on the activities of the collaboration as well as related information are given in appendix. (author)

  7. Global Software Engineering

    DEFF Research Database (Denmark)

    Ebert, Christof; Kuhrmann, Marco; Prikladnicki, Rafael

    2016-01-01

    SOFTWARE, LIKE ALL industry products, is the result of complex multinational supply chains with many partners from concept to development to production and maintenance. Global software engineering (GSE), IT outsourcing, and business process outsourcing during the past decade have showed growth...... rates of 10 to 20 percent per year. This instalment of Practitioner’s Digest summarizes experiences and guidance from industry to facilitate knowledge and technology transfer for GSE. It’s based on industry feedback from the annual IEEE International Conference on Global Software Engineering, which had...

  8. Survey on Projects at DLR Simulation and Software Technology with Focus on Software Engineering and HPC

    OpenAIRE

    Schreiber, Andreas; Basermann, Achim

    2013-01-01

    We introduce the DLR institute “Simulation and Software Technology” (SC) and present current activities regarding software engineering and high performance computing (HPC) in German or international projects. Software engineering at SC focusses on data and knowledge management as well as tools for studies and experiments. We discuss how we apply software configuration management, validation and verification in our projects. Concrete research topics are traceability of (software devel...

  9. Software Geometry in Simulations

    Science.gov (United States)

    Alion, Tyler; Viren, Brett; Junk, Tom

    2015-04-01

    The Long Baseline Neutrino Experiment (LBNE) involves many detectors. The experiment's near detector (ND) facility, may ultimately involve several detectors. The far detector (FD) will be significantly larger than any other Liquid Argon (LAr) detector yet constructed; many prototype detectors are being constructed and studied to motivate a plethora of proposed FD designs. Whether it be a constructed prototype or a proposed ND/FD design, every design must be simulated and analyzed. This presents a considerable challenge to LBNE software experts; each detector geometry must be described to the simulation software in an efficient way which allows for multiple authors to easily collaborate. Furthermore, different geometry versions must be tracked throughout their use. We present a framework called General Geometry Description (GGD), written and developed by LBNE software collaborators for managing software to generate geometries. Though GGD is flexible enough to be used by any experiment working with detectors, we present it's first use in generating Geometry Description Markup Language (GDML) files to interface with LArSoft, a framework of detector simulations, event reconstruction, and data analyses written for all LAr technology users at Fermilab. Brett is the other of the framework discussed here, the General Geometry Description (GGD).

  10. Developing Control and Monitoring Software for the Data Acquisition System of the COMPASS Experiment at CERN

    Directory of Open Access Journals (Sweden)

    Martin Bodlák

    2013-01-01

    Full Text Available This paper focuses on the analysis, design and development of software for the new data acquisition system of the COMPASS experiment at CERN. In this system, the data flow is controlled by custom hardware; the software will therefore be used only for run control and for monitoring. The requirements on the software have been analyzed, and the functionality of the system has been defined. The system consists of several distributed nodes; communication between the nodes is based on a custom protocol and a DIM library. A minimal version of the system has already been implemented. Preliminary results of performance and stability tests have shown that the system fulfills the defined requirements, and is stable. In the next phase of development, the system will be tested on the real hardware. It is expected that the system will be ready for deployment in 2014.

  11. The LHCb Starterkit initiative

    CERN Document Server

    Puig Navarro, Albert

    2017-01-01

    The vast majority of high-energy physicists use and produce software every day. Software skills are usually acquired on the go and dedicated training courses are rare. The LHCb Starterkit is a new training format for getting LHCb collaborators started in effectively using software to perform their research. The initiative, combining courses and online tutorials, focuses on teaching basic skills for research computing, as well as LHCb software specifics. Unlike traditional tutorials we focus on starting with basics, performing all the material live, with a high degree of interactivity, giving priority to understanding the tools as opposed to handing out recipes that work “as if by magic”. The LHCb Starterkit was started by young members of the collaboration inspired by the principles of Software Carpentry, and the material is created in a collaborative fashion using the tools we teach. Three successful entry-level workshops, as well as two advanced ones, have taken place since the start of the initiative i...

  12. Experiment Software and Projects on the Web with VISPA

    Science.gov (United States)

    Erdmann, M.; Fischer, B.; Fischer, R.; Geiser, E.; Glaser, C.; Müller, G.; Rieger, M.; Urban, M.; von Cube, R. F.; Welling, C.

    2017-10-01

    The Visual Physics Analysis (VISPA) project defines a toolbox for accessing software via the web. It is based on latest web technologies and provides a powerful extension mechanism that enables to interface a wide range of applications. Beyond basic applications such as a code editor, a file browser, or a terminal, it meets the demands of sophisticated experiment-specific use cases that focus on physics data analyses and typically require a high degree of interactivity. As an example, we developed a data inspector that is capable of browsing interactively through event content of several data formats, e.g., MiniAOD which is utilized by the CMS collaboration. The VISPA extension mechanism can also be used to embed external web-based applications that benefit from dynamic allocation of user-defined computing resources via SSH. For example, by wrapping the JSROOT project, ROOT files located on any remote machine can be inspected directly through a VISPA server instance. We introduced domains that combine groups of users and role-based permissions. Thereby, tailored projects are enabled, e.g. for teaching where access to student’s homework is restricted to a team of tutors, or for experiment-specific data that may only be accessible for members of the collaboration. We present the extension mechanism including corresponding applications and give an outlook onto the new permission system.

  13. Analyzing, Modelling, and Designing Software Ecosystems

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    as the software development and distribution by a set of actors dependent on each other and the ecosystem. We commence on the hypothesis that the establishment of a software ecosystem on the telemedicine services of Denmark would address these issues and investigate how a software ecosystem can foster...... the development, implementation, and use of telemedicine services. We initially expand the theory of software ecosystems by contributing to the definition and understanding of software ecosystems, providing means of analyzing existing and designing new ecosystems, and defining and measuring the qualities...... of software ecosystems. We use these contributions to design a software ecosystem in the telemedicine services of Denmark with (i) a common platform that supports and promotes development from different actors, (ii) high software interaction, (iii) strong social network of actors, (iv) robust business...

  14. Beginning software engineering

    CERN Document Server

    Stephens, Rod

    2015-01-01

    Beginning Software Engineering demystifies the software engineering methodologies and techniques that professional developers use to design and build robust, efficient, and consistently reliable software. Free of jargon and assuming no previous programming, development, or management experience, this accessible guide explains important concepts and techniques that can be applied to any programming language. Each chapter ends with exercises that let you test your understanding and help you elaborate on the chapter's main concepts. Everything you need to understand waterfall, Sashimi, agile, RAD, Scrum, Kanban, Extreme Programming, and many other development models is inside!

  15. Initial experiences of family caregivers of survivors of a traumatic brain injury

    Directory of Open Access Journals (Sweden)

    Mandi Broodryk

    2015-08-01

    Full Text Available Background: There seems to be a paucity of research on the initial subjective experiences of family caregivers of survivors of a traumatic brain injury (TBI. Objective: To explore the challenges that family caregivers face during the initial stages of recovery of a relative who has sustained a TBI. Methods: Thematic analysis was used to explore the findings from semi-structured interviews that were conducted with 12 female family caregivers of relatives who had sustained a TBI. Results: Family caregivers recalled their initial experiences of the shock at hearing the news about their relative’s TBI, negative experiences in hospital and frustrating interactions with healthcare professionals as particularly challenging. Conclusion: The findings of this study emphasise caregivers’ need for support, information and psycho-education, especially from healthcare professionals, from the very beginning stages of recovery from a TBI. Practical and physical needs with regard to admission to and care in the hospital were also highlighted. This research will hopefully contribute to creating awareness amongst healthcare professionals on how they can contribute to improvement of the services provided by the healthcare system based on the experiences of the caregivers who participated in this study.

  16. Experience Report: Introducing Kanban Into Automotive Software Project

    Directory of Open Access Journals (Sweden)

    Marek Majchrzak

    2017-03-01

    Full Text Available The boundaries between traditional and agile approach methods are disappearing. A significant number of software projects require a continuous implementation of tasks without dividing them into sprints or strict project phases. Customers expect more flexibility and responsiveness from software vendors in response to the ever-changing business environment. To achieve better results in this field, Capgemini has begun using the Lean philosophy and Kanban techniques. \\\\The following article illustrates examples of different uses of Kanban and the main stakeholder of the process. The article presents the main advantages of transparency and ways to improve the customer co-operation as well as stakeholder relationships. The Authors try to visualise all of the elements in the context of the project. \\\\There is also a discussion of different approaches in two software projects. The article fokuses on the main challenges and the evolutionary approach used. An attempt is made to answer the question how to convince both the team as well as the customer, and how to optimise ways to achieve great results.

  17. How does Software Process Improvement Address Global Software Engineering?

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Diebold, Philipp; Münch, Jürgen

    2016-01-01

    a systematic mapping study on the state-of-the-art in SPI from a general perspective, we observed Global Software Engineering (GSE) becoming a topic of interest in recent years. Therefore, in this paper, we provide a detailed investigation of those papers from the overall systematic mapping study that were......For decades, Software Process Improvement (SPI) programs have been implemented, inter alia, to improve quality and speed of software development. To set up, guide, and carry out SPI projects, and to measure SPI state, impact, and success, a multitude of different SPI approaches and considerable...... experience are available. SPI addresses many aspects ranging from individual developer skills to entire organizations. It comprises for instance the optimization of specific activities in the software lifecycle as well as the creation of organization awareness and project culture. In the course of conducting...

  18. A Framework of the Use of Information in Software Testing

    Science.gov (United States)

    Kaveh, Payman

    2010-01-01

    With the increasing role that software systems play in our daily lives, software quality has become extremely important. Software quality is impacted by the efficiency of the software testing process. There are a growing number of software testing methodologies, models, and initiatives to satisfy the need to improve software quality. The main…

  19. Patient-specific IMRT verification using independent fluence-based dose calculation software: experimental benchmarking and initial clinical experience

    International Nuclear Information System (INIS)

    Georg, Dietmar; Stock, Markus; Kroupa, Bernhard; Olofsson, Joergen; Nyholm, Tufve; Ahnesjoe, Anders; Karlsson, Mikael

    2007-01-01

    Experimental methods are commonly used for patient-specific intensity-modulated radiotherapy (IMRT) verification. The purpose of this study was to investigate the accuracy and performance of independent dose calculation software (denoted as 'MUV' (monitor unit verification)) for patient-specific quality assurance (QA). 52 patients receiving step-and-shoot IMRT were considered. IMRT plans were recalculated by the treatment planning systems (TPS) in a dedicated QA phantom, in which an experimental 1D and 2D verification (0.3 cm 3 ionization chamber; films) was performed. Additionally, an independent dose calculation was performed. The fluence-based algorithm of MUV accounts for collimator transmission, rounded leaf ends, tongue-and-groove effect, backscatter to the monitor chamber and scatter from the flattening filter. The dose calculation utilizes a pencil beam model based on a beam quality index. DICOM RT files from patient plans, exported from the TPS, were directly used as patient-specific input data in MUV. For composite IMRT plans, average deviations in the high dose region between ionization chamber measurements and point dose calculations performed with the TPS and MUV were 1.6 ± 1.2% and 0.5 ± 1.1% (1 S.D.). The dose deviations between MUV and TPS slightly depended on the distance from the isocentre position. For individual intensity-modulated beams (total 367), an average deviation of 1.1 ± 2.9% was determined between calculations performed with the TPS and with MUV, with maximum deviations up to 14%. However, absolute dose deviations were mostly less than 3 cGy. Based on the current results, we aim to apply a confidence limit of 3% (with respect to the prescribed dose) or 6 cGy for routine IMRT verification. For off-axis points at distances larger than 5 cm and for low dose regions, we consider 5% dose deviation or 10 cGy acceptable. The time needed for an independent calculation compares very favourably with the net time for an experimental approach

  20. British Isles Field Experience: An Initiative in International Education.

    Science.gov (United States)

    Martin, William J.

    The British Isles Field Experience (BIFE) program was initiated at Williamsport Area Community College (WACC) to provide a group of WACC faculty and staff members with individual and group activities of a personal, professional, and cultural nature in order to promote an international perspective that can be infused into student, collegiate, and…

  1. Teaching Empirical Software Engineering Using Expert Teams

    DEFF Research Database (Denmark)

    Kuhrmann, Marco

    2017-01-01

    Empirical software engineering aims at making software engineering claims measurable, i.e., to analyze and understand phenomena in software engineering and to evaluate software engineering approaches and solutions. Due to the involvement of humans and the multitude of fields for which software...... is crucial, software engineering is considered hard to teach. Yet, empirical software engineering increases this difficulty by adding the scientific method as extra dimension. In this paper, we present a Master-level course on empirical software engineering in which different empirical instruments...... an extra specific expertise that they offer as service to other teams, thus, fostering cross-team collaboration. The paper outlines the general course setup, topics addressed, and it provides initial lessons learned....

  2. The initial experience of trans-rectal ultrasound and biopsy in ...

    African Journals Online (AJOL)

    The initial experience of trans-rectal ultrasound and biopsy in diagnosis of carcinoma prostate in Gezira Hospital for Renal Disease and Surgery (GHRDS). Walaa Eldin Ibraheem, Sami Mahjoub Taha, Mustafa Omran Mansour, Mohammed El Imam Mohamed Ahmed ...

  3. Modernization of tank floor scanning system (TAFLOSS) Software

    International Nuclear Information System (INIS)

    Mohd Fitri Abd Rahman; Jaafar Abdullah; Zainul A Hassan

    2002-01-01

    The main objective of the project is to develop new user-friendly software that combined the second-generation software (developed in-house) and commercial software. This paper describes the development of computer codes for analysing the initial data and plotting exponential curve fit. The method that used in curve fitting is least square technique. The software that had been developed is capable to give a comparable result as the commercial software. (Author)

  4. Software packages for food engineering needs

    OpenAIRE

    Abakarov, Alik

    2011-01-01

    The graphic user interface (GUI) software packages “ANNEKs” and “OPT-PROx” are developed to meet food engineering needs. “OPT-RROx” (OPTimal PROfile) is software developed to carry out thermal food processing optimization based on the variable retort temperature processing and global optimization technique. “ANNEKs” (Artificial Neural Network Enzyme Kinetics) is software designed for determining the kinetics of enzyme hydrolysis of protein at different initial reaction parameters based on the...

  5. Software quality testing process analysis

    OpenAIRE

    Mera Paz, Julián

    2016-01-01

    Introduction: This article is the result of reading, review, analysis of books, magazines and articles well known for their scientific and research quality, which have addressed the software quality testing process. The author, based on his work experience in software development companies, teaching and other areas, has compiled and selected information to argue and substantiate the importance of the software quality testing process. Methodology: the existing literature on the software qualit...

  6. Software engineering in industry

    Science.gov (United States)

    Story, C. M.

    1989-12-01

    Can software be "engineered"? Can a few people with limited resources and a negligible budget produce high quality software solutions to complex software problems? It is possible to resolve the conflict between research activities and the necessity to view software development as a means to an end rather than as an end in itself? The aim of this paper is to encourage further thought and discussion on various topics which, in the author's experience, are becoming increasingly critical in large current software production and development projects, inside and outside high energy physics (HEP). This is done by briefly exploring some of the software engineering ideas and technologies now used in the information industry, using, as a case-study, a project with many similarities to those currently under way in HEP.

  7. Initial experience with the NRC significance determination process

    International Nuclear Information System (INIS)

    Madison, A.L.

    2001-01-01

    The U.S. Nuclear Regulatory Commission (NRC) has revamped its inspection, assessment, and enforcement programs for commercial nuclear power plants. The new oversight process uses more objective, timely, and safety-significant criteria in assessing performance, while seeking to more effectively and efficiently regulate the industry. The NRC tested the new process at thirteen reactors at nine sites across the country on a pilot basis in 1999 to identify what things worked well and what improvements were called for before beginning Initial Implementation at all US nuclear power plants on April 2, 2000. After a year of experience has been gained with the new oversight process at all US plants, the NRC anticipates making further improvements based on this wider experience. (author)

  8. Initial experience with the NRC significance determination process

    Energy Technology Data Exchange (ETDEWEB)

    Madison, A.L. [Office of Nuclear Reactor Regulation, US Nuclear Regulatory Commission (United States)

    2001-07-01

    The U.S. Nuclear Regulatory Commission (NRC) has revamped its inspection, assessment, and enforcement programs for commercial nuclear power plants. The new oversight process uses more objective, timely, and safety-significant criteria in assessing performance, while seeking to more effectively and efficiently regulate the industry. The NRC tested the new process at thirteen reactors at nine sites across the country on a pilot basis in 1999 to identify what things worked well and what improvements were called for before beginning Initial Implementation at all US nuclear power plants on April 2, 2000. After a year of experience has been gained with the new oversight process at all US plants, the NRC anticipates making further improvements based on this wider experience. (author)

  9. The BaBar Software Architecture and Infrastructure

    International Nuclear Information System (INIS)

    Cosmo, Gabriele

    2003-01-01

    The BaBar experiment has in place since 1995 a software release system (SRT Software Release Tools) based on CVS (Concurrent Version System) which is in common for all the software developed for the experiment, online or offline, simulation or reconstruction. A software release is a snapshot of all BaBar code (online, offline, utilities, scripts, makefiles, etc.). This set of code is tested to work together, and is indexed by a release number (e.g., 6.8.2) so a user can refer to a particular release and get reproducible results. A release will involve particular versions of packages. A package generally consists of a set of code for a particular task, together with a GNU makefile, scripts and documentation. All BaBar software is maintained in AFS (Andrew File System) directories, so the code is accessible worldwide within the Collaboration. The combination SRT, CVS, AFS, has demonstrated to be a valid, powerful and efficient way of organizing the software infrastructure of a modern HEP experiment with collaborating Institutes distributed worldwide, both in a development and production phase

  10. Scientific Software - the role of best practices and recommendations

    Science.gov (United States)

    Fritzsch, Bernadette; Bernstein, Erik; Castell, Wolfgang zu; Diesmann, Markus; Haas, Holger; Hammitzsch, Martin; Konrad, Uwe; Lähnemann, David; McHardy, Alice; Pampel, Heinz; Scheliga, Kaja; Schreiber, Andreas; Steglich, Dirk

    2017-04-01

    In Geosciences - like in most other communities - scientific work strongly depends on software. For big data analysis, existing (closed or open source) program packages are often mixed with newly developed codes. Different versions of software components and varying configurations can influence the result of data analysis. This often makes reproducibility of results and reuse of codes very difficult. Policies for publication and documentation of used and newly developed software, along with best practices, can help tackle this problem. Within the Helmholtz Association a Task Group "Access to and Re-use of scientific software" was implemented by the Open Science Working Group in 2016. The aim of the Task Group is to foster the discussion about scientific software in the Open Science context and to formulate recommendations for the production and publication of scientific software, ensuring open access to it. As a first step, a workshop gathered interested scientists from institutions across Germany. The workshop brought together various existing initiatives from different scientific communities to analyse current problems, share established best practices and come up with possible solutions. The subjects in the working groups covered a broad range of themes, including technical infrastructures, standards and quality assurance, citation of software and reproducibility. Initial recommendations are presented and discussed in the talk. They are the foundation for further discussions in the Helmholtz Association and the Priority Initiative "Digital Information" of the Alliance of Science Organisations in Germany. The talk aims to inform about the activities and to link with other initiatives on the national or international level.

  11. Development of a visualized software for tokamak experiment data processing

    International Nuclear Information System (INIS)

    Cao Jianyong; Ding Xuantong; Luo Cuiwen

    2004-01-01

    With the VBA programming in Microsoft Excel, the authors have developed a post-processing software of experimental data in tokamak. The standard formal data in the HL-1M and HL-2A tokamaks can be read, displayed in Excel, and transmitted directly into the MATLAB workspace, for displaying pictures in MATLAB with the software. The authors have also developed data post-processing software in MATLAB environment, which can read standard format data, display picture, supply visual graphical user interface and provide part of advanced signal processing ability

  12. Development of a software for a multi-processor system aimed at the on-line control of nuclear physics experiments

    International Nuclear Information System (INIS)

    Poggioli, Jean Renaud

    1984-01-01

    This research thesis reports the development of a software for an acquisition computer aimed at the on-line control of nuclear physics experiments. An original architecture, based on the assignment of a processor to each fundamental task, enables the implementation of a high performance system. In order to make the user free of programming constraints, the author developed a software for dynamic generation of acquisition and processing codes. These codes are created from a data base which is programmed by the user by using a language close to the physical reality. Procedures of interactive control of the experiment are thus simplified by displaying function menus on the operator terminal. The author evokes possible hardware improvements and possible extensions of the system [fr

  13. Formalising and analysing the control software of the Compact Muon Solenoid Experiment at the Large Hadron Collider

    NARCIS (Netherlands)

    Hwong, Y.L.; Keiren, J.J.A.; Kusters, V.J.J.; Leemans, S.J.J.; Willemse, T.A.C.

    2013-01-01

    The control software of the CERN Compact Muon Solenoid experiment contains over 27 500 finite state machines. These state machines are organised hierarchically: commands are sent down the hierarchy and state changes are sent upwards. The sheer size of the system makes it virtually impossible to

  14. Verification and validation of control system software

    International Nuclear Information System (INIS)

    Munro, J.K. Jr.; Kisner, R.A.; Bhadtt, S.C.

    1991-01-01

    The following guidelines are proposed for verification and validation (V ampersand V) of nuclear power plant control system software: (a) use risk management to decide what and how much V ampersand V is needed; (b) classify each software application using a scheme that reflects what type and how much V ampersand V is needed; (c) maintain a set of reference documents with current information about each application; (d) use Program Inspection as the initial basic verification method; and (e) establish a deficiencies log for each software application. The following additional practices are strongly recommended: (a) use a computer-based configuration management system to track all aspects of development and maintenance; (b) establish reference baselines of the software, associated reference documents, and development tools at regular intervals during development; (c) use object-oriented design and programming to promote greater software reliability and reuse; (d) provide a copy of the software development environment as part of the package of deliverables; and (e) initiate an effort to use formal methods for preparation of Technical Specifications. The paper provides background information and reasons for the guidelines and recommendations. 3 figs., 3 tabs

  15. The Software Management Environment (SME)

    Science.gov (United States)

    Valett, Jon D.; Decker, William; Buell, John

    1988-01-01

    The Software Management Environment (SME) is a research effort designed to utilize the past experiences and results of the Software Engineering Laboratory (SEL) and to incorporate this knowledge into a tool for managing projects. SME provides the software development manager with the ability to observe, compare, predict, analyze, and control key software development parameters such as effort, reliability, and resource utilization. The major components of the SME, the architecture of the system, and examples of the functionality of the tool are discussed.

  16. Lessons Learned in Software Testing A Context-Driven Approach

    CERN Document Server

    Kaner, Cem; Pettichord, Bret

    2008-01-01

    Decades of software testing experience condensed into the most important lessons learned.The world's leading software testing experts lend you their wisdom and years of experience to help you avoid the most common mistakes in testing software. Each lesson is an assertion related to software testing, followed by an explanation or example that shows you the how, when, and why of the testing lesson. More than just tips, tricks, and pitfalls to avoid, Lessons Learned in Software Testing speeds you through the critical testing phase of the software development project without the extensive trial an

  17. Assessing Software Quality Through Visualised Cohesion Metrics

    Directory of Open Access Journals (Sweden)

    Timothy Shih

    2001-05-01

    Full Text Available Cohesion is one of the most important factors for software quality as well as maintainability, reliability and reusability. Module cohesion is defined as a quality attribute that seeks for measuring the singleness of the purpose of a module. The module of poor quality can be a serious obstacle to the system quality. In order to design a good software quality, software managers and engineers need to introduce cohesion metrics to measure and produce desirable software. A highly cohesion software is thought to be a desirable constructing. In this paper, we propose a function-oriented cohesion metrics based on the analysis of live variables, live span and the visualization of processing element dependency graph. We give six typical cohesion examples to be measured as our experiments and justification. Therefore, a well-defined, well-normalized, well-visualized and well-experimented cohesion metrics is proposed to indicate and thus enhance software cohesion strength. Furthermore, this cohesion metrics can be easily incorporated with software CASE tool to help software engineers to improve software quality.

  18. Computer software configuration management

    International Nuclear Information System (INIS)

    Pelletier, G.

    1987-08-01

    This report reviews the basic elements of software configuration management (SCM) as defined by military and industry standards. Several software configuration management standards are evaluated given the requirements of the nuclear industry. A survey is included of available automated tools for supporting SCM activities. Some information is given on the experience of establishing and using SCM plans of other organizations that manage critical software. The report concludes with recommendations of practices that would be most appropriate for the nuclear power industry in Canada

  19. Software Process Improvement: Where Is the Evidence?

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Konopka, Claudia; Nellemann, Peter

    2015-01-01

    for future research directions. An analysis of 635 publications draws a big picture of SPI-related research of the past 25 years. Our study shows a high number of solution proposals, experience reports, and secondary studies, but only few theories. In particular, standard SPI models like CMMI and ISO......Software process improvement (SPI) is around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out...... there? Are there new emerging approaches? What are open issues? Still, we struggle to answer the question for what is the current state of SPI and related research? In this paper, we present initial results from a systematic mapping study to shed light on the field of SPI and to draw conclusions...

  20. Managing Software Process Evolution

    DEFF Research Database (Denmark)

    This book focuses on the design, development, management, governance and application of evolving software processes that are aligned with changing business objectives, such as expansion to new domains or shifting to global production. In the context of an evolving business world, it examines...... the complete software process lifecycle, from the initial definition of a product to its systematic improvement. In doing so, it addresses difficult problems, such as how to implement processes in highly regulated domains or where to find a suitable notation system for documenting processes, and provides...... essential insights and tips to help readers manage process evolutions. And last but not least, it provides a wealth of examples and cases on how to deal with software evolution in practice. Reflecting these topics, the book is divided into three parts. Part 1 focuses on software business transformation...

  1. Enabling software defined networking experiments in networked critical infrastructures

    Directory of Open Access Journals (Sweden)

    Béla Genge

    2014-05-01

    Full Text Available Nowadays, the fact that Networked Critical Infrastructures (NCI, e.g., power plants, water plants, oil and gas distribution infrastructures, and electricity grids, are targeted by significant cyber threats is well known. Nevertheless, recent research has shown that specific characteristics of NCI can be exploited in the enabling of more efficient mitigation techniques, while novel techniques from the field of IP networks can bring significant advantages. In this paper we explore the interconnection of NCI communication infrastructures with Software Defined Networking (SDN-enabled network topologies. SDN provides the means to create virtual networking services and to implement global networking decisions. It relies on OpenFlow to enable communication with remote devices and has been recently categorized as the “Next Big Technology”, which will revolutionize the way decisions are implemented in switches and routers. Therefore, the paper documents the first steps towards enabling an SDN-NCI and presents the impact of a Denial of Service experiment over traffic resulting from an XBee sensor network which is routed across an emulated SDN network.

  2. Multi-threaded software framework development for the ATLAS experiment

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00226135; Baines, John; Bold, Tomasz; Calafiura, Paolo; Dotti, Andrea; Farrell, Steven; Leggett, Charles; Malon, David; Ritsch, Elmar; Snyder, Scott; Tsulaia, Vakhtang; van Gemmeren, Peter; Wynne, Benjamin

    2016-01-01

    ATLAS's current software framework, Gaudi/Athena, has been very successful for the experiment in LHC Runs 1 and 2. However, its single threaded design has been recognised for some time to be increasingly problematic as CPUs have increased core counts and decreased available memory per core. Even the multi-process version of Athena, AthenaMP, will not scale to the range of architectures we expect to use beyond Run2. ATLAS examined the requirements on an updated multi-threaded framework and laid out plans for a new framework, including better support for high level trigger (HLT) use cases, in 2014. In this paper we report on our progress in developing the new multi-threaded task parallel extension of Athena, AthenaMT. Implementing AthenaMT has required many significant code changes. Progress has been made in updating key concepts of the framework, to allow the incorporation of different levels of thread safety in algorithmic code (from un-migrated thread-unsafe code, to thread safe copyable code to reentrant co...

  3. Multi-threaded Software Framework Development for the ATLAS Experiment

    CERN Document Server

    Stewart, Graeme; The ATLAS collaboration; Baines, John; Calafiura, Paolo; Dotti, Andrea; Farrell, Steven; Leggett, Charles; Malon, David; Ritsch, Elmar; Snyder, Scott; Tsulaia, Vakhtang; van Gemmeren, Peter; Wynne, Benjamin

    2016-01-01

    ATLAS's current software framework, Gaudi/Athena, has been very successful for the experiment in LHC Runs 1 and 2. However, its single threaded design has been recognised for some time to be increasingly problematic as CPUs have increased core counts and decreased available memory per core. Even the multi-process version of Athena, AthenaMP, will not scale to the range of architectures we expect to use beyond Run2. ATLAS examined the requirements on an updated multi-threaded framework and layed out plans for a new framework, including better support for high level trigger (HLT) use cases, in 2014. In this paper we report on our progress in developing the new multi-threaded task parallel extension of Athena, AthenaMT. Implementing AthenaMT has required many significant code changes. Progress has been made in updating key concepts of the framework, to allow the incorporation of different levels of thread safety in algorithmic code (from un-migrated thread-unsafe code, to thread safe copyable code to reentrant c...

  4. Software for the alignment of the CMS experiment at CERN

    International Nuclear Information System (INIS)

    Arce, P.

    1999-01-01

    In the CMS experiment the position of the muon chambers has to be known with precision of the order of 100 μm. With this aim a complex optical alignment system has been designed, which is composed of three parts that correspond to the main parts of the detector: the alignment of the barre l muon chambers, the alignment of the end cap muon chambers, and the link between both chambers and the inner tracker sub-detector. The total number of elements in the three systems is around seven thousand. The purpose of the CMS optical alignment software is to analyze the data taken by all these elements and reconstruct the position of the muon chambers with respect to each other and with respect to the inner tracker reference system and to propagate the errors of the measurements to the errors in the positions of the chambers. (author)

  5. Software complex AS (automation of spectrometry). User interface of experiment automation system implementation

    International Nuclear Information System (INIS)

    Astakhova, N.V.; Beskrovnyj, A.I.; Bogdzel', A.A.; Butorin, P.E.; Vasilovskij, S.G.; Gundorin, N.A.; Zlokazov, V.B.; Kutuzov, S.A.; Salamatin, I.M.; Shvetsov, V.N.

    2003-01-01

    An instrumental software complex for automation of spectrometry (AS) that enables prompt realization of experiment automation systems for spectrometers, which use data buferisation, has been developed. In the development new methods of programming and building of automation systems together with novel net technologies were employed. It is suggested that programs to schedule and conduct experiments should be based on the parametric model of the spectrometer, the approach that will make it possible to write programs suitable for any FLNP (Frank Laboratory of Neutron Physics) spectrometer and experimental technique applied and use different hardware interfaces for introducing the spectrometric data into the data acquisition system. The article describes the possibilities provided to the user in the field of scheduling and control of the experiment, data viewing, and control of the spectrometer parameters. The possibility of presenting the current spectrometer state, programs and the experimental data in the Internet in the form of dynamically formed protocols and graphs, as well as of the experiment control via the Internet is realized. To use the means of the Internet on the side of the client, applied programs are not needed. It suffices to know how to use the two programs to carry out experiments in the automated mode. The package is designed for experiments in condensed matter and nuclear physics and is ready for using. (author)

  6. Intercomparison of alpha particle spectrometry software packages

    International Nuclear Information System (INIS)

    1999-08-01

    Software has reached an important level as the 'logical controller' at different levels, from a single instrument to an entire computer-controlled experiment. This is also the case for software packages in nuclear instruments and experiments. In particular, because of the range of applications of alpha-particle spectrometry, software packages in this field are often used. It is the aim of this intercomparison to test and describe the abilities of four such software packages. The main objectives of the intercomparison were the ability of the programs to determine the peak areas and the peak area uncertainties, and the statistical control and stability of reported results. In this report, the task, methods and results of the intercomparison are presented in order to asist the potential users of such software and to stimulate the development of even better alpha-particle spectrum analysis software

  7. Bioconductor: open software development for computational biology and bioinformatics

    DEFF Research Database (Denmark)

    Gentleman, R.C.; Carey, V.J.; Bates, D.M.

    2004-01-01

    The Bioconductor project is an initiative for the collaborative creation of extensible software for computational biology and bioinformatics. The goals of the project include: fostering collaborative development and widespread use of innovative software, reducing barriers to entry into interdisci......The Bioconductor project is an initiative for the collaborative creation of extensible software for computational biology and bioinformatics. The goals of the project include: fostering collaborative development and widespread use of innovative software, reducing barriers to entry...... into interdisciplinary scientific research, and promoting the achievement of remote reproducibility of research results. We describe details of our aims and methods, identify current challenges, compare Bioconductor to other open bioinformatics projects, and provide working examples....

  8. Top 10 metrics for life science software good practices.

    Science.gov (United States)

    Artaza, Haydee; Chue Hong, Neil; Corpas, Manuel; Corpuz, Angel; Hooft, Rob; Jimenez, Rafael C; Leskošek, Brane; Olivier, Brett G; Stourac, Jan; Svobodová Vařeková, Radka; Van Parys, Thomas; Vaughan, Daniel

    2016-01-01

    Metrics for assessing adoption of good development practices are a useful way to ensure that software is sustainable, reusable and functional. Sustainability means that the software used today will be available - and continue to be improved and supported - in the future. We report here an initial set of metrics that measure good practices in software development. This initiative differs from previously developed efforts in being a community-driven grassroots approach where experts from different organisations propose good software practices that have reasonable potential to be adopted by the communities they represent. We not only focus our efforts on understanding and prioritising good practices, we assess their feasibility for implementation and publish them here.

  9. Upgrade Software and Computing

    CERN Document Server

    The LHCb Collaboration, CERN

    2018-01-01

    This document reports the Research and Development activities that are carried out in the software and computing domains in view of the upgrade of the LHCb experiment. The implementation of a full software trigger implies major changes in the core software framework, in the event data model, and in the reconstruction algorithms. The increase of the data volumes for both real and simulated datasets requires a corresponding scaling of the distributed computing infrastructure. An implementation plan in both domains is presented, together with a risk assessment analysis.

  10. Proceedings of the Thirteenth Annual Software Engineering Workshop

    Science.gov (United States)

    1988-01-01

    Topics covered in the workshop included studies and experiments conducted in the Software Engineering Laboratory (SEL), a cooperative effort of NASA Goddard Space Flight Center, the University of Maryland, and Computer Sciences Corporation; software models; software products; and software tools.

  11. Editorial Management serials online: construction process, publication and administration with free software solutions

    OpenAIRE

    Andrés Vuotto; María Carolina Rojas; Gladys Vanesa Fernández

    2013-01-01

    Initially raised the main points to consider and develop the planning and construction of an online publication of a scientific nature, emphasizing the process and editorial functions, document preservation, access management, indexing and visibility. In the second part of the paper presents a proposed solution to every aspect previously described, highlighting the work of the information professional and optimizing time, cost and results offered free software, from a concrete experience with...

  12. Preliminary results from initial in-pile debris bed experiments

    International Nuclear Information System (INIS)

    Rivard, J.B.

    1977-01-01

    An accident in a liquid metal fast breeder reactor (LMFBR) in which molten core material is suddenly quenched with subcooled liquid sodium could result in extensive fragmentation and dispersal of fuel as subcritical beds of frozen particulate debris within the reactor vessel. Since this debris will continue to generate power due to decay of retained fission products, containment of the debris is threatened if the generated heat is not removed. Therefore, the initial safety question is the capacity which debris beds may have for transfer of the decay heat to overlying liquid sodium by natural processes--i.e., without the aid of forced circulation of the coolant. Up to the present time, all experiments on debris bed behavior either have used substitute materials (e.g., sand and water) or have employed actual materials, but atypical heating methods. Increased confidence in the applicability of debris bed simulations is afforded if the heat is generated within the fuel component of the appropriate fast reactor materials. The initial series of in-pile tests reported on herein constitutes the first experiments in which the internal heating mode has been produced in particulate oxide fuel immersed in liquid sodium. Fission heating of the fully-enriched UO 2 in the experiment while it is contained within Sandia Laboratories Annular Core Pulse Reactor (ACPR), operating in its steady-state mode, approximates the decay heating of debris. Preliminary results are discussed

  13. Analysis of molten fuel-coolant interaction during a reactivity-initiated accident experiment

    International Nuclear Information System (INIS)

    El-Genk, M.S.; Hobbins, R.R.

    1981-01-01

    The results of a reactivity-initiated accident experiment, designated RIA-ST-4, are discussed and analyzed with regard to molten fuel-coolant interaction (MFCI). In this experiment, extensive amounts of molten UO 2 fuel and zircaloy cladding were produced and fragmented upon mixing with the coolant. Coolant pressurization up to 35 MPa and coolant overheating in excess of 940 K occurred after fuel rod failure. The initial coolant conditions were similar to those in boiling water reactors during a hot startup (that is, coolant pressure of 6.45 MPa, coolant temperature of 538 K, and coolant flow rate of 85 cm 3 /s). It is concluded that the high coolant pressure recorded in the RIA-ST-4 experiment was caused by an energetic MFCI and was not due to gas release from the test rod at failure, Zr/water reaction, or to UO 2 fuel vapor pressure. The high coolant temperature indicated the presence of superheated steam, which may have formed during the expansion of the working fluid back to the initial coolant pressure; yet, the thermal-to-mechanical energy conversion ratio is estimated to be only 0.3%

  14. Building Spaces of Exclusivity: An Ethnographic Approach to Indian and Colombian Women’s Role and Experience in Local Free Software Communities

    Directory of Open Access Journals (Sweden)

    Tania Pérez-Bustos

    2010-01-01

    Full Text Available This paper aims to account for the ways women integrating the free software community in two countries from the global South negotiate with feminizing paradigms imposed to them by the collectives interested in popularizating free technologies. Through an ethnographic approach to vital experiences of women in the Indian collective Linux-Chix, and holding a dialog with the experiences of non-organized women in the free software community in Colombia, this paper suggests these negotiations are going to be materialized primarily in the constitution of survival strategies from which certain civilizing projects are particularly vindicated, some of which seem to promote a Western paradigm of female subjectivity.

  15. IsoDesign: a software for optimizing the design of 13C-metabolic flux analysis experiments.

    Science.gov (United States)

    Millard, Pierre; Sokol, Serguei; Letisse, Fabien; Portais, Jean-Charles

    2014-01-01

    The growing demand for (13) C-metabolic flux analysis ((13) C-MFA) in the field of metabolic engineering and systems biology is driving the need to rationalize expensive and time-consuming (13) C-labeling experiments. Experimental design is a key step in improving both the number of fluxes that can be calculated from a set of isotopic data and the precision of flux values. We present IsoDesign, a software that enables these parameters to be maximized by optimizing the isotopic composition of the label input. It can be applied to (13) C-MFA investigations using a broad panel of analytical tools (MS, MS/MS, (1) H NMR, (13) C NMR, etc.) individually or in combination. It includes a visualization module to intuitively select the optimal label input depending on the biological question to be addressed. Applications of IsoDesign are described, with an example of the entire (13) C-MFA workflow from the experimental design to the flux map including important practical considerations. IsoDesign makes the experimental design of (13) C-MFA experiments more accessible to a wider biological community. IsoDesign is distributed under an open source license at http://metasys.insa-toulouse.fr/software/isodes/ © 2013 Wiley Periodicals, Inc.

  16. Article I. Multi-platform Automated Software Building and Packaging

    International Nuclear Information System (INIS)

    Rodriguez, A Abad; Gomes Gouveia, V E; Meneses, D; Capannini, F; Aimar, A; Di Meglio, A

    2012-01-01

    One of the major goals of the EMI (European Middleware Initiative) project is the integration of several components of the pre-existing middleware (ARC, gLite, UNICORE and dCache) into a single consistent set of packages with uniform distributions and repositories. Those individual middleware projects have been developed in the last decade by tens of development teams and before EMI were all built and tested using different tools and dedicated services. The software, millions of lines of code, is written in several programming languages and supports multiple platforms. Therefore a viable solution ought to be able to build and test applications on multiple programming languages using common dependencies on all selected platforms. It should, in addition, package the resulting software in formats compatible with the popular Linux distributions, such as Fedora and Debian, and store them in repositories from which all EMI software can be accessed and installed in a uniform way. Despite this highly heterogeneous initial situation, a single common solution, with the aim of quickly automating the integration of the middleware products, had to be selected and implemented in a few months after the beginning of the EMI project. Because of the previous knowledge and the short time available in which to provide this common solution, the ETICS service, where the gLite middleware was already built for years, was selected. This contribution describes how the team in charge of providing a common EMI build and packaging infrastructure to the whole project has developed a homogeneous solution for releasing and packaging the EMI components from the initial set of tools used by the earlier middleware projects. An important element of the presentation is the developers experience and feedback on converging on ETICS and on the on-going work in order to finally add more widely used and supported build and packaging solutions of the Linux platforms

  17. Introducing Software Engineering by means of Extreme Programming

    DEFF Research Database (Denmark)

    Hedin, G.; Bendix, Lars Gotfred; Magnusson, B.

    2003-01-01

    This paper reports on experience from teaching basic software engineering concepts by using Extreme Programming in a second year undergraduate course taken by 107 students. We describe how this course fits into a wider programme on software engineering and technology and report our experience from...... running and improving the course. Particularly important aspects of our set-up includes team coaching (by older students) and "team-in-one-room". Our experience so far is very positive and we see that students get a good basic understanding of the important concepts in software engineering, rooted...

  18. Testing Object-Oriented Software

    DEFF Research Database (Denmark)

    Caspersen, Michael Edelgaard; Madsen, Ole Lehrmann; Skov, Stefan H.

    The report is a result of an activity within the project Centre for Object Technology (COT), case 2. In case 2 a number of pilot projects have been carried out to test the feasibility of using object technology within embedded software. Some of the pilot projects have resulted in proto-types that......The report is a result of an activity within the project Centre for Object Technology (COT), case 2. In case 2 a number of pilot projects have been carried out to test the feasibility of using object technology within embedded software. Some of the pilot projects have resulted in proto......-types that are currently being developed into production versions. To assure a high quality in the product it was decided to carry out an activ-ity regarding issues in testing OO software. The purpose of this report is to discuss the issues of testing object-oriented software. It is often claimed that testing of OO...... software is radically different form testing traditional software developed using imperative/procedural programming. Other authors claim that there is no difference. In this report we will attempt to give an answer to these questions (or at least initiate a discussion)....

  19. Two‐year experience with the commercial Gamma Knife Check software

    Science.gov (United States)

    Bhatnagar, Jagdish; Bednarz, Greg; Novotny, Josef; Flickinger, John; Lunsford, L. Dade; Huq, M. Saiful

    2016-01-01

    The Gamma Knife Check software is an FDA approved second check system for dose calculations in Gamma Knife radiosurgery. The purpose of this study was to evaluate the accuracy and the stability of the commercial software package as a tool for independent dose verification. The Gamma Knife Check software version 8.4 was commissioned for a Leksell Gamma Knife Perfexion and a 4C unit at the University of Pittsburgh Medical Center in May 2012. Independent dose verifications were performed using this software for 319 radiosurgery cases on the Perfexion and 283 radiosurgery cases on the 4C units. The cases on each machine were divided into groups according to their diagnoses, and an averaged absolute percent dose difference for each group was calculated. The percentage dose difference for each treatment target was obtained as the relative difference between the Gamma Knife Check dose and the dose from the tissue maximum ratio algorithm (TMR 10) from the GammaPlan software version 10 at the reference point. For treatment plans with imaging skull definition, results obtained from the Gamma Knife Check software using the measurement‐based skull definition method are used for comparison. The collected dose difference data were also analyzed in terms of the distance from the treatment target to the skull, the number of treatment shots used for the target, and the gamma angles of the treatment shots. The averaged percent dose differences between the Gamma Knife Check software and the GammaPlan treatment planning system are 0.3%, 0.89%, 1.24%, 1.09%, 0.83%, 0.55%, 0.33%, and 1.49% for the trigeminal neuralgia, acoustic neuroma, arteriovenous malformation (AVM), meningioma, pituitary adenoma, glioma, functional disorders, and metastasis cases on the Perfexion unit. The corresponding averaged percent dose differences for the 4C unit are 0.33%, 1.2%, 2.78% 1.99%, 1.4%, 1.92%, 0.62%, and 1.51%, respectively. The dose difference is, in general, larger for treatment targets in the

  20. Selection and Management of Open Source Software in Libraries

    OpenAIRE

    Vimal Kumar, V.

    2007-01-01

    Open source software was a revolutionary concept among computer programmers and users. To a certain extent open source solutions could provide an alternative solution to costly commercial software. Open source software is, software that users have the ability to run, copy, distribute, study, change, share and improve for any purpose. Open source library software’s does not need the initial cost of commercial software and enables libraries to have greater control over their working environmen...

  1. DIII-D Thomson Scattering Diagnostic Data Acquisition, Processing and Analysis Software

    International Nuclear Information System (INIS)

    Middaugh, K.R.; Bray, B.D.; Hsieh, C.L.; McHarg, B.B.Jr.; Penaflor, B.G.

    1999-01-01

    One of the diagnostic systems critical to the success of the DIII-D tokamak experiment is the Thomson scattering diagnostic. This diagnostic is unique in that it measures local electron temperature and density: (1) at multiple locations within the tokamak plasma; and (2) at different times throughout the plasma duration. Thomson ''raw'' data are digitized signals of scattered light, measured at different times and locations, from the laser beam paths fired into the plasma. Real-time acquisition of this data is performed by specialized hardware. Once obtained, the raw data are processed into meaningful temperature and density values which can be analyzed for measurement quality. This paper will provide an overview of the entire Thomson scattering diagnostic software and will focus on the data acquisition, processing, and analysis software implementation. The software falls into three general categories: (1) Set-up and Control: Initializes and controls all Thomson hardware and software, synchronizes with other DIII-D computers, and invokes other Thomson software as appropriate. (2) Data Acquisition and Processing: Obtains raw measured data from memory and processes it into temperature and density values. (3) Analysis: Provides a graphical user interface in which to perform analysis and sophisticated plotting of analysis parameters

  2. Code Forking, Governance, and Sustainability in Open Source Software

    Directory of Open Access Journals (Sweden)

    Juho Lindman

    2013-01-01

    Full Text Available The right to fork open source code is at the core of open source licensing. All open source licenses grant the right to fork their code, that is to start a new development effort using an existing code as its base. Thus, code forking represents the single greatest tool available for guaranteeing sustainability in open source software. In addition to bolstering program sustainability, code forking directly affects the governance of open source initiatives. Forking, and even the mere possibility of forking code, affects the governance and sustainability of open source initiatives on three distinct levels: software, community, and ecosystem. On the software level, the right to fork makes planned obsolescence, versioning, vendor lock-in, end-of-support issues, and similar initiatives all but impossible to implement. On the community level, forking impacts both sustainability and governance through the power it grants the community to safeguard against unfavourable actions by corporations or project leaders. On the business-ecosystem level forking can serve as a catalyst for innovation while simultaneously promoting better quality software through natural selection. Thus, forking helps keep open source initiatives relevant and presents opportunities for the development and commercialization of current and abandoned programs.

  3. Control and test software for IRAM WideX correlator

    International Nuclear Information System (INIS)

    Blanchet, S.; Broguiere, D.; Chavatte, P.; Morel, F.; Perrigouard, A.; Torres, M.

    2012-01-01

    IRAM is an international research institute for radio astronomy. It has designed a new correlator called WideX for the Plateau de Bure interferometer (an array of six 15-meter telescopes) in the French Alps. The device started its official service in February 2010. This correlator must be driven in real-time at 32 Hz for sending parameters and for data acquisition. With 3.67 million channels, distributed over 1792 dedicated chips, that produce a 1.87 Gbits/sec data output rate, the data acquisition and processing and also the automatic hardware-failure detection are big challenges for the software. This article presents the software that has been developed to drive and test the correlator. In particular it presents an innovative usage of a high speed optical link, initially developed for the CERN ALICE experiment, associated with real-time Linux (RTAI) to achieve our goals. (authors)

  4. Initial acceptance test experience with FFTF plant equipment

    International Nuclear Information System (INIS)

    Brown, R.K.; Coleman, K.A.; Mahaffey, M.K.; McCargar, C.G.; Young, M.W.

    1978-09-01

    The purpose of this paper is to examine the initial acceptance test experience of certain pieces of auxiliary equipment of the Fast Flux Test Facility (FFTF). The scope focuses on the DHX blowers and drive train, inert gas blowers, H and V containment isolation valves, and the Surveillance and In-service Inspection (SISI) transporter and trolley. For each type of equipment, the discussion includes a summary of the design and system function, installation history, preoperational acceptance testing procedures and results, and unusual events and resolutions

  5. Expert software for accident identification

    International Nuclear Information System (INIS)

    Dobnikar, M.; Nemec, T.; Muehleisen, A.

    2003-01-01

    Each type of an accident in a Nuclear Power Plant (NPP) causes immediately after the start of the accident variations of physical parameters that are typical for that type of the accident thus enabling its identification. Examples of these parameter are: decrease of reactor coolant system pressure, increase of radiation level in the containment, increase of pressure in the containment. An expert software enabling a fast preliminary identification of the type of the accident in Krsko NPP has been developed. As input data selected typical parameters from Emergency Response Data System (ERDS) of the Krsko NPP are used. Based on these parameters the expert software identifies the type of the accident and also provides the user with appropriate references (past analyses and other documentation of such an accident). The expert software is to be used as a support tool by an expert team that forms in case of an emergency at Slovenian Nuclear Safety Administration (SNSA) with the task to determine the cause of the accident, its most probable scenario and the source term. The expert software should provide initial identification of the event, while the final one is still to be made after appropriate assessment of the event by the expert group considering possibility of non-typical events, multiple causes, initial conditions, influences of operators' actions etc. The expert software can be also used as an educational/training tool and even as a simple database of available accident analyses. (author)

  6. Physics Validation of the LHC Software

    CERN Multimedia

    CERN. Geneva

    2004-01-01

    The LHC Software will be confronted to unprecedented challenges as soon as the LHC will turn on. We summarize the main Software requirements coming from the LHC detectors, triggers and physics, and we discuss several examples of Software components developed by the experiments and the LCG project (simulation, reconstruction, etc.), their validation, and their adequacy for LHC physics.

  7. Software Process Improvement: Supporting the Linking of the Software and the Business Strategies

    Science.gov (United States)

    Albuquerque, Adriano Bessa; Rocha, Ana Regina; Lima, Andreia Cavalcanti

    The market is becoming more and more competitive, a lot of products and services depend of the software product and the software is one of the most important assets, which influence the organizations’ businesses. Considering this context, we can observe that the companies must to deal with the software, developing or acquiring, carefully. One of the perspectives that can help to take advantage of the software, supporting effectively the business, is to invest on the organization’s software processes. This paper presents an approach to evaluate and improve the processes assets of the software organizations, based on internationally well-known standards and process models. This approach is supported by automated tools from the TABA Workstation and is part of a wider improvement strategy constituted of three layers (organizational layer, process execution layer and external entity layer). Moreover, this paper presents the experience of use and their results.

  8. The Community Reclaims Control? Learning Experiences from Rural Broadband Initiatives in the Netherlands

    NARCIS (Netherlands)

    Salemink, Koen; Strijker, Dirk; Bosworth, Gary

    2017-01-01

    Based on four illustrative case studies from the Netherlands, this article discusses learning experiences gained from rural broadband initiatives. As an example of the big society' (or participatiesamenleving' in Dutch), initiatives try to step in where the market and the government fail. The main

  9. A communication-channel-based representation system for software

    NARCIS (Netherlands)

    Demirezen, Zekai; Tanik, Murat M.; Aksit, Mehmet; Skjellum, Anthony

    We observed that before initiating software development the objectives are minimally organized and developers introduce comparatively higher organization throughout the design process. To be able to formally capture this observation, a new communication channel representation system for software is

  10. "Elite" Career-Changers and Their Experience of Initial Teacher Education

    Science.gov (United States)

    Wilkins, Chris

    2017-01-01

    This study explores the motivation of "high-status" professionals to change career and enter teaching, and their experience of undertaking initial teacher education (ITE) programmes in England. The study builds on previous research which found that career-changers are disproportionately more likely to fail to complete their ITE studies,…

  11. Numerical and Experimental Validation of a New Damage Initiation Criterion

    Science.gov (United States)

    Sadhinoch, M.; Atzema, E. H.; Perdahcioglu, E. S.; van den Boogaard, A. H.

    2017-09-01

    Most commercial finite element software packages, like Abaqus, have a built-in coupled damage model where a damage evolution needs to be defined in terms of a single fracture energy value for all stress states. The Johnson-Cook criterion has been modified to be Lode parameter dependent and this Modified Johnson-Cook (MJC) criterion is used as a Damage Initiation Surface (DIS) in combination with the built-in Abaqus ductile damage model. An exponential damage evolution law has been used with a single fracture energy value. Ultimately, the simulated force-displacement curves are compared with experiments to validate the MJC criterion. 7 out of 9 fracture experiments were predicted accurately. The limitations and accuracy of the failure predictions of the newly developed damage initiation criterion will be discussed shortly.

  12. Beyond Bundles - Reproducible Software Environments with GNU Guix

    CERN Multimedia

    CERN. Geneva; Wurmus, Ricardo

    2018-01-01

    Building reproducible data analysis pipelines and numerical experiments is a key challenge for reproducible science, in which tools to reproduce software environments play a critical role. The advent of “container-based” deployment tools such as Docker and Singularity has made it easier to replicate software environments. These tools are very much about bundling the bits of software binaries in a convenient way, not so much about describing how software is composed. Science is not just about replicating, though—it demands the ability to inspect and to experiment. In this talk we will present GNU Guix, a software management toolkit. Guix departs from container-based solutions in that it enables declarative composition of software environments. It is comparable to “package managers” like apt or yum, but with a significant difference: Guix provides accurate provenance tracking of build artifacts, and bit-reproducible software. We will illustrate the many ways in which Guix can improve how software en...

  13. Residence time distribution software analysis. User's manual

    International Nuclear Information System (INIS)

    1996-01-01

    Radiotracer applications cover a wide range of industrial activities in chemical and metallurgical processes, water treatment, mineral processing, environmental protection and civil engineering. Experiment design, data acquisition, treatment and interpretation are the basic elements of tracer methodology. The application of radiotracers to determine impulse response as RTD as well as the technical conditions for conducting experiments in industry and in the environment create a need for data processing using special software. Important progress has been made during recent years in the preparation of software programs for data treatment and interpretation. The software package developed for industrial process analysis and diagnosis by the stimulus-response methods contains all the methods for data processing for radiotracer experiments

  14. Reflective approach for software design decision making

    NARCIS (Netherlands)

    Razavian, M.; Tang, A.; Capilla, R.; Lago, P.

    2016-01-01

    Good software design practice is difficult to define and teach. Despite the many software design methods and processes that are available, the quality of software design relies on human factors. We notice from literature and our own experiments that some of these factors concern design reasoning and

  15. Variation of the Young's modulus with plastic strain applying to elastoplastic software

    International Nuclear Information System (INIS)

    Morestin, F.; Boivin, M.

    1993-01-01

    Work hardening of steel involves modifications of the elastic properties of the material, for instance, an increase of its yield stress. It may be also the cause of an appreciable decrease of the Young's modulus. This property decreases as plastic strain increases. Experiments with a microcomputer controlled tensile test machine indicated that diminution could reach more than 10% of the initial value, after only 5% of plastic strain. In spite of this fact, lots of elastoplastic softwares don't combine the decrease of the Young's modulus with plastification though it may involve obvious differences among results. As an application we have developed a software which computes the deformation of steel sheet in press forming, after springback. This software takes into account the decrease of the Young's modulus and its results are very close to experimental values. Quite arbitrarily, we noticed a recovery of the Young's modulus of plastified specimens after few days but not for all steels tested. (author)

  16. A Software Configuration Management Course

    DEFF Research Database (Denmark)

    Asklund, U.; Bendix, Lars Gotfred

    2003-01-01

    Software Configuration Management has been a big success in research and creation of tools. There are also many vendors in the market of selling courses to companies. However, in the education sector Software Configuration Management has still not quite made it - at least not into the university...... curriculum. It is either not taught at all or is just a minor part of a general course in software engineering. In this paper, we report on our experience with giving a full course entirely dedicated to Software Configuration Management topics and start a discussion of what ideally should be the goal...

  17. Dynamic crack initiation toughness : experiments and peridynamic modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Foster, John T.

    2009-10-01

    This is a dissertation on research conducted studying the dynamic crack initiation toughness of a 4340 steel. Researchers have been conducting experimental testing of dynamic crack initiation toughness, K{sub Ic}, for many years, using many experimental techniques with vastly different trends in the results when reporting K{sub Ic} as a function of loading rate. The dissertation describes a novel experimental technique for measuring K{sub Ic} in metals using the Kolsky bar. The method borrows from improvements made in recent years in traditional Kolsky bar testing by using pulse shaping techniques to ensure a constant loading rate applied to the sample before crack initiation. Dynamic crack initiation measurements were reported on a 4340 steel at two different loading rates. The steel was shown to exhibit a rate dependence, with the recorded values of K{sub Ic} being much higher at the higher loading rate. Using the knowledge of this rate dependence as a motivation in attempting to model the fracture events, a viscoplastic constitutive model was implemented into a peridynamic computational mechanics code. Peridynamics is a newly developed theory in solid mechanics that replaces the classical partial differential equations of motion with integral-differential equations which do not require the existence of spatial derivatives in the displacement field. This allows for the straightforward modeling of unguided crack initiation and growth. To date, peridynamic implementations have used severely restricted constitutive models. This research represents the first implementation of a complex material model and its validation. After showing results comparing deformations to experimental Taylor anvil impact for the viscoplastic material model, a novel failure criterion is introduced to model the dynamic crack initiation toughness experiments. The failure model is based on an energy criterion and uses the K{sub Ic} values recorded experimentally as an input. The failure model

  18. Blast experiments for the derivation of initial cloud dimensions after a ''Dirty Bomb'' event

    International Nuclear Information System (INIS)

    Thielen, H.; Schroedl, E.

    2004-01-01

    Basis for the assessment of potential consequences of a ''dirty bomb'' event is the calculation of the atmospheric dispersion of airborne particles. The empirical derivation of parameters for the estimation of the initial pollutant cloud dimensions was the principal purpose for blast experiments performed in the training area Munster in summer 2003 with the participation of several highly engaged German organisations and institutions. The experiments were performed under variation of parameters like mass and kind of explosive, subsurface characteristics or meteorological conditions and were documented by digital video recording. The blasting experiments supplied significant results under reproducible conditions. The initial cloud dimension was primarily influenced by the explosive mass. The influence of other parameters was relatively small and within the range of the experimental uncertainties. Based on these experimental results a new correlation was determined for the empirical estimation of the initial cloud dimensions as a function of explosive mass. The observed initial cloud volumes were more than an order of magnitude smaller than those calculated with other widely-used formulas (e.g. HOTSPOT). As a smaller volume of the initial cloud leads to higher near-ground concentration maxima, our results support an appropriate adjustment of currently employed calculation methods. (orig.)

  19. Software for radiation protection

    International Nuclear Information System (INIS)

    Graffunder, H.

    2002-01-01

    The software products presented are universally usable programs for radiation protection. The systems were designed in order to establish a comprehensive database specific to radiation protection and, on this basis, model in programs subjects of radiation protection. Development initially focused on the creation of the database. Each software product was to access the same nuclide-specific data; input errors and differences in spelling were to be excluded from the outset. This makes the products more compatible with each other and able to exchange data among each other. The software products are modular in design. Functions recurring in radiation protection are always treated the same way in different programs, and also represented the same way on the program surface. The recognition effect makes it easy for users to familiarize with the products quickly. All software products are written in German and are tailored to the administrative needs and codes and regulations in Germany and in Switzerland. (orig.) [de

  20. Observation-Driven Configuration of Complex Software Systems

    Science.gov (United States)

    Sage, Aled

    2010-06-01

    The ever-increasing complexity of software systems makes them hard to comprehend, predict and tune due to emergent properties and non-deterministic behaviour. Complexity arises from the size of software systems and the wide variety of possible operating environments: the increasing choice of platforms and communication policies leads to ever more complex performance characteristics. In addition, software systems exhibit different behaviour under different workloads. Many software systems are designed to be configurable so that policies can be chosen to meet the needs of various stakeholders. For complex software systems it can be difficult to accurately predict the effects of a change and to know which configuration is most appropriate. This thesis demonstrates that it is useful to run automated experiments that measure a selection of system configurations. Experiments can find configurations that meet the stakeholders' needs, find interesting behavioural characteristics, and help produce predictive models of the system's behaviour. The design and use of ACT (Automated Configuration Tool) for running such experiments is described, in combination a number of search strategies for deciding on the configurations to measure. Design Of Experiments (DOE) is discussed, with emphasis on Taguchi Methods. These statistical methods have been used extensively in manufacturing, but have not previously been used for configuring software systems. The novel contribution here is an industrial case study, applying the combination of ACT and Taguchi Methods to DC-Directory, a product from Data Connection Ltd (DCL). The case study investigated the applicability of Taguchi Methods for configuring complex software systems. Taguchi Methods were found to be useful for modelling and configuring DC- Directory, making them a valuable addition to the techniques available to system administrators and developers.

  1. Comparison of the initial ETA gas propagation experiments with theoretical models

    Energy Technology Data Exchange (ETDEWEB)

    Chambers, F.W.; Clark, J.C.; Fessenden, T.J.

    1982-04-20

    This report contains a description of the initial ETA propagation experiments in air at a beam current of 4.5 kA. The beam was observed to propagate at the pressures anticipated on the basis of previous theory and experiment. A comparison of measured net current waveforms with predictions of the PHOENIX code showed good agreement over the pressure range 0.1 to 200 torr. However, the beam was observed to expand with Z at a faster rate than theory predicts. Excessive transverse beam modulation at injection complicated the experiments and limited their comparison with theory.

  2. Comparison of the initial ETA gas propagation experiments with theoretical models

    International Nuclear Information System (INIS)

    Chambers, F.W.; Clark, J.C.; Fessenden, T.J.

    1982-01-01

    This report contains a description of the initial ETA propagation experiments in air at a beam current of 4.5 kA. The beam was observed to propagate at the pressures anticipated on the basis of previous theory and experiment. A comparison of measured net current waveforms with predictions of the PHOENIX code showed good agreement over the pressure range 0.1 to 200 torr. However, the beam was observed to expand with Z at a faster rate than theory predicts. Excessive transverse beam modulation at injection complicated the experiments and limited their comparison with theory

  3. Shock initiation experiments on ratchet grown PBX 9502

    Energy Technology Data Exchange (ETDEWEB)

    Gustavsen, Richard L [Los Alamos National Laboratory; Thompson, Darla G [Los Alamos National Laboratory; Olinger, Barton W [Los Alamos National Laboratory; Deluca, Racci [Los Alamos National Laboratory; Bartram, Brian D [Los Alamos National Laboratory; Pierce, Timothy H [Los Alamos National Laboratory; Sanchez, Nathaniel J [Los Alamos National Laboratory

    2010-01-01

    This study compares the shock initiation behavior of PBX 9502 pressed to less than nominal density (nominal density is 1.890 {+-} 0.005 g/cm{sup 3}) with PBX 9502 pressed to nominal density and then ''ratchet grown'' to low density. PBX 9502 is an insensitive plastic bonded explosive consisting of 95 weight % dry-aminated tri-amino-tri-nitro-benzene (TATB) and 5 weight % Kel-F 800 plastic binder. ''Ratchet growth'' - an irreversible increase in specific volume - occurs when an explosive based on TATB is temperature cycled. The design of our study is as follows: PBX 9502, all from the same lot, received the following four treatments. Samples in the first group were pressed to less than nominal density. These were not ratchet grown and used as a baseline. Samples in the second group were pressed to nominal density and then ratchet grown by temperature cycling 30 times between -54 C and +80 C. Samples in the final group were pressed to nominal density and cut into 100 mm by 25.4 mm diameter cylinders. During thermal cycling the cylinders were axially constrained by a 100 psi load. Samples for shock initiation experiments were cut perpendicular (disks) and parallel (slabs) to the axial load. The four sample groups can be summarized with the terms pressed low, ratchet grown/no load, axial load/disks, and axial load/slabs. All samples were shock initiated with nearly identical inputs in plate impact experiments carried out on a gas gun. Wave profiles were measured after propagation through 3, 4, 5, and 6 mm of explosive. Side by side comparison of wave profiles from different samples is used as a measure of relative sensitivity. All reduced density samples were more shock sensitive than nominal density PBX 9502. Differences in shock sensitivity between ratchet grown and pressed to low density PBX 9502 were small, but the low density pressings are slightly more sensitive than the ratchet grown samples.

  4. Software Quality Assurance for Nuclear Safety Systems

    International Nuclear Information System (INIS)

    Sparkman, D R; Lagdon, R

    2004-01-01

    The US Department of Energy has undertaken an initiative to improve the quality of software used to design and operate their nuclear facilities across the United States. One aspect of this initiative is to revise or create new directives and guides associated with quality practices for the safety software in its nuclear facilities. Safety software includes the safety structures, systems, and components software and firmware, support software and design and analysis software used to ensure the safety of the facility. DOE nuclear facilities are unique when compared to commercial nuclear or other industrial activities in terms of the types and quantities of hazards that must be controlled to protect workers, public and the environment. Because of these differences, DOE must develop an approach to software quality assurance that ensures appropriate risk mitigation by developing a framework of requirements that accomplishes the following goals: (sm b ullet) Ensures the software processes developed to address nuclear safety in design, operation, construction and maintenance of its facilities are safe (sm b ullet) Considers the larger system that uses the software and its impacts (sm b ullet) Ensures that the software failures do not create unsafe conditions Software designers for nuclear systems and processes must reduce risks in software applications by incorporating processes that recognize, detect, and mitigate software failure in safety related systems. It must also ensure that fail safe modes and component testing are incorporated into software design. For nuclear facilities, the consideration of risk is not necessarily sufficient to ensure safety. Systematic evaluation, independent verification and system safety analysis must be considered for software design, implementation, and operation. The software industry primarily uses risk analysis to determine the appropriate level of rigor applied to software practices. This risk-based approach distinguishes safety

  5. Worldwide collaborative efforts in plasma control software development

    International Nuclear Information System (INIS)

    Penaflor, B.G.; Ferron, J.R.; Walker, M.L.; Humphreys, D.A.; Leuer, J.A.; Piglowski, D.A.; Johnson, R.D.; Xiao, B.J.; Hahn, S.H.; Gates, D.A.

    2008-01-01

    This presentation will describe the DIII-D collaborations with various tokamak experiments throughout the world which have adapted custom versions of the DIII-D plasma control system (PCS) software for their own use. Originally developed by General Atomics for use on the DIII-D tokamak, the PCS has been successfully installed and used for the NSTX experiment in Princeton, the MAST experiment in Culham UK, the EAST experiment in China, and the Pegasus experiment in the University of Wisconsin. In addition to these sites, a version of the PCS is currently being developed for use by the KSTAR tokamak in Korea. A well-defined and robust PCS software infrastructure has been developed to provide a common foundation for implementing the real-time data acquisition and feedback control codes. The PCS infrastructure provides a flexible framework that has allowed the PCS to be easily adapted to fulfill the unique needs of each site. The software has also demonstrated great flexibility in allowing for different computing, data acquisition and real-time networking hardware to be used. A description of the current PCS software architecture will be given along with experiences in developing and supporting the various PCS installations throughout the world

  6. Challenges of Implementing Free and Open Source Software (FOSS): Evidence from the Indian Educational Setting

    Science.gov (United States)

    Thankachan, Briju; Moore, David Richard

    2017-01-01

    The use of Free and Open Source Software (FOSS), a subset of Information and Communication Technology (ICT), can reduce the cost of purchasing software. Despite the benefit in the initial purchase price of software, deploying software requires total cost that goes beyond the initial purchase price. Total cost is a silent issue of FOSS and can only…

  7. Software Security and the "Building Security in Maturity" Model

    CERN Document Server

    CERN. Geneva

    2011-01-01

    Using the framework described in my book "Software Security: Building Security In" I will discuss and describe the state of the practice in software security. This talk is peppered with real data from the field, based on my work with several large companies as a Cigital consultant. As a discipline, software security has made great progress over the last decade. Of the sixty large-scale software security initiatives we are aware of, thirty-two---all household names---are currently included in the BSIMM study. Those companies among the thirty-two who graciously agreed to be identified include: Adobe, Aon, Bank of America, Capital One, The Depository Trust & Clearing Corporation (DTCC), EMC, Google, Intel, Intuit, McKesson, Microsoft, Nokia, QUALCOMM, Sallie Mae, Standard Life, SWIFT, Symantec, Telecom Italia, Thomson Reuters, VMware, and Wells Fargo. The BSIMM was created by observing and analyzing real-world data from thirty-two leading software security initiatives. The BSIMM can...

  8. HDR brachytherapy in carcinoma of cervix: initial experience at AWARE hospitals

    International Nuclear Information System (INIS)

    Rajendran, M.; Reddy, K.D.; Reddy, R.M.; Reddy, J.M.; Reddy, B.V.N.; Kiran Kumar; Gopi, S.; Dharaniraj; Janardhanan

    2002-01-01

    High dose rate (HDR) brachytherapy is well established in the management of gynaecological malignancies. A report on the initial results of one and half year experience with a consistent dose/fractionation schedule and procedure of planning with delivery of treatment schedule is presented

  9. Introducing a New Software for Geodetic Analysis

    Science.gov (United States)

    Hjelle, Geir Arne; Dähnn, Michael; Fausk, Ingrid; Kirkvik, Ann-Silje; Mysen, Eirik

    2017-04-01

    At the Norwegian Mapping Authority, we are currently developing Where, a new software for geodetic analysis. Where is built on our experiences with the Geosat software, and will be able to analyse and combine data from VLBI, SLR, GNSS and DORIS. The software is mainly written in Python which has proved very fruitful. The code is quick to write and the architecture is easily extendable and maintainable, while at the same time taking advantage of well-tested code like the SOFA and IERS libraries. This presentation will show some of the current capabilities of Where, including benchmarks against other software packages, and outline our plans for further progress. In addition we will report on some investigations we have done experimenting with alternative weighting strategies for VLBI.

  10. LHCb software strategy

    CERN Document Server

    Van Herwijnen, Eric

    1998-01-01

    This document describes the software strategy of the LHCb experiment. The main objective is to reuse designs and code wherever possible; We will implement an architecturally driven design process; This architectural process will be implemented using Object Technology; We aim for platform indepence; try to take advantage of distributed computing and will use industry standards, commercial software and profit from HEP developments; We will implement a common software process and development environment. One of the major problems that we are immediately faced with is the conversion of our current code from Fortran into an Object Oriented language and the conversion of our current developers to Object technology. Some technical terms related to OO programming are defined in Annex A.1

  11. The Social Dynamics of Software Development

    NARCIS (Netherlands)

    Heiskanen, A.; Newman, M.; Simila, J.

    2000-01-01

    A variety of experiences in software development processes between a public sector organisation and several software vendors over a decade-long period are described and interpreted. Three information systems histories are presented as case examples and their analysis is based on detailed insider

  12. A REVIEW OF SOFTWARE-INDUCED FAILURE EXPERIENCE.

    Energy Technology Data Exchange (ETDEWEB)

    CHU, T.L.; MARTINEZ-GURIDI, G.; YUE, M.; LEHNER, J.

    2006-09-01

    We present a review of software-induced failures in commercial nuclear power plants (NPPs) and in several non-nuclear industries. We discuss the approach used for connecting operational events related to these failures and the insights gained from this review. In particular, we elaborate on insights that can be used to model this kind of failure in a probabilistic risk assessment (PRA) model. We present the conclusions reached in these areas.

  13. OntoSoft: A Software Commons for Geosciences

    Science.gov (United States)

    Gil, Y.

    2015-12-01

    The goal of the EarthCube OntoSoft project is to enable the creation of a germinal ecosystem for software stewardship in geosciences that will empower scientists to manage their software as valuable scientific assets in an open transparent mode that enables broader access to that software by other scientists, software professionals, students, and decision makers. Our work to date includes: 1) an ontology for describing scientific software metadata, 2) a scientific software repository that contains more than 600 entries that can be searched and compared across metadata fields, 3) an intelligent user interface that guides scientists to publish software. We have also developed a training program where scientists learn to describe and cite software in their papers in addition to data and provenance. This training program is part of a Geoscience Papers of the Future Initiative, where scientists learn as they are writing a journal paper that can be submitted to a Special Section of the AGU Earth and Space Science Journal.

  14. KTM Tokamak operation scenarios software infrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Pavlov, V.; Baystrukov, K.; Golobkov, YU.; Ovchinnikov, A.; Meaentsev, A.; Merkulov, S.; Lee, A. [National Research Tomsk Polytechnic University, Tomsk (Russian Federation); Tazhibayeva, I.; Shapovalov, G. [National Nuclear Center (NNC), Kurchatov (Kazakhstan)

    2014-10-15

    One of the largest problems for tokamak devices such as Kazakhstan Tokamak for Material Testing (KTM) is the operation scenarios' development and execution. Operation scenarios may be varied often, so a convenient hardware and software solution is required for scenario management and execution. Dozens of diagnostic and control subsystems with numerous configuration settings may be used in an experiment, so it is required to automate the subsystem configuration process to coordinate changes of the related settings and to prevent errors. Most of the diagnostic and control subsystems software at KTM was unified using an extra software layer, describing the hardware abstraction interface. The experiment sequence was described using a command language. The whole infrastructure was brought together by a universal communication protocol supporting various media, including Ethernet and serial links. The operation sequence execution infrastructure was used at KTM to carry out plasma experiments.

  15. How to make x-ray simulation software working on WWW: a simple recipe based on seven years of experience

    International Nuclear Information System (INIS)

    Stepanov, S.

    2004-01-01

    Attaching WWW interfaces to scientific software opens new opportunities to researchers by making their results available to wide scientific community in a way complimentary to publication. We have shown that this task may be much easier than many used to think: the amount of additional code is small, the Common Gateway Interface (CGI) can be written in any language, not necessarily PERL, and the software can be interfaced on any operating system it was originally written and does not have to be ported to UNIX. This paper provides some useful recipes resulted from seven years of author's experience in developing and maintaining highly successful X-ray Web server project. All these solutions are based on free public domain software (Apache, GnuPlot, and InfoZip) and applicable for multiple computer platforms. Some practical examples are provided.

  16. The Software Point of View

    CERN Multimedia

    Bentvelsen, Stan

    Physics was meant to be the topic of the LUND workshop, but it was software that dominated throughout the week. ATLAS' new software proves a tough nut to crack. Many presented physics analyses were repetitions of the Physics TDR, using (or trying to use) the new software. The main purpose was to demonstrate that results were invariant with new software. Hard to prove since the 'new software' is not yet completed. Nevertheless the so far existing sofware is massively experimented with throughout the entire ATLAS community. So what does this "new software" really mean? The answer depends strongly on the person who deals with this question. The scope varies between "migrating from Fortran77 to C++" and "learn Object Oriented approach" to "implementing services and algorithms in Athena" or "feeding detector description in FADS". Clearly the new software primarily involves migrating from the old Fortran code to C++ with its object oriententation paradigm. Occasionaly there was a hint that this migration is not c...

  17. SISCOM imaging : an initial South African experience

    International Nuclear Information System (INIS)

    Warwick, J.; Rubow, S.; Van Heerden, B.; Ghoorun, S.; Butler, J.

    2004-01-01

    Full text: Subtraction ictal SPECT co-registered with MRI (SISCOM) is a new technique utilized for the detection and localization of epileptogenic foci in patients with refractory focal epilepsy who are candidates for surgical resection. The technique requires many challenges to be overcome, in particular in relation to the administration of the radiopharmaceutical, acquisition of brain SPECT and the conversion, co-registration and fusion of brain SPECT and MRI studies. Furthermore the interpretation of the studies is complex and is ideally performed in a multidisciplinary context in cooperation with disciplines such as neurology, radiology, psychiatry and neurosurgery. Materials and methods: Two brain SPECT studies are performed using 99m Tc-ethylene cystinate dimer (ECD). An ictal study is performed after the administration of the 99m Tc-ECD during a seizure. An interictal SPECT, performed between seizures is then subtracted from the ictal SPECT, and the difference image fused with an MRI study to optimise localization of the epileptogenic focus. Image conversion, co-registration and fusion was performed using MRlcro and SPM software. Results: To date the Departments of Neurology and Nuclear Medicine have completed over 10 SISCOM studies. Conclusion: During this presentation this initial work will be presented. The methodology as well as the challenges involved in performing and interpreting these studies will be discussed. Individual cases will be used to illustrate the impact of this powerful technique on future patient management. (author)

  18. A Methodology for Integrating Maintainability Using Software Metrics

    OpenAIRE

    Lewis, John A.; Henry, Sallie M.

    1989-01-01

    Maintainability must be integrated into software early in the development process. But for practical use, the techniques used must be as unobtrusive to the existing software development process as possible. This paper defines a methodology for integrating maintainability into large-scale software and describes an experiment which implemented the methodology into a major commercial software development environment.

  19. Negotiation and Decision Making with Collaborative Software: How MarineMap 'Changed the Game' in California's Marine Life Protected Act Initiative.

    Science.gov (United States)

    Cravens, Amanda E

    2016-02-01

    Environmental managers and planners have become increasingly enthusiastic about the potential of decision support tools (DSTs) to improve environmental decision-making processes as information technology transforms many aspects of daily life. Discussions about DSTs, however, rarely recognize the range of ways software can influence users' negotiation, problem-solving, or decision-making strategies and incentives, in part because there are few empirical studies of completed processes that used technology. This mixed-methods study-which draws on data from approximately 60 semi-structured interviews and an online survey--examines how one geospatial DST influenced participants' experiences during a multi-year marine planning process in California. Results suggest that DSTs can facilitate communication by creating a common language, help users understand the geography and scientific criteria in play during the process, aid stakeholders in identifying shared or diverging interests, and facilitate joint problem solving. The same design features that enabled the tool to aid in decision making, however, also presented surprising challenges in certain circumstances by, for example, making it difficult for participants to discuss information that was not spatially represented on the map-based interface. The study also highlights the importance of the social context in which software is developed and implemented, suggesting that the relationship between the software development team and other participants may be as important as technical software design in shaping how DSTs add value. The paper concludes with considerations to inform the future use of DSTs in environmental decision-making processes.

  20. Negotiation and Decision Making with Collaborative Software: How MarineMap `Changed the Game' in California's Marine Life Protected Act Initiative

    Science.gov (United States)

    Cravens, Amanda E.

    2016-02-01

    Environmental managers and planners have become increasingly enthusiastic about the potential of decision support tools (DSTs) to improve environmental decision-making processes as information technology transforms many aspects of daily life. Discussions about DSTs, however, rarely recognize the range of ways software can influence users' negotiation, problem-solving, or decision-making strategies and incentives, in part because there are few empirical studies of completed processes that used technology. This mixed-methods study—which draws on data from approximately 60 semi-structured interviews and an online survey—examines how one geospatial DST influenced participants' experiences during a multi-year marine planning process in California. Results suggest that DSTs can facilitate communication by creating a common language, help users understand the geography and scientific criteria in play during the process, aid stakeholders in identifying shared or diverging interests, and facilitate joint problem solving. The same design features that enabled the tool to aid in decision making, however, also presented surprising challenges in certain circumstances by, for example, making it difficult for participants to discuss information that was not spatially represented on the map-based interface. The study also highlights the importance of the social context in which software is developed and implemented, suggesting that the relationship between the software development team and other participants may be as important as technical software design in shaping how DSTs add value. The paper concludes with considerations to inform the future use of DSTs in environmental decision-making processes.

  1. Lean software development in action

    CERN Document Server

    Janes, Andrea

    2014-01-01

    This book illustrates how goal-oriented, automated measurement can be used to create Lean organizations and to facilitate the development of Lean software, while also demonstrating the practical implementation of Lean software development by combining tried and trusted tools. In order to be successful, a Lean orientation of software development has to go hand in hand with a company's overall business strategy. To achieve this, two interrelated aspects require special attention: measurement and experience management. In this book, Janes and Succi provide the necessary knowledge to establish "

  2. The Effects of Development Team Skill on Software Product Quality

    Science.gov (United States)

    Beaver, Justin M.; Schiavone, Guy A.

    2006-01-01

    This paper provides an analysis of the effect of the skill/experience of the software development team on the quality of the final software product. A method for the assessment of software development team skill and experience is proposed, and was derived from a workforce management tool currently in use by the National Aeronautics and Space Administration. Using data from 26 smallscale software development projects, the team skill measures are correlated to 5 software product quality metrics from the ISO/IEC 9126 Software Engineering Product Quality standard. in the analysis of the results, development team skill is found to be a significant factor in the adequacy of the design and implementation. In addition, the results imply that inexperienced software developers are tasked with responsibilities ill-suited to their skill level, and thus have a significant adverse effect on the quality of the software product. Keywords: software quality, development skill, software metrics

  3. OntoSoft: A Software Registry for Geosciences

    Science.gov (United States)

    Garijo, D.; Gil, Y.

    2017-12-01

    The goal of the EarthCube OntoSoft project is to enable the creation of an ecosystem for software stewardship in geosciences that will empower scientists to manage their software as valuable scientific assets. By sharing software metadata in OntoSoft, scientists enable broader access to that software by other scientists, software professionals, students, and decision makers. Our work to date includes: 1) an ontology for describing scientific software metadata, 2) a distributed scientific software repository that contains more than 750 entries that can be searched and compared across metadata fields, 3) an intelligent user interface that guides scientists to publish software and allows them to crowdsource its corresponding metadata. We have also developed a training program where scientists learn to describe and cite software in their papers in addition to data and provenance, and we are using OntoSoft to show them the benefits of publishing their software metadata. This training program is part of a Geoscience Papers of the Future Initiative, where scientists are reflecting on their current practices, benefits and effort for sharing software and data. This journal paper can be submitted to a Special Section of the AGU Earth and Space Science Journal.

  4. Long-term preservation of analysis software environment

    International Nuclear Information System (INIS)

    Toppe Larsen, Dag; Blomer, Jakob; Buncic, Predrag; Charalampidis, Ioannis; Haratyunyan, Artem

    2012-01-01

    Long-term preservation of scientific data represents a challenge to experiments, especially regarding the analysis software. Preserving data is not enough; the full software and hardware environment is needed. Virtual machines (VMs) make it possible to preserve hardware “in software”. A complete infrastructure package has been developed for easy deployment and management of VMs, based on CERN virtual machine (CernVM). Further, a HTTP-based file system, CernVM file system (CVMFS), is used for the distribution of the software. It is possible to process data with any given software version, and a matching, regenerated VM version. A point-and-click web user interface is being developed for setting up the complete processing chain, including VM and software versions, number and type of processing nodes, and the particular type of analysis and data. This paradigm also allows for distributed cloud-computing on private and public clouds, for both legacy and contemporary experiments.

  5. Bonaparte: Application of new software for missing persons program

    NARCIS (Netherlands)

    van Dongen, C.J.J.; Slooten, K.; Slagter, M.; Burgers, W.; Wiegerinck, W.

    2011-01-01

    The Netherlands Forensic Institute (NFI), together with SNN at Radboud University Nijmegen, have developed new software for pedigree matching which can handle autosomal, Y chromosomal and mitochondrial DNA profiles. Initially this software, called Bonaparte, has been developed for DNA DVI. Bonaparte

  6. Analysis and proposal of the new architecture of the selected parts of the software support of the COMPASS experiment

    CERN Document Server

    Jary, Vladimir

    This work focuses on the data acquisition system of the Compass experiment at CERN. At first the database current subsystem that suffered from increased load during year 2009 is analysed. The reasons of problems are identified and new architecture that includes replication, backups, and monitoring for achieving the high availability and reliability is proposed and implemented. Several advanced database features including partitioned tables or storage engines are described and tested. Then, the process of implementation of the remote control and monitoring of the experiment is explained. As the existing data acquisition system is partly based on a deprecated technologies, development of a new architecture has started. We focus on requirements analysis and proposal of a control and monitoring software for the new hardware platform based on the FPGA technology. The software is to be deployed in a heterogenous network environment. According to the proposal, the system is built on the DIM communication library. Ro...

  7. Experiences in Teaching a Graduate Course on Model-Driven Software Development

    Science.gov (United States)

    Tekinerdogan, Bedir

    2011-01-01

    Model-driven software development (MDSD) aims to support the development and evolution of software intensive systems using the basic concepts of model, metamodel, and model transformation. In parallel with the ongoing academic research, MDSD is more and more applied in industrial practices. After being accepted both by a broad community of…

  8. 49 CFR 238.105 - Train electronic hardware and software safety.

    Science.gov (United States)

    2010-10-01

    ... and software system safety as part of the pre-revenue service testing of the equipment. (d)(1... safely by initiating a full service brake application in the event of a hardware or software failure that... 49 Transportation 4 2010-10-01 2010-10-01 false Train electronic hardware and software safety. 238...

  9. Reducing the risk of failure: Software Quality assurance standards and methods

    International Nuclear Information System (INIS)

    Elphick, J.; Cope, H.

    1992-01-01

    An effective Software Quality Assurance (SQA) program provides an overall approach to software engineering and the establishment of proven methods for the production of reliable software. And, in the authors experience, the overall costs for the software life are diminished with the application of quality methods. In their experience, the issues for implementing quality standards and practices are many. This paper addresses those issues as well as the lessons learned from developing and implementing a number of software quality assurance programs. Their experience includes the development and implementation of their own NRC accepted SQA program and an SQA program for an engineering software developer, as well as developing SQA procedures, standards, and methods for utilities, medical and commercial clients. Some of the issues addressed in this paper are: setting goals and defining quality; applying the software life cycle; addressing organizational issues; providing flexibility and increasing productivity; producing effective documentation; maintaining quality records; Imposing software configuration management; conducting reviews, audits, and controls; verification and validation; and controlling software procurement

  10. Using neural networks in software repositories

    Science.gov (United States)

    Eichmann, David (Editor); Srinivas, Kankanahalli; Boetticher, G.

    1992-01-01

    The first topic is an exploration of the use of neural network techniques to improve the effectiveness of retrieval in software repositories. The second topic relates to a series of experiments conducted to evaluate the feasibility of using adaptive neural networks as a means of deriving (or more specifically, learning) measures on software. Taken together, these two efforts illuminate a very promising mechanism supporting software infrastructures - one based upon a flexible and responsive technology.

  11. Commissioning and initial experimental program of the BGO-OD experiment at ELSA

    Science.gov (United States)

    Alef, S.; Bauer, P.; Bayadilov, D.; Beck, R.; Becker, M.; Bella, A.; Bielefeldt, P.; Böse, S.; Braghieri, A.; Brinkmann, K.; Cole, P.; Di Salvo, R.; Dutz, H.; Elsner, D.; Fantini, A.; Freyermuth, O.; Friedrich, S.; Frommberger, F.; Ganenko, V.; Geffers, D.; Gervino, G.; Ghio, F.; Görtz, S.; Gridnev, A.; Gutz, E.; Hammann, D.; Hannappel, J.; Hillert, W.; Ignatov, A.; Jahn, R.; Joosten, R.; Jude, T. C.; Klein, F.; Knaust, J.; Kohl, K.; Koop, K.; Krusche, B.; Lapik, A.; Levi Sandri, P.; Lopatin, I. V.; Mandaglio, G.; Messi, F.; Messi, R.; Metag, V.; Moricciani, D.; Mushkarenkov, A.; Nanova, M.; Nedorezov, V.; Novinskiy, D.; Pedroni, P.; Reitz, B.; Romaniuk, M.; Rostomyan, T.; Rudnev, N.; Schaerf, C.; Scheluchin, G.; Schmieden, H.; Stugelev, A.; Sumachev, V.; Tarakanov, V.; Vegna, V.; Walther, D.; Watts, D.; Zaunick, H.; Zimmermann, T.

    2016-11-01

    BGO-OD is a new meson photoproduction experiment at the ELSA facility of Bonn University. It aims at the investigation of non strange and strange baryon excitations, and is especially designed to be able to detect weekly bound meson-baryon type structures. The setup for the BGO-OD experiment is presented, the characteristics of the photon beam and the detector performances are shown and the initial experimental program is discussed.

  12. What's Happening in the Software Engineering Laboratory?

    Science.gov (United States)

    Pajerski, Rose; Green, Scott; Smith, Donald

    1995-01-01

    Since 1976 the Software Engineering Laboratory (SEL) has been dedicated to understanding and improving the way in which one NASA organization the Flight Dynamics Division (FDD) at Goddard Space Flight Center, develops, maintains, and manages complex flight dynamics systems. This paper presents an overview of recent activities and studies in SEL, using as a framework the SEL's organizational goals and experience based software improvement approach. It focuses on two SEL experience areas : (1) the evolution of the measurement program and (2) an analysis of three generations of Cleanroom experiments.

  13. Optimasi Penjadwalan Pengerjaan Software Pada Software House Dengan Flow-Shop Problem Menggunakan Artificial Bee Colony

    Directory of Open Access Journals (Sweden)

    Muhammad Fhadli

    2016-12-01

    This research proposed an implementation related to software execution scheduling process at a software house with Flow-Shop Problem (FSP using Artificial Bee Colony (ABC algorithm. Which in FSP required a solution to complete some job/task along with its overall cost at a minimum. There is a constraint that should be kept to note in this research, that is the uncertainty completion time of its jobs. In this research, we will present a solution that is a sequence order of project execution with its overall completion time at a minimum. An experiment will be performed with 3 attempts on each experiment conditions, that is an experiment of iteration parameter and experiment of limit parameter. From this experiment, we concluded that the use of this algorithm explained in this paper can reduce project execution time if we increase the value of total iteration and total colony. Keywords: optimization, flow-shop problem, artificial bee colony, swarm intelligence, meta-heuristic.

  14. Lessons learned from development and quality assurance of software systems at the Halden Project

    International Nuclear Information System (INIS)

    Bjorlo, T.J.; Berg, O.; Pehrsen, M.; Dahll, G.; Sivertsen, T.

    1996-01-01

    The OECD Halden Reactor Project has developed a number of software systems within the research programmes. These programmes have comprised a wide range of topics, like studies of software for safety-critical applications, development of different operator support systems, and software systems for building and implementing graphical user interfaces. The systems have ranged from simple prototypes to installations in process plants. In the development of these software systems, Halden has gained much experience in quality assurance of different types of software. This paper summarises the accumulated experience at the Halden Project in quality assurance of software systems. The different software systems being developed at the Halden Project may be grouped into three categories. These are plant-specific software systems (one-of-a-kind deliveries), generic software products, and safety-critical software systems. This classification has been found convenient as the categories have different requirements to the quality assurance process. In addition, the experience from use of software development tools and proprietary software systems at Halden, is addressed. The paper also focuses on the experience gained from the complete software life cycle, starting with the software planning phase and ending with software operation and maintenance

  15. Landscape Builder: Software for the creation of initial landscapes for LANDIS from FIA data

    Directory of Open Access Journals (Sweden)

    William Dijak

    2013-06-01

    Full Text Available I developed Landscape Builder to create spatially explicit landscapes as starting conditions for LANDIS Pro 7.0 and LANDIS II landscape forest simulation models from classified satellite imagery and Forest Inventory and Analysis (FIA data collected over multiple years. LANDIS Pro and LANDIS II models project future landscapes by simulating tree growth, tree species succession, disease, insects, fire, wind, and management disturbance. Landscape Builder uses inventory plot attributes from the FIA inventory database, FIA unit map, National Forest type map, National Forest size class map, land cover map, and landform map to assign FIA plot attributes to raster pixels representing a real forest landscape. In addition to creating a detailed map of current (initial forest landscape conditions, the software produces specific files required for use in LANDIS Pro 7.0 or LANDIS II format. Other tools include the ability to create a dominant species and age-class map from previously created LANDIS maps, a tool to create a dominant species and age-class map from a stand map and field plot data, and a tool to convert between Esri ascii rasters and Erdas file format types.

  16. Modular Infrastructure for Rapid Flight Software Development

    Science.gov (United States)

    Pires, Craig

    2010-01-01

    This slide presentation reviews the use of modular infrastructure to assist in the development of flight software. A feature of this program is the use of model based approach for application unique software. A review of two programs that this approach was use on are: the development of software for Hover Test Vehicle (HTV), and Lunar Atmosphere and Dust Environment Experiment (LADEE).

  17. Where Does the Time Go in Software DSMs?--Experiences with JIAJIA

    Institute of Scientific and Technical Information of China (English)

    SHI Weisong; HU Weiwu; TANGZhimin

    1999-01-01

    The performance gap between softwareDSM systems and message passing platforms prevents the prevalence ofsoftware DSM system greatly, though great efforts have been delivered inthis area in the past decade. In this paper, we take the challenge tofind where we should focus our efforts in the future design. Thecomponents of total system overhead of software DSM systems are analyzedin detail firstly. Based on a state-of-the-art software DSM systemJIAJIA, we measure these components on Dawning parallel system and drawfive important conclusions which are different from some traditionalviewpoints. (1) The performance of the JIAJIA software DSM system isacceptable. For four of eight applications, the parallel efficiencyachieved by JIAJIA is about 80%, while for two others, 70% efficiencycan be obtained. (2) 40.94% interrupt service time is overlapped withwaiting time. (3) Encoding and decoding diffs do not cost muchtime (<1%), so using hardware support to encode/decode diffs andsend/receive messages is not worthwhile. (4) Great endeavours should beput to reduce data miss penalty and optimize synchronization operations,which occupy 11.75% and 13.65% of total execution time respectively.(5) Communication hardware overhead occupies 66.76% of the wholecommunication time in the experimental environment, and communicationsoftware overhead does not take much time as expected.Moreover, by studying the effect of CPU speed to system overhead, wefind that the common speedup formula for distributed memory systems doesnot work under software DSM systems. Therefore, we design a new speedupformula special to software DSM systems, and point out that when the CPUspeed increases the speedup can be increased too even if the networkspeed is fixed, which is impossible in message passing systems. Finally,we argue that JIAJIA system has desired scalability.

  18. Initial experience of tritium exposure control at JET

    International Nuclear Information System (INIS)

    Patel, B.; Campling, D.C.; Schofield, P.A.; Macheta, P.; Sandland, K.

    1998-01-01

    Some of the safety procedures and controls in place for work with tritium are described, and initial operational experience of handling tritium is discussed. A description is given of work to rectify a water leak in a JET neutral beam heating component, which involved man-access to a confined volume to perform repairs, at tritium levels about 100 DAC (80 MBq/m 3 . HTO). Control measures involving use of purge and extract ventilation, and of personal protection using air-fed pressurized suits are described. Results are given of the internal doses to project staff and of atmospheric discharges of tritium during the repair outage. (P.A.)

  19. Discrete Choice Experiments: A Guide to Model Specification, Estimation and Software.

    Science.gov (United States)

    Lancsar, Emily; Fiebig, Denzil G; Hole, Arne Risa

    2017-07-01

    We provide a user guide on the analysis of data (including best-worst and best-best data) generated from discrete-choice experiments (DCEs), comprising a theoretical review of the main choice models followed by practical advice on estimation and post-estimation. We also provide a review of standard software. In providing this guide, we endeavour to not only provide guidance on choice modelling but to do so in a way that provides a 'way in' for researchers to the practicalities of data analysis. We argue that choice of modelling approach depends on the research questions, study design and constraints in terms of quality/quantity of data and that decisions made in relation to analysis of choice data are often interdependent rather than sequential. Given the core theory and estimation of choice models is common across settings, we expect the theoretical and practical content of this paper to be useful to researchers not only within but also beyond health economics.

  20. FREE SOFTWARE IN EDUCATION: AN EXPERIENCE IN A TEACHERS’ TRAINING COURSES

    Directory of Open Access Journals (Sweden)

    Daniele da Rocha Schneider

    2016-12-01

    Full Text Available This paper approaches the use of free operational systems and softwares in a teachers’ training course. We problematize the need of developing digital and technological fluency (contemporary skills, fundamental concepts and intelectual capacities of future teachers considering the pedagogical application of the main educational applicatives. A conceptual and theoretical review was perfomed, followed by an analysis of the “Free software in Education” course proposal, which is oferred in undergraduate teaching degrees at the Federal University of Rio Grande do Sul. The result evidences the course as a differentiated training, that allows the development of digital and technological fluency in free technologies, boosting the use of different softwares for the development of inovative pedagogical practices.

  1. Development of workflow planning software and a tracking study of the decay B± → J / Ψ at the D0 Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Evans, David Edward [Lancaster Univ. (United Kingdom)

    2003-09-01

    A description of the development of the mc_runjob software package used to manage large scale computing tasks for the D0 Experiment at Fermilab is presented, along with a review of the Digital Front End Trigger electronics and the software used to control them. A tracking study is performed on detector data to determine that the D0 Experiment can detect charged B mesons, and that these results are in accordance with current results. B mesons are found by searching for the decay channel B± → J / Ψ K± .

  2. Fast and convenient data analysis software at the BGO-OD experiment

    Energy Technology Data Exchange (ETDEWEB)

    Freyermuth, Oliver [Physikalisches Institut, Universitaet Bonn (Germany); Collaboration: BGO-OD-Collaboration

    2016-07-01

    The BGO-OD experiment located at the ELSA accelerator in Bonn is using the electron beam with energies up to 3.2 GeV for the investigation of meson photoproduction off the nucleon. The setup combines a central highly segmented BGO crystal calorimeter with a forward magnetic spectrometer complemented by ToF walls. In total over 5000 channels of diverse detectors are connected to the readout system. The data analysis for this complex setup is handled by a modular software derived from the ROOT-based analysis framework ExPlORA, originally developed by the CB-ELSA/TAPS collaboration in Bonn. This framework has now been heavily extended with a set of generic tools which can be used without knowledge of its internal design or extensive programming experience, while achieving the execution speed of compiled code. The underlying concept as well as its performance will be presented. Secondly, methods of data preprocessing will be discussed. Since the analysis chain is based on object-oriented data structures, it can be easily segmented by storing intermediate preprocessed datasets. A technique to prune lower-level information was developed. Finally, it is illustrated that ExPlORA is closely entangled with recent and upcoming developments on C++ and ROOT. On this basis it is equipped with tools assisting in development and testing.

  3. Initial experience with robotic pancreatic surgery in Singapore: single institution experience with 30 consecutive cases.

    Science.gov (United States)

    Goh, Brian K P; Low, Tze-Yi; Lee, Ser-Yee; Chan, Chung-Yip; Chung, Alexander Y F; Ooi, London L P J

    2018-05-24

    Presently, the worldwide experience with robotic pancreatic surgery (RPS) is increasing although widespread adoption remains limited. In this study, we report our initial experience with RPS. This is a retrospective review of a single institution prospective database of 72 consecutive robotic hepatopancreatobiliary surgeries performed between 2013 and 2017. Of these, 30 patients who underwent RPS were included in this study of which 25 were performed by a single surgeon. The most common procedure was robotic distal pancreatectomy (RDP) which was performed in 20 patients. This included eight subtotal pancreatectomies, two extended pancreatecto-splenectomies (en bloc gastric resection) and 10 spleen-saving-RDP. Splenic preservation was successful in 10/11 attempted spleen-saving-RDP. Eight patients underwent pancreaticoduodenectomies (five hybrid with open reconstruction), one patient underwent a modified Puestow procedure and one enucleation of uncinate tumour. Four patients had extended resections including two RDP with gastric resection and two pancreaticoduodenectomies with vascular resection. There was one (3.3%) open conversion and seven (23.3%) major (>Grade II) morbidities. Overall, there were four (13.3%) clinically significant (Grade B) pancreatic fistulas of which three required percutaneous drainage. These occurred after three RDP and one robotic enucleation. There was one reoperation for port-site hernia and no 30-day/in-hospital mortalities. The median post-operative stay was 6.5 (range: 3-36) days and there were six (20%) 30-day readmissions. Our initial experience showed that RPS can be adopted safely with a low open conversion rate for a wide variety of procedures including pancreaticoduodenectomy. © 2018 Royal Australasian College of Surgeons.

  4. Emerging Pathogens Initiative (EPI)

    Data.gov (United States)

    Department of Veterans Affairs — The Emerging Pathogens Initiative (EPI) database contains emerging pathogens information from the local Veterans Affairs Medical Centers (VAMCs). The EPI software...

  5. Software for people fundamentals, trends and best practices

    CERN Document Server

    Maedche, Alexander; Neer, Ludwig

    2012-01-01

    The highly competitive and globalized software market is creating pressure on software companies. Given the current boundary conditions, it is critical to continuously increase time-to-market and reduce development costs. In parallel, driven by private life experiences with mobile computing devices, the World Wide Web and software-based services, people, general expectations with regards to software are growing. They expect software that is simple and joyful to use. In the light of the changes that have taken place in recent years, software companies need to fundamentally reconsider the way th

  6. Proceedings of the 14th Annual Software Engineering Workshop

    Science.gov (United States)

    1989-01-01

    Several software related topics are presented. Topics covered include studies and experiment at the Software Engineering Laboratory at the Goddard Space Flight Center, predicting project success from the Software Project Management Process, software environments, testing in a reuse environment, domain directed reuse, and classification tree analysis using the Amadeus measurement and empirical analysis.

  7. FASTBUS software workshop

    International Nuclear Information System (INIS)

    1985-01-01

    FASTBUS is a standard for modular high-speed data acquisition, data-processing and control, development for use in high-energy physics experiments incorporating different types of computers and microprocessors. This Workshop brought together users from different laboratories for a review of current software activities, using the standard both in experiments and for test equipment. There are also papers on interfacing and the present state of systems being developed for use in future LEP experiments. Also included is a discussion on the proposed revision of FASTBUS Standard Routines. (orig.)

  8. A measurement system for large, complex software programs

    Science.gov (United States)

    Rone, Kyle Y.; Olson, Kitty M.; Davis, Nathan E.

    1994-01-01

    This paper describes measurement systems required to forecast, measure, and control activities for large, complex software development and support programs. Initial software cost and quality analysis provides the foundation for meaningful management decisions as a project evolves. In modeling the cost and quality of software systems, the relationship between the functionality, quality, cost, and schedule of the product must be considered. This explicit relationship is dictated by the criticality of the software being developed. This balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and developers with respect to the processes being employed.

  9. Testing of the assisting software for radiologists analysing head CT images: lessons learned.

    Science.gov (United States)

    Martynov, Petr; Mitropolskii, Nikolai; Kukkola, Katri; Gretsch, Monika; Koivisto, Vesa-Matti; Lindgren, Ilkka; Saunavaara, Jani; Reponen, Jarmo; Mäkynen, Anssi

    2017-12-11

    Assessing a plan for user testing and evaluation of the assisting software developed for radiologists. Test plan was assessed in experimental testing, where users performed reporting on head computed tomography studies with the aid of the software developed. The user testing included usability tests, questionnaires, and interviews. In addition, search relevance was assessed on the basis of user opinions. The testing demonstrated weaknesses in the initial plan and enabled improvements. Results showed that the software has acceptable usability level but some minor fixes are needed before larger-scale pilot testing. The research also proved that it is possible even for radiologists with under a year's experience to perform reporting of non-obvious cases when assisted by the software developed. Due to the small number of test users, it was impossible to assess effects on diagnosis quality. The results of the tests performed showed that the test plan designed is useful, and answers to the key research questions should be forthcoming after testing with more radiologists. The preliminary testing revealed opportunities to improve test plan and flow, thereby illustrating that arranging preliminary test sessions prior to any complex scenarios is beneficial.

  10. Quality assessment with the AGIR software results and experience

    International Nuclear Information System (INIS)

    Rauch, D.; Kotter, E.; Kurtz, C.; Schaefer, O.; Ehritt-Braun, C.; Burger, D.; Schaper, J.; Uhrmeister, P.

    2001-01-01

    Purpose: To evaluate whether a new software from the working group for interventional radiology (AGIR) is an appropriate tool for quality assurance in interventional radiology, and presentation of results acquired within the quality improvement process in 1999. Patients and methods: AGIR-defined parameters such as patient data, risk profile, given interventions as well as complications were registered by a recently developed software. Based on monthly data analyses, possible complications were identified and discussed in morbidity and mortality conferences. Results: 1014 interventions were performed in our institution in 1999. According to criteria established by AGIR, the complication rate was 2.7%. In addition and according to SCVIR criteria, complications were distinguished quantitatively in five classes and semiquantitatively in minor and major groups. The result was a minor complication rate of 1.8%, and a major rate of 0.9%. There were no cases of death associated with the intervention. Further strategies were developed in order to reduce the complication rate. Conclusion: Extensive quality assurance methods can be integrated in daily routine work. These methods lead to an intensive transparency of treatment results, and allow the implementation of continuous quality improvements. The development of the software is a first step in establishing a nation-wide quality assurance system. Nevertheless, modification and additional definition of the AGIR predefined parameters are required, for example, to avoid unnecessary procedures. (orig.) [de

  11. The Relationship of a Pilot's Educational Background, Aeronautical Experience and Recency of Experience to Performance In Initial Training at a Regional Airline

    Science.gov (United States)

    Shane, Nancy R.

    The purpose of this study was to determine how a pilot's educational background, aeronautical experience and recency of experience relate to their performance during initial training at a regional airline. Results show that variables in pilots' educational background, aeronautical experience and recency of experience do predict performance in training. The most significant predictors include years since graduation from college, multi-engine time, total time and whether or not a pilot had military flying experience. Due to the pilot shortage, the pilots entering regional airline training classes since August 2013 have varied backgrounds, aeronautical experience and recency of experience. As explained by Edward Thorndike's law of exercise and the law of recency, pilots who are actively using their aeronautical knowledge and exercising their flying skills should exhibit strong performance in those areas and pilots who have not been actively using their aeronautical knowledge and exercising their flying skills should exhibit degraded performance in those areas. Through correlation, chi-square and multiple regression analysis, this study tests this theory as it relates to performance in initial training at a regional airline.

  12. ThermoData Engine (TDE): software implementation of the dynamic data evaluation concept. 5. Experiment planning and product design.

    Science.gov (United States)

    Diky, Vladimir; Chirico, Robert D; Kazakov, Andrei F; Muzny, Chris D; Magee, Joseph W; Abdulagatov, Ilmutdin; Kang, Jeong Won; Kroenlein, Kenneth; Frenkel, Michael

    2011-01-24

    ThermoData Engine (TDE) is the first full-scale software implementation of the dynamic data evaluation concept, as reported recently in this journal. In the present paper, we describe development of an algorithmic approach to assist experiment planning through assessment of the existing body of knowledge, including availability of experimental thermophysical property data, variable ranges studied, associated uncertainties, state of prediction methods, and parameters for deployment of prediction methods and how these parameters can be obtained using targeted measurements, etc., and, indeed, how the intended measurement may address the underlying scientific or engineering problem under consideration. A second new feature described here is the application of the software capabilities for aid in the design of chemical products through identification of chemical systems possessing desired values of thermophysical properties within defined ranges of tolerance. The algorithms and their software implementation to achieve this are described. Finally, implementation of a new data validation and weighting system is described for vapor-liquid equilibrium (VLE) data, and directions for future enhancements are outlined.

  13. Controlling S2 terminal using FS software

    Science.gov (United States)

    Xue, Zhuhe

    New S2FS software for controlling S2 terminal of Sheshan station has been developed. It works under Field System software. All S2 operation commands are incorporated in a station program. The interface of SWT computer and S2 terminal is RS232 interface. S2FS software is designed by using Shell and C language. It has been used in VSOP experiments.

  14. Initially curved microplates under electrostatic actuation: theory and experiment

    KAUST Repository

    Saghir, Shahid

    2016-07-01

    Microplates are the building blocks of many micro-electro-mechanical systems. It is common for them to experience initial curvature imperfection due to residual stresses caused by the micro fabrication process. Such plates are essentially different from perfectly flat ones and cannot be modeled using flat plate models. In this paper, we adopt a dynamic analog of the von Karman governing equations of imperfect plates. These equations are then used to develop a reduced order model based on the Galerkin procedure, to simulate the static and dynamic behavior of the microplate under electrostatic actuation. To validate the simulation results, an initially curved imperfect microplate made of silicon nitride is fabricated and tested. The static behaviour of the microplate is investigated when applying a DC voltage Vdc. Then, the dynamic behaviour of the microplate is examined under the application of a harmonic AC voltage, Vac, superimposed to Vdc. The simulation results show good agreement with the experimentally measured responses. © 2016 IOP Publishing Ltd.

  15. [Evaluation of the influence of humidity and temperature on the drug stability by initial average rate experiment].

    Science.gov (United States)

    He, Ning; Sun, Hechun; Dai, Miaomiao

    2014-05-01

    To evaluate the influence of temperature and humidity on the drug stability by initial average rate experiment, and to obtained the kinetic parameters. The effect of concentration error, drug degradation extent, humidity and temperature numbers, humidity and temperature range, and average humidity and temperature on the accuracy and precision of kinetic parameters in the initial average rate experiment was explored. The stability of vitamin C, as a solid state model, was investigated by an initial average rate experiment. Under the same experimental conditions, the kinetic parameters obtained from this proposed method were comparable to those from classical isothermal experiment at constant humidity. The estimates were more accurate and precise by controlling the extent of drug degradation, changing humidity and temperature range, or by setting the average temperature closer to room temperature. Compared with isothermal experiments at constant humidity, our proposed method saves time, labor, and materials.

  16. Certification of digital system software

    International Nuclear Information System (INIS)

    Waclo, J.; Cook, B.; Adomaitis, D.

    1991-01-01

    The activities involved in the successful application of digital systems to Nuclear Protection functions is not achieved through happenstance. At Westinghouse there has been a longstanding program to utilize digital state of the art technology for protection system advancement. Thereby gaining the advantages of increased system reliability, performance, ease of operation and reduced maintenance costs. This paper describes the Westinghouse background and experience in the safety system software development process, including Verification and Validation, and its application to protection system qualification and the successful use for licensing the Eagle 21 Digital Process Protection System Upgrade. In addition, the lessons learned from this experience are discussed from the perspective of improving the development process through applying feedback of the measurements made on the process and the software product quality. The goal of this process optimization is to produce the highest possible software quality while recognizing the real world constraints of available resources, project schedule and the regulatory policies that are customary in the nuclear industry

  17. African American women's experiences with the initial discovery, diagnosis, and treatment of breast cancer.

    Science.gov (United States)

    Lackey, N R; Gates, M F; Brown, G

    2001-04-01

    To describe the experiences of African American women living with breast cancer following the primary diagnosis and while undergoing initial treatment. Phenomenologic. 13 African American women (ages 30-66) purposefully selected from two oncology clinics in the mid-South. Phenomenologic interviews (transcribed verbatim) and field notes were analyzed using Colaizzi's method of phenomenologic description and analysis. Experience Trajectory, Femininity, and Spirituality were the three major themes. The Experience Trajectory subthemes were finding the lump, getting the diagnosis, undergoing surgery and adjuvant treatment. The Femininity subthemes were loss of all or part of the breast, loss of hair, and sexual attractiveness to a man. Spirituality was reflected as a reliance on God. Telling the story of their experience trajectory during their breast cancer experience is valuable in assessing African American women's feelings, emotions, and fears of body changes that occur during surgery and treatment. Their spirituality helps them through this experience. Research involving both African American women and their partners would provide greater insight into specific relationship patterns and communication related to sexuality during this experience. Nurses need to listen to the stories of African American women about the initial experience of discovery, diagnosis, and treatment of breast cancer so they can be more informed advocates for these women. African American women need more information from healthcare providers regarding the whole experience trajectory.

  18. SOFTWARE OPEN SOURCE, SOFTWARE GRATIS?

    Directory of Open Access Journals (Sweden)

    Nur Aini Rakhmawati

    2006-01-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 Berlakunya Undang – undang Hak Atas Kekayaan Intelektual (HAKI, memunculkan suatu alternatif baru untuk menggunakan software open source. Penggunaan software open source menyebar seiring dengan isu global pada Information Communication Technology (ICT saat ini. Beberapa organisasi dan perusahaan mulai menjadikan software open source sebagai pertimbangan. Banyak konsep mengenai software open source ini. Mulai dari software yang gratis sampai software tidak berlisensi. Tidak sepenuhnya isu software open source benar, untuk itu perlu dikenalkan konsep software open source mulai dari sejarah, lisensi dan bagaimana cara memilih lisensi, serta pertimbangan dalam memilih software open source yang ada. Kata kunci :Lisensi, Open Source, HAKI

  19. Artificial intelligence and expert systems in-flight software testing

    Science.gov (United States)

    Demasie, M. P.; Muratore, J. F.

    1991-01-01

    The authors discuss the introduction of advanced information systems technologies such as artificial intelligence, expert systems, and advanced human-computer interfaces directly into Space Shuttle software engineering. The reconfiguration automation project (RAP) was initiated to coordinate this move towards 1990s software technology. The idea behind RAP is to automate several phases of the flight software testing procedure and to introduce AI and ES into space shuttle flight software testing. In the first phase of RAP, conventional tools to automate regression testing have already been developed or acquired. There are currently three tools in use.

  20. Initial experiences of simultaneous laparoscopic resection of colorectal cancer and liver metastases

    NARCIS (Netherlands)

    Hoekstra, L. T.; Busch, O. R. C.; Bemelman, W. A.; van Gulik, T. M.; Tanis, P. J.

    2012-01-01

    Introduction. Simultaneous resection of primary colorectal carcinoma (CRC) and synchronous liver metastases (SLMs) is subject of debate with respect to morbidity in comparison to staged resection. The aim of this study was to evaluate our initial experience with this approach. Methods. Five patients

  1. Software Quality Control at Belle II

    Science.gov (United States)

    Ritter, M.; Kuhr, T.; Hauth, T.; Gebard, T.; Kristof, M.; Pulvermacher, C.; Belle Software Group, II

    2017-10-01

    Over the last seven years the software stack of the next generation B factory experiment Belle II has grown to over one million lines of C++ and Python code, counting only the part included in offline software releases. There are several thousand commits to the central repository by about 100 individual developers per year. To keep a coherent software stack of high quality that it can be sustained and used efficiently for data acquisition, simulation, reconstruction, and analysis over the lifetime of the Belle II experiment is a challenge. A set of tools is employed to monitor the quality of the software and provide fast feedback to the developers. They are integrated in a machinery that is controlled by a buildbot master and automates the quality checks. The tools include different compilers, cppcheck, the clang static analyzer, valgrind memcheck, doxygen, a geometry overlap checker, a check for missing or extra library links, unit tests, steering file level tests, a sophisticated high-level validation suite, and an issue tracker. The technological development infrastructure is complemented by organizational means to coordinate the development.

  2. AN INITIAL EVALUATION OF THE BTRACKS BALANCE PLATE AND SPORTS BALANCE SOFTWARE FOR CONCUSSION DIAGNOSIS.

    Science.gov (United States)

    Goble, Daniel J; Manyak, Kristin A; Abdenour, Thomas E; Rauh, Mitchell J; Baweja, Harsimran S

    2016-04-01

    As recently dictated by the American Medical Society, balance testing is an important component in the clinical evaluation of concussion. Despite this, previous research on the efficacy of balance testing for concussion diagnosis suggests low sensitivity (∼30%), based primarily on the popular Balance Error Scoring System (BESS). The Balance Tracking System (BTrackS, Balance Tracking Systems Inc., San Diego, CA, USA) consists of a force plate (BTrackS Balance Plate) and software (BTrackS Sport Balance) which can quickly (balance testing with gold standard accuracy. The present study aimed to determine the sensitivity of the BTrackS Balance Plate and Sports Balance Software for concussion diagnosis. Cross-Sectional Study. Preseason baseline balance testing of 519 healthy Division I college athletes playing sports with a relatively high risk for concussions was performed with the BTrackS Balance Test. Testing was administered by certified athletic training staff using the BTrackS Balance Plate and Sport Balance software. Of the baselined athletes, 25 later experienced a concussion during the ensuing sport season. Post-injury balance testing was performed on these concussed athletes within 48 of injury and the sensitivity of the BTrackS Balance Plate and Sport Balance software was estimated based on the number of athletes showing a balance decline according to the criteria specified in the Sport Balance software. This criteria is based on the minimal detectable change statistic with a 90% confidence level (i.e. 90% specificity). Of 25 athletes who experienced concussions, 16 had balance declines relative to baseline testing results according to the BTrackS Sport Balance software criteria. This corresponds to an estimated concussion sensitivity of 64%, which is twice as great as that reported previously for the BESS. The BTrackS Balance Plate and Sport Balance software has the greatest concussion sensitivity of any balance testing instrument reported to date. Level 2

  3. MAUS: MICE Analysis User Software

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    The Muon Ionization Cooling Experiment (MICE) has developed the MICE Analysis User Software (MAUS) to simulate and analyse experimental data. It serves as the primary codebase for the experiment, providing for online data quality checks and offline batch simulation and reconstruction. The code is structured in a Map-Reduce framework to allow parallelization whether on a personal machine or in the control room. Various software engineering practices from industry are also used to ensure correct and maintainable physics code, which include unit, functional and integration tests, continuous integration and load testing, code reviews, and distributed version control systems. Lastly, there are various small design decisions like using JSON as the data structure, using SWIG to allow developers to write components in either Python or C++, or using the SCons python-based build system that may be of interest to other experiments.

  4. Initial experience with software system JODNEW for evaluation biophysical characteristics related to treatment of carcinoma of thyroid gland by 131I

    International Nuclear Information System (INIS)

    Hermanska, J.; Zimak, J.; Vosminkova, K.; Nemec, J.; Blazek, T.; Jirsa, L.; Karny, M.

    1998-01-01

    Our research tries to exploit sophisticated methods for a balancing of positive and negative consequences of radionuclide applications. We have tailored Bayesian data processing in order to support decision making during treatment of thyroid diseases with help of 131 I. After successful experimental phase we have implemented them. This novel in-house developed software system JODNEW i now tested. It aim at: (1) increasing quality of raw biophysical data exploited in diagnostics and therapy of thyroid diseases; (2) estimating cumulated activity so that MIRD methodology can be well used; (3) decreasing working load on staff. JODNEW is an extensive data-base system co-operating with advanced estimation algorithms coded in C++. The Bayesian methodology adopted allows us to exploit expert knowledge, models of observed processes as well as measured data in a consistent way. This is important in the considered case when the number of measurements is quite limited and influence of biological and physical variations is high. Moreover, all estimates are qualified by the remaining uncertainty. During diagnostics> The (functioning) volume of thyroid gland and body mass are measured. A diagnostic amount of 131 I is administered. Three whole body measurements of elimination rate by urine (excretions) are made within 2 days after administration. The accumulated activities above thyroid gland and other lesions are registered within several days. Evaluation and measurements during therapy are: The accumulation ability is evaluated using diagnostic data. Consequences of 131 I administration are judged, then, the therapeutic activity is selected and administered. The accumulation dynamics is supervised and reaching radio-hygienic limits influencing patient regime is predicted. The common features of these steps are: (1) Individual measurements are corrupted by a high and varying uncertainty; (2) The number of measurements is limited; (3) A significant expert experience is available

  5. Reflecting Indigenous Culture in Educational Software Design.

    Science.gov (United States)

    Fleer, Marilyn

    1989-01-01

    Discusses research on Australian Aboriginal cognition which relates to the development of appropriate educational software. Describes "Tinja," a software program using familiar content and experiences, Aboriginal characters and cultural values, extensive graphics and animation, peer and group work, and open-ended design to help young…

  6. Software Development Management: Empirical and Analytical Perspectives

    Science.gov (United States)

    Kang, Keumseok

    2011-01-01

    Managing software development is a very complex activity because it must deal with people, organizations, technologies, and business processes. My dissertation consists of three studies that examine software development management from various perspectives. The first study empirically investigates the impacts of prior experience with similar…

  7. Maintenance simulation: Software issues

    Energy Technology Data Exchange (ETDEWEB)

    Luk, C.H.; Jette, M.A.

    1995-07-01

    The maintenance of a distributed software system in a production environment involves: (1) maintaining software integrity, (2) maintaining and database integrity, (3) adding new features, and (4) adding new systems. These issues will be discussed in general: what they are and how they are handled. This paper will present our experience with a distributed resource management system that accounts for resources consumed, in real-time, on a network of heterogenous computers. The simulated environments to maintain this system will be presented relate to the four maintenance areas.

  8. OPM: The Open Porous Media Initiative

    Science.gov (United States)

    Flemisch, B.; Flornes, K. M.; Lie, K.; Rasmussen, A.

    2011-12-01

    The principal objective of the Open Porous Media (OPM) initiative is to develop a simulation suite that is capable of modeling industrially and scientifically relevant flow and transport processes in porous media and bridge the gap between the different application areas of porous media modeling, including reservoir mechanics, CO2 sequestration, biological systems, and product development of engineered media. The OPM initiative will provide a long-lasting, efficient, and well-maintained open-source software for flow and transport in porous media built on modern software principles. The suite is released under the GNU General Public License (GPL). Our motivation is to provide a means to unite industry and public research on simulation of flow and transport in porous media. For academic users, we seek to provide a software infrastructure that facilitates testing of new ideas on models with industry-standard complexity, while at the same time giving the researcher control over discretization and solvers. Similarly, we aim to accelerate the technology transfer from academic institutions to professional companies by making new research results available as free software of professional standard. The OPM initiative is currently supported by six research groups in Norway and Germany and funded by existing grants from public research agencies as well as from Statoil Petroleum and Total E&P Norge. However, a full-scale development of the OPM initiative requires substantially more funding and involvement of more research groups and potential end users. In this talk, we will provide an overview of the current activities in the OPM initiative. Special emphasis will be given to the demonstration of the synergies achieved by combining the strengths of individual open-source software components. In particular, a new fully implicit solver developed within the DUNE-based simulator DuMux could be enhanced by the ability to read industry-standard Eclipse input files and to run on

  9. Software Estimation Demystifying the Black Art

    CERN Document Server

    McConnell, Steve

    2009-01-01

    Often referred to as the "black art" because of its complexity and uncertainty, software estimation is not as difficult or puzzling as people think. In fact, generating accurate estimates is straightforward-once you understand the art of creating them. In his highly anticipated book, acclaimed author Steve McConnell unravels the mystery to successful software estimation-distilling academic information and real-world experience into a practical guide for working software professionals. Instead of arcane treatises and rigid modeling techniques, this guide highlights a proven set of procedures,

  10. Experimental research control software system

    International Nuclear Information System (INIS)

    Cohn, I A; Kovalenko, A G; Vystavkin, A N

    2014-01-01

    A software system, intended for automation of a small scale research, has been developed. The software allows one to control equipment, acquire and process data by means of simple scripts. The main purpose of that development is to increase experiment automation easiness, thus significantly reducing experimental setup automation efforts. In particular, minimal programming skills are required and supervisors have no reviewing troubles. Interactions between scripts and equipment are managed automatically, thus allowing to run multiple scripts simultaneously. Unlike well-known data acquisition commercial software systems, the control is performed by an imperative scripting language. This approach eases complex control and data acquisition algorithms implementation. A modular interface library performs interaction with external interfaces. While most widely used interfaces are already implemented, a simple framework is developed for fast implementations of new software and hardware interfaces. While the software is in continuous development with new features being implemented, it is already used in our laboratory for automation of a helium-3 cryostat control and data acquisition. The software is open source and distributed under Gnu Public License.

  11. Experimental research control software system

    Science.gov (United States)

    Cohn, I. A.; Kovalenko, A. G.; Vystavkin, A. N.

    2014-05-01

    A software system, intended for automation of a small scale research, has been developed. The software allows one to control equipment, acquire and process data by means of simple scripts. The main purpose of that development is to increase experiment automation easiness, thus significantly reducing experimental setup automation efforts. In particular, minimal programming skills are required and supervisors have no reviewing troubles. Interactions between scripts and equipment are managed automatically, thus allowing to run multiple scripts simultaneously. Unlike well-known data acquisition commercial software systems, the control is performed by an imperative scripting language. This approach eases complex control and data acquisition algorithms implementation. A modular interface library performs interaction with external interfaces. While most widely used interfaces are already implemented, a simple framework is developed for fast implementations of new software and hardware interfaces. While the software is in continuous development with new features being implemented, it is already used in our laboratory for automation of a helium-3 cryostat control and data acquisition. The software is open source and distributed under Gnu Public License.

  12. Software dependability in the Tandem GUARDIAN system

    Science.gov (United States)

    Lee, Inhwan; Iyer, Ravishankar K.

    1995-01-01

    Based on extensive field failure data for Tandem's GUARDIAN operating system this paper discusses evaluation of the dependability of operational software. Software faults considered are major defects that result in processor failures and invoke backup processes to take over. The paper categorizes the underlying causes of software failures and evaluates the effectiveness of the process pair technique in tolerating software faults. A model to describe the impact of software faults on the reliability of an overall system is proposed. The model is used to evaluate the significance of key factors that determine software dependability and to identify areas for improvement. An analysis of the data shows that about 77% of processor failures that are initially considered due to software are confirmed as software problems. The analysis shows that the use of process pairs to provide checkpointing and restart (originally intended for tolerating hardware faults) allows the system to tolerate about 75% of reported software faults that result in processor failures. The loose coupling between processors, which results in the backup execution (the processor state and the sequence of events) being different from the original execution, is a major reason for the measured software fault tolerance. Over two-thirds (72%) of measured software failures are recurrences of previously reported faults. Modeling, based on the data, shows that, in addition to reducing the number of software faults, software dependability can be enhanced by reducing the recurrence rate.

  13. Theoretical modeling and experimental study on fatigue initiation life of 16MnR notched components

    International Nuclear Information System (INIS)

    Wang Xiaogui; Gao Zengliang; Qiu Baoxiang; Jiang Yanrao

    2010-01-01

    In order to investigate the effects of notch geometry and loading conditions on the fatigue initiation life and fatigue fracture life of 16MnR material, fatigue experiments were conducted for both smooth rod specimens and notched rod specimens. The detailed elastic-plastic stress and strain responses were computed by the finite element software (ABAQUS) incorporating a robust cyclic plasticity model via a user subroutine UMAT. The obtained stresses and strains were applied to the multiaxial fatigue damage criterion to compute the fatigue damage induced by a loading cycle on the critical material plane. The fatigue initiation life was then obtained by the proposed theoretical model. The well agreement between the predicted results and the experiment data indicated that the fatigue initiation of notched components in the multiaxial stress state related to all the nonzero stress and strain quantities. (authors)

  14. Barrier experiment: Shock initiation under complex loading

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-01-12

    The barrier experiments are a variant of the gap test; a detonation wave in a donor HE impacts a barrier and drives a shock wave into an acceptor HE. The question we ask is: What is the trade-off between the barrier material and threshold barrier thickness to prevent the acceptor from detonating. This can be viewed from the perspective of shock initiation of the acceptor subject to a complex pressure drive condition. Here we consider key factors which affect whether or not the acceptor undergoes a shock-to-detonation transition. These include the following: shock impedance matches for the donor detonation wave into the barrier and then the barrier shock into the acceptor, the pressure gradient behind the donor detonation wave, and the curvature of detonation front in the donor. Numerical simulations are used to illustrate how these factors affect the reaction in the acceptor.

  15. 2016 International Conference on Software Process Improvement

    CERN Document Server

    Muñoz, Mirna; Rocha, Álvaro; Feliu, Tomas; Peña, Adriana

    2017-01-01

    This book offers a selection of papers from the 2016 International Conference on Software Process Improvement (CIMPS’16), held between the 12th and 14th of October 2016 in Aguascalientes, Aguascalientes, México. The CIMPS’16 is a global forum for researchers and practitioners to present and discuss the most recent innovations, trends, results, experiences and concerns in the different aspects of software engineering with a focus on, but not limited to, software processes, security in information and communication technology, and big data. The main topics covered include: organizational models, standards and methodologies, knowledge management, software systems, applications and tools, information and communication technologies and processes in non-software domains (mining, automotive, aerospace, business, health care, manufacturing, etc.) with a clear focus on software process challenges.

  16. Integrated Web-Based Immersive Exploration of the Coordinated Canyon Experiment Data using Open Source STOQS Software

    Science.gov (United States)

    McCann, M. P.; Gwiazda, R.; O'Reilly, T. C.; Maier, K. L.; Lundsten, E. M.; Parsons, D. R.; Paull, C. K.

    2017-12-01

    The Coordinated Canyon Experiment (CCE) in Monterey Submarine Canyon has produced a wealth of oceanographic measurements whose analysis will improve understanding of turbidity current processes. Exploration of this data set, consisting of over 60 parameters from 15 platforms, is facilitated by using the open source Spatial Temporal Oceanographic Query System (STOQS) software (https://github.com/stoqs/stoqs). The Monterey Bay Aquarium Research Institute (MBARI) originally developed STOQS to help manage and visualize upper water column oceanographic measurements, but the generality of its data model permits effective use for any kind of spatial/temporal measurement data. STOQS consists of a PostgreSQL database and server-side Python/Django software; the client-side is jQuery JavaScript supporting AJAX requests to update a single page web application. The User Interface (UI) is optimized to provide a quick overview of data in spatial and temporal dimensions, as well as in parameter, platform, and data value space. A user may zoom into any feature of interest and select it, initiating a filter operation that updates the UI with an overview of all the data in the new filtered selection. When details are desired, radio buttons and checkboxes are selected to generate a number of different types of visualizations. These include color-filled temporal section and line plots, parameter-parameter plots, 2D map plots, and interactive 3D spatial visualizations. The Extensible 3D (X3D) standard and X3DOM JavaScript library provide the technology for presenting animated 3D data directly within the web browser. Most of the oceanographic measurements from the CCE (e.g. mooring mounted ADCP and CTD data) are easily visualized using established methods. However, unified integration and multiparameter display of several concurrently deployed sensors across a network of platforms is a challenge we hope to solve. Moreover, STOQS also allows display of data from a new instrument - the

  17. HEP Software Foundation Community White Paper Working Group - Detector Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Apostolakis, J; et al.

    2018-03-12

    A working group on detector simulation was formed as part of the high-energy physics (HEP) Software Foundation's initiative to prepare a Community White Paper that describes the main software challenges and opportunities to be faced in the HEP field over the next decade. The working group met over a period of several months in order to review the current status of the Full and Fast simulation applications of HEP experiments and the improvements that will need to be made in order to meet the goals of future HEP experimental programmes. The scope of the topics covered includes the main components of a HEP simulation application, such as MC truth handling, geometry modeling, particle propagation in materials and fields, physics modeling of the interactions of particles with matter, the treatment of pileup and other backgrounds, as well as signal processing and digitisation. The resulting work programme described in this document focuses on the need to improve both the software performance and the physics of detector simulation. The goals are to increase the accuracy of the physics models and expand their applicability to future physics programmes, while achieving large factors in computing performance gains consistent with projections on available computing resources.

  18. Experiences on dynamic simulation software in chemical engineering education

    DEFF Research Database (Denmark)

    Komulainen, Tiina M.; Enemark-rasmussen, Rasmus; Sin, Gürkan

    2012-01-01

    Commercial process simulators are increasing interest in the chemical engineer education. In this paper, the use of commercial dynamic simulation software, D-SPICE® and K-Spice®, for three different chemical engineering courses is described and discussed. The courses cover the following topics...

  19. Music Learning Based on Computer Software

    Directory of Open Access Journals (Sweden)

    Baihui Yan

    2017-12-01

    Full Text Available In order to better develop and improve students’ music learning, the authors proposed the method of music learning based on computer software. It is still a new field to use computer music software to assist teaching. Hereby, we conducted an in-depth analysis on the computer-enabled music learning and the music learning status in secondary schools, obtaining the specific analytical data. Survey data shows that students have many cognitive problems in the current music classroom, and yet teachers have not found a reasonable countermeasure to them. Against this background, the introduction of computer music software to music learning is a new trial that can not only cultivate the students’ initiatives of music learning, but also enhance their abilities to learn music. Therefore, it is concluded that the computer software based music learning is of great significance to improving the current music learning modes and means.

  20. Experimental test accelerator: description and results of initial experiments

    International Nuclear Information System (INIS)

    Fessenden, T.; Birx, D.; Briggs, R.

    1980-01-01

    The ETA is a high current (10,000 Amp) linear induction accelerator that produces short (30 ns) pulses of electrons at 5 MeV twice per second or in bursts of 5 pulses separated by as little as one millisecond. At this time the machine has operated at 65% of its design current and 90% of the design voltage. This report contains a description of the accelerator and its diagnostics; the results of the initial year of operation; a comparison of design codes with experiments on beam transport; and a discussion of some of the special problems and their status

  1. Discharge initiation experiments in the Tokapole II tokamak

    International Nuclear Information System (INIS)

    Shepard, D.A.

    1984-01-01

    Experiments in the Tokapole II tokamak demonstrate the benefits of high density (n/sub e//n/sub o/ greater than or equal to 0.01) preionization by reducing four quantities at startup: necessary toroidal loop voltage (V 1 ) (50%), volt-second consumption (40-50%), impurity radiation (25-50%), and runaway electron production (approx. 80-100%). A zero-dimensional code models the loop voltage reduction dependence on preionization density and predicts a similar result for reactor scale devices. The code shows low initial resistivity and a high resistivity time derivative contribute to loop voltage reduction. Microwaves at the electron cyclotron resonance (ECR) frequency and plasma gun injection produce high density preionization, which reduces the initial V 1 , volt-second consumption, and runaways. The ECR preionization also reduces impurity radiation by shortening the time from voltage application to current channel formation. This, evidently, reduces the total plasma-wall interaction at startup. The power balance of the ECR plasma in a toroidal-field-only case was studied using Langmuir probes and impurity doping. The vertical electric field and current, which result from curvature drift, were measured as approx. 10 V/cm and 50 amps, respectively, and exceeded expected values for the bulk electron temperature (approx. 10 eV)

  2. Earth Science Informatics Community Requirements for Improving Sustainable Science Software Practices: User Perspectives and Implications for Organizational Action

    Science.gov (United States)

    Downs, R. R.; Lenhardt, W. C.; Robinson, E.

    2014-12-01

    Science software is integral to the scientific process and must be developed and managed in a sustainable manner to ensure future access to scientific data and related resources. Organizations that are part of the scientific enterprise, as well as members of the scientific community who work within these entities, can contribute to the sustainability of science software and to practices that improve scientific community capabilities for science software sustainability. As science becomes increasingly digital and therefore, dependent on software, improving community practices for sustainable science software will contribute to the sustainability of science. Members of the Earth science informatics community, including scientific data producers and distributers, end-user scientists, system and application developers, and data center managers, use science software regularly and face the challenges and the opportunities that science software presents for the sustainability of science. To gain insight on practices needed for the sustainability of science software from the science software experiences of the Earth science informatics community, an interdisciplinary group of 300 community members were asked to engage in simultaneous roundtable discussions and report on their answers to questions about the requirements for improving scientific software sustainability. This paper will present an analysis of the issues reported and the conclusions offered by the participants. These results provide perspectives for science software sustainability practices and have implications for actions that organizations and their leadership can initiate to improve the sustainability of science software.

  3. Software Assurance in Acquisition: Mitigating Risks to the Enterprise. A Reference Guide for Security-Enhanced Software Acquisition and Outsourcing

    Science.gov (United States)

    2009-02-01

    Monitoring ISO /IEC 12207 2008(E) IEEE 1062 1998 PMBOK 3.0 Initiating Closing 3. Monitoring & Controlling 1. Planning 2. Executing Follow-on...software life cycles [ ISO /IEC 15026]. Software assurance is a key element of national security and homeland security. It is critical because dramatic...they are met. This may also include a plan for testing that SwA requirements are met. The [NDIA] and [ ISO /IEC 15026] provide details on structure and

  4. Initial resident refractive surgical experience: outcomes of PRK and LASIK for myopia.

    Science.gov (United States)

    Wagoner, Michael D; Wickard, Joseph C; Wandling, George R; Milder, Lisa C; Rauen, Matthew P; Kitzmann, Anna S; Sutphin, John E; Goins, Kenneth M

    2011-03-01

    To evaluate and compare the outcome of initial resident surgical experience with photorefractive keratectomy (PRK) and LASIK. Retrospective review of all cases performed with the VISX Star S4 platform (Abbott Medical Optics) between July 1, 2003 and June 30, 2007. Inclusion criteria were spherical equivalent of -0.50 to -10.00 diopters (D), refractive astigmatic error of ≤3.00 D, intention to provide full distance correction, and minimum 3-month postoperative follow-up after initial ablation or retreatment (if performed). A total of 153 cases performed by 20 different residents met the inclusion criteria; 38 eyes underwent PRK and 115 eyes had LASIK. After initial treatment, mean Snellen uncorrected distance visual acuity (UDVA) after PRK was 20/17.3 and after LASIK was 20/19.5. Photorefractive keratectomy was associated with a significantly better approximation between preoperative corrected distance visual acuity (CDVA) and postoperative UDVA (ΔlogMAR 0.009 vs 0.091; P=.004) and a greater percentage of eyes that achieved UDVA of 20/20 or better (94.7% vs 78.3%; P=.02) or 20/30 or better (100% vs 87.8%; P=.02). There was a higher prevalence of retreatment in eyes that underwent LASIK (7.0% vs 0%; P=.20). One (0.9%) eye lost 2 lines of CDVA after LASIK. Supervised refractive surgery residents can achieve excellent visual outcomes in patients operated during their initial refractive experience. Photorefractive keratectomy was associated with better visual outcome than LASIK. Copyright 2011, SLACK Incorporated.

  5. The (mis)use of subjective process measures in software engineering

    Science.gov (United States)

    Valett, Jon D.; Condon, Steven E.

    1993-01-01

    A variety of measures are used in software engineering research to develop an understanding of the software process and product. These measures fall into three broad categories: quantitative, characteristics, and subjective. Quantitative measures are those to which a numerical value can be assigned, for example effort or lines of code (LOC). Characteristics describe the software process or product; they might include programming language or the type of application. While such factors do not provide a quantitative measurement of a process or product, they do help characterize them. Subjective measures (as defined in this study) are those that are based on the opinion or opinions of individuals; they are somewhat unique and difficult to quantify. Capturing of subjective measure data typically involves development of some type of scale. For example, 'team experience' is one of the subjective measures that were collected and studied by the Software Engineering Laboratory (SEL). Certainly, team experience could have an impact on the software process or product; actually measuring a team's experience, however, is not a strictly mathematical exercise. Simply adding up each team member's years of experience appears inadequate. In fact, most researchers would agree that 'years' do not directly translate into 'experience.' Team experience must be defined subjectively and then a scale must be developed e.g., high experience versus low experience; or high, medium, low experience; or a different or more granular scale. Using this type of scale, a particular team's overall experience can be compared with that of other teams in the development environment. Defining, collecting, and scaling subjective measures is difficult. First, precise definitions of the measures must be established. Next, choices must be made about whose opinions will be solicited to constitute the data. Finally, care must be given to defining the right scale and level of granularity for measurement.

  6. Managing Cultural Variation in Software Process Improvement

    DEFF Research Database (Denmark)

    Kræmmergaard, Pernille; Müller, Sune Dueholm; Mathiassen, Lars

    The scale and complexity of change in software process improvement (SPI) are considerable and managerial attention to organizational culture during SPI can therefore potentially contribute to successful outcomes. However, we know little about the impact of variations in organizational subculture ...... organizations can have important implications for SPI outcomes. Furthermore, it provides insights into how software managers can practically assess subcultures to inform decisions about and help prepare plans for SPI initiatives.......The scale and complexity of change in software process improvement (SPI) are considerable and managerial attention to organizational culture during SPI can therefore potentially contribute to successful outcomes. However, we know little about the impact of variations in organizational subculture...

  7. The evolution of CMS software performance studies

    CERN Document Server

    Kortelainen, Matti J

    2010-01-01

    CMS has had an ongoing and dedicated effort to optimize software performance for several years. Initially this effort focused primarily on the cleanup of many issues coming from basic C++ errors, namely reducing dynamic memory churn, unnecessary copies/temporaries and tools to routinely monitor these things. Over the past 1.5 years, however, the transition to 64bit, newer versions of the gcc compiler, newer tools and the enabling of techniques like vectorization have made possible more sophisticated improvements to the software performance. This presentation will cover this evolution and describe the current avenues being pursued for software performance, as well as the corresponding gains.

  8. The evolution of CMS software performance studies

    Science.gov (United States)

    Kortelainen, M. J.; Elmer, P.; Eulisse, G.; Innocente, V.; Jones, C. D.; Tuura, L.

    2011-12-01

    CMS has had an ongoing and dedicated effort to optimize software performance for several years. Initially this effort focused primarily on the cleanup of many issues coming from basic C++ errors, namely reducing dynamic memory churn, unnecessary copies/temporaries and tools to routinely monitor these things. Over the past 1.5 years, however, the transition to 64bit, newer versions of the gcc compiler, newer tools and the enabling of techniques like vectorization have made possible more sophisticated improvements to the software performance. This presentation will cover this evolution and describe the current avenues being pursued for software performance, as well as the corresponding gains.

  9. Changes in Transferable Knowledge Resulting from Study in a Graduate Software Engineering Curriculum

    Science.gov (United States)

    Bareiss, Ray; Sedano, Todd; Katz, Edward

    2012-01-01

    This paper presents the initial results of a study of the evolution of students' knowledge of software engineering from the beginning to the end of a master's degree curriculum in software engineering. Students were presented with a problem involving the initiation of a complex new project at the beginning of the program and again at the end of…

  10. Seeding the cloud: Financial bootstrapping in the computer software sector

    OpenAIRE

    Mac An Bhaird, Ciarán; Lynn, Theo

    2015-01-01

    This study investigates resourcing of computer software companies that have adopted cloud computing for the development and delivery of application software. Use of this innovative technology potentially impacts firm financing because the initial infrastructure investment requirement is much lower than for packaged software, lead time to market is shorter, and cloud computing supports instant scalability. We test these predictions by conducting in-depth interviews with founders of 18 independ...

  11. CernVM - a virtual software appliance for LHC applications

    International Nuclear Information System (INIS)

    Buncic, P; Sanchez, C Aguado; Blomer, J; Franco, L; Mato, P; Harutyunian, A; Yao, Y

    2010-01-01

    CernVM is a Virtual Software Appliance capable of running physics applications from the LHC experiments at CERN. It aims to provide a complete and portable environment for developing and running LHC data analysis on any end-user computer (laptop, desktop) as well as on the Grid, independently of Operating System platforms (Linux, Windows, MacOS). The experiment application software and its specific dependencies are built independently from CernVM and delivered to the appliance just in time by means of a CernVM File System (CVMFS) specifically designed for efficient software distribution. The procedures for building, installing and validating software releases remains under the control and responsibility of each user community. We provide a mechanism to publish pre-built and configured experiment software releases to a central distribution point from where it finds its way to the running CernVM instances via the hierarchy of proxy servers or content delivery networks. In this paper, we present current state of CernVM project and compare performance of CVMFS to performance of traditional network file system like AFS and discuss possible scenarios that could further improve its performance and scalability.

  12. An x-ray detection system development for Tandem Mirror Experiment Upgrade (TMX-U): Hardware and software

    International Nuclear Information System (INIS)

    Jones, R.M.; Coutts, G.W.; Failor, B.H.

    1983-01-01

    This x-ray detection system measures the electron Bremstrahlung spectrum from the Tandem Mirror Experiment-Upgrade (TMX-U). From this spectrum, we can calculate the electron temperature. The low energy portion of the spectrum (0.5-40 keV) is measured by a liquid-nitrogen-cooled, lithium-drifted silicon detector. The higher energy spectrometer uses an intrinsic germanium detector to accommodate the 100 to 200 keV spectra. The system proceeds as follows. The preamplified detector signals are digitized by a high-speed A-to-D converter located in a Computer Automated Measurement and Control (CAMAC) crate. The data is then stored in a histogramming memory via a data router. The CAMAC crate interfaces with a local desktop computer or the main data acquisition computer that stores the data. The software sets up the modules, acquires the energy spectra (with sample times as short as 2 ms) and plots it. Up to 40 time-resolved spectra are available during one plasma cycle. The actual module configuration, CAMAC interfacing and software that runs the system are the subjects of this paper

  13. ALFA: The new ALICE-FAIR software framework

    Science.gov (United States)

    Al-Turany, M.; Buncic, P.; Hristov, P.; Kollegger, T.; Kouzinopoulos, C.; Lebedev, A.; Lindenstruth, V.; Manafov, A.; Richter, M.; Rybalchenko, A.; Vande Vyvre, P.; Winckler, N.

    2015-12-01

    The commonalities between the ALICE and FAIR experiments and their computing requirements led to the development of large parts of a common software framework in an experiment independent way. The FairRoot project has already shown the feasibility of such an approach for the FAIR experiments and extending it beyond FAIR to experiments at other facilities[1, 2]. The ALFA framework is a joint development between ALICE Online- Offline (O2) and FairRoot teams. ALFA is designed as a flexible, elastic system, which balances reliability and ease of development with performance using multi-processing and multithreading. A message- based approach has been adopted; such an approach will support the use of the software on different hardware platforms, including heterogeneous systems. Each process in ALFA assumes limited communication and reliance on other processes. Such a design will add horizontal scaling (multiple processes) to vertical scaling provided by multiple threads to meet computing and throughput demands. ALFA does not dictate any application protocols. Potentially, any content-based processor or any source can change the application protocol. The framework supports different serialization standards for data exchange between different hardware and software languages.

  14. Promoting linguistic complexity, greater message length and ease of engagement in email writing in people with aphasia: initial evidence from a study utilizing assistive writing software.

    Science.gov (United States)

    Thiel, Lindsey; Sage, Karen; Conroy, Paul

    2017-01-01

    Improving email writing in people with aphasia could enhance their ability to communicate, promote interaction and reduce isolation. Spelling therapies have been effective in improving single-word writing. However, there has been limited evidence on how to achieve changes to everyday writing tasks such as email writing in people with aphasia. One potential area that has been largely unexplored in the literature is the potential use of assistive writing technologies, despite some initial evidence that assistive writing software use can lead to qualitative and quantitative improvements to spontaneous writing. This within-participants case series design study aimed to investigate the effects of using assistive writing software to improve email writing in participants with dysgraphia related to aphasia. Eight participants worked through a hierarchy of writing tasks of increasing complexity within broad topic areas that incorporate the spheres of writing need of the participants: writing for domestic needs, writing for social needs and writing for business/administrative needs. Through completing these tasks, participants had the opportunity to use the various functions of the software, such as predictive writing, word banks and text to speech. Therapy also included training and practice in basic computer and email skills to encourage increased independence. Outcome measures included email skills, keyboard skills, email writing and written picture description tasks, and a perception of disability assessment. Four of the eight participants showed statistically significant improvements to spelling accuracy within emails when using the software. At a group level there was a significant increase in word length with the software; while four participants showed noteworthy changes to the range of word classes used. Enhanced independence in email use and improvements in participants' perceptions of their writing skills were also noted. This study provided some initial evidence

  15. Mapping modern software process engineering techniques onto an HEP development environment

    International Nuclear Information System (INIS)

    Wellisch, J.P.

    2003-01-01

    One of the most challenging issues faced in HEP in recent years is the question of how to capitalise on software development and maintenance experience in a continuous manner. To capitalise means in our context to evaluate and apply new process technologies as they arise, and to further evolve technologies already widely in use. It also implies the definition and adoption of standards. The CMS off-line software improvement effort aims at continual software quality improvement, and continual improvement in the efficiency of the working environment with the goal to facilitate doing great new physics. To achieve this, we followed a process improvement program based on ISO-15504, and Rational Unified Process. This experiment in software process improvement in HEP has been progressing now for a period of 3 years. Taking previous experience from ATLAS and SPIDER into account, we used a soft approach of continuous change within the limits of current culture to create of de facto software process standards within the CMS off line community as the only viable route to a successful software process improvement program in HEP. We will present the CMS approach to software process improvement in this process R and D, describe lessons learned, and mistakes made. We will demonstrate the benefits gained, and the current status of the software processes established in CMS off-line software

  16. Mapping modern software process engineering techniques onto an HEP development environment

    Science.gov (United States)

    Wellisch, J. P.

    2003-04-01

    One of the most challenging issues faced in HEP in recent years is the question of how to capitalise on software development and maintenance experience in a continuous manner. To capitalise means in our context to evaluate and apply new process technologies as they arise, and to further evolve technologies already widely in use. It also implies the definition and adoption of standards. The CMS off-line software improvement effort aims at continual software quality improvement, and continual improvement in the efficiency of the working environment with the goal to facilitate doing great new physics. To achieve this, we followed a process improvement program based on ISO-15504, and Rational Unified Process. This experiment in software process improvement in HEP has been progressing now for a period of 3 years. Taking previous experience from ATLAS and SPIDER into account, we used a soft approach of continuous change within the limits of current culture to create of de facto software process standards within the CMS off line community as the only viable route to a successful software process improvement program in HEP. We will present the CMS approach to software process improvement in this process R&D, describe lessons learned, and mistakes made. We will demonstrate the benefits gained, and the current status of the software processes established in CMS off-line software.

  17. Standardising Software Processes - An Obstacle for Innovation?

    DEFF Research Database (Denmark)

    Aaen, Ivan; Pries-Heje, Jan

    2004-01-01

    Over the last 10 years CMM has achieved widespread use as a model for improving software organisations. Often CMM is used to standardise software processes across projects. In this paper we discuss this standardisation of SPI in relation to innovation, organisational size and company growth. Our...... discussion is empirically based on years work and experience working with companies on SPI. In the concrete our discussion is enhanced by vignette stories taken from our experience. As a result we find that standardisation focussing on process, metrics, and controls may jeopardize innovative capabilities...

  18. Design of the Jet Performance Software for the ATLAS Experiment at LHC

    CERN Document Server

    Doglioni, C; The ATLAS collaboration; Loch, P; Perez, K; Vitillo, RA

    2011-01-01

    This paper describes the design and implementation of the JetFramework, a software tool developed for the data analysis of the ATLAS experi- ment at CERN. JetFramework is based on Athena, an object oriented framework for data processing. The JetFramework Athena package im- plements a configurable data-flow graph (DFG) to represent an analysis. Each node of the graph can perform some computation on one or more particle collections in input. A standard set of nodes to retrieve, filter, sort and plot collections are provided. Users can also implement their own computation units inheriting from a generic interface. The analysis graph can be declared and configured in an Athena options file. To provide the requested flexibility to configure nodes from a configuration file, a sim- ple expression language permits to specify selection and plotting criterias. Viewing an analysis as an explicit DFG permits end-users to avoid writing code for repetitive tasks and to reuse user-defined computation units in other analysis...

  19. Camac Software for TJ-I and TJ-IU

    International Nuclear Information System (INIS)

    Milligen, B. Ph. van.

    1994-01-01

    A user-friendly software package for control of CAMAC data acquisition modules for the TJ-I and TJ-IU experiments at the Association CIEMAT para Fusion has been developed. The CAMAC control software operates in Synchronization with the pre-existing VME-based data-acquisition system. The control software controls the setup of the CAMAC modules and manages the data flow from the taking to the storage of data. Data file management is performed largely automatically. Further, user software is provided for viewing and analysing the data

  20. Case Study Research in Software Engineering Guidelines and Examples

    CERN Document Server

    Runeson, Per; Rainer, Austen; Regnell, Bjorn

    2012-01-01

    Based on their own experiences of in-depth case studies of software projects in international corporations, in this book the authors present detailed practical guidelines on the preparation, conduct, design and reporting of case studies of software engineering.  This is the first software engineering specific book on the case study research method.

  1. Incorporating Code-Based Software in an Introductory Statistics Course

    Science.gov (United States)

    Doehler, Kirsten; Taylor, Laura

    2015-01-01

    This article is based on the experiences of two statistics professors who have taught students to write and effectively utilize code-based software in a college-level introductory statistics course. Advantages of using software and code-based software in this context are discussed. Suggestions are made on how to ease students into using code with…

  2. Experimental analysis of specification language impact on NPP software diversity

    International Nuclear Information System (INIS)

    Yoo, Chang Sik; Seong, Poong Hyun

    1998-01-01

    When redundancy and diversity is applied in NPP digital computer system, diversification of system software may be a critical point for the entire system dependability. As the means of enhancing software diversity, specification language diversity is suggested in this study. We set up a simple hypothesis for the specification language impact on common errors, and an experiment based on NPP protection system application was performed. Experiment result showed that this hypothesis could be justified and specification language diversity is effective in overcoming software common mode failure problem

  3. Medical device software: defining key terms.

    Science.gov (United States)

    Pashkov, Vitalii; Gutorova, Nataliya; Harkusha, Andrii

    one of the areas of significant growth in medical devices has been the role of software - as an integral component of a medical device, as a standalone device and more recently as applications on mobile devices. The risk related to a malfunction of the standalone software used within healthcare is in itself not a criterion for its qualification or not as a medical device. It is therefore, necessary to clarify some criteria for the qualification of stand-alone software as medical devices Materials and methods: Ukrainian, European Union, United States of America legislation, Guidelines developed by European Commission and Food and Drug Administration's, recommendations represented by international voluntary group and scientific works. This article is based on dialectical, comparative, analytic, synthetic and comprehensive research methods. the legal regulation of software which is used for medical purpose in Ukraine limited to one definition. In European Union and United States of America were developed and applying special guidelines that help developers, manufactures and end users to difference software on types standing on medical purpose criteria. Software becomes more and more incorporated into medical devices. Developers and manufacturers may not have initially appreciated potential risks to patients and users such situation could have dangerous results for patients or users. It is necessary to develop and adopt the legislation that will intend to define the criteria for the qualification of medical device software and the application of the classification criteria to such software, provide some illustrative examples and step by step recommendations to qualify software as medical device.

  4. Energy efficiency initiatives: Indian experience

    Energy Technology Data Exchange (ETDEWEB)

    Dey, Dipankar [ICFAI Business School, Kolkata, (IBS-K) (India)

    2007-07-01

    India, with a population of over 1.10 billion is one of the fastest growing economies of the world. As domestic sources of different conventional commercial energy are drying up, dependence on foreign energy sources is increasing. There exists a huge potential for saving energy in India. After the first 'oil shock' (1973), the government of India realized the need for conservation of energy and a 'Petroleum Conservation Action Group' was formed in 1976. Since then many initiatives aiming at energy conservation and improving energy efficiency, have been undertaken (the establishment of Petroleum Conservation Research Association in 1978; the notification of Eco labelling scheme in 1991; the formation of Bureau of Energy Efficiency in 2002). But no such initiative was successful. In this paper an attempt has been made to analyze the changing importance of energy conservation/efficiency measures which have been initiated in India between 1970 and 2005.The present study tries to analyze the limitations and the reasons of failure of those initiatives. The probable reasons are: fuel pricing mechanism (including subsidies), political factors, corruption and unethical practices, influence of oil and related industry lobbies - both internal and external, the economic situation and the prolonged protection of domestic industries. Further, as India is opening its economy, the study explores the opportunities that the globally competitive market would offer to improve the overall energy efficiency of the economy. The study suggests that the Bureau of Energy Efficiency (BEE) - the newly formed nodal agency for improving energy efficiency of the economy may be made an autonomous institution where intervention from the politicians would be very low. For proper implementation of different initiatives to improve energy efficiency, BEE should involve more the civil societies (NGO) from the inception to the implementation stage of the programs. The paper also

  5. The Systems Biology Research Tool: evolvable open-source software

    OpenAIRE

    Wright, J; Wagner, A

    2008-01-01

    Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput) experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform calle...

  6. Mapping modern software process engineering techniques onto an HEP development environment

    CERN Document Server

    Wellisch, J P

    2003-01-01

    One of the most challenging issues faced in HEP in recent years is the question of how to capitalise on software development and maintenance experience in a continuous manner. To capitalise means in our context to evaluate and apply new process technologies as they arise, and to further evolve technologies already widely in use. It also implies the definition and adoption of standards. The CMS off- line software improvement effort aims at continual software quality improvement, and continual improvement in the efficiency of the working environment with the goal to facilitate doing great new physics. To achieve this, we followed a process improvement program based on ISO-15504, and Rational Unified Process. This experiment in software process improvement in HEP has been progressing now for a period of 3 years. Taking previous experience from ATLAS and SPIDER into account, we used a soft approach of continuous change within the limits of current culture to create of de facto software process standards within th...

  7. Initial cathode processing experiences and results for the treatment of spent fuel

    International Nuclear Information System (INIS)

    Westphal, B.R.; Laug, D.V.; Brunsvold, A.R.; Roach, P.D.

    1996-01-01

    As part of the spent fuel treatment demonstration at Argonne National Laboratory, a vacuum distillation process is being employed for the recovery of uranium following an electrorefining process. Distillation of a salt electrolyte, primarily consisting of a eutectic mixture of lithium and potassium chlorides, from uranium is achieved by a batch operation termed ''cathode processing.'' Cathode processing is performed in a retort furnace which enables the production of a stable uranium product that can be isotopically diluted and stored. To date, experiments have been performed with two distillation units; one for prototypical testing and the other for actual spent fuel treatment operations. The results and experiences from these initial experiments with both units will be discussed as well as problems encountered and their resolution

  8. Discharge initiation experiments in the Tokapole II tokamak

    International Nuclear Information System (INIS)

    Shepard, D.A.

    1984-06-01

    Experiments in the Tokapole II tokamak demonstrate the benefits of high density (n/sub e//n 0 greater than or equal to 0.01) preionization by reducing four quantities at startup: necessary toroidal loop voltage (V 1 ) (50%), volt-second consumption (40 to 50%), impurity radiation (25 to 50%), and runaway electron production (approx. 80 to 100%). A zero-dimensional code models the loop voltage reduction dependence on preionization density and predicts a similar result for reactor scale devices. The code shows low initial resistivity and a high resistivity time derivative contribute to loop voltage reduction. The power balance of the ECR plasma in a toroidal-field-only case was studied. Langmuir probes and impurity doping were used. The vertical electric field (E/sub v/) and current (I/sub v/), which result from curvature drift, were measured (E/sub v/ approx. 10 V/cm and I/sub v/ approx. 50 Amps) and exceeded expected values for the bulk electron temperature (approx. 10 eV). A series of experiments with external windings to simulate field errors perpendicular to the toroidal field was done. The results imply that an error field of 0.1% of the toroidal field is deleterious to ECR plasma density

  9. High/Scope Preschool Key Experiences: Initiative and Social Relations. [with] Curriculum Videotape.

    Science.gov (United States)

    Graves, Michelle

    As preschoolers develop the ability to carry out their ideas and play alone and with others, they are developing the foundation for social competence. This booklet and a companion videotape help teachers and parents recognize and support nine High/Scope key experiences in initiative and social relations: (1) making and expressing choices, plans,…

  10. Augmented Reality Guidance for the Resection of Missing Colorectal Liver Metastases: An Initial Experience.

    Science.gov (United States)

    Ntourakis, Dimitrios; Memeo, Ricardo; Soler, Luc; Marescaux, Jacques; Mutter, Didier; Pessaux, Patrick

    2016-02-01

    Modern chemotherapy achieves the shrinking of colorectal cancer liver metastases (CRLM) to such extent that they may disappear from radiological imaging. Disappearing CRLM rarely represents a complete pathological remission and have an important risk of recurrence. Augmented reality (AR) consists in the fusion of real-time patient images with a computer-generated 3D virtual patient model created from pre-operative medical imaging. The aim of this prospective pilot study is to investigate the potential of AR navigation as a tool to help locate and surgically resect missing CRLM. A 3D virtual anatomical model was created from thoracoabdominal CT-scans using customary software (VR RENDER(®), IRCAD). The virtual model was superimposed to the operative field using an Exoscope (VITOM(®), Karl Storz, Tüttlingen, Germany). Virtual and real images were manually registered in real-time using a video mixer, based on external anatomical landmarks with an estimated accuracy of 5 mm. This modality was tested in three patients, with four missing CRLM that had sizes from 12 to 24 mm, undergoing laparotomy after receiving pre-operative oxaliplatin-based chemotherapy. AR display and fine registration was performed within 6 min. AR helped detect all four missing CRLM, and guided their resection. In all cases the planned security margin of 1 cm was clear and resections were confirmed to be R0 by pathology. There was no postoperative major morbidity or mortality. No local recurrence occurred in the follow-up period of 6-22 months. This initial experience suggests that AR may be a helpful navigation tool for the resection of missing CRLM.

  11. Large-scale visualization projects for teaching software engineering.

    Science.gov (United States)

    Müller, Christoph; Reina, Guido; Burch, Michael; Weiskopf, Daniel

    2012-01-01

    The University of Stuttgart's software engineering major complements the traditional computer science major with more practice-oriented education. Two-semester software projects in various application areas offered by the university's different computer science institutes are a successful building block in the curriculum. With this realistic, complex project setting, students experience the practice of software engineering, including software development processes, technologies, and soft skills. In particular, visualization-based projects are popular with students. Such projects offer them the opportunity to gain profound knowledge that would hardly be possible with only regular lectures and homework assignments.

  12. CAMAC Software for TJ-I and TJ-IU

    Energy Technology Data Exchange (ETDEWEB)

    Milligen, B Ph. van

    1993-07-01

    A user-friendly software package for control of CAMAC data acquisition modules for the TJ-I and TJ-IU experiments at the Asociacion CIEMAT para Fusion has been developed. The CAMAC control software operates in synchronisation with the pre-existing VME-based data acquisition system. The control software controls the setup of the CAMAC modules and manages the data flow from the lacking to the storage of data. Data file management is performed largely automatically. Further, user software is provided for viewing and analysing the data. (Author) 9 refs.

  13. CAMAC Software for TJ-I and TJ-IU

    International Nuclear Information System (INIS)

    Milligen, B. Ph. van

    1994-01-01

    A user-friendly software package for control of CAMAC data acquisition modules for the TJ-I and TJ-IU experiments at the Asociacion CIEMAT para Fusion has been developed. The CAMAC control software operates in synchronisation with the pre-existing VME-based data acquisition system. The control software controls the setup of the CAMAC modules and manages the data flow from the lacking to the storage of data. Data file management is performed largely automatically. Further, user software is provided for viewing and analysing the data. (Author) 9 refs

  14. Software reliability growth models with normal failure time distributions

    International Nuclear Information System (INIS)

    Okamura, Hiroyuki; Dohi, Tadashi; Osaki, Shunji

    2013-01-01

    This paper proposes software reliability growth models (SRGM) where the software failure time follows a normal distribution. The proposed model is mathematically tractable and has sufficient ability of fitting to the software failure data. In particular, we consider the parameter estimation algorithm for the SRGM with normal distribution. The developed algorithm is based on an EM (expectation-maximization) algorithm and is quite simple for implementation as software application. Numerical experiment is devoted to investigating the fitting ability of the SRGMs with normal distribution through 16 types of failure time data collected in real software projects

  15. Use of operational data for the assessment of pre-existing software

    International Nuclear Information System (INIS)

    Helminen, Atte; Gran, Bjoern Axel; Kristiansen, Monica; Winther, Rune

    2004-01-01

    To build sufficient confidence on the reliability of the safety systems of nuclear power plants all available sources of information should be used. One important data source is the operational experience collected for the system. The operational experience is particularly applicable for systems of pre-existing software. Even though systems and devices involving pre-existing software are not considered for the functions of highest safety levels of nuclear power plants, they will most probably be introduced to functions of lower safety levels and to none-safety related applications. In the paper we shortly discuss the use of operational experience data for the reliability assessment of pre-existing software in general, and the role of pre-existing software in relation to safety applications. Then we discuss the modelling of operational profiles, the application of expert judgement on operational profiles and the need of a realistic test case. Finally, we discuss the application of operational experience data in Bayesian statistics. (Author)

  16. 76 FR 75875 - Defense Federal Acquisition Regulation Supplement; Open Source Software Public Meeting

    Science.gov (United States)

    2011-12-05

    ... Regulation Supplement; Open Source Software Public Meeting AGENCY: Defense Acquisition Regulations System... initiate a dialogue with industry regarding the use of open source software in DoD contracts. DATES: Public... to the risks to the contractors and the Government associated with using open source software on DoD...

  17. Validating New Software for Semiautomated Liver Volumetry--Better than Manual Measurement?

    Science.gov (United States)

    Noschinski, L E; Maiwald, B; Voigt, P; Wiltberger, G; Kahn, T; Stumpp, P

    2015-09-01

    This prospective study compared a manual program for liver volumetry with semiautomated software. The hypothesis was that the semiautomated software would be faster, more accurate and less dependent on the evaluator's experience. Ten patients undergoing hemihepatectomy were included in this IRB approved study after written informed consent. All patients underwent a preoperative abdominal 3-phase CT scan, which was used for whole liver volumetry and volume prediction for the liver part to be resected. Two different types of software were used: 1) manual method: borders of the liver had to be defined per slice by the user; 2) semiautomated software: automatic identification of liver volume with manual assistance for definition of Couinaud segments. Measurements were done by six observers with different experience levels. Water displacement volumetry immediately after partial liver resection served as the gold standard. The resected part was examined with a CT scan after displacement volumetry. Volumetry of the resected liver scan showed excellent correlation to water displacement volumetry (manual: ρ = 0.997; semiautomated software: ρ = 0.995). The difference between the predicted volume and the real volume was significantly smaller with the semiautomated software than with the manual method (33% vs. 57%, p = 0.002). The semiautomated software was almost four times faster for volumetry of the whole liver (manual: 6:59 ± 3:04 min; semiautomated: 1:47 ± 1:11 min). Both methods for liver volumetry give an estimated liver volume close to the real one. The tested semiautomated software is faster, more accurate in predicting the volume of the resected liver part, gives more reproducible results and is less dependent on the user's experience. Both tested types of software allow exact volumetry of resected liver parts. Preoperative prediction can be performed more accurately with the semiautomated software. The semiautomated software is nearly four times faster than the

  18. NHPP-Based Software Reliability Models Using Equilibrium Distribution

    Science.gov (United States)

    Xiao, Xiao; Okamura, Hiroyuki; Dohi, Tadashi

    Non-homogeneous Poisson processes (NHPPs) have gained much popularity in actual software testing phases to estimate the software reliability, the number of remaining faults in software and the software release timing. In this paper, we propose a new modeling approach for the NHPP-based software reliability models (SRMs) to describe the stochastic behavior of software fault-detection processes. The fundamental idea is to apply the equilibrium distribution to the fault-detection time distribution in NHPP-based modeling. We also develop efficient parameter estimation procedures for the proposed NHPP-based SRMs. Through numerical experiments, it can be concluded that the proposed NHPP-based SRMs outperform the existing ones in many data sets from the perspective of goodness-of-fit and prediction performance.

  19. Guidance and Control Software Project Data - Volume 1: Planning Documents

    Science.gov (United States)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the planning documents from the GCS project. Volume 1 contains five appendices: A. Plan for Software Aspects of Certification for the Guidance and Control Software Project; B. Software Development Standards for the Guidance and Control Software Project; C. Software Verification Plan for the Guidance and Control Software Project; D. Software Configuration Management Plan for the Guidance and Control Software Project; and E. Software Quality Assurance Activities.

  20. Radiobiology software for educational purpose

    International Nuclear Information System (INIS)

    Pandey, A.K.; Sharma, S.K.; Kumar, R.; Bal, C.S.; Nair, O.; Haresh, K.P.; Julka, P.K.

    2014-01-01

    To understand radio-nuclide therapy and the basis of radiation protection, it is essential to understand radiobiology. With limited time for classroom teaching and limited time and resources for radiobiology experiments students do not acquire firm grasp of theoretical mathematical models and experimental knowledge of target theory and Linear quadratic models that explain nature of cell survival curves. We believe that this issue might be addressed with numerical simulation of cell survival curves using mathematical models. Existing classroom teaching can be reoriented to understand the subject using the concept of modeling, simulation and virtual experiments. After completion of the lecture, students can practice with simulation tool at their convenient time. In this study we have developed software that can help the students to acquire firm grasp of theoretical and experimental radiobiology. The software was developed using FreeMat ver 4.0, open source software. Target theory, linear quadratic model, cell killing based on Poisson model have been included. The implementation of the program structure was to display the menu for the user choice to be made and then program flows depending on the users choice. The program executes by typing 'Radiobiology' on the command line interface. Students can investigate the effect of radiation dose on cell, interactively. They can practice to draw the cell survival curve based on the input and output data and they can also compare their handmade graphs with automatically generated graphs by the program. This software is in the early stage of development and will evolve on user feedback. We feel this simulation software will be quite useful for students entering in the nuclear medicine, radiology and radiotherapy disciplines. (author)

  1. Analysis of Software Development Methodologies to Build Safety Software Applications for the SATEX-II: A Mexican Experimental Satellite

    Science.gov (United States)

    Aguilar Cisneros, Jorge; Vargas Martinez, Hector; Pedroza Melendez, Alejandro; Alonso Arevalo, Miguel

    2013-09-01

    Mexico is a country where the experience to build software for satellite applications is beginning. This is a delicate situation because in the near future we will need to develop software for the SATEX-II (Mexican Experimental Satellite). SATEX- II is a SOMECyTA's project (the Mexican Society of Aerospace Science and Technology). We have experienced applying software development methodologies, like TSP (Team Software Process) and SCRUM in other areas. Then, we analyzed these methodologies and we concluded: these can be applied to develop software for the SATEX-II, also, we supported these methodologies with SSP-05-0 Standard in particular with ESA PSS-05-11. Our analysis was focusing on main characteristics of each methodology and how these methodologies could be used with the ESA PSS 05-0 Standards. Our outcomes, in general, may be used by teams who need to build small satellites, but, in particular, these are going to be used when we will build the on board software applications for the SATEX-II.

  2. [Characterization of photochemical smog chamber and initial experiments].

    Science.gov (United States)

    Jia, Long; Xu, Yong-Fu; Shi, Yu-Zhen

    2011-02-01

    A self-made new indoor environmental chamber facility for the study of atmospheric processes leading to the formation of ozone and secondary organic aerosols has been introduced and characterized. The characterization experiments include the measurements of wall effects for reactive species and the determination of chamber dependent * OH radical sources by CO-NO(x) irradiation experiments. Preliminary ethene-NO(x) and benzene-NO(x) experiments were conducted as well. The results of characterization experiments show that the wall effects for O3 and NO2 in a new reactor are not obvious. Relative humidity has a great effect on the wall losses in the old reactor, especially for O3. In the old reactor, the rate constant for O3 wall losses is obtained to be 1.0 x 10(-5) s(-1) (RH = 5%) and 4.0 x10(-5) s(-1) (RH = 91%), whereas for NO2, it is 1.0 x 10(-6) s(-1) (RH = 5%) and 0.6 x 10(-6) s(-1) (RH = 75%). The value for k(NO2 --> HONO) determined by CO-NO(x) irradiation experiments is (4.2-5.2) x 10(-5) s(-1) and (2.3-2.5) x 10(-5) s(-1) at RH = 5% and RH 75% -77%, respectively. The average *OH concentration is estimated to be (2.1 +/- 0.4) x 10(6) molecules/cm3 by using a reaction rate coefficient of CO and * OH. The sensitivity of chamber dependent auxiliary reactions to the O3 formation is discussed. Results show that NO2 --> HONO has the greatest impact on the O3 formation during the initial stage, N2O5 + H2O --> 2HNO3 has a minus effect to maximum O3 concentration, and that the wall losses of both O3 and NO2 have little impact on the O3 formation. The results from the ethene-NO(x) and benzene-NO(x) experiments are in good agreement with those from the MCM simulation, which reflects that the facility for the study of the formation of secondary pollution of ozone and secondary organic aerosols is reliable. This demonstrates that our facility can be further used in the deep-going study of chemical processes in the atmosphere.

  3. The Legacy of Space Shuttle Flight Software

    Science.gov (United States)

    Hickey, Christopher J.; Loveall, James B.; Orr, James K.; Klausman, Andrew L.

    2011-01-01

    The initial goals of the Space Shuttle Program required that the avionics and software systems blaze new trails in advancing avionics system technology. Many of the requirements placed on avionics and software were accomplished for the first time on this program. Examples include comprehensive digital fly-by-wire technology, use of a digital databus for flight critical functions, fail operational/fail safe requirements, complex automated redundancy management, and the use of a high-order software language for flight software development. In order to meet the operational and safety goals of the program, the Space Shuttle software had to be extremely high quality, reliable, robust, reconfigurable and maintainable. To achieve this, the software development team evolved a software process focused on continuous process improvement and defect elimination that consistently produced highly predictable and top quality results, providing software managers the confidence needed to sign each Certificate of Flight Readiness (COFR). This process, which has been appraised at Capability Maturity Model (CMM)/Capability Maturity Model Integration (CMMI) Level 5, has resulted in one of the lowest software defect rates in the industry. This paper will present an overview of the evolution of the Primary Avionics Software System (PASS) project and processes over thirty years, an argument for strong statistical control of software processes with examples, an overview of the success story for identifying and driving out errors before flight, a case study of the few significant software issues and how they were either identified before flight or slipped through the process onto a flight vehicle, and identification of the valuable lessons learned over the life of the project.

  4. Software engineering practices for control system reliability

    International Nuclear Information System (INIS)

    S. K. Schaffner; K. S White

    1999-01-01

    This paper will discuss software engineering practices used to improve Control System reliability. The authors begin with a brief discussion of the Software Engineering Institute's Capability Maturity Model (CMM) which is a framework for evaluating and improving key practices used to enhance software development and maintenance capabilities. The software engineering processes developed and used by the Controls Group at the Thomas Jefferson National Accelerator Facility (Jefferson Lab), using the Experimental Physics and Industrial Control System (EPICS) for accelerator control, are described. Examples are given of how their procedures have been used to minimized control system downtime and improve reliability. While their examples are primarily drawn from their experience with EPICS, these practices are equally applicable to any control system. Specific issues addressed include resource allocation, developing reliable software lifecycle processes and risk management

  5. Global Software Engineering: A Software Process Approach

    Science.gov (United States)

    Richardson, Ita; Casey, Valentine; Burton, John; McCaffery, Fergal

    Our research has shown that many companies are struggling with the successful implementation of global software engineering, due to temporal, cultural and geographical distance, which causes a range of factors to come into play. For example, cultural, project managementproject management and communication difficulties continually cause problems for software engineers and project managers. While the implementation of efficient software processes can be used to improve the quality of the software product, published software process models do not cater explicitly for the recent growth in global software engineering. Our thesis is that global software engineering factors should be included in software process models to ensure their continued usefulness in global organisations. Based on extensive global software engineering research, we have developed a software process, Global Teaming, which includes specific practices and sub-practices. The purpose is to ensure that requirements for successful global software engineering are stipulated so that organisations can ensure successful implementation of global software engineering.

  6. Experience of Initial Symptoms of Breast Cancer and Triggers for Action in Ethiopia

    International Nuclear Information System (INIS)

    Dye, T.D.; Hobden, C.; Reeler, A.; Dye, T.D.; Bogale, S.; Tilahun, Y.; Deressa, T.

    2012-01-01

    Objective. This study assessed the initial experiences, symptoms, and actions of patients in Ethiopia ultimately determined to have breast cancer. Methods. 69 participants in a comprehensive breast cancer treatment program at the main national cancer hospital in Ethiopia were interviewed using mixed qualitative and quantitative approaches. Participants narratives of their initial cancer experience were coded and analyzed for themes around their symptoms, time to seeking advice, triggers for action, and contextual factors. The assessment was approved by the Addis Ababa University Faculty of Medicine Institutional Review Board. Results. Nearly all women first noticed lumps, though few sought medical advice within the first year (average time to action: 1.5 years). Eventually, changes in their symptoms motivated most participants to seek advice. Most participants did not think the initial lump would be cancer, nor was a lump of any particular concern until symptoms changed. Conclusion. Given the frequency with which lumps are the first symptom noticed, raising awareness among participants that lumps should trigger medical consultation could contribute significantly to more rapid medical advice-seeking among women in Ethiopia. Primary care sites should be trained and equipped to offer evaluation of lumps so that women can be referred appropriately for assessment if needed

  7. Elements of strategic capability for software outsourcing enterprises based on the resource

    Science.gov (United States)

    Shi, Wengeng

    2011-10-01

    Software outsourcing enterprises as an emerging high-tech enterprises, the rise of the speed and the number was very amazing. In addition to Chinese software outsourcing for giving preferential policies, the software outsourcing business has its ability to upgrade, and in general the software companies have not had the related characteristics. View from the resource base of the theory, the analysis software outsourcing companies have the ability and resources of rare and valuable and non-mimic, we try to give an initial framework for theoretical analysis based on this.

  8. Modular Software for Spacecraft Navigation Using the Global Positioning System (GPS)

    Science.gov (United States)

    Truong, S. H.; Hartman, K. R.; Weidow, D. A.; Berry, D. L.; Oza, D. H.; Long, A. C.; Joyce, E.; Steger, W. L.

    1996-01-01

    The Goddard Space Flight Center Flight Dynamics and Mission Operations Divisions have jointly investigated the feasibility of engineering modular Global Positioning SYSTEM (GPS) navigation software to support both real time flight and ground postprocessing configurations. The goals of this effort are to define standard GPS data interfaces and to engineer standard, reusable navigation software components that can be used to build a broad range of GPS navigation support applications. The paper discusses the GPS modular software (GMOD) system and operations concepts, major requirements, candidate software architecture, feasibility assessment and recommended software interface standards. In additon, ongoing efforts to broaden the scope of the initial study and to develop modular software to support autonomous navigation using GPS are addressed,

  9. An Initial Load-Based Green Software Defined Network

    Directory of Open Access Journals (Sweden)

    Ying Hu

    2017-05-01

    Full Text Available Software defined network (SDN is a new network architecture in which the control function is decoupled from the data forwarding plane, that is attracting wide attentions from both research and industry sectors. However, SDN still faces the energy waste problem as do traditional networks. At present, research on energy saving in SDN is mainly focused on the static optimization of the network with zero load when new traffic arrives, changing the transmission path of the uncompleted traffic which arrived before the optimization, possibly resulting in route oscillation and other deleterious effects. To avoid this, a dynamical energy saving optimization scheme in which the paths of the uncompleted flows will not be changed when new traffic arrives is designed. To find the optimal solution for energy saving, the problem is modeled as a mixed integer linear programming (MILP problem. As the high complexity of the problem prohibits the optimal solution, an improved heuristic routing algorithm called improved constant weight greedy algorithm (ICWGA is proposed to find a sub-optimal solution. Simulation results show that the energy saving capacity of ICWGA is close to that of the optimal solution, offering desirable improvement in the energy efficiency of the network.

  10. Smart roadside initiative : user manual.

    Science.gov (United States)

    2015-09-01

    This document provides the user instructions for the Smart Roadside Initiative (SRI) applications including mobile and web-based SRI applications. These applications include smartphone-enabled information exchange and notification, and software compo...

  11. Top 10 metrics for life science software good practices [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Haydee Artaza

    2016-08-01

    Full Text Available Metrics for assessing adoption of good development practices are a useful way to ensure that software is sustainable, reusable and functional. Sustainability means that the software used today will be available - and continue to be improved and supported - in the future. We report here an initial set of metrics that measure good practices in software development. This initiative differs from previously developed efforts in being a community-driven grassroots approach where experts from different organisations propose good software practices that have reasonable potential to be adopted by the communities they represent. We not only focus our efforts on understanding and prioritising good practices, we assess their feasibility for implementation and publish them here.

  12. Progressive retry for software error recovery in distributed systems

    Science.gov (United States)

    Wang, Yi-Min; Huang, Yennun; Fuchs, W. K.

    1993-01-01

    In this paper, we describe a method of execution retry for bypassing software errors based on checkpointing, rollback, message reordering and replaying. We demonstrate how rollback techniques, previously developed for transient hardware failure recovery, can also be used to recover from software faults by exploiting message reordering to bypass software errors. Our approach intentionally increases the degree of nondeterminism and the scope of rollback when a previous retry fails. Examples from our experience with telecommunications software systems illustrate the benefits of the scheme.

  13. Editorial Management serials online: construction process, publication and administration with free software solutions

    Directory of Open Access Journals (Sweden)

    Andrés Vuotto

    2013-10-01

    Full Text Available Initially raised the main points to consider and develop the planning and construction of an online publication of a scientific nature, emphasizing the process and editorial functions, document preservation, access management, indexing and visibility. In the second part of the paper presents a proposed solution to every aspect previously described, highlighting the work of the information professional and optimizing time, cost and results offered free software, from a concrete experience with the system Open Journal System under the journal portal of the Faculty of Humanities at the Universidad Nacional de Mar del Plata.

  14. Research initiatives for plug-and-play scientific computing

    International Nuclear Information System (INIS)

    McInnes, Lois Curfman; Dahlgren, Tamara; Nieplocha, Jarek; Bernholdt, David; Allan, Ben; Armstrong, Rob; Chavarria, Daniel; Elwasif, Wael; Gorton, Ian; Kenny, Joe; Krishan, Manoj; Malony, Allen; Norris, Boyana; Ray, Jaideep; Shende, Sameer

    2007-01-01

    This paper introduces three component technology initiatives within the SciDAC Center for Technology for Advanced Scientific Component Software (TASCS) that address ever-increasing productivity challenges in creating, managing, and applying simulation software to scientific discovery. By leveraging the Common Component Architecture (CCA), a new component standard for high-performance scientific computing, these initiatives tackle difficulties at different but related levels in the development of component-based scientific software: (1) deploying applications on massively parallel and heterogeneous architectures, (2) investigating new approaches to the runtime enforcement of behavioral semantics, and (3) developing tools to facilitate dynamic composition, substitution, and reconfiguration of component implementations and parameters, so that application scientists can explore tradeoffs among factors such as accuracy, reliability, and performance

  15. Deployment of the CMS software on the WLCG Grid

    International Nuclear Information System (INIS)

    Behrenhoff, W; Wissing, C; Kim, B; Blyweert, S; D'Hondt, J; Maes, J; Maes, M; Mulders, P Van; Villella, I; Vanelderen, L

    2011-01-01

    The CMS Experiment is taking high energy collision data at CERN. The computing infrastructure used to analyse the data is distributed round the world in a tiered structure. In order to use the 7 Tier-1 sites, the 50 Tier-2 sites and a still growing number of about 30 Tier-3 sites, the CMS software has to be available at those sites. Except for a very few sites the deployment and the removal of CMS software is managed centrally. Since the deployment team has no local accounts at the remote sites all installation jobs have to be sent via Grid jobs. Via a VOMS role the job has a high priority in the batch system and gains write privileges to the software area. Due to the lack of interactive access the installation jobs must be very robust against possible failures, in order not to leave a broken software installation. The CMS software is packaged in RPMs that are installed in the software area independent of the host OS. The apt-get tool is used to resolve package dependencies. This paper reports about the recent deployment experiences and the achieved performance.

  16. Incubator Display Software Cost Reduction Toolset Software Requirements Specification

    Science.gov (United States)

    Moran, Susanne; Jeffords, Ralph

    2005-01-01

    The Incubator Display Software Requirements Specification was initially developed by Intrinsyx Technologies Corporation (Intrinsyx) under subcontract to Lockheed Martin, Contract Number NAS2-02090, for the National Aeronautics and Space Administration (NASA) Ames Research Center (ARC) Space Station Biological Research Project (SSBRP). The Incubator Display is a User Payload Application (UPA) used to control an Incubator subrack payload for the SSBRP. The Incubator Display functions on-orbit as part of the subrack payload laptop, on the ground as part of the Communication and Data System (CDS) ground control system, and also as part of the crew training environment.

  17. [Initiation, promotion, initiation experiments with radon and cigarette smoke: Lung tumors in rats]. Progress report

    International Nuclear Information System (INIS)

    Moolgavkar, S.H.

    1994-01-01

    During the past several years, the authors have made considerable progress in modeling carcinogenesis in general, and in modeling radiation carcinogenesis, in particular. They present an overview of their progress in developing stochastic carcinogenesis models and applying them to experimental and epidemiologic data sets. Traditionally, cancer models have been used for the analysis of incidence (or prevalence) data in epidemiology and time to tumor data in experimental studies. The relevant quantities for the analysis of these data are the hazard function and the probability of tumor. The derivation of these quantities is briefly described here. More recently, the authors began to use these models for the analysis of data on intermediate lesions on the pathway to cancer. Such data are available in experimental carcinogenesis studies, in particular in initiation and promotion studies on the mouse skin and the rat liver. If however, quantitative information on intermediate lesions on the pathway to lung cancer were to be come available at some future date, the methods that they have developed for the analysis of initiation-promotion experiments could easily be applied to the analysis of these lesions. The mathematical derivations here are couched in terms of a particular two-mutation model of carcinogenesis. Extension to models postulating more than two mutations is not always straightforward

  18. Software Development in the Water Sciences: a view from the divide (Invited)

    Science.gov (United States)

    Miles, B.; Band, L. E.

    2013-12-01

    While training in statistical methods is an important part of many earth scientists' training, these scientists often learn the bulk of their software development skills in an ad hoc, just-in-time manner. Yet to carry out contemporary research scientists are spending more and more time developing software. Here I present perspectives - as an earth sciences graduate student with professional software engineering experience - on the challenges scientists face adopting software engineering practices, with an emphasis on areas of the science software development lifecycle that could benefit most from improved engineering. This work builds on experience gained as part of the NSF-funded Water Science Software Institute (WSSI) conceptualization award (NSF Award # 1216817). Throughout 2013, the WSSI team held a series of software scoping and development sprints with the goals of: (1) adding features to better model green infrastructure within the Regional Hydro-Ecological Simulation System (RHESSys); and (2) infusing test-driven agile software development practices into the processes employed by the RHESSys team. The goal of efforts such as the WSSI is to ensure that investments by current and future scientists in software engineering training will enable transformative science by improving both scientific reproducibility and researcher productivity. Experience with the WSSI indicates: (1) the potential for achieving this goal; and (2) while scientists are willing to adopt some software engineering practices, transformative science will require continued collaboration between domain scientists and cyberinfrastructure experts for the foreseeable future.

  19. Development of a computer-aided design software for dental splint in orthognathic surgery

    Science.gov (United States)

    Chen, Xiaojun; Li, Xing; Xu, Lu; Sun, Yi; Politis, Constantinus; Egger, Jan

    2016-12-01

    In the orthognathic surgery, dental splints are important and necessary to help the surgeon reposition the maxilla or mandible. However, the traditional methods of manual design of dental splints are difficult and time-consuming. The research on computer-aided design software for dental splints is rarely reported. Our purpose is to develop a novel special software named EasySplint to design the dental splints conveniently and efficiently. The design can be divided into two steps, which are the generation of initial splint base and the Boolean operation between it and the maxilla-mandibular model. The initial splint base is formed by ruled surfaces reconstructed using the manually picked points. Then, a method to accomplish Boolean operation based on the distance filed of two meshes is proposed. The interference elimination can be conducted on the basis of marching cubes algorithm and Boolean operation. The accuracy of the dental splint can be guaranteed since the original mesh is utilized to form the result surface. Using EasySplint, the dental splints can be designed in about 10 minutes and saved as a stereo lithography (STL) file for 3D printing in clinical applications. Three phantom experiments were conducted and the efficiency of our method was demonstrated.

  20. Proceedings of the Eighteenth Annual Software Engineering Workshop

    Science.gov (United States)

    1993-01-01

    The workshop provided a forum for software practitioners from around the world to exchange information on the measurement, use, and evaluation of software methods, models, and tools. This year, approximately 450 people attended the workshop, which consisted of six sessions on the following topics: the Software Engineering Laboratory, measurement, technology assessment, advanced concepts, process, and software engineering issues in NASA. Three presentations were given in each of the topic areas. The content of those presentations and the research papers detailing the work reported are included in these proceedings. The workshop concluded with a tutorial session on how to start an Experience Factory.

  1. Mapping patients' experiences from initial symptoms to gout diagnosis: a qualitative exploration.

    Science.gov (United States)

    Liddle, Jennifer; Roddy, Edward; Mallen, Christian D; Hider, Samantha L; Prinjha, Suman; Ziebland, Sue; Richardson, Jane C

    2015-09-14

    To explore patients' experiences from initial symptoms to receiving a diagnosis of gout. Data from in-depth semistructured interviews were used to construct themes to describe key features of patients' experiences of gout diagnosis. A maximum variation sample of 43 UK patients with gout (29 men; 14 women; age range 32-87 years) were recruited from general practices, rheumatology clinics, gout support groups and through online advertising. Severe joint pain, combined with no obvious signs of physical trauma or knowledge of injury, caused confusion for patients attempting to interpret their symptoms. Reasons for delayed consultation included self-diagnosis and/or self-medication, reluctance to seek medical attention, and financial/work pressures. Factors potentially contributing to delayed diagnosis after consultation included reported misdiagnosis, attacks in joints other than the first metatarsophalangeal joint, and female gender. The limitations in using serum uric acid (SUA) levels for diagnostic purposes were not always communicated effectively to patients, and led to uncertainty and lack of confidence in the accuracy of the diagnosis. Resistance to the diagnosis occurred in response to patients' beliefs about the causes of gout and characteristics of the people likely to be affected. Diagnosis prompted actions, such as changes in diet, and evidence was found of self-monitoring of SUA levels. This study is the first to report data specifically about patients' pathways to initial consultation and subsequent experiences of gout diagnosis. A more targeted approach to information provision at diagnosis would improve patients' experiences. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  2. 4th International Conference on Software Process Improvement

    CERN Document Server

    Muñoz, Mirna; Rocha, Álvaro; Calvo-Manzano, Jose

    2016-01-01

    This book contains a selection of papers from The 2015 International Conference on Software Process Improvement (CIMPS’15), held between the 28th and 30th of October in Mazatlán, Sinaloa, México. The CIMPS’15 is a global forum for researchers and practitioners that present and discuss the most recent innovations, trends, results, experiences and concerns in the several perspectives of Software Engineering with clear relationship but not limited to software processes, Security in Information and Communication Technology and Big Data Field. The main topics covered are: Organizational Models, Standards and Methodologies, Knowledge Management, Software Systems, Applications and Tools, Information and Communication Technologies and Processes in non-software domains (Mining, automotive, aerospace, business, health care, manufacturing, etc.) with a demonstrated relationship to software process challenges.

  3. Feasibility study for the redesign of MDOT's pavement management systems software.

    Science.gov (United States)

    2011-04-01

    In August of 2006 the Mississippi Department of Transportation (MDOT) initiated State Study No. 191, entitled Feasibility : Study for the Redesign of MDOTs Pavement Management System (PMS) Software. At the initiation of this study, the : Dep...

  4. Path generation algorithm for UML graphic modeling of aerospace test software

    Science.gov (United States)

    Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Chen, Chao

    2018-03-01

    Aerospace traditional software testing engineers are based on their own work experience and communication with software development personnel to complete the description of the test software, manual writing test cases, time-consuming, inefficient, loopholes and more. Using the high reliability MBT tools developed by our company, the one-time modeling can automatically generate test case documents, which is efficient and accurate. UML model to describe the process accurately express the need to rely on the path is reached, the existing path generation algorithm are too simple, cannot be combined into a path and branch path with loop, or too cumbersome, too complicated arrangement generates a path is meaningless, for aerospace software testing is superfluous, I rely on our experience of ten load space, tailor developed a description of aerospace software UML graphics path generation algorithm.

  5. Class 1E software verification and validation: Past, present, and future

    Energy Technology Data Exchange (ETDEWEB)

    Persons, W.L.; Lawrence, J.D.

    1993-10-01

    This paper discusses work in progress that addresses software verification and validation (V&V) as it takes place during the full software life cycle of safety-critical software. The paper begins with a brief overview of the task description and discussion of the historical evolution of software V&V. A new perspective is presented which shows the entire verification and validation process from the viewpoints of a software developer, product assurance engineer, independent V&V auditor, and government regulator. An account of the experience of the field test of the Verification Audit Plan and Report generated from the V&V Guidelines is presented along with sample checklists and lessons learned from the verification audit experience. Then, an approach to automating the V&V Guidelines is introduced. The paper concludes with a glossary and bibliography.

  6. Software engineering architecture-driven software development

    CERN Document Server

    Schmidt, Richard F

    2013-01-01

    Software Engineering: Architecture-driven Software Development is the first comprehensive guide to the underlying skills embodied in the IEEE's Software Engineering Body of Knowledge (SWEBOK) standard. Standards expert Richard Schmidt explains the traditional software engineering practices recognized for developing projects for government or corporate systems. Software engineering education often lacks standardization, with many institutions focusing on implementation rather than design as it impacts product architecture. Many graduates join the workforce with incomplete skil

  7. Initiation process of a thrust fault revealed by analog experiments

    Science.gov (United States)

    Yamada, Yasuhiro; Dotare, Tatsuya; Adam, Juergen; Hori, Takane; Sakaguchi, Hide

    2016-04-01

    We conducted 2D (cross-sectional) analog experiments with dry sand using a high resolution digital image correlation (DIC) technique to reveal initiation process of a thrust fault in detail, and identified a number of "weak shear bands" and minor uplift prior to the thrust initiation. The observations suggest that the process can be divided into three stages. Stage 1: characterized by a series of abrupt and short-lived weak shear bands at the location where the thrust will be generated later. Before initiation of the fault, the area to be the hanging wall starts to uplift. Stage 2: defined by the generation of the new thrust and its active displacement. The location of the new thrust seems to be constrained by its associated back-thrust, produced at the foot of the surface slope (by the previous thrust). The activity of the previous thrust turns to zero once the new thrust is generated, but the timing of these two events is not the same. Stage 3: characterized by a constant displacement along the (new) thrust. Similar minor shear bands can be seen in the toe area of the Nankai accretionary prism, SW Japan and we can correlate the along-strike variations in seismic profiles to the model results that show the characteristic features in each thrust development stage.

  8. Software Defined Cyberinfrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Foster, Ian; Blaiszik, Ben; Chard, Kyle; Chard, Ryan

    2017-07-17

    Within and across thousands of science labs, researchers and students struggle to manage data produced in experiments, simulations, and analyses. Largely manual research data lifecycle management processes mean that much time is wasted, research results are often irreproducible, and data sharing and reuse remain rare. In response, we propose a new approach to data lifecycle management in which researchers are empowered to define the actions to be performed at individual storage systems when data are created or modified: actions such as analysis, transformation, copying, and publication. We term this approach software-defined cyberinfrastructure because users can implement powerful data management policies by deploying rules to local storage systems, much as software-defined networking allows users to configure networks by deploying rules to switches.We argue that this approach can enable a new class of responsive distributed storage infrastructure that will accelerate research innovation by allowing any researcher to associate data workflows with data sources, whether local or remote, for such purposes as data ingest, characterization, indexing, and sharing. We report on early experiments with this approach in the context of experimental science, in which a simple if-trigger-then-action (IFTA) notation is used to define rules.

  9. Experience with Intel's many integrated core architecture in ATLAS software

    International Nuclear Information System (INIS)

    Fleischmann, S; Neumann, M; Kama, S; Lavrijsen, W; Vitillo, R

    2014-01-01

    Intel recently released the first commercial boards of its Many Integrated Core (MIC) Architecture. MIC is Intel's solution for the domain of throughput computing, currently dominated by general purpose programming on graphics processors (GPGPU). MIC allows the use of the more familiar x86 programming model and supports standard technologies such as OpenMP, MPI, and Intel's Threading Building Blocks (TBB). This should make it possible to develop for both throughput and latency devices using a single code base. In ATLAS Software, track reconstruction has been shown to be a good candidate for throughput computing on GPGPU devices. In addition, the newly proposed offline parallel event-processing framework, GaudiHive, uses TBB for task scheduling. The MIC is thus, in principle, a good fit for this domain. In this paper, we report our experiences of porting to and optimizing ATLAS tracking algorithms for the MIC, comparing the programmability and relative cost/performance of the MIC against those of current GPGPUs and latency-optimized CPUs.

  10. Proposing an Evidence-Based Strategy for Software Requirements Engineering.

    Science.gov (United States)

    Lindoerfer, Doris; Mansmann, Ulrich

    2016-01-01

    This paper discusses an evidence-based approach to software requirements engineering. The approach is called evidence-based, since it uses publications on the specific problem as a surrogate for stakeholder interests, to formulate risks and testing experiences. This complements the idea that agile software development models are more relevant, in which requirements and solutions evolve through collaboration between self-organizing cross-functional teams. The strategy is exemplified and applied to the development of a Software Requirements list used to develop software systems for patient registries.

  11. Organizational Change Perspectives on Software Process Improvement

    DEFF Research Database (Denmark)

    Müller, Sune Dueholm; Mathiassen, Lars; Balshøj, Hans Henrik

    Many software organizations have engaged in Software Process Improvement (SPI) and experienced the challenges related to managing such complex organizational change efforts. As a result, there is an increasing body of research investigating change management in SPI. To provide an overview of what......, and brain perspectives. Practitioners may use these articles as a guide to SPI insights relevant to their improvement initiatives. In contrast, the impact of culture, dominance, psychic prison, flux and transformation, and politics in SPI have only received scant attention. We argue that these perspectives...

  12. Quality assurance of the modernized Dukovany I and C safety system software

    International Nuclear Information System (INIS)

    Karpeta, C.

    2005-01-01

    The approach to quality assurance of the software that implements the instrumentation and control functions for safety category A as per IEC 61226, which has been adopted within the 'NPP Dukovany I and C Refurbishment' project, is described. A survey of the requirements for software quality assurance of the systems that initiate protection interventions in the event of anticipated operational occurrences or accident conditions is given. The software development process applied by the system designers and manufacturers, from the software requirements specification phase to the software testing phase, is outlined. Basic information on technical audits of the software development process is also provided. (orig.)

  13. Software Acquisition Management Practical Experience

    Science.gov (United States)

    2009-04-23

    t a comp ete m ss on so tware su te n ar Air Force Air Mobility Command declared Initial Operation ...or revised controls areR i t t i f d t l t t ll d d di i li d 43 necessary equ remen s managemen s un amen a o a con ro e an sc p ne...Warner Robins, GA 31088 FAA NAS Plan (TDWR) SOW S ft d l t t d i i t b d t do ware eve opmen managemen an eng neer ng o e con uc e in

  14. MODEL: A software suite for data acquisition

    Energy Technology Data Exchange (ETDEWEB)

    Sendall, D M; Boissat, C; Bozzoli, W; Burkimsher, P; Jones, R; Matheys, J P; Mornacchi, G; Nguyen, T; Vyvre, P vande; Vascotto, A; Weaver, D [European Organization for Nuclear Research, Geneva (Switzerland). DD Div.

    1989-12-01

    MODEL is a new suite of modular data-acquisition software. It is aimed at the needs of LEP experiments, and is also general enough to be more widely used. It can accomodate a variety of users styles. It runs on a set of loosely coupled processors, and makes use of the remote procedure call technique. Implemented originally for the VAX family, some of its services have already been extended to other systems, including embedded microprocessors. The software modules available include facilities for data-flow management, a framework for monitoring programs, a window-oriented human interface, an error message utility, a process control utility and a run control scheme. It is already in use in a variety of experiments, and is still under development in the light of user experience. (orig.).

  15. Formal Methods: Practice and Experience

    DEFF Research Database (Denmark)

    Woodcock, Jim; Larsen, Peter Gorm; Bicarregui, Juan

    2009-01-01

    . Based on this, we discuss the issues surrounding the industrial adoption of formal methods. Finally, we look to the future and describe the development of a Verified Software Repository, part of the worldwide Verified Software Initiative. We introduce the initial projects being used to populate...... the repository, and describe the challenges they address. © 2009 ACM. (146 refs.)...

  16. Product derivation in software product families : a case study

    NARCIS (Netherlands)

    Deelstra, S; Sinnema, M; Bosch, J

    2005-01-01

    From our experience with several organizations that employ software product families, we have learned that, contrary to popular belief, deriving individual products from shared software assets is a time-consuming and expensive activity. In this paper we therefore present a study that investigated

  17. Object oriented reconstruction software for the Instrumented Flux Return of BABAR

    CERN Document Server

    Nardo, E D; Lista, L

    2001-01-01

    BABAR experiment is the first High Energy Physics experiment to extensively use object oriented technology and the C++ programming language for online and offline software. Object orientation permits to reach a high level of flexibility and maintainability of the code, which is a key point in a large project with many developers. These goals are reached with the introduction of reusable code elements, with abstraction of code behaviours and polymorphism. Software design, before code implementation, is the key task that determines the achievement of such a goal. We present the experience with the application of object oriented technology and design patterns to the reconstruction software of the Instrumented Flux Return detector of BABAR experiment. The use of abstract interfaces improved the development of reconstruction code and permitted to flexibly apply modification to reconstruction strategies, and eventually to reduce the maintenance load. The experience during the last years of development is presented....

  18. EDS operator and control software

    International Nuclear Information System (INIS)

    Ott, L.L.

    1985-04-01

    The Enrichment Diagnostic System (EDS) was developed at Lawrence Livermore National Laboratory (LLNL) to acquire, display and analyze large quantities of transient data for a real-time Advanced Vapor Laser Isotope Separation (AVLIS) experiment. Major topics discussed in this paper are the EDS operator interface (SHELL) program, the data acquisition and analysis scheduling software, and the graphics software. The workstation concept used in EDS, the software used to configure a user's workstation, and the ownership and management of a diagnostic are described. An EDS diagnostic is a combination of hardware and software designed to study specific aspects of the process. Overall system performance is discussed from the standpoint of scheduling techniques, evaluation tools, optimization techniques, and program-to-program communication methods. EDS is based on a data driven design which keeps the need to modify software to a minimum. This design requires a fast and reliable data base management system. A third party data base management product, Berkeley Software System Database, written explicitly for HP1000's, is used for all EDS data bases. All graphics is done with an in-house graphics product, Device Independent Graphics Library (DIGLIB). Examples of devices supported by DIGLIB are: Versatec printer/plotters, Raster Technologies Graphic Display Controllers, and HP terminals (HP264x and HP262x). The benefits derived by using HP hardware and software as well as obstacles imposed by the HP environment are presented in relation to EDS development and implementation

  19. Sandia National Laboratories ASCI Applications Software Quality Engineering Practices; TOPICAL

    International Nuclear Information System (INIS)

    ZEPPER, JOHN D.; ARAGON, KATHRYN MARY; ELLIS, MOLLY A.; BYLE, KATHLEEN A.; EATON, DONNA SUE

    2002-01-01

    This document provides a guide to the deployment of the software verification activities, software engineering practices, and project management principles that guide the development of Accelerated Strategic Computing Initiative (ASCI) applications software at Sandia National Laboratories (Sandia). The goal of this document is to identify practices and activities that will foster the development of reliable and trusted products produced by the ASCI Applications program. Document contents include an explanation of the structure and purpose of the ASCI Quality Management Council, an overview of the software development lifecycle, an outline of the practices and activities that should be followed, and an assessment tool. These sections map practices and activities at Sandia to the ASCI Software Quality Engineering: Goals, Principles, and Guidelines, a Department of Energy document

  20. Towards a comprehensive framework for reuse: A reuse-enabling software evolution environment

    Science.gov (United States)

    Basili, V. R.; Rombach, H. D.

    1988-01-01

    Reuse of products, processes and knowledge will be the key to enable the software industry to achieve the dramatic improvement in productivity and quality required to satisfy the anticipated growing demand. Although experience shows that certain kinds of reuse can be successful, general success has been elusive. A software life-cycle technology which allows broad and extensive reuse could provide the means to achieving the desired order-of-magnitude improvements. The scope of a comprehensive framework for understanding, planning, evaluating and motivating reuse practices and the necessary research activities is outlined. As a first step towards such a framework, a reuse-enabling software evolution environment model is introduced which provides a basis for the effective recording of experience, the generalization and tailoring of experience, the formalization of experience, and the (re-)use of experience.

  1. The BTeV Software Tutorial Suite

    International Nuclear Information System (INIS)

    Kutschke, Robert K.

    2004-01-01

    The BTeV Collaboration is starting to develop its C++ based offline software suite, an integral part of which is a series of tutorials. These tutorials are targeted at a diverse audience, including new graduate students, experienced physicists with little or no C++ experience, those with just enough C++ to be dangerous, and experts who need only an overview of the available tools. The tutorials must both teach C++ in general and the BTeV specific tools in particular. Finally, they must teach physicists how to find and use the detailed documentation. This report will review the status of the BTeV experiment, give an overview of the plans for and the state of the software and will then describe the plans for the tutorial suite

  2. Guidance and Control Software Project Data - Volume 3: Verification Documents

    Science.gov (United States)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the verification documents from the GCS project. Volume 3 contains four appendices: A. Software Verification Cases and Procedures for the Guidance and Control Software Project; B. Software Verification Results for the Pluto Implementation of the Guidance and Control Software; C. Review Records for the Pluto Implementation of the Guidance and Control Software; and D. Test Results Logs for the Pluto Implementation of the Guidance and Control Software.

  3. Software FMEA analysis for safety-related application software

    International Nuclear Information System (INIS)

    Park, Gee-Yong; Kim, Dong Hoon; Lee, Dong Young

    2014-01-01

    Highlights: • We develop a modified FMEA analysis suited for applying to software architecture. • A template for failure modes on a specific software language is established. • A detailed-level software FMEA analysis on nuclear safety software is presented. - Abstract: A method of a software safety analysis is described in this paper for safety-related application software. The target software system is a software code installed at an Automatic Test and Interface Processor (ATIP) in a digital reactor protection system (DRPS). For the ATIP software safety analysis, at first, an overall safety or hazard analysis is performed over the software architecture and modules, and then a detailed safety analysis based on the software FMEA (Failure Modes and Effect Analysis) method is applied to the ATIP program. For an efficient analysis, the software FMEA analysis is carried out based on the so-called failure-mode template extracted from the function blocks used in the function block diagram (FBD) for the ATIP software. The software safety analysis by the software FMEA analysis, being applied to the ATIP software code, which has been integrated and passed through a very rigorous system test procedure, is proven to be able to provide very valuable results (i.e., software defects) that could not be identified during various system tests

  4. The NOvA software testing framework

    International Nuclear Information System (INIS)

    Tamsett, M; Group, C

    2015-01-01

    The NOvA experiment at Fermilab is a long-baseline neutrino experiment designed to study vε appearance in a vμ beam. NOvA has already produced more than one million Monte Carlo and detector generated files amounting to more than 1 PB in size. This data is divided between a number of parallel streams such as far and near detector beam spills, cosmic ray backgrounds, a number of data-driven triggers and over 20 different Monte Carlo configurations. Each of these data streams must be processed through the appropriate steps of the rapidly evolving, multi-tiered, interdependent NOvA software framework. In total there are greater than 12 individual software tiers, each of which performs a different function and can be configured differently depending on the input stream. In order to regularly test and validate that all of these software stages are working correctly NOvA has designed a powerful, modular testing framework that enables detailed validation and benchmarking to be performed in a fast, efficient and accessible way with minimal expert knowledge. The core of this system is a novel series of python modules which wrap, monitor and handle the underlying C++ software framework and then report the results to a slick front-end web-based interface. This interface utilises modern, cross-platform, visualisation libraries to render the test results in a meaningful way. They are fast and flexible, allowing for the easy addition of new tests and datasets. In total upwards of 14 individual streams are regularly tested amounting to over 70 individual software processes, producing over 25 GB of output files. The rigour enforced through this flexible testing framework enables NOvA to rapidly verify configurations, results and software and thus ensure that data is available for physics analysis in a timely and robust manner. (paper)

  5. Guidance and Control Software Project Data - Volume 2: Development Documents

    Science.gov (United States)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the development documents from the GCS project. Volume 2 contains three appendices: A. Guidance and Control Software Development Specification; B. Design Description for the Pluto Implementation of the Guidance and Control Software; and C. Source Code for the Pluto Implementation of the Guidance and Control Software

  6. SPINAL CORD STIMULATION IN TREATMENT OF THE NEUROPATHIC PAIN SYNDROMES: INITIAL EXPERIENCE

    Directory of Open Access Journals (Sweden)

    D. A. Rzaev

    2010-01-01

    Full Text Available In the article initial experience of spinal cord stimulation for chronic pain syndromes is described. The trial was done for 62 patients, in 52 cases trial was successful and subcutaneous pulse generator were implanated. Maximal follow-up is 26 months. The level of pain evaluates at VAS. Permanent pain-relieve results were achieved in 46 patients (74,2%. These results correspond to literature data.

  7. The experience of initiating injection drug use and its social context: a qualitative systematic review and thematic synthesis.

    Science.gov (United States)

    Guise, Andy; Horyniak, Danielle; Melo, Jason; McNeil, Ryan; Werb, Dan

    2017-12-01

    Understanding the experience of initiating injection drug use and its social contexts is crucial to inform efforts to prevent transitions into this mode of drug consumption and support harm reduction. We reviewed and synthesized existing qualitative scientific literature systematically to identify the socio-structural contexts for, and experiences of, the initiation of injection drug use. We searched six databases (Medline, Embase, PsychINFO, CINAHL, IBSS and SSCI) systematically, along with a manual search, including key journals and subject experts. Peer-reviewed studies were included if they qualitatively explored experiences of or socio-structural contexts for injection drug use initiation. A thematic synthesis approach was used to identify descriptive and analytical themes throughout studies. From 1731 initial results, 41 studies reporting data from 1996 participants were included. We developed eight descriptive themes and two analytical (higher-order) themes. The first analytical theme focused on injecting initiation resulting from a social process enabled and constrained by socio-structural factors: social networks and individual interactions, socialization into drug-using identities and choices enabled and constrained by social context all combine to produce processes of injection initiation. The second analytical theme addressed pathways that explore varying meanings attached to injection initiation and how they link to social context: seeking pleasure, responses to increasing tolerance to drugs, securing belonging and identity and coping with pain and trauma. Qualitative research shows that injection drug use initiation has varying and distinct meanings for individuals involved and is a dynamic process shaped by social and structural factors. Interventions should therefore respond to the socio-structural influences on injecting drug use initiation by seeking to modify the contexts for initiation, rather than solely prioritizing the reduction of individual

  8. Initial experiment of focusing wiggler of MM wave Free Electron Laser on LAX-1

    International Nuclear Information System (INIS)

    Sakamoto, Keishi; Maebara, Sunao; Watanabe, Akihiko; Kishimoto, Yasuaki; Nagashima, Takashi; Maeda, Hikosuke; Shiho, Makoto; Oda, Hisako; Kawasaki, Sunao.

    1991-03-01

    Initial results of Free Electron laser (FEL) Experiment in the mm wave region are presented. The experiment is carried out using a induction linac system (LAX-1: Large current Accelerator Experiment) of E b = 1 MeV, Ib = 1 ∼ 3 kA. The wiggler of FEL is composed of the curved surface magnets arrays (focusing wiggler), which is found to be effective for a transport of low energy and high current beam through the wiggler. The superradiance of the mm wave region (30 GHz ∼ 40 GHz) is observed. The growth rate of this radiation is 0.42 dB/cm. (author)

  9. Proceedings of the Fifteenth Annual Software Engineering Workshop

    Science.gov (United States)

    1990-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by GSFC and created for the purpose of investigating the effectiveness of software engineering technologies when applied to the development of applications software. The goals of the SEL are: (1) to understand the software development process in the GSFC environment; (2) to measure the effect of various methodologies, tools, and models on this process; and (3) to identify and then to apply successful development practices. Fifteen papers were presented at the Fifteenth Annual Software Engineering Workshop in five sessions: (1) SEL at age fifteen; (2) process improvement; (3) measurement; (4) reuse; and (5) process assessment. The sessions were followed by two panel discussions: (1) experiences in implementing an effective measurement program; and (2) software engineering in the 1980's. A summary of the presentations and panel discussions is given.

  10. Lessons in modern digital field geology: Open source software, 3D techniques, and the new world of digital mapping

    Science.gov (United States)

    Pavlis, Terry; Hurtado, Jose; Langford, Richard; Serpa, Laura

    2014-05-01

    Although many geologists refuse to admit it, it is time to put paper-based geologic mapping into the historical archives and move to the full potential of digital mapping techniques. For our group, flat map digital geologic mapping is now a routine operation in both research and instruction. Several software options are available, and basic proficiency with the software can be learned in a few hours of instruction and practice. The first practical field GIS software, ArcPad, remains a viable, stable option on Windows-based systems. However, the vendor seems to be moving away from ArcPad in favor of mobile software solutions that are difficult to implement without GIS specialists. Thus, we have pursued a second software option based on the open source program QGIS. Our QGIS system uses the same shapefile-centric data structure as our ArcPad system, including similar pop-up data entry forms and generic graphics for easy data management in the field. The advantage of QGIS is that the same software runs on virtually all common platforms except iOS, although the Android version remains unstable as of this writing. A third software option we are experimenting with for flat map-based field work is Fieldmove, a derivative of the 3D-capable program Move developed by Midland Valley. Our initial experiments with Fieldmove are positive, particularly with the new, inexpensive (potential for communicating the complexity of key exposures. For example, in studies of metamorphic structures we often search for days to find "Rosetta Stone" outcrops that display key geometric relationships. While conventional photographs rarely can capture the essence of the field exposure, capturing a true 3D representation of the exposure with multiple photos from many orientations can solve this communication problem. As spatial databases evolve these 3D models should be readily importable into the database.

  11. Class 1E software verification and validation: Past, present, and future

    International Nuclear Information System (INIS)

    Persons, W.L.; Lawrence, J.D.

    1993-10-01

    This paper discusses work in progress that addresses software verification and validation (V ampersand V) as it takes place during the full software life cycle of safety-critical software. The paper begins with a brief overview of the task description and discussion of the historical evolution of software V ampersand V. A new perspective is presented which shows the entire verification and validation process from the viewpoints of a software developer, product assurance engineer, independent V ampersand V auditor, and government regulator. An account of the experience of the field test of the Verification Audit Plan and Report generated from the V ampersand V Guidelines is presented along with sample checklists and lessons learned from the verification audit experience. Then, an approach to automating the V ampersand V Guidelines is introduced. The paper concludes with a glossary and bibliography

  12. Class 1E software verification and validation: Past, present, and future

    International Nuclear Information System (INIS)

    Persons, W.L.; Lawrence, J.D.

    1994-01-01

    This paper discusses work in progress that addresses software verification and validation (V ampersand V) as it takes place during the full software life cycle of safety-critical software. The paper begins with a brief overview of the task description and discussion of the historical evolution of software V ampersand V. A new perspective is presented which shows the entire verification and validation process from the viewpoints of a software developer, product assurance engineer, independent V ampersand V auditor, and government regulator. An account of the experience of the field test of the Verification Audit Plan and Report generated from the V ampersand V Guidelines is presented along with sample checklists and lessons learned from the verification audit experience. Then, an approach to automating the V ampersand V Guidelines is introduced. The paper concludes with a glossary and bibliography

  13. Expected requirements in support tool for software process improvement in SMEs

    OpenAIRE

    Muñoz Mata, Mirna; Mejía Miranda, Jezreel; Amescua Seco, Antonio; Calvo-Manzano Villalón, José Antonio; Cuevas Agustín, Gonzalo; San Feliu Gilabert, Tomás

    2012-01-01

    Nowadays being competitive is an important challenge for software development organizations. In order to achieve this, since last years, software process improvement has been an obvious and logical way. Unfortunately, even when many organizations are motivated to implement software process initiatives, not all know how best to do so, especially in Small and Medium Enterprises (SMEs) where due to its especial features, they have to be carefully in how to manage its resources to assure their ma...

  14. Delivering LHC software to HPC compute elements

    CERN Document Server

    Blomer, Jakob; Hardi, Nikola; Popescu, Radu

    2017-01-01

    In recent years, there was a growing interest in improving the utilization of supercomputers by running applications of experiments at the Large Hadron Collider (LHC) at CERN when idle cores cannot be assigned to traditional HPC jobs. At the same time, the upcoming LHC machine and detector upgrades will produce some 60 times higher data rates and challenge LHC experiments to use so far untapped compute resources. LHC experiment applications are tailored to run on high-throughput computing resources and they have a different anatomy than HPC applications. LHC applications comprise a core framework that allows hundreds of researchers to plug in their specific algorithms. The software stacks easily accumulate to many gigabytes for a single release. New releases are often produced on a daily basis. To facilitate the distribution of these software stacks to world-wide distributed computing resources, LHC experiments use a purpose-built, global, POSIX file system, the CernVM File System. CernVM-FS pre-processes dat...

  15. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    Energy Technology Data Exchange (ETDEWEB)

    Habib, Salman [Argonne National Lab. (ANL), Argonne, IL (United States); Roser, Robert [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); LeCompte, Tom [Argonne National Lab. (ANL), Argonne, IL (United States); Marshall, Zach [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Borgland, Anders [SLAC National Accelerator Lab., Menlo Park, CA (United States); Viren, Brett [Brookhaven National Lab. (BNL), Upton, NY (United States); Nugent, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Asai, Makato [SLAC National Accelerator Lab., Menlo Park, CA (United States); Bauerdick, Lothar [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Finkel, Hal [Argonne National Lab. (ANL), Argonne, IL (United States); Gottlieb, Steve [Indiana Univ., Bloomington, IN (United States); Hoeche, Stefan [SLAC National Accelerator Lab., Menlo Park, CA (United States); Sheldon, Paul [Vanderbilt Univ., Nashville, TN (United States); Vay, Jean-Luc [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Elmer, Peter [Princeton Univ., NJ (United States); Kirby, Michael [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Patton, Simon [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Potekhin, Maxim [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Yanny, Brian [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Calafiura, Paolo [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Dart, Eli [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Gutsche, Oliver [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Izubuchi, Taku [Brookhaven National Lab. (BNL), Upton, NY (United States); Lyon, Adam [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Petravick, Don [Univ. of Illinois, Urbana-Champaign, IL (United States). National Center for Supercomputing Applications (NCSA)

    2015-10-29

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  16. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    Energy Technology Data Exchange (ETDEWEB)

    Habib, Salman [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Roser, Robert [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2015-10-28

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  17. INFOS: spectrum fitting software for NMR analysis

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Albert A., E-mail: alsi@nmr.phys.chem.ethz.ch [ETH Zürich, Physical Chemistry (Switzerland)

    2017-02-15

    Software for fitting of NMR spectra in MATLAB is presented. Spectra are fitted in the frequency domain, using Fourier transformed lineshapes, which are derived using the experimental acquisition and processing parameters. This yields more accurate fits compared to common fitting methods that use Lorentzian or Gaussian functions. Furthermore, a very time-efficient algorithm for calculating and fitting spectra has been developed. The software also performs initial peak picking, followed by subsequent fitting and refinement of the peak list, by iteratively adding and removing peaks to improve the overall fit. Estimation of error on fitting parameters is performed using a Monte-Carlo approach. Many fitting options allow the software to be flexible enough for a wide array of applications, while still being straightforward to set up with minimal user input.

  18. The Open Source DataTurbine Initiative: Streaming Data Middleware for Environmental Observing Systems

    Science.gov (United States)

    Fountain T.; Tilak, S.; Shin, P.; Hubbard, P.; Freudinger, L.

    2009-01-01

    The Open Source DataTurbine Initiative is an international community of scientists and engineers sharing a common interest in real-time streaming data middleware and applications. The technology base of the OSDT Initiative is the DataTurbine open source middleware. Key applications of DataTurbine include coral reef monitoring, lake monitoring and limnology, biodiversity and animal tracking, structural health monitoring and earthquake engineering, airborne environmental monitoring, and environmental sustainability. DataTurbine software emerged as a commercial product in the 1990 s from collaborations between NASA and private industry. In October 2007, a grant from the USA National Science Foundation (NSF) Office of Cyberinfrastructure allowed us to transition DataTurbine from a proprietary software product into an open source software initiative. This paper describes the DataTurbine software and highlights key applications in environmental monitoring.

  19. Sandia software guidelines: Software quality planning

    Energy Technology Data Exchange (ETDEWEB)

    1987-08-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. In consonance with the IEEE Standard for Software Quality Assurance Plans, this volume identifies procedures to follow in producing a Software Quality Assurance Plan for an organization or a project, and provides an example project SQA plan. 2 figs., 4 tabs.

  20. Social.Water--Open Source Citizen Science Software for CrowdHydrology

    Science.gov (United States)

    Fienen, M. N.; Lowry, C.

    2013-12-01

    CrowdHydrology is a crowd-sourced citizen science project in which passersby near streams are encouraged to read a gage and send an SMS (text) message with the water level to a number indicated on a sign. The project was initially started using free services such as Google Voice, Gmail, and Google Maps to acquire and present the data on the internet. Social.Water is open-source software, using Python and JavaScript, that automates the acquisition, categorization, and presentation of the data. Open-source objectives pervade both the project and the software as the code is hosted at Github, only free scripting codes are used, and any person or organization can install a gage and join the CrowdHydrology network. In the first year, 10 sites were deployed in upstate New York, USA. In the second year, expansion to 44 sites throughout the upper Midwest USA was achieved. Comparison with official USGS and academic measurements have shown low error rates. Citizen participation varies greatly from site to site, so surveys or other social information is sought for insight into why some sites experience higher rates of participation than others.

  1. Software upgradation of PXI based data acquisition for Aditya experiments

    International Nuclear Information System (INIS)

    Panchal, Vipul K.; Chavda, Chhaya; Patel, Vijay; Patel, Narendra; Ghosh, Joydeep

    2015-01-01

    Aditya Data Acquisition and Control System is designed to acquire data from diagnostics like Loop Voltage, Rogowski, Magnetic probes, X-rays etc and for control of gas feed, gate valve control, trigger pulse generation etc. CAMAC based data acquisition system was updated with PXI based Multifunction modules. The System is interfaced using optical connectivity with PC using PCI based controller module. Data is acquired using LabVIEW graphical user interface (GUI) and stored in server. The present GUI based application does not have features like module parameters configuration, analysis, webcasting etc. So a new application software using LabVIEW is being developed with features for individual module support considering programmable channel configuration - sampling rate, number of pre and post trigger samples, number of active channel selection etc. It would also have facility of using multi-functionality of timer and counter. The software would be scalable considering more modules, channels and crates along with security of different access level of user privileges. (author)

  2. Software

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, R.; Budd, G.; Ross, E.; Wells, P.

    2010-07-15

    The software section of this journal presented new software programs that have been developed to help in the exploration and development of hydrocarbon resources. Software provider IHS Inc. has made additions to its geological and engineering analysis software tool, IHS PETRA, a product used by geoscientists and engineers to visualize, analyze and manage well production, well log, drilling, reservoir, seismic and other related information. IHS PETRA also includes a directional well module and a decline curve analysis module to improve analysis capabilities in unconventional reservoirs. Petris Technology Inc. has developed a software to help manage the large volumes of data. PetrisWinds Enterprise (PWE) helps users find and manage wellbore data, including conventional wireline and MWD core data; analysis core photos and images; waveforms and NMR; and external files documentation. Ottawa-based Ambercore Software Inc. has been collaborating with Nexen on the Petroleum iQ software for steam assisted gravity drainage (SAGD) producers. Petroleum iQ integrates geology and geophysics data with engineering data in 3D and 4D. Calgary-based Envirosoft Corporation has developed a software that reduces the costly and time-consuming effort required to comply with Directive 39 of the Alberta Energy Resources Conservation Board. The product includes an emissions modelling software. Houston-based Seismic Micro-Technology (SMT) has developed the Kingdom software that features the latest in seismic interpretation. Holland-based Joa Oil and Gas and Calgary-based Computer Modelling Group have both supplied the petroleum industry with advanced reservoir simulation software that enables reservoir interpretation. The 2010 software survey included a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In

  3. Patients' Experiences of Performing Self-care of Stomas in the Initial Postoperative Period.

    Science.gov (United States)

    Lim, Siew Hoon; Chan, Sally Wai Chi; He, Hong-Gu

    2015-01-01

    With the loss of an important bodily function and the distortion in body image, a stoma patient experiences physical, psychological, and social changes. With limited current studies exploring experiences of patients in the management of their stoma, there is a need to explore their experiences, their needs, and factors that influence their self-management. The aim of this study was to investigate patients' experiences of performing self-care of stomas in the initial postoperative period. This study adopted a descriptive qualitative approach from the interpretive paradigm. Semistructured interviews were conducted with 12 patients 1 month postoperatively in a colorectal ward in a hospital in Singapore. Thematic analysis was applied to the interview data. Five themes were identified: process of acceptance and self-management of stoma, physical limitations, psychological reactions, social support, and need for timely and sufficient stoma preparation and education. This study highlights the importance of health professionals' role in helping patients adjust preoperatively and postoperatively and accept the presence of a stoma. Health professionals need to be aware of the physical, psychological, and social impact of stoma on patients in the initial 30-day postoperative period. Research findings informed the type and level of assistance and support to be offered to patients by nurses and the importance of encouraging patients to be involved in stoma care at an early stage, which will ultimately contribute to effective and independent self-management. Patients can be prepared preoperatively to reduce the psychological and social impact of stoma after creation of their stoma.

  4. Prostate brachytherapy in Ghana: our initial experience

    Directory of Open Access Journals (Sweden)

    James Edward Mensah

    2016-10-01

    Full Text Available Purpose: This study presents the experience of a brachytherapy team in Ghana with a focus on technology transfer and outcome. The team was initially proctored by experienced physicians from Europe and South Africa. Material and methods : A total of 90 consecutive patients underwent either brachytherapy alone or brachytherapy in combination with external beam radiotherapy for prostate carcinoma between July 2008 and February 2014 at Korle Bu Teaching Hospital, Accra, Ghana. Patients were classified as low-risk, intermediate, and high-risk according to the National Comprehensive Cancer Network (NCCN criteria. All low-risk and some intermediate risk group patients were treated with seed implantation alone. Some intermediate and all high-risk group patients received brachytherapy combined with external beam radiotherapy. Results: The median patient age was 64.0 years (range 46-78 years. The median follow-up was 58 months (range 18-74 months. Twelve patients experienced biochemical failure including one patient who had evidence of metastatic disease and died of prostate cancer. Freedom from biochemical failure rates for low, intermediate, and high-risk cases were 95.4%, 90.9%, and 70.8%, respectively. Clinical parameters predictive of biochemical outcome included: clinical stage, Gleason score, and risk group. Pre-treatment prostate specific antigen (PSA was not a statistically significant predictor of biochemical failure. Sixty-nine patients (76.6% experienced grade 1 urinary symptoms in the form of frequency, urgency, and poor stream. These symptoms were mostly self-limiting. Four patients needed catheterization for urinary retention (grade 2. One patient developed a recto urethral fistula (grade 3 following banding for hemorrhoids. Conclusions : Our results compare favorably with those reported by other institutions with more extensive experience. We believe therefore that, interstitial permanent brachytherapy can be safely and effectively

  5. NASA Software Engineering Benchmarking Study

    Science.gov (United States)

    Rarick, Heather L.; Godfrey, Sara H.; Kelly, John C.; Crumbley, Robert T.; Wifl, Joel M.

    2013-01-01

    was its software assurance practices, which seemed to rate well in comparison to the other organizational groups and also seemed to include a larger scope of activities. An unexpected benefit of the software benchmarking study was the identification of many opportunities for collaboration in areas including metrics, training, sharing of CMMI experiences and resources such as instructors and CMMI Lead Appraisers, and even sharing of assets such as documented processes. A further unexpected benefit of the study was the feedback on NASA practices that was received from some of the organizations interviewed. From that feedback, other potential areas where NASA could improve were highlighted, such as accuracy of software cost estimation and budgetary practices. The detailed report contains discussion of the practices noted in each of the topic areas, as well as a summary of observations and recommendations from each of the topic areas. The resulting 24 recommendations from the topic areas were then consolidated to eliminate duplication and culled into a set of 14 suggested actionable recommendations. This final set of actionable recommendations, listed below, are items that can be implemented to improve NASA's software engineering practices and to help address many of the items that were listed in the NASA top software engineering issues. 1. Develop and implement standard contract language for software procurements. 2. Advance accurate and trusted software cost estimates for both procured and in-house software and improve the capture of actual cost data to facilitate further improvements. 3. Establish a consistent set of objectives and expectations, specifically types of metrics at the Agency level, so key trends and models can be identified and used to continuously improve software processes and each software development effort. 4. Maintain the CMMI Maturity Level requirement for critical NASA projects and use CMMI to measure organizations developing software for NASA. 5

  6. Eprints Institutional Repository Software: A Review

    Directory of Open Access Journals (Sweden)

    Mike R. Beazley

    2011-01-01

    Full Text Available Setting up an institutional repository (IR can be a daunting task. There are many software packages out there, some commercial, some open source, all of which offer different features and functionality. This article will provide some thoughts about one of these software packages: Eprints. Eprints was one of the first IR software packages to appear and has been available for 10 years. It is under continual development by its creators at the University of Southampton and the current version is v3.2.3. Eprints is open-source, meaning that anyone can download and make use of the software for free and the software can be modified however the user likes. This presents clear advantages for institutions will smaller budgets and also for institutions that have programmers on staff. Eprints requires some additional software to run: Linux, Apache, MySQL, and Perl. This software is all open-source and already present on the servers of many institutions. There is now a version of Eprints that will run on Windows servers as well, which will make the adoption of Eprints even easier for some. In brief, Eprints is an excellent choice for any institution looking to get an IR up and running quickly and easily. Installation is straightforward as is the initial configuration. Once the IR is up and running, users may upload documents and provide the necessary metadata for the records by filling out a simple web form. Embargoes on published documents are handled elegantly by the software, and the software links to the SHERPA/RoMEO database so authors can easily verify their rights regarding IR submissions. Eprints has some drawbacks, which will be discussed later in the review, but on the whole it is easy to recommend to anyone looking to start an IR. However, It is less clear that an institution with an existing IR based on another software package should migrate to Eprints.

  7. Initial Scaling Studies and Conceptual Thermal Fluids Experiments for the Prismatic NGNP Point Design

    Energy Technology Data Exchange (ETDEWEB)

    D. M. McEligot; G. E. McCreery

    2004-09-01

    The objective of this report is to document the initial high temperature gas reactor scaling studies and conceptual experiment design for gas flow and heat transfer. The general approach of the project is to develop new benchmark experiments for assessment in parallel with CFD and coupled CFD/ATHENA/RELAP5-3D calculations for the same geometry. Two aspects of the complex flow in an NGNP are being addressed: (1) flow and thermal mixing in the lower plenum ("hot streaking" issue) and (2) turbulence and resulting temperature distributions in reactor cooling channels ("hot channel" issue). Current prismatic NGNP concepts are being examined to identify their proposed flow conditions and geometries over the range from normal operation to decay heat removal in a pressurized cooldown. Approximate analyses are being applied to determine key non-dimensional parameters and their magnitudes over this operating range. For normal operation, the flow in the coolant channels can be considered to be dominant forced convection with slight transverse property variation. The flow in the lower plenum can locally be considered to be a situation of multiple buoyant jets into a confined density-stratified crossflow -- with obstructions. Experiments are needed for the combined features of the lower plenum flows. Missing from the typical jet experiments are interactions with nearby circular posts and with vertical posts in the vicinity of vertical walls - with near stagnant surroundings at one extreme and significant crossflow at the other. Two heat transfer experiments are being considered. One addresses the "hot channel" problem, if necessary. The second experiment will treat heated jets entering a model plenum. Unheated MIR (Matched-Index-of-Refraction) experiments are first steps when the geometry is complicated. One does not want to use a computational technique which will not even handle constant properties properly. The MIR experiment will simulate flow features of the paths of jets

  8. Preserving experience through expert systems

    International Nuclear Information System (INIS)

    Jelinek, J.B.; Weidman, S.H.

    1989-01-01

    Expert systems technology, one of the branches in the field of computerized artificial intelligence, has existed for >30 yr but only recently has been made available on commercially standard hardware and software platforms. An expert system can be defined as any method of encoding knowledge by representing that knowledge as a collection of facts or objects. Decisions are made by the expert program by obtaining data about the problem or situation and correlating encoded facts (knowledge) to the data until a conclusion can be reached. Such conclusions can be relayed to the end user as expert advice. Realizing the potential of this technology, General Electric (GE) Nuclear Energy (GENE) has initiated a development program in expert systems applications; this technology offers the potential for packaging, distributing, and preserving nuclear experience in a software form. The paper discusses application fields, effective applications, and knowledge acquisition and knowledge verification

  9. Usability challenges in an Ethiopian software development organization

    DEFF Research Database (Denmark)

    Teka, Degif; Dittrich, Yvonne; Kifle, Mesfin

    2016-01-01

    Usability and user centered design (UCD) are central to software development. In developing countries, the gap between IT development and the local use situation is larger than in western countries. However, usability is neither well addressed in software practice nor at the policy making level...... in Ethiopia. Software practitioners focus on functional requirements, meeting deadlines and budget. The software development industry in Ethiopia is in its early stage. The article aims at understanding usability practices in an Ethiopian software development company. Developers, system analysts, product...... configuration, their experience, cultural knowledge and common sense regarding the users' situation guided the design. Prototypes and fast delivery of working versions helped in getting user feedback even if early user focus proved to be a challenge as communication between developers and users suffered from...

  10. Liver transplantations in Bulgaria--initial experience.

    Science.gov (United States)

    Vladov, N; Mihaylov, V; Takorov, I; Vasilevski, I; Lukanova, T; Odisseeva, E; Katzarov, K; Simonova, M; Tomova, D; Konakchieva, M; Petrov, N; Mladenov, N; Sergeev, S; Mutafchiiski, V

    2014-01-01

    The filed of liver transplantation (LT) continues to evolve and is highly effective therapy for many patients with acute and chronic liver failure resulting from a variety of causes. Improvement of perioperative care, surgical technique and immunosuppression in recent years has led to its transformation into a safe and routine procedure with steadily improving results. The aim of this paper is to present the initial experience of the transplant team at Military Medical Academy - Sofia, Bulgaria. For the period of April 2007 - August 2014 the team performed 38 liver transplants in 37 patients (one retransplantation). Patients were followed up prospectively and retrospectively. In 36 (95%) patients a graft from a cadaveric donor was used and in two cases--a right liver grafts from live donor. The mean MELD score of the transplanted patients was 17 (9-40). The preferred surgical technique was "piggyback" with preservation of inferior vena cava in 33 (86%) of the cases and classical technique in 3 (8%) patients. The overall complication rate was 48%. Early mortality rate was 13% (5 patients). The overall 1- and 5-year survival is 81% and 77% respectivelly. The setting of a new LT program is a complex process which requires the effort and effective colaboration of a wide range of speciacialists (hepatologists, surgeons, anesthesiologists, psychologists, therapists, coordinators, etc.) and institutions. The good results are function of a proper selection of the donors and the recipients. Living donation is an alternative in the shortage of cadaveric donors.

  11. The HEP Software and Computing Knowledge Base

    Science.gov (United States)

    Wenaus, T.

    2017-10-01

    HEP software today is a rich and diverse domain in itself and exists within the mushrooming world of open source software. As HEP software developers and users we can be more productive and effective if our work and our choices are informed by a good knowledge of what others in our community have created or found useful. The HEP Software and Computing Knowledge Base, hepsoftware.org, was created to facilitate this by serving as a collection point and information exchange on software projects and products, services, training, computing facilities, and relating them to the projects, experiments, organizations and science domains that offer them or use them. It was created as a contribution to the HEP Software Foundation, for which a HEP S&C knowledge base was a much requested early deliverable. This contribution will motivate and describe the system, what it offers, its content and contributions both existing and needed, and its implementation (node.js based web service and javascript client app) which has emphasized ease of use for both users and contributors.

  12. More about software requirements thorny issues and practical advice

    CERN Document Server

    Wiegers, Karl E

    2006-01-01

    No matter how much instruction you've had on managing software requirements, there's no substitute for experience. Too often, lessons about requirements engineering processes lack the no-nonsense guidance that supports real-world solutions. Complementing the best practices presented in his book, Software Requirements, Second Edition, requirements engineering authority Karl Wiegers tackles even more of the real issues head-on in this book. With straightforward, professional advice and practical solutions based on actual project experiences, this book answers many of the tough questions rais

  13. Development of a flight software testing methodology

    Science.gov (United States)

    Mccluskey, E. J.; Andrews, D. M.

    1985-01-01

    The research to develop a testing methodology for flight software is described. An experiment was conducted in using assertions to dynamically test digital flight control software. The experiment showed that 87% of typical errors introduced into the program would be detected by assertions. Detailed analysis of the test data showed that the number of assertions needed to detect those errors could be reduced to a minimal set. The analysis also revealed that the most effective assertions tested program parameters that provided greater indirect (collateral) testing of other parameters. In addition, a prototype watchdog task system was built to evaluate the effectiveness of executing assertions in parallel by using the multitasking features of Ada.

  14. The ESA's Space Trajectory Analysis software suite

    Science.gov (United States)

    Ortega, Guillermo

    relationship between objects in 2D and 3D formats, etc. Further, the article explains that the STA development is open source and it is based on the state of the art astrodynamics routines that are grouped into modules. The modules are programmed using the C++ language. The different STA modules are designed, developed, tested and verified by the different Universities. Software integration and overall validation is performed by ESA. Students are chosen to work in STA modules as part of their Master or PhD thesis programs. As part of their growing experience, the students learn how to write documentation for a space project using European Coorperation on Space Standardization (ECSS) standards, how to test and verify the software modules they write and, how to interact with ESA and each other in this process. Finally, the article concludes about the benefits of the STA initiative. The STA project allows a strong link among applied mathematics, space engineering, and informatics disciplines by reinforcing the academic community with requirements and needs coming from space agencies and industry real needs and missions.

  15. artdaq: DAQ software development made simple

    Science.gov (United States)

    Biery, Kurt; Flumerfelt, Eric; Freeman, John; Ketchum, Wesley; Lukhanin, Gennadiy; Rechenmacher, Ron

    2017-10-01

    For a few years now, the artdaq data acquisition software toolkit has provided numerous experiments with ready-to-use components which allow for rapid development and deployment of DAQ systems. Developed within the Fermilab Scientific Computing Division, artdaq provides data transfer, event building, run control, and event analysis functionality. This latter feature includes built-in support for the art event analysis framework, allowing experiments to run art modules for real-time filtering, compression, disk writing and online monitoring. As art, also developed at Fermilab, is also used for offline analysis, a major advantage of artdaq is that it allows developers to easily switch between developing online and offline software. artdaq continues to be improved. Support for an alternate mode of running whereby data from some subdetector components are only streamed if requested has been added; this option will reduce unnecessary DAQ throughput. Real-time reporting of DAQ metrics has been implemented, along with the flexibility to choose the format through which experiments receive the reports; these formats include the Ganglia, Graphite and syslog software packages, along with flat ASCII files. Additionally, work has been performed investigating more flexible modes of online monitoring, including the capability to run multiple online monitoring processes on different hosts, each running its own set of art modules. Finally, a web-based GUI interface through which users can configure details of their DAQ system has been implemented, increasing the ease of use of the system. Already successfully deployed on the LArlAT, DarkSide-50, DUNE 35ton and Mu2e experiments, artdaq will be employed for SBND and is a strong candidate for use on ICARUS and protoDUNE. With each experiment comes new ideas for how artdaq can be made more flexible and powerful. The above improvements will be described, along with potential ideas for the future.

  16. Examining software complexity and quality for scientific software

    International Nuclear Information System (INIS)

    Kelly, D.; Shepard, T.

    2005-01-01

    Research has not found a simple relationship between software complexity and software quality, and particularly no relationship between commonly used software complexity metrics and the occurrence of software faults. A study with an example of scientific software from the nuclear power industry illustrates the importance of addressing cognitive complexity, the complexity related to understanding the intellectual content of the software. Simple practices such as aptly-named variables contributes more to high quality software than limiting code sizes. This paper examines the research into complexity and quality and reports on a longitudinal study using the example of nuclear software. (author)

  17. PyMUS: Python-Based Simulation Software for Virtual Experiments on Motor Unit System.

    Science.gov (United States)

    Kim, Hojeong; Kim, Minjung

    2018-01-01

    We constructed a physiologically plausible computationally efficient model of a motor unit and developed simulation software that allows for integrative investigations of the input-output processing in the motor unit system. The model motor unit was first built by coupling the motoneuron model and muscle unit model to a simplified axon model. To build the motoneuron model, we used a recently reported two-compartment modeling approach that accurately captures the key cell-type-related electrical properties under both passive conditions (somatic input resistance, membrane time constant, and signal attenuation properties between the soma and the dendrites) and active conditions (rheobase current and afterhyperpolarization duration at the soma and plateau behavior at the dendrites). To construct the muscle unit, we used a recently developed muscle modeling approach that reflects the experimentally identified dependencies of muscle activation dynamics on isometric, isokinetic and dynamic variation in muscle length over a full range of stimulation frequencies. Then, we designed the simulation software based on the object-oriented programing paradigm and developed the software using open-source Python language to be fully operational using graphical user interfaces. Using the developed software, separate simulations could be performed for a single motoneuron, muscle unit and motor unit under a wide range of experimental input protocols, and a hierarchical analysis could be performed from a single channel to the entire system behavior. Our model motor unit and simulation software may represent efficient tools not only for researchers studying the neural control of force production from a cellular perspective but also for instructors and students in motor physiology classroom settings.

  18. Software engineering for the EBR-II data acquisition system conversion

    International Nuclear Information System (INIS)

    Schorzman, W.

    1988-01-01

    The purpose of this paper is to outline how EBR-II engineering approached the data acquisition system (DAS) software conversion project with the restraints of operational transparency and six weeks for final implementation and testing. Software engineering is a relatively new discipline that provides a structured philosopy for software conversion. The software life cycle is structured into six basic steps: 1) initiation, 2) requirements definition, 3) design, 4) programming, 5) testing, and 6) operations. These steps are loosely defined and can be altered to fit specific software applications. DAS software is encompassed from three sources: 1) custom software, 2) system software, and 3) in-house application software. A data flow structure is used to describe the DAS software. The categories are: 1) software used to bring signals into the central processer, 2) software that transforms the analog data to engineering units and then logs the data in the data store, and 3) software used to transport and display the data. The focus of this paper is to describe how the conversion team used a structured engineering approach and utilized the resources available to produce a quality system on time. Although successful, the conversion process provided some pit falls and stumbling blocks. Working through these obstacles enhanced our understanding and surfaced in the form of LESSONS LEARNED, which are gracefully shared in this paper

  19. CLARAty: Challenges and Steps Toward Reusable Robotic Software

    Directory of Open Access Journals (Sweden)

    Richard Madison

    2008-11-01

    Full Text Available We present in detail some of the challenges in developing reusable robotic software. We base that on our experience in developing the CLARAty robotics software, which is a generic object-oriented framework used for the integration of new algorithms in the areas of motion control, vision, manipulation, locomotion, navigation, localization, planning and execution. CLARAty was adapted to a number of heterogeneous robots with different mechanisms and hardware control architectures. In this paper, we also describe how we addressed some of these challenges in the development of the CLARAty software.

  20. CLARAty: Challenges and Steps toward Reusable Robotic Software

    Directory of Open Access Journals (Sweden)

    Issa A.D. Nesnas

    2006-03-01

    Full Text Available We present in detail some of the challenges in developing reusable robotic software. We base that on our experience in developing the CLARAty robotics software, which is a generic object-oriented framework used for the integration of new algorithms in the areas of motion control, vision, manipulation, locomotion, navigation, localization, planning and execution. CLARAty was adapted to a number of heterogeneous robots with different mechanisms and hardware control architectures. In this paper, we also describe how we addressed some of these challenges in the development of the CLARAty software.

  1. NASA Controller Acceptability Study 1(CAS-1) Experiment Description and Initial Observations

    Science.gov (United States)

    Chamberlain, James P.; Consiglio, Maria C.; Comstock, James R., Jr.; Ghatas, Rania W.; Munoz, Cesar

    2015-01-01

    This paper describes the Controller Acceptability Study 1 (CAS-1) experiment that was conducted by NASA Langley Research Center personnel from January through March 2014 and presents partial CAS-1 results. CAS-1 employed 14 air traffic controller volunteers as research subjects to assess the viability of simulated future unmanned aircraft systems (UAS) operating alongside manned aircraft in moderate-density, moderate-complexity Class E airspace. These simulated UAS were equipped with a prototype pilot-in-the-loop (PITL) Detect and Avoid (DAA) system, specifically the Self-Separation (SS) function of such a system based on Stratway+ software to replace the see-and-avoid capabilities of manned aircraft pilots. A quantitative CAS-1 objective was to determine horizontal miss distance (HMD) values for SS encounters that were most acceptable to air traffic controllers, specifically HMD values that were assessed as neither unsafely small nor disruptively large. HMD values between 0.5 and 3.0 nautical miles (nmi) were assessed for a wide array of encounter geometries between UAS and manned aircraft. The paper includes brief introductory material about DAA systems and their SS functions, followed by descriptions of the CAS-1 simulation environment, prototype PITL SS capability, and experiment design, and concludes with presentation and discussion of partial CAS-1 data and results.

  2. Becoming Predictably Adaptable in Software Development

    Directory of Open Access Journals (Sweden)

    Michael Vakoc

    2017-10-01

    Full Text Available It’s difficult to state exact timelines in software development and it is even more difficult to say when features that users want will be delivered. We propose changes to current software development methodologies that enable companies to be predictably adaptable and deliver both on time and what customer asked for. We do so through research of current literature, interviews and personal experience working at an international company that builds products for millions of customers and is facing exactly the challenges described above.

  3. EnTagRec : an enhanced tag recommendation system for software information sites

    NARCIS (Netherlands)

    Wang, S.; Lo, D.; Vasilescu, B.N.; Serebrenik, A.

    2014-01-01

    Software engineers share experiences with modern technologies by means of software information sites, such as STACK OVERFLOW. These sites allow developers to label posted content, referred to as software objects, with short descriptions, known as tags. However, tags assigned to objects tend to be

  4. Thermophysical Property Estimation by Transient Experiments: The Effect of a Biased Initial Temperature Distribution

    Directory of Open Access Journals (Sweden)

    Federico Scarpa

    2015-01-01

    Full Text Available The identification of thermophysical properties of materials in dynamic experiments can be conveniently performed by the inverse solution of the associated heat conduction problem (IHCP. The inverse technique demands the knowledge of the initial temperature distribution within the material. As only a limited number of temperature sensors (or no sensor at all are arranged inside the test specimen, the knowledge of the initial temperature distribution is affected by some uncertainty. This uncertainty, together with other possible sources of bias in the experimental procedure, will propagate in the estimation process and the accuracy of the reconstructed thermophysical property values could deteriorate. In this work the effect on the estimated thermophysical properties due to errors in the initial temperature distribution is investigated along with a practical method to quantify this effect. Furthermore, a technique for compensating this kind of bias is proposed. The method consists in including the initial temperature distribution among the unknown functions to be estimated. In this way the effect of the initial bias is removed and the accuracy of the identified thermophysical property values is highly improved.

  5. Product-oriented Software Certification Process for Software Synthesis

    Science.gov (United States)

    Nelson, Stacy; Fischer, Bernd; Denney, Ewen; Schumann, Johann; Richardson, Julian; Oh, Phil

    2004-01-01

    The purpose of this document is to propose a product-oriented software certification process to facilitate use of software synthesis and formal methods. Why is such a process needed? Currently, software is tested until deemed bug-free rather than proving that certain software properties exist. This approach has worked well in most cases, but unfortunately, deaths still occur due to software failure. Using formal methods (techniques from logic and discrete mathematics like set theory, automata theory and formal logic as opposed to continuous mathematics like calculus) and software synthesis, it is possible to reduce this risk by proving certain software properties. Additionally, software synthesis makes it possible to automate some phases of the traditional software development life cycle resulting in a more streamlined and accurate development process.

  6. A Practical Introduction to HardwareSoftware Codesign

    CERN Document Server

    Schaumont, Patrick R

    2013-01-01

    This textbook provides an introduction to embedded systems design, with emphasis on integration of custom hardware components with software. The key problem addressed in the book is the following: how can an embedded systems designer strike a balance between flexibility and efficiency? The book describes how combining hardware design with software design leads to a solution to this important computer engineering problem. The book covers four topics in hardware/software codesign: fundamentals, the design space of custom architectures, the hardware/software interface and application examples. The book comes with an associated design environment that helps the reader to perform experiments in hardware/software codesign. Each chapter also includes exercises and further reading suggestions. Improvements in this second edition include labs and examples using modern FPGA environments from Xilinx and Altera, which make the material applicable to a greater number of courses where these tools are already in use.  Mo...

  7. Strategies employed for LHC software performance studies

    CERN Document Server

    Nowak, A

    2010-01-01

    The objective of this work is to collect and assess the software performance related strategies employed by the major players in the LHC software arena: the four main experiments (ALICE, ATLAS, CMS and LHCb) and the two main software frameworks (Geant4 and ROOT). As the software used differs between the parties, so do the directions and methods in optimization, and their intensity. The common feeling shared by nearly all interviewed parties is that performance is not one of their top priorities and that maintaining it at a constant level is a satisfactory solution, given the resources at hand. In principle, despite some organized efforts, a less structured approach seems to be the dominant one, and opportunistic optimization prevails. Four out of six surveyed groups are investigating memory management related effects, deemed to be the primary cause of their performance issues. The most commonly used tools include Valgrind and homegrown software. All questioned groups expressed the desire for advanced tools, s...

  8. Portability and the National Energy Software Center

    International Nuclear Information System (INIS)

    Butler, M.K.

    1978-01-01

    The software portability problem is examined from the viewpoint of experience gained in the operation of a software exchange and information center. First, the factors contributing to the program interchange to date are identified, then major problem areas remaining are noted. The import of the development of programing language and documentation standards is noted, and the program packaging procedures and dissemination practices employed by the Center to facilitate successful software transport are described. Organization, or installation, dependencies of the computing environment, often hidden from the program author, and data interchange complexities are seen as today's primary issues, with dedicated processors and network communications offering an alternative solution

  9. Prototype radiographic system for emergency and intensive care units: Initial experience

    International Nuclear Information System (INIS)

    Mirvis, S.

    1986-01-01

    A prototype radiographic system has been developed for use in bedside examinations in multibed trauma or intensive care units and emergency rooms. The system features a single-phase, high-frequency 30-kW ceiling-mounted generator with an x-ray tube extending from a long counterbalanced arm. All movements are servo-assisted for ease of operation. Based on initial experience, the unit allows easier access to the patient around resuscitation and monitoring equipment, occupies less floor space, and yields better quality images than do standard mobile radiographic units

  10. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  11. Data acquisition and processing software for linear PSD based neutron diffractometers

    International Nuclear Information System (INIS)

    Pande, S.S.; Borkar, S.P.; Ghodgaonkar, M.D.

    2003-01-01

    As a part of data acquisition system for various single and multi-PSD diffractometers software is developed to acquire the data and support the requirements of diffraction experiments. The software is a front-end Windows 98 application on PC and a transputer program on the MPSD card. The front-end application provides entire user interface required for data acquisition, control, presentation and system setup. Data is acquired and the diffraction spectra are generated in the transputer program. All the required hardware control is also implemented in the transputer program. The two programs communicate using a device driver named VTRANSPD. The software plays a vital role in customizing and integrating the data acquisition system for various diffractometer setups. Also the experiments are effectively automated in the software which has helped in making best use of available beam time. These and other features of the data acquisition and processing software are presented here. This software is being used along with the data acquisition system at a few single PSD and multi-PSD diffractometers. (author)

  12. A concept of software testing for SMART MMIS software

    International Nuclear Information System (INIS)

    Seo, Yong Seok; Seong, Seung Hwan; Park, Keun Ok; Hur, Sub; Kim, Dong Hoon

    2001-01-01

    In order to achieve high quality of SMART MMIS software, the well-constructed software testing concept shall be required. This paper established software testing concept which is to be applied to SMART MMIS software, in terms of software testing organization, documentation. procedure, and methods. The software testing methods are classified into source code static analysis and dynamic testing. The software dynamic testing methods are discussed with two aspects: white-box and black-box testing. As software testing concept introduced in this paper is applied to the SMART MMIS software. the high quality of the software will be produced. In the future, software failure data will be collected through the construction of SMART MMIS prototyping facility which the software testing concept of this paper is applied to

  13. Use of Iodine-based contrast media in digital full-field mammography - initial experience

    International Nuclear Information System (INIS)

    Diekmann, F.; Diekmann, S.; Taupitz, M.; Bick, U.; Winzer, K.-J.; Huettner, C.; Muller, S.; Jeunehomme, F.; Hamm, B.

    2003-01-01

    Aim: To investigate the use of iodine-based contrast media in digital full-field mammography. Methods: After performing initial phantom studies, seven patients underwent digital mammography (Senographe 2000D, GE Medical Systems, Milwaukee, USA) using a specially filtered beam before as well as 60, 120, and 180 seconds after injection of 80 ml of iodine contrast medium (Ultravist 370, Schering AG, Germany). The precontrast mammograms were then subtracted from the postcontrast mammograms and the resulting images compared with a contrast-enhanced dynamic MRI study, performed on all women. Results: Contrast medium accumulation within the tumors was visualized with a good quality in all cases. The conditions under which successful contrast-enhanced digital mammography can be performed were determined in phantom studies. Conclusions: Contrast-enhanced digital mammography has a potential for improving the visualization of breast tumors in mammography using special beam filtering, adjusted X-ray parameters, proper timing, and suitable subtraction software. (orig.) [de

  14. Computing for an SSC experiment

    International Nuclear Information System (INIS)

    Gaines, I.

    1993-01-01

    The hardware and software problems for SSC experiments are similar to those faced by present day experiments but larger in scale. In particular, the Solenoidal Detector Collaboration (SDC) anticipates the need for close to 10**6 MIPS of off-line computing and will produce several Petabytes (10**15 bytes) of data per year. Software contributions will be made from large numbers of highly geographically dispersed physicists. Hardware and software architectures to meet these needs have been designed. Providing the requisites amount of computing power and providing tools to allow cooperative software development using extensions of existing techniques look achievable. The major challenges will be to provide efficient methods of accessing and manipulating the enormous quantities of data that will be produced at the SSC, and to enforce the use of software engineering tools that will ensure the open-quotes correctnessclose quotes of experiment critical software

  15. Fostering successful scientific software communities

    Science.gov (United States)

    Bangerth, W.; Heister, T.; Hwang, L.; Kellogg, L. H.

    2016-12-01

    Developing sustainable open source software packages for the sciences appears at first to be primarily a technical challenge: How can one create stable and robust algorithms, appropriate software designs, sufficient documentation, quality assurance strategies such as continuous integration and test suites, or backward compatibility approaches that yield high-quality software usable not only by the authors, but also the broader community of scientists? However, our experience from almost two decades of leading the development of the deal.II software library (http://www.dealii.org, a widely-used finite element package) and the ASPECT code (http://aspect.dealii.org, used to simulate convection in the Earth's mantle) has taught us that technical aspects are not the most difficult ones in scientific open source software. Rather, it is the social challenge of building and maintaining a community of users and developers interested in answering questions on user forums, contributing code, and jointly finding solutions to common technical and non-technical challenges. These problems are posed in an environment where project leaders typically have no resources to reward the majority of contributors, where very few people are specifically paid for the work they do on the project, and with frequent turnover of contributors as project members rotate into and out of jobs. In particular, much software work is done by graduate students who may become fluent enough in a software only a year or two before they leave academia. We will discuss strategies we have found do and do not work in maintaining and growing communities around the scientific software projects we lead. Specifically, we will discuss the management style necessary to keep contributors engaged, ways to give credit where credit is due, and structuring documentation to decrease reliance on forums and thereby allow user communities to grow without straining those who answer questions.

  16. Validating new software for semiautomated liver volumetry. Better than manual measurement?

    Energy Technology Data Exchange (ETDEWEB)

    Noschinski, L.E.; Maiwald, B.; Voigt, P.; Kahn, T.; Stumpp, P. [University Hospital Leipzig (Germany). Dept. of Diagnostic and Interventional Radiology; Wiltberger, G. [University Hospital Leipzig (Germany). Dept. of Visceral, Transplantation, Thoracic and Vascular Surgery

    2015-09-15

    This prospective study compared a manual program for liver volumetry with semiautomated software. The hypothesis was that the semiautomated software would be faster, more accurate and less dependent on the evaluator's experience. Ten patients undergoing hemihepatectomy were included in this IRB approved study after written informed consent. All patients underwent a preoperative abdominal 3-phase CT scan, which was used for whole liver volumetry and volume prediction for the liver part to be resected. Two different types of software were used: 1) manual method: borders of the liver had to be defined per slice by the user; 2) semiautomated software: automatic identification of liver volume with manual assistance for definition of Couinaud segments. Measurements were done by six observers with different experience levels. Water displacement volumetry immediately after partial liver resection served as the gold standard. The resected part was examined with a CT scan after displacement volumetry. Volumetry of the resected liver scan showed excellent correlation to water displacement volumetry (manual: ρ = 0.997; semiautomated software: ρ = 0.995). The difference between the predicted volume and the real volume was significantly smaller with the semiautomated software than with the manual method (33 % vs. 57 %, p = 0.002). The semiautomated software was almost four times faster for volumetry of the whole liver (manual: 6:59 ± 3:04min; semiautomated: 1:47 ± 1:11 min). Both methods for liver volumetry give an estimated liver volume close to the real one. The tested semiautomated software is faster, more accurate in predicting the volume of the resected liver part, gives more reproducible results and is less dependent on the user's experience.

  17. Validating new software for semiautomated liver volumetry. Better than manual measurement?

    International Nuclear Information System (INIS)

    Noschinski, L.E.; Maiwald, B.; Voigt, P.; Kahn, T.; Stumpp, P.; Wiltberger, G.

    2015-01-01

    This prospective study compared a manual program for liver volumetry with semiautomated software. The hypothesis was that the semiautomated software would be faster, more accurate and less dependent on the evaluator's experience. Ten patients undergoing hemihepatectomy were included in this IRB approved study after written informed consent. All patients underwent a preoperative abdominal 3-phase CT scan, which was used for whole liver volumetry and volume prediction for the liver part to be resected. Two different types of software were used: 1) manual method: borders of the liver had to be defined per slice by the user; 2) semiautomated software: automatic identification of liver volume with manual assistance for definition of Couinaud segments. Measurements were done by six observers with different experience levels. Water displacement volumetry immediately after partial liver resection served as the gold standard. The resected part was examined with a CT scan after displacement volumetry. Volumetry of the resected liver scan showed excellent correlation to water displacement volumetry (manual: ρ = 0.997; semiautomated software: ρ = 0.995). The difference between the predicted volume and the real volume was significantly smaller with the semiautomated software than with the manual method (33 % vs. 57 %, p = 0.002). The semiautomated software was almost four times faster for volumetry of the whole liver (manual: 6:59 ± 3:04min; semiautomated: 1:47 ± 1:11 min). Both methods for liver volumetry give an estimated liver volume close to the real one. The tested semiautomated software is faster, more accurate in predicting the volume of the resected liver part, gives more reproducible results and is less dependent on the user's experience.

  18. UFMulti: A new parallel processing software system for HEP

    Science.gov (United States)

    Avery, Paul; White, Andrew

    1989-12-01

    UFMulti is a multiprocessing software package designed for general purpose high energy physics applications, including physics and detector simulation, data reduction and DST physics analysis. The system is particularly well suited for installations where several workstation or computers are connected through a local area network (LAN). The initial configuration of the software is currently running on VAX/VMS machines with a planned extension to ULTRIX, using the new RISC CPUs from Digital, in the near future.

  19. UFMULTI: A new parallel processing software system for HEP

    International Nuclear Information System (INIS)

    Avery, P.; White, A.

    1989-01-01

    UFMulti is a multiprocessing software package designed for general purpose high energy physics applications, including physics and detector simulation, data reduction and DST physics analysis. The system is particularly well suited for installations where several workstations or computers are connected through a local area network (LAN). The initial configuration of the software is currently running on VAX/VMS machines with a planned extension to ULTRIX, using the new RISC CPUs from Digital, in the near future. (orig.)

  20. Computer software to assess weld thickness loss in offshore pipelines: PEDS

    Energy Technology Data Exchange (ETDEWEB)

    Germano, Andre Luiz Silva; Correa, Samanda Cristine Arruda [Centro Universitario Estadual da Zona Oeste (CCMAT/UEZO), Rio de Janeiro, RJ (Brazil)], e-mail: scorrea@nuclear.ufrj.br; Souza, Edmilson Monteiro de; Silva, Ademir Xavier da; Lopes, Ricardo Tadeu [Programa de Engenharia Nuclear, COPPE, Universidade Federal do Rio de Janeiro (UFRJ), Rio de Janeiro, RJ (Brazil)], e-mails: emonteiro@nuclear.ufrj.br, ademir@nuclear.ufrj.br, ricardo@lin.ufrj.br

    2010-07-01

    The purpose of this work is to present an initial vision about a computer software named PEDS to assess weld thickness loss in offshore pipelines through digital radiography. This software calculates the thickness loss through a data bank obtained using computational modeling based on Monte Carlo MCNPX code. In order to give users more flexibility, the computer software was written in Java, which allows it to run on Linux, Mac OSX and Windows. Furthermore, tools are provided to image display, select and analyze specific areas of the image (measure average, area of selection) and generate profile plots. Applications of this software in the offshore area are presented. (author)