WorldWideScience

Sample records for software capabilities offered

  1. An Interoperability Framework and Capability Profiling for Manufacturing Software

    Science.gov (United States)

    Matsuda, M.; Arai, E.; Nakano, N.; Wakai, H.; Takeda, H.; Takata, M.; Sasaki, H.

    ISO/TC184/SC5/WG4 is working on ISO16100: Manufacturing software capability profiling for interoperability. This paper reports on a manufacturing software interoperability framework and a capability profiling methodology which were proposed and developed through this international standardization activity. Within the context of manufacturing application, a manufacturing software unit is considered to be capable of performing a specific set of function defined by a manufacturing software system architecture. A manufacturing software interoperability framework consists of a set of elements and rules for describing the capability of software units to support the requirements of a manufacturing application. The capability profiling methodology makes use of the domain-specific attributes and methods associated with each specific software unit to describe capability profiles in terms of unit name, manufacturing functions, and other needed class properties. In this methodology, manufacturing software requirements are expressed in terns of software unit capability profiles.

  2. Dynamic Capabilities and Project Management in Small Software Companies

    DEFF Research Database (Denmark)

    Nørbjerg, Jacob; Nielsen, Peter Axel; Persson, John Stouby

    2017-01-01

    A small software company depends on its capability to adapt to rapid technological and other changes in its environment—its dynamic capabilities. In this paper, we argue that to evolve and maintain its dynamic capabilities a small software company must pay attention to the interaction between...... dynamic capabilities at different levels of the company — particularly between the project management and the company levels. We present a case study of a small software company and show how successful dynamic capabilities at the company level can affect project management in small software companies...

  3. Improving the Agency's Software Acquisition Capability

    Science.gov (United States)

    Hankinson, Allen

    2003-01-01

    External development of software has oftc n led to unsatisfactory results and great frustration for the assurE 7ce community. Contracts frequently omit critical assuranc 4 processes or the right to oversee software development activitie: At a time when NASA depends more and more on software to in plement critical system functions, combination of three factors ex; cerbate this problem: I ) the ever-increasing trend to acquire rather than develop software in-house, 2) the trend toward performance based contracts, and 3) acquisition vehicles that only state softwar 2 requirements while leaving development standards and assur! ince methodologies up to the contractor. We propose to identify specific methods at d tools that NASA projects can use to mitigate the adverse el ects of the three problems. TWO broad classes of methoddt ols will be explored. The first will be those that provide NASA p ojects with insight and oversight into contractors' activities. The st cond will be those that help projects objectively assess, and thus i nprwe, their software acquisition capability. Of particular interest is the Software Engineering Institute's (SEI) Software Acqt isition Capability Maturity Model (SA-CMMO).

  4. Combining Capability Assessment and Value Engineering: a New Two-dimensional Method for Software Process Improvement

    Directory of Open Access Journals (Sweden)

    Pasi Ojala

    2008-02-01

    Full Text Available During the last decades software process improvement (SPI has been recognized as a usable possibility to increase the quality of software development. Implemented SPI investments have often indicated increased process capabilities as well. Recently more attention has been focused on the costs of SPI as well as on the cost-effectiveness and productivity of software development, although the roots of economic-driven software engineering originate from the very early days of software engineering research. This research combines Value Engineering and capability assessment into usable new method in order to better respond to the challenges that cost-effectiveness and productivity has brought to software companies. This is done in part by defining the concepts of value, worth and cost and in part by defining the Value Engineering process and different enhancements it has seen to offer to software assessment. The practical industrial cases show that proposed two-dimensional method works in practise and is useful to assessed companies.

  5. Archiving Software Systems: Approaches to Preserve Computational Capabilities

    Science.gov (United States)

    King, T. A.

    2014-12-01

    A great deal of effort is made to preserve scientific data. Not only because data is knowledge, but it is often costly to acquire and is sometimes collected under unique circumstances. Another part of the science enterprise is the development of software to process and analyze the data. Developed software is also a large investment and worthy of preservation. However, the long term preservation of software presents some challenges. Software often requires a specific technology stack to operate. This can include software, operating systems and hardware dependencies. One past approach to preserve computational capabilities is to maintain ancient hardware long past its typical viability. On an archive horizon of 100 years, this is not feasible. Another approach to preserve computational capabilities is to archive source code. While this can preserve details of the implementation and algorithms, it may not be possible to reproduce the technology stack needed to compile and run the resulting applications. This future forward dilemma has a solution. Technology used to create clouds and process big data can also be used to archive and preserve computational capabilities. We explore how basic hardware, virtual machines, containers and appropriate metadata can be used to preserve computational capabilities and to archive functional software systems. In conjunction with data archives, this provides scientist with both the data and capability to reproduce the processing and analysis used to generate past scientific results.

  6. Software-as-a-Service Offer Differentiation by Business Unit

    Directory of Open Access Journals (Sweden)

    Islam Balbaa

    2011-01-01

    Full Text Available This article summarizes the author's recent research into the fit between software-as-a-service (SaaS tools and the requirements of particular business units. First, an overview of SaaS is provided, including a summary of its benefits to users and software vendors. Next, the approach used to gather and analyze data about the SaaS solutions offered on the Force.com AppExchange is outlined. Finally, the article describes the managerial implications of this research.

  7. Distinctive Innovation Capabilities of Argentine Software Companies with High Innovation Results and Impacts

    OpenAIRE

    María Isabel Camio; María del Carmen Romero; María Belén Álvarez; Alfredo José Rébori

    2018-01-01

    The software sector is of growing importance and, due to its degree of dynamism, the identification of capabilities for innovation is vital. This study identifies capabilities variables that distinguish Argentine software companies with high innovation results and high innovation impacts from those with lesser results and impacts. It is applied to a sample of 103 companies, a measurement model and the component variables of an innovation degree index for software companies (INIs) formulated i...

  8. Distinctive Innovation Capabilities of Argentine Software Companies with High Innovation Results and Impacts

    Directory of Open Access Journals (Sweden)

    María Isabel Camio

    2018-04-01

    Full Text Available The software sector is of growing importance and, due to its degree of dynamism, the identification of capabilities for innovation is vital. This study identifies capabilities variables that distinguish Argentine software companies with high innovation results and high innovation impacts from those with lesser results and impacts. It is applied to a sample of 103 companies, a measurement model and the component variables of an innovation degree index for software companies (INIs formulated in previous studies. A Principal Component Analysis and a biplot are conducted. In the analysis of results and impacts, 100% of the variability within the first two components is explained, which shows the high correlation between variables. From the biplots, it appears that companies with high results have higher degrees in the variables of motivation, strategy, leadership and internal determinants; and those with high impacts present higher degrees of structure, strategy, leadership, free software and innovation activities. The findings add elements to the theory of capabilities for innovation in the software sector and allow us to consider the relative importance of different capabilities variables in the generation of innovation results and impacts.

  9. SSE software test management STM capability: Using STM in the Ground Systems Development Environment (GSDE)

    Science.gov (United States)

    Church, Victor E.; Long, D.; Hartenstein, Ray; Perez-Davila, Alfredo

    1992-01-01

    This report is one of a series discussing configuration management (CM) topics for Space Station ground systems software development. It provides a description of the Software Support Environment (SSE)-developed Software Test Management (STM) capability, and discusses the possible use of this capability for management of developed software during testing performed on target platforms. This is intended to supplement the formal documentation of STM provided by the SEE Project. How STM can be used to integrate contractor CM and formal CM for software before delivery to operations is described. STM provides a level of control that is flexible enough to support integration and debugging, but sufficiently rigorous to insure the integrity of the testing process.

  10. The Capabilities of the Offshore Middlemen

    DEFF Research Database (Denmark)

    Mahnke, Volker; Wareham, Jonathan

    preliminary theoretical justification for the emergence of offshore intermediaries; describe how and why they develop intermediation capabilities; and offer initial evidence substantiating their function and processes in intermediating transnational offshoring relationships in software development...

  11. New Modelling Capabilities in Commercial Software for High-Gain Antennas

    DEFF Research Database (Denmark)

    Jørgensen, Erik; Lumholt, Michael; Meincke, Peter

    2012-01-01

    characterization of the reflectarray element, an initial phaseonly synthesis, followed by a full optimization procedure taking into account the near-field from the feed and the finite extent of the array. Another interesting new modelling capability is made available through the DIATOOL software, which is a new...... type of EM software tool aimed at extending the ways engineers can use antenna measurements in the antenna design process. The tool allows reconstruction of currents and near fields on a 3D surface conformal to the antenna, by using the measured antenna field as input. The currents on the antenna...... surface can provide valuable information about the antenna performance or undesired contributions, e.g. currents on a cable,can be artificially removed. Finally, the CHAMP software will be extended to cover reflector shaping and more complex materials,which combined with a much faster execution speed...

  12. Core Logistics Capability Policy Applied to USAF Combat Aircraft Avionics Software: A Systems Engineering Analysis

    Science.gov (United States)

    2010-06-01

    cannot make a distinction between software maintenance and development” (Sharma, 2004). ISO /IEC 12207 Software Lifecycle Processes offers a guide to...synopsis of ISO /IEC 12207 , Raghu Singh of the Federal Aviation Administration states “Whenever a software product needs modifications, the development...Corporation. Singh, R. (1998). International Standard ISO /IEC 12207 Software Life Cycle Processes. Washington: Federal Aviation Administration. The Joint

  13. The capabilities and applications of the saphire 5.0 safety assessment software

    International Nuclear Information System (INIS)

    Russell, K.D.; Wood, S.T.; Kvarfordt, K.J.

    1994-01-01

    The System Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a suite of computer programs that were developed to create and analyze a probabilistic risk assessment (PRA) of a nuclear power plant. The programs in this suite include: Models and Results Data Base (MAR-D) software, Integrated Reliability and Risk Analysis System (IRRAS) software, System Analysis and Risk Assessment (SARA) software, and Fault tree, Event tree, and Piping and instrumentation diagram (FEP) graphical editor. Each of these programs performs a specific function in taking a PRA from the conceptual state all the way to publication. This paper provides an overview of the features and capabilities provided in version 5.0 of this software system. Some major new features include the ability to store unlimited cut sets, the ability to perform location transformations, the ability to perform seismic analysis, the ability to perform automated rule based recovery analysis and end state cut set partitioning, the ability to perform end state analysis, a new alphanumeric fault tree editor, and a new alphanumeric event tree editor. Many enhancements and improvements to the user interface as well as a significant reduction in the time required to perform an analysis are included in version 5.0. These new features and capabilities provide a powerful set of PC based PRA analysis tools

  14. Assessment of the integration capability of system architectures from a complex and distributed software systems perspective

    Science.gov (United States)

    Leuchter, S.; Reinert, F.; Müller, W.

    2014-06-01

    Procurement and design of system architectures capable of network centric operations demand for an assessment scheme in order to compare different alternative realizations. In this contribution an assessment method for system architectures targeted at the C4ISR domain is presented. The method addresses the integration capability of software systems from a complex and distributed software system perspective focusing communication, interfaces and software. The aim is to evaluate the capability to integrate a system or its functions within a system-of-systems network. This method uses approaches from software architecture quality assessment and applies them on the system architecture level. It features a specific goal tree of several dimensions that are relevant for enterprise integration. These dimensions have to be weighed against each other and totalized using methods from the normative decision theory in order to reflect the intention of the particular enterprise integration effort. The indicators and measurements for many of the considered quality features rely on a model based view on systems, networks, and the enterprise. That means it is applicable to System-of-System specifications based on enterprise architectural frameworks relying on defined meta-models or domain ontologies for defining views and viewpoints. In the defense context we use the NATO Architecture Framework (NAF) to ground respective system models. The proposed assessment method allows evaluating and comparing competing system designs regarding their future integration potential. It is a contribution to the system-of-systems engineering methodology.

  15. Proposing a Qualitative Approach for Corporate Competitive Capability Modeling in High-Tech Business (Case study: Software Industry

    Directory of Open Access Journals (Sweden)

    Mahmoud Saremi Saremi

    2010-09-01

    Full Text Available The evolution of global business trend for ICT-based products in recent decades shows the intensive activity of pioneer developing countries to gain a powerful competitive position in global software industry. In this research, with regard to importance of competition issue for top managers of Iranian software companies, a conceptual model has been developed for Corporate Competitive Capability concept. First, after describing the research problem, we present a comparative review of recent theories of firm and competition that has been applied by different researchers in the High-Tech and Knowledge Intensive Organization filed. Afterwards, with a detailed review of literature and previous research papers, an initial research framework and applied research method has been proposed. The main and final section of paper assigned to describing the result of research in different steps of qualitative modeling process. The agreed concepts are related to corporate competitive capability, the elicited and analyzed experts Cause Map, the elicited collective causal maps, and the final proposed model for software industry are the modeling results for this paper.

  16. Archival standards, in archival open access software And offer appropriate software for internal archival centers

    Directory of Open Access Journals (Sweden)

    Abdolreza Izadi

    2016-12-01

    Full Text Available The purpose of this study is Study of Descriptive Metadata Standards in Archival open source software, to determine the most appropriate descriptive metadata standard (s and also Encoder Software support of these standards. The approach of present study is combination and library methods, Delphi and descriptive survey are used. Data gathering in library study is fiche, in the Delphi method is questionnaire and in descriptive survey is checklist. Statistical population contains 5 Archival open source software. The findings suggest that 5 metadata standards, consist of EAD, ISAD, EAC-CPF, ISAAR & ISDF, diagnosed appropriate by Delphi Panel members as the most appropriate descriptive metadata standards to use for archival software. Moreover, ICA-ATOM and Archivist toolkit in terms of support for standards that were suitable, diagnosed as the most appropriate archival software.

  17. Between Innovation and Governance: The Case of Research-based Software Development in a Large Petroleum Company

    OpenAIRE

    Seifvand, Atiyeh

    2012-01-01

    Software innovations can offer organizations with competitive advantages. Research and development entities within the petroleum industry therefore seek to utilize IT capabilities to produce innovative software. Many factors may influence the success or failure of developing and implementing research-based software innovations in organizations. Of these issues the relation between software innovation and IT governance remains largely unexplored in the research literature.This study explores t...

  18. A Real-Time Capable Software-Defined Receiver Using GPU for Adaptive Anti-Jam GPS Sensors

    Science.gov (United States)

    Seo, Jiwon; Chen, Yu-Hsuan; De Lorenzo, David S.; Lo, Sherman; Enge, Per; Akos, Dennis; Lee, Jiyun

    2011-01-01

    Due to their weak received signal power, Global Positioning System (GPS) signals are vulnerable to radio frequency interference. Adaptive beam and null steering of the gain pattern of a GPS antenna array can significantly increase the resistance of GPS sensors to signal interference and jamming. Since adaptive array processing requires intensive computational power, beamsteering GPS receivers were usually implemented using hardware such as field-programmable gate arrays (FPGAs). However, a software implementation using general-purpose processors is much more desirable because of its flexibility and cost effectiveness. This paper presents a GPS software-defined radio (SDR) with adaptive beamsteering capability for anti-jam applications. The GPS SDR design is based on an optimized desktop parallel processing architecture using a quad-core Central Processing Unit (CPU) coupled with a new generation Graphics Processing Unit (GPU) having massively parallel processors. This GPS SDR demonstrates sufficient computational capability to support a four-element antenna array and future GPS L5 signal processing in real time. After providing the details of our design and optimization schemes for future GPU-based GPS SDR developments, the jamming resistance of our GPS SDR under synthetic wideband jamming is presented. Since the GPS SDR uses commercial-off-the-shelf hardware and processors, it can be easily adopted in civil GPS applications requiring anti-jam capabilities. PMID:22164116

  19. A Real-Time Capable Software-Defined Receiver Using GPU for Adaptive Anti-Jam GPS Sensors

    Directory of Open Access Journals (Sweden)

    Dennis Akos

    2011-09-01

    Full Text Available Due to their weak received signal power, Global Positioning System (GPS signals are vulnerable to radio frequency interference. Adaptive beam and null steering of the gain pattern of a GPS antenna array can significantly increase the resistance of GPS sensors to signal interference and jamming. Since adaptive array processing requires intensive computational power, beamsteering GPS receivers were usually implemented using hardware such as field-programmable gate arrays (FPGAs. However, a software implementation using general-purpose processors is much more desirable because of its flexibility and cost effectiveness. This paper presents a GPS software-defined radio (SDR with adaptive beamsteering capability for anti-jam applications. The GPS SDR design is based on an optimized desktop parallel processing architecture using a quad-core Central Processing Unit (CPU coupled with a new generation Graphics Processing Unit (GPU having massively parallel processors. This GPS SDR demonstrates sufficient computational capability to support a four-element antenna array and future GPS L5 signal processing in real time. After providing the details of our design and optimization schemes for future GPU-based GPS SDR developments, the jamming resistance of our GPS SDR under synthetic wideband jamming is presented. Since the GPS SDR uses commercial-off-the-shelf hardware and processors, it can be easily adopted in civil GPS applications requiring anti-jam capabilities.

  20. Space-Based Reconfigurable Software Defined Radio Test Bed Aboard International Space Station

    Science.gov (United States)

    Reinhart, Richard C.; Lux, James P.

    2014-01-01

    The National Aeronautical and Space Administration (NASA) recently launched a new software defined radio research test bed to the International Space Station. The test bed, sponsored by the Space Communications and Navigation (SCaN) Office within NASA is referred to as the SCaN Testbed. The SCaN Testbed is a highly capable communications system, composed of three software defined radios, integrated into a flight system, and mounted to the truss of the International Space Station. Software defined radios offer the future promise of in-flight reconfigurability, autonomy, and eventually cognitive operation. The adoption of software defined radios offers space missions a new way to develop and operate space transceivers for communications and navigation. Reconfigurable or software defined radios with communications and navigation functions implemented in software or VHDL (Very High Speed Hardware Description Language) provide the capability to change the functionality of the radio during development or after launch. The ability to change the operating characteristics of a radio through software once deployed to space offers the flexibility to adapt to new science opportunities, recover from anomalies within the science payload or communication system, and potentially reduce development cost and risk by adapting generic space platforms to meet specific mission requirements. The software defined radios on the SCaN Testbed are each compliant to NASA's Space Telecommunications Radio System (STRS) Architecture. The STRS Architecture is an open, non-proprietary architecture that defines interfaces for the connections between radio components. It provides an operating environment to abstract the communication waveform application from the underlying platform specific hardware such as digital-to-analog converters, analog-to-digital converters, oscillators, RF attenuators, automatic gain control circuits, FPGAs, general-purpose processors, etc. and the interconnections among

  1. Building IT capability in health-care organizations.

    Science.gov (United States)

    Khatri, Naresh

    2006-05-01

    While computer technology has revolutionized industries such as banking and airlines, it has done little for health care so far. Most of the health-care organizations continue the early-computer-era practice of buying the latest technology without knowing how it might effectively be employed in achieving business goals. By investing merely in information technology (IT) rather than in IT capabilities they acquire IT components--primarily hardware, software, and vendor-provided services--which they do not understand and, as a result, are not capable of fully utilizing for achieving organizational objectives. In the absence of internal IT capabilities, health-care organizations have relied heavily on the fragmented IT vendor market in which vendors do not offer an open architecture, and are unwilling to offer electronic interfaces that would make their 'closed' systems compatible with those of other vendors. They are hamstrung as a result because they have implemented so many different technologies and databases that information stays in silos. Health systems can meet this challenge by developing internal IT capabilities that would allow them to seamlessly integrate clinical and business IT systems and develop innovative uses of IT. This paper develops a comprehensive conception of IT capability grounded in the resource-based theory of the firm as a remedy to the woes of IT investments in health care.

  2. SCaN Testbed Software Development and Lessons Learned

    Science.gov (United States)

    Kacpura, Thomas J.; Varga, Denise M.

    2012-01-01

    National Aeronautics and Space Administration (NASA) has developed an on-orbit, adaptable, Software Defined Radio (SDR)Space Telecommunications Radio System (STRS)-based testbed facility to conduct a suite of experiments to advance technologies, reduce risk, and enable future mission capabilities on the International Space Station (ISS). The SCAN Testbed Project will provide NASA, industry, other Government agencies, and academic partners the opportunity to develop and field communications, navigation, and networking technologies in the laboratory and space environment based on reconfigurable, SDR platforms and the STRS Architecture.The SDRs are a new technology for NASA, and the support infrastructure they require is different from legacy, fixed function radios. SDRs offer the ability to reconfigure on-orbit communications by changing software for new waveforms and operating systems to enable new capabilities or fix any anomalies, which was not a previous option. They are not stand alone devices, but required a new approach to effectively control them and flow data. This requires extensive software to be developed to utilize the full potential of these reconfigurable platforms. The paper focuses on development, integration and testing as related to the avionics processor system, and the software required to command, control, monitor, and interact with the SDRs, as well as the other communication payload elements. An extensive effort was required to develop the flight software and meet the NASA requirements for software quality and safety. The flight avionics must be radiation tolerant, and these processors have limited capability in comparison to terrestrial counterparts. A big challenge was that there are three SDRs onboard, and interfacing with multiple SDRs simultaneously complicatesd the effort. The effort also includes ground software, which is a key element for both the command of the payload, and displaying data created by the payload. The verification of

  3. CASL Dakota Capabilities Summary

    Energy Technology Data Exchange (ETDEWEB)

    Adams, Brian M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Simmons, Chris [Univ. of Texas, Austin, TX (United States); Williams, Brian J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-10-10

    The Dakota software project serves the mission of Sandia National Laboratories and supports a worldwide user community by delivering state-of-the-art research and robust, usable software for optimization and uncertainty quantification. These capabilities enable advanced exploration and riskinformed prediction with a wide range of computational science and engineering models. Dakota is the verification and validation (V&V) / uncertainty quantification (UQ) software delivery vehicle for CASL, allowing analysts across focus areas to apply these capabilities to myriad nuclear engineering analyses.

  4. People Capability Maturity Model. SM.

    Science.gov (United States)

    1995-09-01

    tailored so it consumes less time and resources than a traditional software process assessment or CMU/SEI-95-MM-02 People Capability Maturity Model...improved reputation or customer loyalty. CMU/SEI-95-MM-02 People Capability Maturity Model ■ L5-17 Coaching Level 5: Optimizing Activity 1...Maturity Model CMU/SEI-95-MM-62 Carnegie-Mellon University Software Engineering Institute DTIC ELECTE OCT 2 7 1995 People Capability Maturity

  5. The Expanded Capabilities Of The Cementitious Barriers Partnership Software Toolbox Version 2.0 - 14331

    Energy Technology Data Exchange (ETDEWEB)

    Burns, Heather; Flach, Greg; Smith, Frank; Langton, Christine; Brown, Kevin; Kosson, David; Samson, Eric; Mallick, Pramod

    2014-01-10

    The Cementitious Barriers Partnership (CBP) Project is a multi-disciplinary, multi-institutional collaboration supported by the U.S. Department of Energy (US DOE) Office of Tank Waste Management. The CBP program has developed a set of integrated tools (based on state-of-the-art models and leaching test methods) that help improve understanding and predictions of the long-term structural, hydraulic and chemical performance of cementitious barriers used in nuclear applications. The CBP Software Toolbox – “Version 1.0” was released early in FY2013 and was used to support DOE-EM performance assessments in evaluating various degradation mechanisms that included sulfate attack, carbonation and constituent leaching. The sulfate attack analysis predicted the extent and damage that sulfate ingress will have on concrete vaults over extended time (i.e., > 1000 years) and the carbonation analysis provided concrete degradation predictions from rebar corrosion. The new release “Version 2.0” includes upgraded carbonation software and a new software module to evaluate degradation due to chloride attack. Also included in the newer version are a dual regime module allowing evaluation of contaminant release in two regimes – both fractured and un-fractured. The integrated software package has also been upgraded with new plotting capabilities and many other features that increase the “user-friendliness” of the package. Experimental work has been generated to provide data to calibrate the models to improve the credibility of the analysis and reduce the uncertainty. Tools selected for and developed under this program have been used to evaluate and predict the behavior of cementitious barriers used in near-surface engineered waste disposal systems for periods of performance up to or longer than 100 years for operating facilities and longer than 1000 years for waste disposal. The CBP Software Toolbox is and will continue to produce tangible benefits to the working DOE

  6. Software for Engineering Simulations of a Spacecraft

    Science.gov (United States)

    Shireman, Kirk; McSwain, Gene; McCormick, Bernell; Fardelos, Panayiotis

    2005-01-01

    Spacecraft Engineering Simulation II (SES II) is a C-language computer program for simulating diverse aspects of operation of a spacecraft characterized by either three or six degrees of freedom. A functional model in SES can include a trajectory flight plan; a submodel of a flight computer running navigational and flight-control software; and submodels of the environment, the dynamics of the spacecraft, and sensor inputs and outputs. SES II features a modular, object-oriented programming style. SES II supports event-based simulations, which, in turn, create an easily adaptable simulation environment in which many different types of trajectories can be simulated by use of the same software. The simulation output consists largely of flight data. SES II can be used to perform optimization and Monte Carlo dispersion simulations. It can also be used to perform simulations for multiple spacecraft. In addition to its generic simulation capabilities, SES offers special capabilities for space-shuttle simulations: for this purpose, it incorporates submodels of the space-shuttle dynamics and a C-language version of the guidance, navigation, and control components of the space-shuttle flight software.

  7. PROGRAMS WITH DATA MINING CAPABILITIES

    Directory of Open Access Journals (Sweden)

    Ciobanu Dumitru

    2012-03-01

    Full Text Available The fact that the Internet has become a commodity in the world has created a framework for anew economy. Traditional businesses migrate to this new environment that offers many features and options atrelatively low prices. However competitiveness is fierce and successful Internet business is tied to rigorous use of allavailable information. The information is often hidden in data and for their retrieval is necessary to use softwarecapable of applying data mining algorithms and techniques. In this paper we want to review some of the programswith data mining capabilities currently available in this area.We also propose some classifications of this softwareto assist those who wish to use such software.

  8. Improving Chemistry Education by Offering Salient Technology Training to Preservice Teachers: A Graduate-Level Course on Using Software to Teach Chemistry

    Science.gov (United States)

    Tofan, Daniel C.

    2009-01-01

    This paper describes an upper-level undergraduate and graduate-level course on computers in chemical education that was developed and offered for the first time in Fall 2007. The course provides future chemistry teachers with exposure to current software tools that can improve productivity in teaching, curriculum development, and education…

  9. A software radio platform based on ARM and FPGA

    Directory of Open Access Journals (Sweden)

    Yang Xin.

    2016-01-01

    Full Text Available The rapid rise in computational performance offered by computer systems has greatly increased the number of practical software radio applications. A scheme presented in this paper is a software radio platform based on ARM and FPGA. FPGA works as the coprocessor together with the ARM, which serves as the core processor. ARM is used for digital signal processing and real-time data transmission, and FPGA is used for synchronous timing control and serial-parallel conversion. A SPI driver for real-time data transmission between ARM and FPGA under ARM-Linux system is provided. By adopting modular design, the software radio platform is capable of implementing wireless communication functions and satisfies the requirements of real-time signal processing platform for high security and broad applicability.

  10. A new curimeter with advanced software capabilities

    International Nuclear Information System (INIS)

    Arista Romeo, Eduardo J.; Toledo Acosta, Rene B.; Dotres Lleras, Armando

    2001-01-01

    A new curimeter, model CD-N102, developed at CEADEN is described. Emphasis is made on the description of the hardware and basic low level software of the device. Attention is paid to meteorological and quality assurance aspects of the problem, as this device is destined to complete the Nuclear Medicine Modules at different hospitals on the country. The characteristics obtained are mentioned. A full block schema is presented and key blocks are discussed in more detail. Also, there are discussed the utilization of software resources and methods for achieving the aforementioned characteristics without further increasing the complexity of the analog part of the device, and trade off between the different characteristics involved and final decisions taken at the design stage

  11. Software Quality Assurance in Software Projects: A Study of Pakistan

    OpenAIRE

    Faisal Shafique Butt; Sundus Shaukat; M. Wasif Nisar; Ehsan Ullah Munir; Muhammad Waseem; Kashif Ayyub

    2013-01-01

    Software quality is specific property which tells what kind of standard software should have. In a software project, quality is the key factor of success and decline of software related organization. Many researches have been done regarding software quality. Software related organization follows standards introduced by Capability Maturity Model Integration (CMMI) to achieve good quality software. Quality is divided into three main layers which are Software Quality Assurance (SQA), Software Qu...

  12. TrajAnalytics: An Open-Source, Web-Based Visual Analytics Software of Urban Trajectory Data

    OpenAIRE

    Zhao, Ye

    2018-01-01

    We developed a software system named TrajAnalytics, which explicitly supports interactive visual analytics of the emerging trajectory data. It offers data management capability and support various data queries by leveraging web-based computing platforms. It allows users to visually conduct queries and make sense of massive trajectory data.

  13. Agility in a small software firm

    DEFF Research Database (Denmark)

    Schmidt, Thomas; Mathiassen, Lars

    2009-01-01

    Small software firms are vulnerable to environmental uncertainty. While agile methods and other technologies offer suggestions to this challenge, we know little about how these firms combine project and firm level capabilities to effectively respond to changes. On this backdrop, we examine a small...... Danish software firm, TeachTech Inc., through the lens of Haeckel's sense-and-respond approach. Our analysis suggests that: the firm has appropriate sense-and-respond cycles, but improving process modularity and human resource flexibility, could increase its ability to respond faster and more effectively......; the firm focuses on specific business goals, but these are not clearly explicated and expressed as empowering governing values enabling a quick and coordinated response; complex and demanding challenges are related to dynamically reassigning commitments and the supporting mechanisms are insufficient...

  14. Software attribute visualization for high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  15. WLS software for the Los Alamos geophysical instrumentation truck

    International Nuclear Information System (INIS)

    Ideker, C.D.; LaDelfe, C.M.

    1985-01-01

    Los Alamos National Laboratory's capabilities for special downhole geophysical well logging has increased steadily over the past few years. Software was developed originally for each individual tool as it became operational. With little or no standardization for tool software modules, software development became redundant, time consuming, and cost ineffective. With long-term use and the rapid evolution of well logging capacity in mind. Los Alamos and EG and G personnel decided to purchase a software system. The system was designed to offer: wide-range use and programming flexibility; standardization subroutines for tool module development; user friendly operation which would reduce training time; operator error checking and alarm activation; maximum growth capacity for new tools as they are added to the inventory; and the ability to incorporate changes made to the computer operating system and hardware. The end result is a sophisticated and flexible software tool and for transferring downhole geophysical measurement data to computer disk files. This paper outlines the need, design, development, and implementation of the WLS software for geophysical data acquisition. A demonstration and working examples are included in the presentation

  16. Software engineering capability for Ada (GRASP/Ada Tool)

    Science.gov (United States)

    Cross, James H., II

    1995-01-01

    The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) has successfully created and prototyped a new algorithmic level graphical representation for Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis has been on the automatic generation of the CSD from Ada PDL or source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional prettyprinted Ada Source code. A new Motif compliant graphical user interface has been developed for the GRASP/Ada prototype.

  17. Improving Software Developer's Competence

    DEFF Research Database (Denmark)

    Abrahamsson, Pekka; Kautz, Karlheinz; Sieppi, Heikki

    2002-01-01

    Emerging agile software development methods are people oriented development approaches to be used by the software industry. The personal software process (PSP) is an accepted method for improving the capabilities of a single software engineer. Five original hypotheses regarding the impact...

  18. Variability in Multi-Tenant Enterprise Software

    OpenAIRE

    Kabbedijk, J.

    2014-01-01

    Enterprise software applications have changed significantly over the last decades. Increasingly, software is deployed in a central location to be accessed through the internet, instead of installing software at end-users. Having software in a central location enables multi-tenancy, where multiple customers transparently share a system’s resources. Currently, multi-tenancy is a popular way to offer functionality of a software product through the internet to numerous customers, offering many ad...

  19. A Demonstration of the System Assessment Capability (SAC) Rev. 1 Software for the Hanford Remediation Assessment Project

    International Nuclear Information System (INIS)

    Eslinger, Paul W.; Kincaid, Charles T.; Nichols, William E.; Wurstner, Signe K.

    2006-01-01

    The System Assessment Capability (SAC) is a suite of interrelated computer codes that provides the capability to conduct large-scale environmental assessments on the Hanford Site. Developed by Pacific Northwest National Laboratory for the Department of Energy, SAC models the fate and transport of radioactive and chemical contaminants, starting with the inventory of those contaminants in waste sites, simulating transport through the environment, and continuing on through impacts to the environment and humans. Separate modules in the SAC address inventory, release from waste forms, water flow and mass transport in the vadose zone, water flow and mass transport in the groundwater, water flow and mass transport in the Columbia River, air transport, and human and ecological impacts. The SAC supports deterministic analyses as well as stochastic analyses using a Monte Carlo approach, enabling SAC users to examine the effect of uncertainties in a number of key parameters. The initial assessment performed with the SAC software identified a number of areas where both the software and the analysis approach could be improved. Since that time the following six major software upgrades have been made: (1) An air pathway model was added to support all-pathway analyses. (2) Models for releases from glass waste forms, buried graphite reactor cores, and buried naval reactor compartments were added. (3) An air-water dual-phase model was added to more accurately track the movement of volatile contaminants in the vadose zone. (4) The ability to run analyses was extended from 1,000 years to 10,000 years or longer after site closure. (5) The vadose zone flow and transport model was upgraded to support two-dimensional or three-dimensional analyses. (6) The ecological model and human risk models were upgraded so the concentrations of contaminants in food products consumed by humans are produced by the ecological model. This report documents the functions in the SAC software and provides a

  20. A Demonstration of the System Assessment Capability (SAC) Rev. 1 Software for the Hanford Remediation Assessment Project

    Energy Technology Data Exchange (ETDEWEB)

    Eslinger, Paul W.; Kincaid, Charles T.; Nichols, William E.; Wurstner, Signe K.

    2006-11-06

    The System Assessment Capability (SAC) is a suite of interrelated computer codes that provides the capability to conduct large-scale environmental assessments on the Hanford Site. Developed by Pacific Northwest National Laboratory for the Department of Energy, SAC models the fate and transport of radioactive and chemical contaminants, starting with the inventory of those contaminants in waste sites, simulating transport through the environment, and continuing on through impacts to the environment and humans. Separate modules in the SAC address inventory, release from waste forms, water flow and mass transport in the vadose zone, water flow and mass transport in the groundwater, water flow and mass transport in the Columbia River, air transport, and human and ecological impacts. The SAC supports deterministic analyses as well as stochastic analyses using a Monte Carlo approach, enabling SAC users to examine the effect of uncertainties in a number of key parameters. The initial assessment performed with the SAC software identified a number of areas where both the software and the analysis approach could be improved. Since that time the following six major software upgrades have been made: (1) An air pathway model was added to support all-pathway analyses. (2) Models for releases from glass waste forms, buried graphite reactor cores, and buried naval reactor compartments were added. (3) An air-water dual-phase model was added to more accurately track the movement of volatile contaminants in the vadose zone. (4) The ability to run analyses was extended from 1,000 years to 10,000 years or longer after site closure. (5) The vadose zone flow and transport model was upgraded to support two-dimensional or three-dimensional analyses. (6) The ecological model and human risk models were upgraded so the concentrations of contaminants in food products consumed by humans are produced by the ecological model. This report documents the functions in the SAC software and provides a

  1. Computer Software Configuration Item-Specific Flight Software Image Transfer Script Generator

    Science.gov (United States)

    Bolen, Kenny; Greenlaw, Ronald

    2010-01-01

    A K-shell UNIX script enables the International Space Station (ISS) Flight Control Team (FCT) operators in NASA s Mission Control Center (MCC) in Houston to transfer an entire or partial computer software configuration item (CSCI) from a flight software compact disk (CD) to the onboard Portable Computer System (PCS). The tool is designed to read the content stored on a flight software CD and generate individual CSCI transfer scripts that are capable of transferring the flight software content in a given subdirectory on the CD to the scratch directory on the PCS. The flight control team can then transfer the flight software from the PCS scratch directory to the Electronically Erasable Programmable Read Only Memory (EEPROM) of an ISS Multiplexer/ Demultiplexer (MDM) via the Indirect File Transfer capability. The individual CSCI scripts and the CSCI Specific Flight Software Image Transfer Script Generator (CFITSG), when executed a second time, will remove all components from their original execution. The tool will identify errors in the transfer process and create logs of the transferred software for the purposes of configuration management.

  2. Technical Requirements Analysis and Control Systems (TRACS) Initial Operating Capability (IOC) documentation

    Science.gov (United States)

    Hammond, Dana P.

    1991-01-01

    The Technical Requirements Analysis and Control Systems (TRACS) software package is described. TRACS offers supplemental tools for the analysis, control, and interchange of project requirements. This package provides the fundamental capability to analyze and control requirements, serves a focal point for project requirements, and integrates a system that supports efficient and consistent operations. TRACS uses relational data base technology (ORACLE) in a stand alone or in a distributed environment that can be used to coordinate the activities required to support a project through its entire life cycle. TRACS uses a set of keyword and mouse driven screens (HyperCard) which imposes adherence through a controlled user interface. The user interface provides an interactive capability to interrogate the data base and to display or print project requirement information. TRACS has a limited report capability, but can be extended with PostScript conventions.

  3. Features of commercial computer software systems for medical examiners and coroners.

    Science.gov (United States)

    Hanzlick, R L; Parrish, R G; Ing, R

    1993-12-01

    There are many ways of automating medical examiner and coroner offices, one of which is to purchase commercial software products specifically designed for death investigation. We surveyed four companies that offer such products and requested information regarding each company and its hardware, software, operating systems, peripheral devices, applications, networking options, programming language, querying capability, coding systems, prices, customer support, and number and size of offices using the product. Although the four products (CME2, ForenCIS, InQuest, and Medical Examiner's Software System) are similar in many respects and each can be installed on personal computers, there are differences among the products with regard to cost, applications, and the other features. Death investigators interested in office automation should explore these products to determine the usefulness of each in comparison with the others and in comparison with general-purpose, off-the-shelf databases and software adaptable to death investigation needs.

  4. ALICES: an advanced object-oriented software workshop for simulators

    International Nuclear Information System (INIS)

    Sayet, R.L.; Rouault, G.; Pieroux, D.; Houte, U. Van

    1999-01-01

    Reducing simulator development costs while improving model quality, user-friendliness and teaching capabilities, is a major target for many years in the simulation industry. It has led to the development of specific software tools which have been improved progressively following the new features and capabilities offered by the software industry. Unlike most of these software tools, ALICES (which is a French acronym for 'Interactive Software Workshop for the Design of Simulators') is not an upgrade of a previous generation of tools, like putting a graphical front-end to a classical code generator, but a really new development. Its design specification is based on previous experience with different tools as well as on new capabilities of software technology, mainly in Object Oriented Design. This allowed us to make a real technological 'jump' in the simulation industry, beyond the constraints of some traditional approaches. The main objectives behind the development of ALICES were the following: (1) Minimizing the simulator development time and costs: a simulator development consists mainly in developing software. One way to reduce costs is to facilitate reuse of existing software by developing standard components, and by defining interface standards, (2) Insuring that the produced simulator can be maintained and updated at a minimal cost: a simulator must evolve along with the simulated process, and it is then necessary to update periodically the simulator. The cost of an adequate maintenance is highly dependent of the quality of the software workshop, (3) Covering the whole simulator development process: from the data package to the acceptance tests and for maintenance and upgrade activities; with the whole development team, even if it is dispatched at different working sites; respecting the Quality Assurance rules and procedures (CORYS T.E.S.S. and TRACTEBEL are ISO-9001 certified). The development of ALICES was also done to comply with the following two main

  5. Empirical Studies on the Use of Social Software in Global Software Development - a Systematic Mapping Study

    DEFF Research Database (Denmark)

    Giuffrida, Rosalba; Dittrich, Yvonne

    2013-01-01

    of empirical studies on the usage of SoSo are available in related fields, there exists no comprehensive overview of what has been investigated to date across them. Objective: The aim of this review is to map empirical studies on the usage of SoSo in Software Engineering projects and in distributed teams...... for collaborative work, fostering awareness, knowledge management and coordination among team members. Contrary to the evident high importance of the social aspects offered by SoSo, socialization is not the most important usage reported. Conclusions: This review reports how SoSo is used in GSD and how it is capable...... of supporting GSD teams. Four emerging themes in global software engineering were identified: the appropriation and development of usage structures; understanding how an ecology of communication channels and tools are used by teams; the role played by SoSo either as a subtext or as an explicit goal; and finally...

  6. Elements of strategic capability for software outsourcing enterprises based on the resource

    Science.gov (United States)

    Shi, Wengeng

    2011-10-01

    Software outsourcing enterprises as an emerging high-tech enterprises, the rise of the speed and the number was very amazing. In addition to Chinese software outsourcing for giving preferential policies, the software outsourcing business has its ability to upgrade, and in general the software companies have not had the related characteristics. View from the resource base of the theory, the analysis software outsourcing companies have the ability and resources of rare and valuable and non-mimic, we try to give an initial framework for theoretical analysis based on this.

  7. A software product certification model

    NARCIS (Netherlands)

    Heck, P.M.; Klabbers, M.D.; van Eekelen, Marko

    2010-01-01

    Certification of software artifacts offers organizations more certainty and confidence about software. Certification of software helps software sales, acquisition, and can be used to certify legislative compliance or to achieve acceptable deliverables in outsourcing. In this article, we present a

  8. Perm State University HPC-hardware and software services: capabilities for aircraft engine aeroacoustics problems solving

    Science.gov (United States)

    Demenev, A. G.

    2018-02-01

    The present work is devoted to analyze high-performance computing (HPC) infrastructure capabilities for aircraft engine aeroacoustics problems solving at Perm State University. We explore here the ability to develop new computational aeroacoustics methods/solvers for computer-aided engineering (CAE) systems to handle complicated industrial problems of engine noise prediction. Leading aircraft engine engineering company, including “UEC-Aviadvigatel” JSC (our industrial partners in Perm, Russia), require that methods/solvers to optimize geometry of aircraft engine for fan noise reduction. We analysed Perm State University HPC-hardware resources and software services to use efficiently. The performed results demonstrate that Perm State University HPC-infrastructure are mature enough to face out industrial-like problems of development CAE-system with HPC-method and CFD-solvers.

  9. Using tracking software for writing instruction

    Directory of Open Access Journals (Sweden)

    Sane M. Yagi

    2011-08-01

    Full Text Available Writing is a complex skill that is hard to teach. Although the written product is what is often evaluated in the context of language teaching, the process of giving thought to linguistic form is fascinating. For almost forty years, language teachers have found it more effective to help learners in the writing process than in the written product; it is there that they could find sources of writing problems. Despite all controversy evoked by post-process approaches with respect to process writing, information technology has lately offered tools that can shed new light on how writing takes place. Software that can record keyboard, mouse, and screen activities is capable of unraveling mysteries of the writing process. Technology has given teachers and learners the option of examining the writing process as it unfolds, enabling them to diagnose strategy as well as wording problems, thus empowering teachers to guide learners individually in how to think about each of their trouble spots in the context of a specific product of writing. With these advances in information technology, metacognitive awareness and strategy training begin to acquire new dimensions of meaning. Technology lays open aspects of the writing process, offering unprecedented insight into creative text production as well. This paper attempts to explain how tracking software can influence writing instruction. It briefly examines the process and post-process approaches to assess their viability, explains the concept of tracking software, proposes methodology needed for the adoption of this technology, and then discusses the pedagogical implications of these issues.

  10. Implementasi CMMI dalam Sebuah Organisasi Pengembang Software untuk Mencapai Return on Investment (ROI yang Diinginkan

    Directory of Open Access Journals (Sweden)

    Ikrar Adinata Arin

    2012-06-01

    Full Text Available The main mechanism to achieve a level of maturity in the organization of software developers is always focused, structured and consistent in carrying out work procedures of a quality standard applied. This article offers readers an approach and discourse of using CMMI (Capability Maturity Model Integrated concept thatgives a positive impact on development of organizational business in a software developer. The goals of CMMI are getting the best product quality, increasing productivity, reducing operational costs as well as software development period and increasing customer’s satisfaction. Nevertheless, a leader of the organization shouldalso be able to take important decisions to be consistent with the estimated time of desired return on investment (ROI.

  11. A SOFTWARE RELIABILITY ESTIMATION METHOD TO NUCLEAR SAFETY SOFTWARE

    Directory of Open Access Journals (Sweden)

    GEE-YONG PARK

    2014-02-01

    Full Text Available A method for estimating software reliability for nuclear safety software is proposed in this paper. This method is based on the software reliability growth model (SRGM, where the behavior of software failure is assumed to follow a non-homogeneous Poisson process. Two types of modeling schemes based on a particular underlying method are proposed in order to more precisely estimate and predict the number of software defects based on very rare software failure data. The Bayesian statistical inference is employed to estimate the model parameters by incorporating software test cases as a covariate into the model. It was identified that these models are capable of reasonably estimating the remaining number of software defects which directly affects the reactor trip functions. The software reliability might be estimated from these modeling equations, and one approach of obtaining software reliability value is proposed in this paper.

  12. Impact of Internet of Things on Software Business Model and Software Industry

    OpenAIRE

    Murari, Bhanu Teja

    2016-01-01

    Context: Internet of things (IoT) technology is rapidly increasing and changes the business environment for a software organization. There is a need to understand what are important factors of business model should a software company focus on obtaining benefits from the potential that IoT offers. This thesis also focuses on finding the impact of IoT on software business model and software industry especially on software development. Objectives: In this thesis, we do research on IoT software b...

  13. PIV Data Validation Software Package

    Science.gov (United States)

    Blackshire, James L.

    1997-01-01

    A PIV data validation and post-processing software package was developed to provide semi-automated data validation and data reduction capabilities for Particle Image Velocimetry data sets. The software provides three primary capabilities including (1) removal of spurious vector data, (2) filtering, smoothing, and interpolating of PIV data, and (3) calculations of out-of-plane vorticity, ensemble statistics, and turbulence statistics information. The software runs on an IBM PC/AT host computer working either under Microsoft Windows 3.1 or Windows 95 operating systems.

  14. MAGIC user's group software

    International Nuclear Information System (INIS)

    Warren, G.; Ludeking, L.; McDonald, J.; Nguyen, K.; Goplen, B.

    1990-01-01

    The MAGIC User's Group has been established to facilitate the use of electromagnetic particle-in-cell software by universities, government agencies, and industrial firms. The software consists of a series of independent executables that are capable of inter-communication. MAGIC, SOS, μ SOS are used to perform electromagnetic simulations while POSTER is used to provide post-processing capabilities. Each is described in the paper. Use of the codes for Klystrode simulation is discussed

  15. A Comparison of Two Commercial Volumetry Software Programs in the Analysis of Pulmonary Ground-Glass Nodules: Segmentation Capability and Measurement Accuracy

    Science.gov (United States)

    Kim, Hyungjin; Lee, Sang Min; Lee, Hyun-Ju; Goo, Jin Mo

    2013-01-01

    Objective To compare the segmentation capability of the 2 currently available commercial volumetry software programs with specific segmentation algorithms for pulmonary ground-glass nodules (GGNs) and to assess their measurement accuracy. Materials and Methods In this study, 55 patients with 66 GGNs underwent unenhanced low-dose CT. GGN segmentation was performed by using 2 volumetry software programs (LungCARE, Siemens Healthcare; LungVCAR, GE Healthcare). Successful nodule segmentation was assessed visually and morphologic features of GGNs were evaluated to determine factors affecting segmentation by both types of software. In addition, the measurement accuracy of the software programs was investigated by using an anthropomorphic chest phantom containing simulated GGNs. Results The successful nodule segmentation rate was significantly higher in LungCARE (90.9%) than in LungVCAR (72.7%) (p = 0.012). Vascular attachment was a negatively influencing morphologic feature of nodule segmentation for both software programs. As for measurement accuracy, mean relative volume measurement errors in nodules ≥ 10 mm were 14.89% with LungCARE and 19.96% with LungVCAR. The mean relative attenuation measurement errors in nodules ≥ 10 mm were 3.03% with LungCARE and 5.12% with LungVCAR. Conclusion LungCARE shows significantly higher segmentation success rates than LungVCAR. Measurement accuracy of volume and attenuation of GGNs is acceptable in GGNs ≥ 10 mm by both software programs. PMID:23901328

  16. A comparison of two commercial volumetry software programs in the analysis of pulmonary ground-glass nodules: Segmentation capability and measurement accuracy

    International Nuclear Information System (INIS)

    Kim, Hyung Jin; Park, Chang Min; Lee, Sang Min; Lee, Hyun Joo; Goo, Jin Mo

    2013-01-01

    To compare the segmentation capability of the 2 currently available commercial volumetry software programs with specific segmentation algorithms for pulmonary ground-glass nodules (GGNs) and to assess their measurement accuracy. In this study, 55 patients with 66 GGNs underwent unenhanced low-dose CT. GGN segmentation was performed by using 2 volumetry software programs (LungCARE, Siemens Healthcare; LungVCAR, GE Healthcare). Successful nodule segmentation was assessed visually and morphologic features of GGNs were evaluated to determine factors affecting segmentation by both types of software. In addition, the measurement accuracy of the software programs was investigated by using an anthropomorphic chest phantom containing simulated GGNs. The successful nodule segmentation rate was significantly higher in LungCARE (90.9%) than in LungVCAR (72.7%) (p = 0.012). Vascular attachment was a negatively influencing morphologic feature of nodule segmentation for both software programs. As for measurement accuracy, mean relative volume measurement errors in nodules ≥ 10 mm were 14.89% with LungCARE and 19.96% with LungVCAR. The mean relative attenuation measurement errors in nodules ≥ 10 mm were 3.03% with LungCARE and 5.12% with LungVCAR. LungCARE shows significantly higher segmentation success rates than LungVCAR. Measurement accuracy of volume and attenuation of GGNs is acceptable in GGNs ≥ 10 mm by both software programs.

  17. A comparison of two commercial volumetry software programs in the analysis of pulmonary ground-glass nodules: Segmentation capability and measurement accuracy

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyung Jin; Park, Chang Min; Lee, Sang Min; Lee, Hyun Joo; Goo, Jin Mo [Dept. of Radiology, Seoul National University College of Medicine, and Institute of Radiation Medicine, Seoul National University Medical Research Center, Seoul (Korea, Republic of)

    2013-08-15

    To compare the segmentation capability of the 2 currently available commercial volumetry software programs with specific segmentation algorithms for pulmonary ground-glass nodules (GGNs) and to assess their measurement accuracy. In this study, 55 patients with 66 GGNs underwent unenhanced low-dose CT. GGN segmentation was performed by using 2 volumetry software programs (LungCARE, Siemens Healthcare; LungVCAR, GE Healthcare). Successful nodule segmentation was assessed visually and morphologic features of GGNs were evaluated to determine factors affecting segmentation by both types of software. In addition, the measurement accuracy of the software programs was investigated by using an anthropomorphic chest phantom containing simulated GGNs. The successful nodule segmentation rate was significantly higher in LungCARE (90.9%) than in LungVCAR (72.7%) (p = 0.012). Vascular attachment was a negatively influencing morphologic feature of nodule segmentation for both software programs. As for measurement accuracy, mean relative volume measurement errors in nodules ≥ 10 mm were 14.89% with LungCARE and 19.96% with LungVCAR. The mean relative attenuation measurement errors in nodules ≥ 10 mm were 3.03% with LungCARE and 5.12% with LungVCAR. LungCARE shows significantly higher segmentation success rates than LungVCAR. Measurement accuracy of volume and attenuation of GGNs is acceptable in GGNs ≥ 10 mm by both software programs.

  18. RELAP-7 Software Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support; Choi, Yong-Joon [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support; Zou, Ling [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support

    2014-09-25

    This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty years of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.

  19. Portability scenarios for intelligent robotic control agent software

    Science.gov (United States)

    Straub, Jeremy

    2014-06-01

    Portability scenarios are critical in ensuring that a piece of AI control software will run effectively across the collection of craft that it is required to control. This paper presents scenarios for control software that is designed to control multiple craft with heterogeneous movement and functional characteristics. For each prospective target-craft type, its capabilities, mission function, location, communications capabilities and power profile are presented and performance characteristics are reviewed. This work will inform future work related to decision making related to software capabilities, hardware control capabilities and processing requirements.

  20. DDP-516 Computer Graphics System Capabilities

    Science.gov (United States)

    1972-06-01

    This report describes the capabilities of the DDP-516 Computer Graphics System. One objective of this report is to acquaint DOT management and project planners with the system's current capabilities, applications hardware and software. The Appendix i...

  1. Software quality assurance

    CERN Document Server

    Laporte, Claude Y

    2018-01-01

    This book introduces Software Quality Assurance (SQA) and provides an overview of standards used to implement SQA. It defines ways to assess the effectiveness of how one approaches software quality across key industry sectors such as telecommunications, transport, defense, and aerospace. * Includes supplementary website with an instructor's guide and solutions * Applies IEEE software standards as well as the Capability Maturity Model Integration for Development (CMMI) * Illustrates the application of software quality assurance practices through the use of practical examples, quotes from experts, and tips from the authors

  2. Software Assurance Curriculum Project Volume 1: Master of Software Assurance Reference Curriculum

    Science.gov (United States)

    2010-08-01

    developed products. The above definition was derived from these references: [IEEE-CS 2008] ISO /IEC 12207 , IEEE Std 12207 -2008, Systems and Software...Systems [CNSS 2009]. Software quality Capability of a software product to satisfy stated and implied needs when used under specified conditions [ ISO ...Curriculum ISO International Organization for Standardization IT information technology KA knowledge area KU knowledge unit MBA Master of

  3. THE TECHNIQUE OF ANALYSIS OF SOFTWARE OF ON-BOARD COMPUTERS OF AIR VESSEL TO ABSENCE OF UNDECLARED CAPABILITIES BY SIGNATURE-HEURISTIC WAY

    Directory of Open Access Journals (Sweden)

    Viktor Ivanovich Petrov

    2017-01-01

    Full Text Available The article considers the issues of civil aviation aircraft onboard computers data safety. Infor- mation security undeclared capabilities stand for technical equipment or software possibilities, which are not mentioned in the documentation. Documentation and tests content requirements are imposed during the software certification. Documentation requirements include documents composition and content of control (specification, description and program code, the source code. Test requirements include: static analysis of program codes (including the compliance of the sources with their loading modules monitoring; dynamic analysis of source code (including implementation of routes monitor- ing. Currently, there are no complex measures for checking onboard computer software. There are no rules and regulations that can allow controlling foreign production aircraft software, and the actual receiving of software is difficult. Consequently, the author suggests developing the basics of aviation rules and regulations, which allow to analyze the programs of CA aircraft onboard computers. If there are no software source codes the two approaches of code analysis are used: a structural static and dy- namic analysis of the source code; signature-heuristic analysis of potentially dangerous operations. Static analysis determines the behavior of the program by reading the program code (without running the program which is represented in the assembler language - disassembly listing. Program tracing is performed by the dynamic analysis. The analysis of aircraft software ability to detect undeclared capa- bilities using the interactive disassembler was considered in this article.

  4. FMEF/experimental capabilities

    International Nuclear Information System (INIS)

    Burgess, C.A.; Dronen, V.R.

    1981-01-01

    The Fuels and Materials Examination Facility (FMEF), under construction at the Hanford site north of Richland, Washington, will be one of the most modern facilities offering irradiated fuels and materials examination capabilities and fuel fabrication development technologies. Scheduled for completion in 1984, the FMEF will provide examination capability for fuel assemblies, fuel pins and test pins irradiated in the FFTF. Various functions of the FMEF are described, with emphasis on experimental data-gathering capabilities in the facility's Nondestructive and Destructive examination cell complex

  5. Software engineering frameworks for the cloud computing paradigm

    CERN Document Server

    Mahmood, Zaigham

    2013-01-01

    This book presents the latest research on Software Engineering Frameworks for the Cloud Computing Paradigm, drawn from an international selection of researchers and practitioners. The book offers both a discussion of relevant software engineering approaches and practical guidance on enterprise-wide software deployment in the cloud environment, together with real-world case studies. Features: presents the state of the art in software engineering approaches for developing cloud-suitable applications; discusses the impact of the cloud computing paradigm on software engineering; offers guidance an

  6. GENII Version 2 Software Design Document

    Energy Technology Data Exchange (ETDEWEB)

    Napier, Bruce A.; Strenge, Dennis L.; Ramsdell, James V.; Eslinger, Paul W.; Fosmire, Christian J.

    2004-03-08

    This document describes the architectural design for the GENII-V2 software package. This document defines details of the overall structure of the software, the major software components, their data file interfaces, and specific mathematical models to be used. The design represents a translation of the requirements into a description of the software structure, software components, interfaces, and necessary data. The design focuses on the major components and data communication links that are key to the implementation of the software within the operating framework. The purpose of the GENII-V2 software package is to provide the capability to perform dose and risk assessments of environmental releases of radionuclides. The software also has the capability of calculating environmental accumulation and radiation doses from surface water, groundwater, and soil (buried waste) media when an input concentration of radionuclide in these media is provided. This report represents a detailed description of the capabilities of the software product with exact specifications of mathematical models that form the basis for the software implementation and testing efforts. This report also presents a detailed description of the overall structure of the software package, details of main components (implemented in the current phase of work), details of data communication files, and content of basic output reports. The GENII system includes the capabilities for calculating radiation doses following chronic and acute releases. Radionuclide transport via air, water, or biological activity may be considered. Air transport options include both puff and plume models, each allow use of an effective stack height or calculation of plume rise from buoyant or momentum effects (or both). Building wake effects can be included in acute atmospheric release scenarios. The code provides risk estimates for health effects to individuals or populations; these can be obtained using the code by applying

  7. BIM Software Capability and Interoperability Analysis : An analytical approach toward structural usage of BIM software (S-BIM)

    OpenAIRE

    A. Taher, Ali

    2016-01-01

    This study focused on the structuralanalysis of BIM models. Different commercial software (Autodesk products and Rhinoceros)are presented through modelling and analysis of different structures with varying complexity,section properties, geometry, and material. Beside the commercial software, differentarchitectural and different tools for structural analysis are evaluated (dynamo, grasshopper,add-on tool, direct link, indirect link via IFC). BIM and Structural BIM (S-BIM)

  8. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  9. Software ecosystems analyzing and managing business networks in the software industry

    CERN Document Server

    Jansen, S; Cusumano, MA

    2013-01-01

    This book describes the state-of-the-art of software ecosystems. It constitutes a fundamental step towards an empirically based, nuanced understanding of the implications for management, governance, and control of software ecosystems. This is the first book of its kind dedicated to this emerging field and offers guidelines on how to analyze software ecosystems; methods for managing and growing; methods on transitioning from a closed software organization to an open one; and instruments for dealing with open source, licensing issues, product management and app stores. It is unique in bringing t

  10. Software Engineering Reviews and Audits

    CERN Document Server

    Summers, Boyd L

    2011-01-01

    Accurate software engineering reviews and audits have become essential to the success of software companies and military and aerospace programs. These reviews and audits define the framework and specific requirements for verifying software development efforts. Authored by an industry professional with three decades of experience, Software Engineering Reviews and Audits offers authoritative guidance for conducting and performing software first article inspections, and functional and physical configuration software audits. It prepares readers to answer common questions for conducting and perform

  11. IEEE Computer Society/Software Engineering Institute Software Process Achievement (SPA) Award 2009

    Science.gov (United States)

    2011-03-01

    capabilities to our GDM. We also introduced software as a service ( SaaS ) as part our technology solutions and have further enhanced our ability to...model PROSPER Infosys production support methodology Q&P quality and productivity R&D research and development SaaS software as a service ... Software Development Life Cycle (SDLC) 23 Table 10: Scientific Estimation Coverage by Service Line 27 CMU/SEI-2011-TR-008 | vi CMU/SEI-2011

  12. Evolving Capabilities for Virtual Globes

    Science.gov (United States)

    Glennon, A.

    2006-12-01

    Though thin-client spatial visualization software like Google Earth and NASA World Wind enjoy widespread popularity, a common criticism is their general lack of analytical functionality. This concern, however, is rapidly being addressed; standard and advanced geographic information system (GIS) capabilities are being developed for virtual globes--though not centralized into a single implementation or software package. The innovation is mostly originating from the user community. Three such capabilities relevant to the earth science, education, and emergency management communities are modeling dynamic spatial phenomena, real-time data collection and visualization, and multi-input collaborative databases. Modeling dynamic spatial phenomena has been facilitated through joining virtual globe geometry definitions--like KML--to relational databases. Real-time data collection uses short scripts to transform user-contributed data into a format usable by virtual globe software. Similarly, collaborative data collection for virtual globes has become possible by dynamically referencing online, multi-person spreadsheets. Examples of these functions include mapping flows within a karst watershed, real-time disaster assessment and visualization, and a collaborative geyser eruption spatial decision support system. Virtual globe applications will continue to evolve further analytical capabilities, more temporal data handling, and from nano to intergalactic scales. This progression opens education and research avenues in all scientific disciplines.

  13. Dynamic capabilities, Marketing Capability and Organizational Performance

    Directory of Open Access Journals (Sweden)

    Adriana Roseli Wünsch Takahashi

    2017-01-01

    Full Text Available The goal of the study is to investigate the influence of dynamic capabilities on organizational performance and the role of marketing capabilities as a mediator in this relationship in the context of private HEIs in Brazil. As a research method we carried out a survey with 316 IES and data analysis was operationalized with the technique of structural equation modeling. The results indicate that the dynamic capabilities have influence on organizational performance only when mediated by marketing ability. The marketing capability has an important role in the survival, growth and renewal on educational services offerings for HEIs in private sector, and consequently in organizational performance. It is also demonstrated that mediated relationship is more intense for HEI with up to 3,000 students and other organizational profile variables such as amount of courses, the constitution, the type of institution and type of education do not significantly alter the results.

  14. SuperMAG: Present and Future Capabilities

    Science.gov (United States)

    Hsieh, S. W.; Gjerloev, J. W.; Barnes, R. J.

    2009-12-01

    SuperMAG is a global collaboration that provides ground magnetic field perturbations from a long list of stations in the same coordinate system, identical time resolution and with a common baseline removal approach. This unique high quality dataset provides a continuous and nearly global monitoring of the ground magnetic field perturbation. Currently, only archived data are available on the website and hence it targets basic research without any operational capabilities. The existing SuperMAG software can be easily adapted to ingest real-time or near real-time data and provide a now-casting capability. The SuperDARN program has a long history of providing near real-time maps of the northern hemisphere electrostatic potential and as both SuperMAG and SuperDARN share common software it is relatively easy to adapt these maps for global magnetic perturbations. Magnetometer measurements would be assimilated by the SuperMAG server using a variety of techniques, either by downloading data at regular intervals from remote servers or by real-time streaming connections. The existing SuperMAG analysis software would then process these measurements to provide the final calibrated data set using the SuperMAG coordinate system. The existing plotting software would then be used to produce regularly updated global plots. The talk will focus on current SuperMAG capabilities illustrating the potential for now-casting and eventually forecasting.

  15. Client Mobile Software Design Principles for Mobile Learning Systems

    Directory of Open Access Journals (Sweden)

    Qing Tan

    2009-01-01

    Full Text Available In a client-server mobile learning system, client mobile software must run on the mobile phone to acquire, package, and send student’s interaction data via the mobile communications network to the connected mobile application server. The server will receive and process the client data in order to offer appropriate content and learning activities. To develop the mobile learning systems there are a number of very important issues that must be addressed. Mobile phones have scarce computing resources. They consist of heterogeneous devices and use various mobile operating systems, they have limitations with their user/device interaction capabilities, high data communications cost, and must provide for device mobility and portability. In this paper we propose five principles for designing Client mobile learning software. A location-based adaptive mobile learning system is presented as a proof of concept to demonstrate the applicability of these design principles.

  16. Statistical Validation for Clinical Measures: Repeatability and Agreement of Kinect™-Based Software.

    Science.gov (United States)

    Lopez, Natalia; Perez, Elisa; Tello, Emanuel; Rodrigo, Alejandro; Valentinuzzi, Max E

    2018-01-01

    The rehabilitation process is a fundamental stage for recovery of people's capabilities. However, the evaluation of the process is performed by physiatrists and medical doctors, mostly based on their observations, that is, a subjective appreciation of the patient's evolution. This paper proposes a tracking platform of the movement made by an individual's upper limb using Kinect sensor(s) to be applied for the patient during the rehabilitation process. The main contribution is the development of quantifying software and the statistical validation of its performance, repeatability, and clinical use in the rehabilitation process. The software determines joint angles and upper limb trajectories for the construction of a specific rehabilitation protocol and quantifies the treatment evolution. In turn, the information is presented via a graphical interface that allows the recording, storage, and report of the patient's data. For clinical purposes, the software information is statistically validated with three different methodologies, comparing the measures with a goniometer in terms of agreement and repeatability. The agreement of joint angles measured with the proposed software and goniometer is evaluated with Bland-Altman plots; all measurements fell well within the limits of agreement, meaning interchangeability of both techniques. Additionally, the results of Bland-Altman analysis of repeatability show 95% confidence. Finally, the physiotherapists' qualitative assessment shows encouraging results for the clinical use. The main conclusion is that the software is capable of offering a clinical history of the patient and is useful for quantification of the rehabilitation success. The simplicity, low cost, and visualization possibilities enhance the use of the software Kinect for rehabilitation and other applications, and the expert's opinion endorses the choice of our approach for clinical practice. Comparison of the new measurement technique with established

  17. Software engineering processes principles and applications

    CERN Document Server

    Wang, Yingxu

    2000-01-01

    Fundamentals of the Software Engineering ProcessIntroductionA Unified Framework of the Software Engineering ProcessProcess AlgebraProcess-Based Software EngineeringSoftware Engineering Process System ModelingThe CMM ModelThe ISO 9001 ModelThe BOOTSTRAP ModelThe ISO/IEC 15504 (SPICE) ModelThe Software Engineering Process Reference Model: SEPRMSoftware Engineering Process System AnalysisBenchmarking the SEPRM ProcessesComparative Analysis of Current Process ModelsTransformation of Capability Levels Between Current Process ModelsSoftware Engineering Process EstablishmentSoftware Process Establish

  18. Evidence synthesis software.

    Science.gov (United States)

    Park, Sophie Elizabeth; Thomas, James

    2018-06-07

    It can be challenging to decide which evidence synthesis software to choose when doing a systematic review. This article discusses some of the important questions to consider in relation to the chosen method and synthesis approach. Software can support researchers in a range of ways. Here, a range of review conditions and software solutions. For example, facilitating contemporaneous collaboration across time and geographical space; in-built bias assessment tools; and line-by-line coding for qualitative textual analysis. EPPI-Reviewer is a review software for research synthesis managed by the EPPI-centre, UCL Institute of Education. EPPI-Reviewer has text mining automation technologies. Version 5 supports data sharing and re-use across the systematic review community. Open source software will soon be released. EPPI-Centre will continue to offer the software as a cloud-based service. The software is offered via a subscription with a one-month (extendible) trial available and volume discounts for 'site licences'. It is free to use for Cochrane and Campbell reviews. The next EPPI-Reviewer version is being built in collaboration with National Institute for Health and Care Excellence using 'surveillance' of newly published research to support 'living' iterative reviews. This is achieved using a combination of machine learning and traditional information retrieval technologies to identify the type of research each new publication describes and determine its relevance for a particular review, domain or guideline. While the amount of available knowledge and research is constantly increasing, the ways in which software can support the focus and relevance of data identification are also developing fast. Software advances are maximising the opportunities for the production of relevant and timely reviews. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise

  19. Self-assembled software and method of overriding software execution

    Science.gov (United States)

    Bouchard, Ann M.; Osbourn, Gordon C.

    2013-01-08

    A computer-implemented software self-assembled system and method for providing an external override and monitoring capability to dynamically self-assembling software containing machines that self-assemble execution sequences and data structures. The method provides an external override machine that can be introduced into a system of self-assembling machines while the machines are executing such that the functionality of the executing software can be changed or paused without stopping the code execution and modifying the existing code. Additionally, a monitoring machine can be introduced without stopping code execution that can monitor specified code execution functions by designated machines and communicate the status to an output device.

  20. Strategies for Using Plagiarism Software in the Screening of Incoming Journal Manuscripts

    DEFF Research Database (Denmark)

    Lykkesfeldt, Jens

    2016-01-01

    In recent years, several online tools have appeared capable of identifying potential plagiarism in science. While such tools may help to maintain or even increase the originality and ethical quality of the scientific literature, no apparent consensus exists among editors on the degree of plagiarism...... or self-plagiarism necessary to reject or retract manuscripts. In this study, two entire volumes of published original papers and reviews from Basic & Clinical Pharmacology & Toxicology were retrospectively scanned for similarity in anonymized form using iThenticate software to explore measures...... to predictively identify true plagiarism and self-plagiarism and to potentially provide guidelines for future screening of incoming manuscripts. Several filters were applied, all of which appeared to lower the noise from irrelevant hits. The main conclusions were that plagiarism software offers a unique...

  1. Software to Go--And It Goes!

    Science.gov (United States)

    Abrams, Mary; Kurlychek, Ken

    1989-01-01

    This article describes the Software Evaluation Clearinghouse for Educators of the Hearing Impaired at Gallaudet University (Washington, DC). Software compatible with Apple and IBM hardware is collected, rated by clearinghouse members, and described in a printed catalog. Tips on starting a software lending library are offered. (PB)

  2. Organisational Capability--What Does It Mean?

    Science.gov (United States)

    National Centre for Vocational Education Research (NCVER), 2006

    2006-01-01

    Organisational capability is rapidly becoming recognized as the key to organizational success. However, the lack of research on it has been well documented in the literature, and organizational capability remains an elusive concept. Yet an understanding of organizational capability can offer insights into how RTOs might work most effectively,…

  3. Large-scale visualization projects for teaching software engineering.

    Science.gov (United States)

    Müller, Christoph; Reina, Guido; Burch, Michael; Weiskopf, Daniel

    2012-01-01

    The University of Stuttgart's software engineering major complements the traditional computer science major with more practice-oriented education. Two-semester software projects in various application areas offered by the university's different computer science institutes are a successful building block in the curriculum. With this realistic, complex project setting, students experience the practice of software engineering, including software development processes, technologies, and soft skills. In particular, visualization-based projects are popular with students. Such projects offer them the opportunity to gain profound knowledge that would hardly be possible with only regular lectures and homework assignments.

  4. Service-oriented architecture for the ARGOS instrument control software

    Science.gov (United States)

    Borelli, J.; Barl, L.; Gässler, W.; Kulas, M.; Rabien, Sebastian

    2012-09-01

    The Advanced Rayleigh Guided ground layer Adaptive optic System, ARGOS, equips the Large Binocular Telescope (LBT) with a constellation of six rayleigh laser guide stars. By correcting atmospheric turbulence near the ground, the system is designed to increase the image quality of the multi-object spectrograph LUCIFER approximately by a factor of 3 over a field of 4 arc minute diameter. The control software has the critical task of orchestrating several devices, instruments, and high level services, including the already existing adaptive optic system and the telescope control software. All these components are widely distributed over the telescope, adding more complexity to the system design. The approach used by the ARGOS engineers is to write loosely coupled and distributed services under the control of different ownership systems, providing a uniform mechanism to offer, discover, interact and use these distributed capabilities. The control system counts with several finite state machines, vibration and flexure compensation loops, and safety mechanism, such as interlocks, aircraft, and satellite avoidance systems.

  5. PROCESS CAPABILITY ESTIMATION FOR NON-NORMALLY DISTRIBUTED DATA USING ROBUST METHODS - A COMPARATIVE STUDY

    Directory of Open Access Journals (Sweden)

    Yerriswamy Wooluru

    2016-06-01

    Full Text Available Process capability indices are very important process quality assessment tools in automotive industries. The common process capability indices (PCIs Cp, Cpk, Cpm are widely used in practice. The use of these PCIs based on the assumption that process is in control and its output is normally distributed. In practice, normality is not always fulfilled. Indices developed based on normality assumption are very sensitive to non- normal processes. When distribution of a product quality characteristic is non-normal, Cp and Cpk indices calculated using conventional methods often lead to erroneous interpretation of process capability. In the literature, various methods have been proposed for surrogate process capability indices under non normality but few literature sources offer their comprehensive evaluation and comparison of their ability to capture true capability in non-normal situation. In this paper, five methods have been reviewed and capability evaluation is carried out for the data pertaining to resistivity of silicon wafer. The final results revealed that the Burr based percentile method is better than Clements method. Modelling of non-normal data and Box-Cox transformation method using statistical software (Minitab 14 provides reasonably good result as they are very promising methods for non - normal and moderately skewed data (Skewness <= 1.5.

  6. Software engineering practices for control system reliability

    International Nuclear Information System (INIS)

    S. K. Schaffner; K. S White

    1999-01-01

    This paper will discuss software engineering practices used to improve Control System reliability. The authors begin with a brief discussion of the Software Engineering Institute's Capability Maturity Model (CMM) which is a framework for evaluating and improving key practices used to enhance software development and maintenance capabilities. The software engineering processes developed and used by the Controls Group at the Thomas Jefferson National Accelerator Facility (Jefferson Lab), using the Experimental Physics and Industrial Control System (EPICS) for accelerator control, are described. Examples are given of how their procedures have been used to minimized control system downtime and improve reliability. While their examples are primarily drawn from their experience with EPICS, these practices are equally applicable to any control system. Specific issues addressed include resource allocation, developing reliable software lifecycle processes and risk management

  7. Orintsol. Surfaces with assorted inclination: software to calculate the solar radiation; Orientsol. Superficies con distinta inclinacion Software para el calculo de la radiacion solar

    Energy Technology Data Exchange (ETDEWEB)

    Rus, C.; Almonacid, F.; Hontoria, L.; Perez, P. J.; Munoz, F. J.

    2009-07-01

    The Universidad de Jaen, conscious of the importance of using energy sources respectful with the environment, offers in its Technical Industry Engineer degree, in the specialties of: Mechanics, Electricity and Industrial electronics the optional subjects Solar electricity and Photovoltaic Facilities. With these matters is intended that the students acquire the capability of design, calculate, analyze their different applications. A fundamental aspect in solar facilities is how to know the incident radiation in the plant which we want to analyze or the size. Orintsol software tool, with a didactic aim, facilitates so teaching as learning about solar radiation received on inclined surfaces. (Author) 8 refs.

  8. Development of a New VLBI Data Analysis Software

    Science.gov (United States)

    Bolotin, Sergei; Gipson, John M.; MacMillan, Daniel S.

    2010-01-01

    We present an overview of a new VLBI analysis software under development at NASA GSFC. The new software will replace CALC/SOLVE and many related utility programs. It will have the capabilities of the current system as well as incorporate new models and data analysis techniques. In this paper we give a conceptual overview of the new software. We formulate the main goals of the software. The software should be flexible and modular to implement models and estimation techniques that currently exist or will appear in future. On the other hand it should be reliable and possess production quality for processing standard VLBI sessions. Also, it needs to be capable of processing observations from a fully deployed network of VLBI2010 stations in a reasonable time. We describe the software development process and outline the software architecture.

  9. Offering Global Collaboration Services beyond CERN and HEP

    Science.gov (United States)

    Fernandes, J.; Ferreira, P.; Baron, T.

    2015-12-01

    The CERN IT department has built over the years a performant and integrated ecosystem of collaboration tools, from videoconference and webcast services to event management software. These services have been designed and evolved in very close collaboration with the various communities surrounding the laboratory and have been massively adopted by CERN users. To cope with this very heavy usage, global infrastructures have been deployed which take full advantage of CERN's international and global nature. If these services and tools are instrumental in enabling the worldwide collaboration which generates major HEP breakthroughs, they would certainly also benefit other sectors of science in which globalization has already taken place. Some of these services are driven by commercial software (Vidyo or Wowza for example), some others have been developed internally and have already been made available to the world as Open Source Software in line with CERN's spirit and mission. Indico for example is now installed in 100+ institutes worldwide. But providing the software is often not enough and institutes, collaborations and project teams do not always possess the expertise, or human or material resources that are needed to set up and maintain such services. Regional and national institutions have to answer needs, which are growingly global and often contradict their operational capabilities or organizational mandate and so are looking at existing worldwide service offers such as CERN's. We believe that the accumulated experience obtained through the operation of a large scale worldwide collaboration service combined with CERN's global network and its recently- deployed Agile Infrastructure would allow the Organization to set up and operate collaborative services, such as Indico and Vidyo, at a much larger scale and on behalf of worldwide research and education institutions and thus answer these pressing demands while optimizing resources at a global level. Such services would

  10. Software for Managing Personal Files.

    Science.gov (United States)

    Lundeen, Gerald

    1989-01-01

    Discusses the special characteristics of personal file management software and compares four microcomputer software packages: Notebook II with Bibliography and Convert, Pro-Cite with Biblio-Links, askSam, and Reference Manager. Each package is evaluated in terms of the user interface, file maintenance, retrieval capabilities, output, and…

  11. Improving offering strategies for wind farms enhanced with storage capability

    DEFF Research Database (Denmark)

    Ding, Huajie; Hu, Zechun; Song, Yonghua

    2015-01-01

    Due to the flexible charging and discharging capability, energy storage system (ESS) is thought of as a promising complement to wind farms (WF) in participating into electricity markets. This paper proposes a reserve-based real-time operation strategy of ESS to make arbitrage and to alleviate...... the wind power deviation from day-ahead contracts. Taking into account the operation strategy as well as two-price balancing market rules, a day-ahead bidding strategy of WF-ESS system is put forward and formulated. A modified gradient descent algorithm is described to solve the formulations. In the case...... studies, the computational efficiency of the algorithm is validated firstly. Moreover, a number of scenarios with/without considering the temporal dependence of wind power forecast error are designed and employed to compare the proposed strategy with other common ones in terms of profit....

  12. Software testability and its application to avionic software

    Science.gov (United States)

    Voas, Jeffrey M.; Miller, Keith W.; Payne, Jeffery E.

    1993-01-01

    Randomly generated black-box testing is an established yet controversial method of estimating software reliability. Unfortunately, as software applications have required higher reliabilities, practical difficulties with black-box testing have become increasingly problematic. These practical problems are particularly acute in life-critical avionics software, where requirements of 10 exp -7 failures per hour of system reliability can translate into a probability of failure (POF) of perhaps 10 exp -9 or less for each individual execution of the software. This paper describes the application of one type of testability analysis called 'sensitivity analysis' to B-737 avionics software; one application of sensitivity analysis is to quantify whether software testing is capable of detecting faults in a particular program and thus whether we can be confident that a tested program is not hiding faults. We so 80 by finding the testabilities of the individual statements of the program, and then use those statement testabilities to find the testabilities of the functions and modules. For the B-737 system we analyzed, we were able to isolate those functions that are more prone to hide errors during system/reliability testing.

  13. Business engineering. Generic Software Architecture in an Object Oriented View

    Directory of Open Access Journals (Sweden)

    Mihaela MURESAN

    2006-01-01

    Full Text Available The generic software architecture offers a solution for the the information system's development and implementation. A generic software/non-software model could be developed by integrating the enterprise blueprint concept (Zachman and the object oriented paradigm (Coad's archetype concept. The standardization of the generic software architecture for various specific software components could be a direction of crucial importance, offering the guarantee of the quality of the model and increasing the efficiency of the design, development and implementation of the software. This approach is also useful for the implementation of the ERP systems designed to fit the user’s particular requirements.

  14. Software takes command

    CERN Document Server

    Manovich, Lev

    2013-01-01

    Software has replaced a diverse array of physical, mechanical, and electronic technologies used before 21st century to create, store, distribute and interact with cultural artifacts. It has become our interface to the world, to others, to our memory and our imagination - a universal language through which the world speaks, and a universal engine on which the world runs. What electricity and combustion engine were to the early 20th century, software is to the early 21st century. Offering the the first theoretical and historical account of software for media authoring and its effects on the prac

  15. Improvements to the APBS biomolecular solvation software suite: Improvements to the APBS Software Suite

    Energy Technology Data Exchange (ETDEWEB)

    Jurrus, Elizabeth [Pacific Northwest National Laboratory, Richland Washington; Engel, Dave [Pacific Northwest National Laboratory, Richland Washington; Star, Keith [Pacific Northwest National Laboratory, Richland Washington; Monson, Kyle [Pacific Northwest National Laboratory, Richland Washington; Brandi, Juan [Pacific Northwest National Laboratory, Richland Washington; Felberg, Lisa E. [University of California, Berkeley California; Brookes, David H. [University of California, Berkeley California; Wilson, Leighton [University of Michigan, Ann Arbor Michigan; Chen, Jiahui [Southern Methodist University, Dallas Texas; Liles, Karina [Pacific Northwest National Laboratory, Richland Washington; Chun, Minju [Pacific Northwest National Laboratory, Richland Washington; Li, Peter [Pacific Northwest National Laboratory, Richland Washington; Gohara, David W. [St. Louis University, St. Louis Missouri; Dolinsky, Todd [FoodLogiQ, Durham North Carolina; Konecny, Robert [University of California San Diego, San Diego California; Koes, David R. [University of Pittsburgh, Pittsburgh Pennsylvania; Nielsen, Jens Erik [Protein Engineering, Novozymes A/S, Copenhagen Denmark; Head-Gordon, Teresa [University of California, Berkeley California; Geng, Weihua [Southern Methodist University, Dallas Texas; Krasny, Robert [University of Michigan, Ann Arbor Michigan; Wei, Guo-Wei [Michigan State University, East Lansing Michigan; Holst, Michael J. [University of California San Diego, San Diego California; McCammon, J. Andrew [University of California San Diego, San Diego California; Baker, Nathan A. [Pacific Northwest National Laboratory, Richland Washington; Brown University, Providence Rhode Island

    2017-10-24

    The Adaptive Poisson-Boltzmann Solver (APBS) software was developed to solve the equations of continuum electrostatics for large biomolecular assemblages that has provided impact in the study of a broad range of chemical, biological, and biomedical applications. APBS addresses three key technology challenges for understanding solvation and electrostatics in biomedical applications: accurate and efficient models for biomolecular solvation and electrostatics, robust and scalable software for applying those theories to biomolecular systems, and mechanisms for sharing and analyzing biomolecular electrostatics data in the scientific community. To address new research applications and advancing computational capabilities, we have continually updated APBS and its suite of accompanying software since its release in 2001. In this manuscript, we discuss the models and capabilities that have recently been implemented within the APBS software package including: a Poisson-Boltzmann analytical and a semi-analytical solver, an optimized boundary element solver, a geometry-based geometric flow solvation model, a graph theory based algorithm for determining pKa values, and an improved web-based visualization tool for viewing electrostatics.

  16. Smartphone-based biosensing platform evolution: implementation of electrochemical analysis capabilities

    DEFF Research Database (Denmark)

    Patou, François; Dimaki, Maria; Svendsen, Winnie Edith

    2016-01-01

    Lab-on-Chip technologies offer great opportunities for the democratization of in-vitro medical diagnostics to the consumer-market. Despite the limitations set by the strict instrumentation and control requirements of certain families of these devices, new solutions are emerging. Smartphones now...... routinely demonstrate their potential as an interface of choice for operating complex, instrumented Lab-on-Chips. The sporadic nature of home-based in-vitro medical diagnostics testing calls for the development of systems capable of evolving with new applications or new technologies for Lab-on-Chip devices....... We present in this work how we evolved the first generation of a smartphone/Lab-on-Chip platform designed for evolvability. We demonstrate how reengineering efforts can be confined to the mobile-software layer and illustrate some of the benefits of building evolvable systems. We implement...

  17. Personality Assessment: A Competency-Capability Perspective.

    Science.gov (United States)

    Kaslow, Nadine J; Finklea, J Tyler; Chan, Ginny

    2018-01-01

    This article begins by reviewing the proficiency of personality assessment in the context of the competencies movement, which has dominated health service psychology in recent years. It examines the value of including a capability framework for advancing this proficiency and enhancing the quality of personality assessments, including Therapeutic Assessment (Finn & Tonsager, 1997 ), that include a personality assessment component. This hybrid competency-capability framework is used to set the stage for the conduct of personality assessments in a variety of contexts and for the optimal training of personality assessment. Future directions are offered in terms of ways psychologists can strengthen their social contract with the public and offer a broader array of personality assessments in more diverse contexts and by individuals who are both competent and capable.

  18. UTM TCL2 Software Requirements

    Science.gov (United States)

    Smith, Irene S.; Rios, Joseph L.; McGuirk, Patrick O.; Mulfinger, Daniel G.; Venkatesan, Priya; Smith, David R.; Baskaran, Vijayakumar; Wang, Leo

    2017-01-01

    The Unmanned Aircraft Systems (UAS) Traffic Management (UTM) Technical Capability Level (TCL) 2 software implements the UTM TCL 2 software requirements described herein. These software requirements are linked to the higher level UTM TCL 2 System Requirements. Each successive TCL implements additional UTM functionality, enabling additional use cases. TCL 2 demonstrated how to enable expanded multiple operations by implementing automation for beyond visual line-of-sight, tracking operations, and operations flying over sparsely populated areas.

  19. Software

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, R.; Budd, G.; Ross, E.; Wells, P.

    2010-07-15

    The software section of this journal presented new software programs that have been developed to help in the exploration and development of hydrocarbon resources. Software provider IHS Inc. has made additions to its geological and engineering analysis software tool, IHS PETRA, a product used by geoscientists and engineers to visualize, analyze and manage well production, well log, drilling, reservoir, seismic and other related information. IHS PETRA also includes a directional well module and a decline curve analysis module to improve analysis capabilities in unconventional reservoirs. Petris Technology Inc. has developed a software to help manage the large volumes of data. PetrisWinds Enterprise (PWE) helps users find and manage wellbore data, including conventional wireline and MWD core data; analysis core photos and images; waveforms and NMR; and external files documentation. Ottawa-based Ambercore Software Inc. has been collaborating with Nexen on the Petroleum iQ software for steam assisted gravity drainage (SAGD) producers. Petroleum iQ integrates geology and geophysics data with engineering data in 3D and 4D. Calgary-based Envirosoft Corporation has developed a software that reduces the costly and time-consuming effort required to comply with Directive 39 of the Alberta Energy Resources Conservation Board. The product includes an emissions modelling software. Houston-based Seismic Micro-Technology (SMT) has developed the Kingdom software that features the latest in seismic interpretation. Holland-based Joa Oil and Gas and Calgary-based Computer Modelling Group have both supplied the petroleum industry with advanced reservoir simulation software that enables reservoir interpretation. The 2010 software survey included a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In

  20. Computer-aided software development

    International Nuclear Information System (INIS)

    Teichroew, D.; Hershey, E.A. III; Yamamoto, Y.

    1978-01-01

    In recent years, as the hardware cost/capability ratio has continued to decrease and as much of the routine data processing has been computerized, the emphasis in software development has shifted from just getting systems operational to the maintenance of existing systems, reduction of duplication by integration, selective addition of new applications, systems that are more usable, maintainable, portable and reliable and to improving the productivity of software developers. This paper examines a number of trends that are changing the methods by which software is being produced and used. (Auth.)

  1. Software theory a cultural and philosophical study

    CERN Document Server

    Frabetti, Federica

    2014-01-01

    This book engages directly in close readings of technical texts and computer code in order to show how software works. It offers an analysis of the cultural, political, and philosophical implications of software technologies that demonstrates the significance of software for the relationship between technology, philosophy, culture, and society.

  2. Community psychology and the capabilities approach.

    Science.gov (United States)

    Shinn, Marybeth

    2015-06-01

    What makes for a good life? The capabilities approach to this question has much to offer community psychology, particularly with respect to marginalized groups. Capabilities are freedoms to engage in valued social activities and roles-what people can do and be given both their capacities, and environmental opportunities and constraints. Economist Amartya Sen's focus on freedoms and agency resonates with psychological calls for empowerment, and philosopher Martha Nussbaum's specification of requirements for a life that is fully human provides an important guide for social programs. Community psychology's focus on mediating structures has much to offer the capabilities approach. Parallels between capabilities, as enumerated by Nussbaum, and settings that foster positive youth development, as described in a National Research Council Report (Eccles and Gootman (Eds) in Community programs to promote youth development. National Academy Press, Washington, 2002) suggest extensions of the approach to children. Community psychologists can contribute to theory about ways to create and modify settings to enhance capabilities as well as empowerment and positive youth development. Finally, capabilities are difficult to measure, because they involve freedoms to choose but only choices actually made or enacted can be observed. The variation in activities or goals across members of a setting provides a measure of the capabilities that the setting fosters.

  3. SQIMSO: Quality Improvement for Small Software Organizations

    OpenAIRE

    Rabih Zeineddine; Nashat Mansour

    2005-01-01

    Software quality improvement process remains incomplete if it is not initiated and conducted through a wide improvement program that considers process quality improvement, product quality improvement and evolution of human resources. But, small software organizations are not capable of bearing the cost of establishing software process improvement programs. In this work, we propose a new software quality improvement model for small organizations, SQIMSO, based on three ...

  4. Software engineering from a Langley perspective

    Science.gov (United States)

    Voigt, Susan

    1994-01-01

    A brief introduction to software engineering is presented. The talk is divided into four sections beginning with the question 'What is software engineering', followed by a brief history of the progression of software engineering at the Langley Research Center in the context of an expanding computing environment. Several basic concepts and terms are introduced, including software development life cycles and maturity levels. Finally, comments are offered on what software engineering means for the Langley Research Center and where to find more information on the subject.

  5. Organizational Economics of Capability and Heterogeneity

    DEFF Research Database (Denmark)

    Argyres, Nicholas S.; Felin, Teppo; Foss, Nicolai Juul

    2012-01-01

    For decades, the literatures on firm capabilities and organizational economics have been at odds with each other, specifically relative to explaining organizational boundaries and heterogeneity. We briefly trace the history of the relationship between the capabilities literature and organizational...... economics, and we point to the dominance of a “capabilities first” logic in this relationship. We argue that capabilities considerations are inherently intertwined with questions about organizational boundaries and internal organization, and we use this point to respond to the prevalent capabilities first...... logic. We offer an integrative research agenda that focuses first on the governance of capabilities and then on the capability of governance....

  6. MEASUREMENT PROCESS OF SOFTWARE DEVELOPMENT PROJECTS FOR SUPPORTING STRATEGIC BUSINESS OBJECTIVES IN SOFTWARE DEVELOPING COMPANIES

    Directory of Open Access Journals (Sweden)

    Sandra Lais Pedroso

    2013-08-01

    Full Text Available Software developing companies work in a competitive market and are often challenged to make business decisions with impact on competitiveness. Models accessing maturity for software development processes quality, such as CMMI and MPS-BR, comprise process measurements systems (PMS. However, these models are not necessarily suitable to support business decisions, neither to achieve strategic goals. The objective of this work is to analyze how the PMS of software development projects could support business strategies for software developing companies. Results taken from this work show that PMS results from maturity models for software processes can be suited to help evaluating operating capabilities and supporting strategic business decisions.

  7. Visual data mining and analysis of software repositories

    NARCIS (Netherlands)

    Voinea, S.L.; Telea, A.C.

    2007-01-01

    In this article we describe an ongoing effort to integrate information visualization techniques into the process of configuration management for software systems. Our focus is to help software engineers manage the evolution of large and complex software systems by offering them effective and

  8. Optical microparticle manipulation advances and new capabilities offered by diffractive optics

    International Nuclear Information System (INIS)

    Sojfer, V.A.; Kotlyar, V.V.; Khonina, S.N.

    2004-01-01

    The review deals with a promising area in laser optics - optical manipulation. The object under manipulation can be of various nature: from a colloid particle to a molecule, from cell, virus, to a micromechanism part, etc. In the first part of this work a concise review of the articles on optical manipulation of microparticles and atoms published in the last two decades is presented. The second part is devoted t the production of laser beams with self-reproduction properties. Such beams can be most effectively produced using diffractive optical elements (DOEs). The DOE-generated self-reproducing laser beams (stable, axially periodic, rotating, and multiorder) offer new opportunities in optical manipulation of micro- and nano-objects [ru

  9. Software Health Management with Bayesian Networks

    Science.gov (United States)

    Mengshoel, Ole; Schumann, JOhann

    2011-01-01

    Most modern aircraft as well as other complex machinery is equipped with diagnostics systems for its major subsystems. During operation, sensors provide important information about the subsystem (e.g., the engine) and that information is used to detect and diagnose faults. Most of these systems focus on the monitoring of a mechanical, hydraulic, or electromechanical subsystem of the vehicle or machinery. Only recently, health management systems that monitor software have been developed. In this paper, we will discuss our approach of using Bayesian networks for Software Health Management (SWHM). We will discuss SWHM requirements, which make advanced reasoning capabilities for the detection and diagnosis important. Then we will present our approach to using Bayesian networks for the construction of health models that dynamically monitor a software system and is capable of detecting and diagnosing faults.

  10. Collaborative business processes for enhancing partnerships among software services providers

    Science.gov (United States)

    Heil Cancian, Maiara; Rabelo, Ricardo; Gresse von Wangenheim, Christiane

    2015-08-01

    Software services have represented a powerful view to support the realisation of the service-oriented architecture (SOA) paradigm. Using open standards and facilitating systems projects, they have increasingly been used as a corporate architectural approach to create interoperable services-based software solutions that can more easily be reused and shared across disparate applications. In the context of software companies, most of them are small firms having enormous difficulties to keep competitive. One strategy to enhance their sustainability is to enlarge partnerships among them at a more valuable level by jointly offering (web) services-based solutions. However, their culture of collaboration is low, and partnerships are usually done with the same companies and sporadically. This article presents an approach to support a more intense collaboration among software companies to attend business opportunities in a more agile way, joining capacities and capabilities which they would not have if they worked alone. This requires, however, some preparedness. From the perspective of business processes, they should understand how to carry out a collaboration more properly. This is essentially what this article is about. It presents a comprehensive list of collaborative business processes and base practices that can also act as a guide for service providers' managers to implement and manage the collaboration along its lifecycle. Processes have been validated and results are discussed.

  11. The Creation and Use of an Analysis Capability Maturity Model (trademark) (ACMM)

    National Research Council Canada - National Science Library

    Covey, R. W; Hixon, D. J

    2005-01-01

    .... Capability Maturity Models (trademark) (CMMs) are being used in several intellectual endeavors, such as software engineering, software acquisition, and systems engineering. This Analysis CMM (ACMM...

  12. The HEP Software and Computing Knowledge Base

    Science.gov (United States)

    Wenaus, T.

    2017-10-01

    HEP software today is a rich and diverse domain in itself and exists within the mushrooming world of open source software. As HEP software developers and users we can be more productive and effective if our work and our choices are informed by a good knowledge of what others in our community have created or found useful. The HEP Software and Computing Knowledge Base, hepsoftware.org, was created to facilitate this by serving as a collection point and information exchange on software projects and products, services, training, computing facilities, and relating them to the projects, experiments, organizations and science domains that offer them or use them. It was created as a contribution to the HEP Software Foundation, for which a HEP S&C knowledge base was a much requested early deliverable. This contribution will motivate and describe the system, what it offers, its content and contributions both existing and needed, and its implementation (node.js based web service and javascript client app) which has emphasized ease of use for both users and contributors.

  13. Software Piracy in Research: A Moral Analysis.

    Science.gov (United States)

    Santillanes, Gary; Felder, Ryan Marshall

    2015-08-01

    Researchers in virtually every discipline rely on sophisticated proprietary software for their work. However, some researchers are unable to afford the licenses and instead procure the software illegally. We discuss the prohibition of software piracy by intellectual property laws, and argue that the moral basis for the copyright law offers the possibility of cases where software piracy may be morally justified. The ethics codes that scientific institutions abide by are informed by a rule-consequentialist logic: by preserving personal rights to authored works, people able to do so will be incentivized to create. By showing that the law has this rule-consequentialist grounding, we suggest that scientists who blindly adopt their institutional ethics codes will commit themselves to accepting that software piracy could be morally justified, in some cases. We hope that this conclusion will spark debate over important tensions between ethics codes, copyright law, and the underlying moral basis for these regulations. We conclude by offering practical solutions (other than piracy) for researchers.

  14. A Comparison of Hazard Prediction and Assessment Capability (HPAC) Software Dose-Rate Contour Plots to a Sample of Local Fallout Data From Test Detonations in the Continental United States, 1945 - 1962

    National Research Council Canada - National Science Library

    Chancellor, Richard W

    2005-01-01

    A comparison of Hazard Prediction and Assessment Capability (HPAC) software dose-rate contour plots to a sample of local nuclear fallout data from test detonations in the continental United States, 1945 - 1962, is performed...

  15. Towards a dynamic concept of alliance capability

    OpenAIRE

    SLUYTS, Kim; MARTENS, Rudy; MATTHYSSENS, Paul

    2008-01-01

    This paper has a threefold purpose. First, we offer a literature review on alliance capability based on strategic and competence based management literature. Second, we extend existing literature on alliance capability by breaking this concept down into five sub capabilities, which are each linked to a stage of the alliance life cycle. Finally, we suggest how firms can support these capabilities through structural, technological and people-related tools and techniques. We argue that current l...

  16. The Use of UML for Software Requirements Expression and Management

    Science.gov (United States)

    Murray, Alex; Clark, Ken

    2015-01-01

    It is common practice to write English-language "shall" statements to embody detailed software requirements in aerospace software applications. This paper explores the use of the UML language as a replacement for the English language for this purpose. Among the advantages offered by the Unified Modeling Language (UML) is a high degree of clarity and precision in the expression of domain concepts as well as architecture and design. Can this quality of UML be exploited for the definition of software requirements? While expressing logical behavior, interface characteristics, timeliness constraints, and other constraints on software using UML is commonly done and relatively straight-forward, achieving the additional aspects of the expression and management of software requirements that stakeholders expect, especially traceability, is far less so. These other characteristics, concerned with auditing and quality control, include the ability to trace a requirement to a parent requirement (which may well be an English "shall" statement), to trace a requirement to verification activities or scenarios which verify that requirement, and to trace a requirement to elements of the software design which implement that requirement. UML Use Cases, designed for capturing requirements, have not always been satisfactory. Some applications of them simply use the Use Case model element as a repository for English requirement statements. Other applications of Use Cases, in which Use Cases are incorporated into behavioral diagrams that successfully communicate the behaviors and constraints required of the software, do indeed take advantage of UML's clarity, but not in ways that support the traceability features mentioned above. Our approach uses the Stereotype construct of UML to precisely identify elements of UML constructs, especially behaviors such as State Machines and Activities, as requirements, and also to achieve the necessary mapping capabilities. We describe this approach in the

  17. PyBus -- A Python Software Bus

    OpenAIRE

    Lavrijsen, W

    2005-01-01

    A software bus, just like its hardware equivalent, allows for the discovery, installation, configuration, loading, unloading, and run-time replacement of software components, as well as channeling of inter-component communication. Python, a popular open-source programming language, encourages a modular design on software written in it, but it offers little or no component functionality. However, the language and its interpreter provide sufficient hooks to implement a thin, integral layer...

  18. The adoption of social enterprise software

    OpenAIRE

    Engelstätter, Benjamin; Sarbu, Miruna

    2011-01-01

    Social enterprise software is a highly promising software application for firms, though it is still in an infancy state. It offers rapid real-time information transfer based on business collaboration tools or instant messaging. The software collects and processes customer data from surveys, consumer feedback, reviews, blogs or social networks. This enables firms to build up detailed customer profiles potentially anticipating upcoming trends. We analyze the determinants of social enterprise so...

  19. Development of Spectrometer Software for Electromagnetic Radiation Measurement and Analysis

    International Nuclear Information System (INIS)

    Mohd Idris Taib; Noor Ezati Shuib; Wan Saffiey Wan Abdullah

    2013-01-01

    This software was under development using LabVIEW to be using with StellarNet Spectrometer system. StellarNet Spectrometer was supplied with SpectraWiz operating software that can measure spectral data for real-time spectroscopy. This LabVIEW software was used to access real-time data from SpectraWiz dynamic link library as hardware interfacing. This software will acquire amplitude of every electromagnetic wavelength at periodic time. In addition to hardware interfacing, the user interface capabilities of software include plotting of spectral data in various mode including scope, absorbance, transmission and irradiance mode. This software surely can be used for research and development in application, utilization and safety of electromagnetic radiation, especially solar, laser and ultra violet. Of-line capabilities of this software are almost unlimited due to availability of mathematical and signal processing function in the LabVIEW add on library. (author)

  20. Desktop Publishing on the Macintosh: A Software Perspective.

    Science.gov (United States)

    Devan, Steve

    1987-01-01

    Discussion of factors to be considered in selecting desktop publishing software for the Macintosh microcomputer focuses on the two approaches to such software, i.e., batch and interactive, and three technical considerations, i.e., document, text, and graphics capabilities. Some new developments in graphics software are also briefly described. (MES)

  1. Modernization of tank floor scanning system (TAFLOSS) Software

    International Nuclear Information System (INIS)

    Mohd Fitri Abd Rahman; Jaafar Abdullah; Zainul A Hassan

    2002-01-01

    The main objective of the project is to develop new user-friendly software that combined the second-generation software (developed in-house) and commercial software. This paper describes the development of computer codes for analysing the initial data and plotting exponential curve fit. The method that used in curve fitting is least square technique. The software that had been developed is capable to give a comparable result as the commercial software. (Author)

  2. GIMS-Software for asset market experiments.

    Science.gov (United States)

    Palan, Stefan

    2015-03-01

    In this article we lay out requirements for an experimental market software for financial and economic research. We then discuss existing solutions. Finally, we introduce GIMS, an open source market software which is characterized by extensibility and ease of use, while offering nearly all of the required functionality.

  3. The Capability Approach: Enabling Musical Learning

    Science.gov (United States)

    Cameron, Kate

    2012-01-01

    Amartya Sen's capability approach offers a new perspective for educators throughout the curriculum. This new insight has the potential to promote a music education that is inherently tailored to the individual. In essence it asks the question: What is music education going to offer to this student? This article represents an initial enquiry into…

  4. Automation software for a materials testing laboratory

    Science.gov (United States)

    Mcgaw, Michael A.; Bonacuse, Peter J.

    1990-01-01

    The software environment in use at the NASA-Lewis Research Center's High Temperature Fatigue and Structures Laboratory is reviewed. This software environment is aimed at supporting the tasks involved in performing materials behavior research. The features and capabilities of the approach to specifying a materials test include static and dynamic control mode switching, enabling multimode test control; dynamic alteration of the control waveform based upon events occurring in the response variables; precise control over the nature of both command waveform generation and data acquisition; and the nesting of waveform/data acquisition strategies so that material history dependencies may be explored. To eliminate repetitive tasks in the coventional research process, a communications network software system is established which provides file interchange and remote console capabilities.

  5. GIMS—Software for asset market experiments

    Science.gov (United States)

    Palan, Stefan

    2015-01-01

    In this article we lay out requirements for an experimental market software for financial and economic research. We then discuss existing solutions. Finally, we introduce GIMS, an open source market software which is characterized by extensibility and ease of use, while offering nearly all of the required functionality. PMID:26525085

  6. Expanding Human Capabilities through the Adoption and Utilization of Free, Libre, and Open Source Software

    Science.gov (United States)

    Simpson, James Daniel

    2014-01-01

    Free, libre, and open source software (FLOSS) is software that is collaboratively developed. FLOSS provides end-users with the source code and the freedom to adapt or modify a piece of software to fit their needs (Deek & McHugh, 2008; Stallman, 2010). FLOSS has a 30 year history that dates to the open hacker community at the Massachusetts…

  7. Software Development using Object-First Approach: a New Learning Strategy

    Directory of Open Access Journals (Sweden)

    Gurdeep S Hura

    2017-08-01

    Full Text Available Software Engineering approach deals with the Software Development (SD that is aligned with design and development of software applications. The Software Development may be implemented in a variety of techniques but its implementation using a procedural paradigm and an imperative language seem to be more effective and efficient for the design and implementation of software applications. The procedural approach for Software Development offers advantages as this it may be used to teach some basic features of programming languages. The object of this paper is to introduce the software development and associated object-first approach for the design of software project application using top-down method. This approach defines functions and modules as basic units for the design and implementation and also for offering hands-on experiences with the basics of programming languages of sequences, selections, iterations structures. These structures will be used to define various modules with programming language constructs for of software development process. The software Development process is one of the very crucial processes of software engineering.

  8. Next Generation Software Process Improvement

    National Research Council Canada - National Science Library

    Turnas, Daniel

    2003-01-01

    .... The application of these processes allows for an organization to mature. The software maturity level, and process improvement, of an organization can be measured with the Capability Maturity Model...

  9. Continuous software engineering – a microservices architecture perspective

    OpenAIRE

    O'Connor, Rory; Elger, Peter; Clarke, Paul

    2017-01-01

    From its earliest days, software development has been beset with challenges in relation to timely delivery, appropriateness of features and quality of deliverables. Many advances in software development processes have helped to address these concerns. For example, agile software development has helped to deliver working software more frequently and capability maturity frameworks have brought about improved consistency in quality levels. However, the age-old challenge of better, cheaper, faste...

  10. Computer Center: Software Review.

    Science.gov (United States)

    Duhrkopf, Richard, Ed.; Belshe, John F., Ed.

    1988-01-01

    Reviews a software package, "Mitosis-Meiosis," available for Apple II or IBM computers with colorgraphics capabilities. Describes the documentation, presentation and flexibility of the program. Rates the program based on graphics and usability in a biology classroom. (CW)

  11. Teaching Empirical Software Engineering Using Expert Teams

    DEFF Research Database (Denmark)

    Kuhrmann, Marco

    2017-01-01

    Empirical software engineering aims at making software engineering claims measurable, i.e., to analyze and understand phenomena in software engineering and to evaluate software engineering approaches and solutions. Due to the involvement of humans and the multitude of fields for which software...... is crucial, software engineering is considered hard to teach. Yet, empirical software engineering increases this difficulty by adding the scientific method as extra dimension. In this paper, we present a Master-level course on empirical software engineering in which different empirical instruments...... an extra specific expertise that they offer as service to other teams, thus, fostering cross-team collaboration. The paper outlines the general course setup, topics addressed, and it provides initial lessons learned....

  12. Interface-based software integration

    Directory of Open Access Journals (Sweden)

    Aziz Ahmad Rais

    2016-07-01

    Full Text Available Enterprise architecture frameworks define the goals of enterprise architecture in order to make business processes and IT operations more effective, and to reduce the risk of future investments. These enterprise architecture frameworks offer different architecture development methods that help in building enterprise architecture. In practice, the larger organizations become, the larger their enterprise architecture and IT become. This leads to an increasingly complex system of enterprise architecture development and maintenance. Application software architecture is one type of architecture that, along with business architecture, data architecture and technology architecture, composes enterprise architecture. From the perspective of integration, enterprise architecture can be considered a system of interaction between multiple examples of application software. Therefore, effective software integration is a very important basis for the future success of the enterprise architecture in question. This article will provide interface-based integration practice in order to help simplify the process of building such a software integration system. The main goal of interface-based software integration is to solve problems that may arise with software integration requirements and developing software integration architecture.

  13. A Method of Nuclear Software Reliability Estimation

    International Nuclear Information System (INIS)

    Park, Gee Yong; Eom, Heung Seop; Cheon, Se Woo; Jang, Seung Cheol

    2011-01-01

    A method on estimating software reliability for nuclear safety software is proposed. This method is based on the software reliability growth model (SRGM) where the behavior of software failure is assumed to follow the non-homogeneous Poisson process. Several modeling schemes are presented in order to estimate and predict more precisely the number of software defects based on a few of software failure data. The Bayesian statistical inference is employed to estimate the model parameters by incorporating the software test cases into the model. It is identified that this method is capable of accurately estimating the remaining number of software defects which are on-demand type directly affecting safety trip functions. The software reliability can be estimated from a model equation and one method of obtaining the software reliability is proposed

  14. The Legacy of Space Shuttle Flight Software

    Science.gov (United States)

    Hickey, Christopher J.; Loveall, James B.; Orr, James K.; Klausman, Andrew L.

    2011-01-01

    The initial goals of the Space Shuttle Program required that the avionics and software systems blaze new trails in advancing avionics system technology. Many of the requirements placed on avionics and software were accomplished for the first time on this program. Examples include comprehensive digital fly-by-wire technology, use of a digital databus for flight critical functions, fail operational/fail safe requirements, complex automated redundancy management, and the use of a high-order software language for flight software development. In order to meet the operational and safety goals of the program, the Space Shuttle software had to be extremely high quality, reliable, robust, reconfigurable and maintainable. To achieve this, the software development team evolved a software process focused on continuous process improvement and defect elimination that consistently produced highly predictable and top quality results, providing software managers the confidence needed to sign each Certificate of Flight Readiness (COFR). This process, which has been appraised at Capability Maturity Model (CMM)/Capability Maturity Model Integration (CMMI) Level 5, has resulted in one of the lowest software defect rates in the industry. This paper will present an overview of the evolution of the Primary Avionics Software System (PASS) project and processes over thirty years, an argument for strong statistical control of software processes with examples, an overview of the success story for identifying and driving out errors before flight, a case study of the few significant software issues and how they were either identified before flight or slipped through the process onto a flight vehicle, and identification of the valuable lessons learned over the life of the project.

  15. Software Quality Assessment Tool Based on Meta-Models

    OpenAIRE

    Doneva Rositsa; Gaftandzhieva Silvia; Doneva Zhelyana; Staevsky Nevena

    2015-01-01

    In the software industry it is indisputably essential to control the quality of produced software systems in terms of capabilities for easy maintenance, reuse, portability and others in order to ensure reliability in the software development. But it is also clear that it is very difficult to achieve such a control through a ‘manual’ management of quality.There are a number of approaches for software quality assurance based typically on software quality models (e.g. ISO 9126, McCall’s, Boehm’s...

  16. Identifying strengths and weaknesses of Quality Management Unit University of Sumatera Utara software using SCAMPI C

    Science.gov (United States)

    Gunawan, D.; Amalia, A.; Rahmat, R. F.; Muchtar, M. A.; Siregar, I.

    2018-02-01

    Identification of software maturity level is a technique to determine the quality of the software. By identifying the software maturity level, the weaknesses of the software can be observed. As a result, the recommendations might be a reference for future software maintenance and development. This paper discusses the software Capability Level (CL) with case studies on Quality Management Unit (Unit Manajemen Mutu) University of Sumatera Utara (UMM-USU). This research utilized Standard CMMI Appraisal Method for Process Improvement class C (SCAMPI C) model with continuous representation. This model focuses on activities for developing quality products and services. The observation is done in three process areas, such as Project Planning (PP), Project Monitoring and Control (PMC), and Requirements Management (REQM). According to the measurement of software capability level for UMM-USU software, turns out that the capability level for the observed process area is in the range of CL1 and CL2. Planning Project (PP) is the only process area which reaches capability level 2, meanwhile, PMC and REQM are still in CL 1 or in performed level. This research reveals several weaknesses of existing UMM-USU software. Therefore, this study proposes several recommendations for UMM-USU to improve capability level for observed process areas.

  17. Benchmarking therapeutic drug monitoring software: a review of available computer tools.

    Science.gov (United States)

    Fuchs, Aline; Csajka, Chantal; Thoma, Yann; Buclin, Thierry; Widmer, Nicolas

    2013-01-01

    Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare

  18. A study of software safety analysis system for safety-critical software

    International Nuclear Information System (INIS)

    Chang, H. S.; Shin, H. K.; Chang, Y. W.; Jung, J. C.; Kim, J. H.; Han, H. H.; Son, H. S.

    2004-01-01

    The core factors and requirements for the safety-critical software traced and the methodology adopted in each stage of software life cycle are presented. In concept phase, Failure Modes and Effects Analysis (FMEA) for the system has been performed. The feasibility evaluation of selected safety parameter was performed and Preliminary Hazards Analysis list was prepared using HAZOP(Hazard and Operability) technique. And the check list for management control has been produced via walk-through technique. Based on the evaluation of the check list, activities to be performed in requirement phase have been determined. In the design phase, hazard analysis has been performed to check the safety capability of the system with regard to safety software algorithm using Fault Tree Analysis (FTA). In the test phase, the test items based on FMEA have been checked for fitness guided by an accident scenario. The pressurizer low pressure trip algorithm has been selected to apply FTA method to software safety analysis as a sample. By applying CASE tool, the requirements traceability of safety critical system has been enhanced during all of software life cycle phases

  19. Programming Language Software For Graphics Applications

    Science.gov (United States)

    Beckman, Brian C.

    1993-01-01

    New approach reduces repetitive development of features common to different applications. High-level programming language and interactive environment with access to graphical hardware and software created by adding graphical commands and other constructs to standardized, general-purpose programming language, "Scheme". Designed for use in developing other software incorporating interactive computer-graphics capabilities into application programs. Provides alternative to programming entire applications in C or FORTRAN, specifically ameliorating design and implementation of complex control and data structures typifying applications with interactive graphics. Enables experimental programming and rapid development of prototype software, and yields high-level programs serving as executable versions of software-design documentation.

  20. P-MartCancer-Interactive Online Software to Enable Analysis of Shotgun Cancer Proteomic Datasets.

    Science.gov (United States)

    Webb-Robertson, Bobbie-Jo M; Bramer, Lisa M; Jensen, Jeffrey L; Kobold, Markus A; Stratton, Kelly G; White, Amanda M; Rodland, Karin D

    2017-11-01

    P-MartCancer is an interactive web-based software environment that enables statistical analyses of peptide or protein data, quantitated from mass spectrometry-based global proteomics experiments, without requiring in-depth knowledge of statistical programming. P-MartCancer offers a series of statistical modules associated with quality assessment, peptide and protein statistics, protein quantification, and exploratory data analyses driven by the user via customized workflows and interactive visualization. Currently, P-MartCancer offers access and the capability to analyze multiple cancer proteomic datasets generated through the Clinical Proteomics Tumor Analysis Consortium at the peptide, gene, and protein levels. P-MartCancer is deployed as a web service (https://pmart.labworks.org/cptac.html), alternatively available via Docker Hub (https://hub.docker.com/r/pnnl/pmart-web/). Cancer Res; 77(21); e47-50. ©2017 AACR . ©2017 American Association for Cancer Research.

  1. 2006 XSD Scientific Software Workshop report.

    Energy Technology Data Exchange (ETDEWEB)

    Evans, K., Jr.; De Carlo, F.; Jemian, P.; Lang, J.; Lienert, U.; Maclean, J.; Newville, M.; Tieman, B.; Toby, B.; van Veenendaal, B.; Univ. of Chicago

    2006-01-22

    In May of 2006, a committee was formed to assess the fundamental needs and opportunities in scientific software for x-ray data reduction, analysis, modeling, and simulation. This committee held a series of discussions throughout the summer, conducted a poll of the members of the x-ray community, and held a workshop. This report details the findings and recommendations of the committee. Each experiment performed at the APS requires three crucial ingredients: the powerful x-ray source, an optimized instrument to perform measurements, and computer software to acquire, visualize, and analyze the experimental observations. While the APS has invested significant resources in the accelerator, investment in other areas such as scientific software for data analysis and visualization has lagged behind. This has led to the adoption of a wide variety of software with variable levels of usability. In order to maximize the scientific output of the APS, it is essential to support the broad development of real-time analysis and data visualization software. As scientists attack problems of increasing sophistication and deal with larger and more complex data sets, software is playing an ever more important role. Furthermore, our need for excellent and flexible scientific software can only be expected to increase, as the upgrade of the APS facility and the implementation of advanced detectors create a host of new measurement capabilities. New software analysis tools must be developed to take full advantage of these capabilities. It is critical that the APS take the lead in software development and the implementation of theory to software to ensure the continued success of this facility. The topics described in this report are relevant to the APS today and critical for the APS upgrade plan. Implementing these recommendations will have a positive impact on the scientific productivity of the APS today and will be even more critical in the future.

  2. Earth Science Informatics Community Requirements for Improving Sustainable Science Software Practices: User Perspectives and Implications for Organizational Action

    Science.gov (United States)

    Downs, R. R.; Lenhardt, W. C.; Robinson, E.

    2014-12-01

    Science software is integral to the scientific process and must be developed and managed in a sustainable manner to ensure future access to scientific data and related resources. Organizations that are part of the scientific enterprise, as well as members of the scientific community who work within these entities, can contribute to the sustainability of science software and to practices that improve scientific community capabilities for science software sustainability. As science becomes increasingly digital and therefore, dependent on software, improving community practices for sustainable science software will contribute to the sustainability of science. Members of the Earth science informatics community, including scientific data producers and distributers, end-user scientists, system and application developers, and data center managers, use science software regularly and face the challenges and the opportunities that science software presents for the sustainability of science. To gain insight on practices needed for the sustainability of science software from the science software experiences of the Earth science informatics community, an interdisciplinary group of 300 community members were asked to engage in simultaneous roundtable discussions and report on their answers to questions about the requirements for improving scientific software sustainability. This paper will present an analysis of the issues reported and the conclusions offered by the participants. These results provide perspectives for science software sustainability practices and have implications for actions that organizations and their leadership can initiate to improve the sustainability of science software.

  3. Open source hard- and software: Using Arduino boards to keep old hardware running

    International Nuclear Information System (INIS)

    Faugel, Helmut; Bobkov, Volodymyr

    2013-01-01

    The ASDEX Upgrade tokamak went into operation in 1991 with a proposed lifetime of 10 years. Due to major modifications ASDEX Upgrade is still in operation. Infrastructure like data acquisition, workstations, etc. is being modernized, interfaces like RS-232 are vanishing and new interfaces are being introduced. This leads to the necessity to adapt old hardware. Most of the microcontrollers used in the old hardware do not offer any support of the new interfaces and have to be replaced. A simple and efficient way is to replace them with open hardware microcontroller boards like the Arduino. These boards are based on 8-bit RISC microcontrollers and offer a software development environment with a large number of libraries. In this paper the use of Arduino boards for replacing the position unit, the stub tuner interface and its use controlling a direct digital synthesizer (DDS) with phase control capability are shown

  4. Open source hard- and software: Using Arduino boards to keep old hardware running

    Energy Technology Data Exchange (ETDEWEB)

    Faugel, Helmut [Max-Planck-Institut für Plasmaphysik, EURATOM Association, Garching (Germany); Bobkov, Volodymyr [Max-Planck-Institut für Plasmaphysik, EURATOM Association, Garching (Germany)

    2013-10-15

    The ASDEX Upgrade tokamak went into operation in 1991 with a proposed lifetime of 10 years. Due to major modifications ASDEX Upgrade is still in operation. Infrastructure like data acquisition, workstations, etc. is being modernized, interfaces like RS-232 are vanishing and new interfaces are being introduced. This leads to the necessity to adapt old hardware. Most of the microcontrollers used in the old hardware do not offer any support of the new interfaces and have to be replaced. A simple and efficient way is to replace them with open hardware microcontroller boards like the Arduino. These boards are based on 8-bit RISC microcontrollers and offer a software development environment with a large number of libraries. In this paper the use of Arduino boards for replacing the position unit, the stub tuner interface and its use controlling a direct digital synthesizer (DDS) with phase control capability are shown.

  5. Lung nodule volumetry: segmentation algorithms within the same software package cannot be used interchangeably

    DEFF Research Database (Denmark)

    Ashraf, Haseem; de Hoop, B; Shaker, S B

    2010-01-01

    We examined the reproducibility of lung nodule volumetry software that offers three different volumetry algorithms.......We examined the reproducibility of lung nodule volumetry software that offers three different volumetry algorithms....

  6. Integrated System Health Management (ISHM): Systematic Capability Implementation

    Science.gov (United States)

    Figueroa, Fernando; Holland, Randy; Schmalzwel, John; Duncavage, Dan

    2006-01-01

    This paper provides a credible approach for implementation of ISHM capability in any system. The requirements and processes to implement ISHM capability are unique in that a credible capability is initially implemented at a low level, and it evolves to achieve higher levels by incremental augmentation. In contrast, typical capabilities, such as thrust of an engine, are implemented once at full Functional Capability Level (FCL), which is not designed to change during the life of the product. The approach will describe core ingredients (e.g. technologies, architectures, etc.) and when and how ISHM capabilities may be implemented. A specific architecture/taxonomy/ontology will be described, as well as a prototype software environment that supports development of ISHM capability. This paper will address implementation of system-wide ISHM as a core capability, and ISHM for specific subsystems as expansions and evolution, but always focusing on achieving an integrated capability.

  7. Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing

    Energy Technology Data Exchange (ETDEWEB)

    Price, Phillip N.; Granderson, Jessica; Sohn, Michael; Addy, Nathan; Jump, David

    2013-09-01

    The overarching goal of this work is to advance the capabilities of technology evaluators in evaluating the building-level baseline modeling capabilities of Energy Management and Information System (EMIS) software. Through their customer engagement platforms and products, EMIS software products have the potential to produce whole-building energy savings through multiple strategies: building system operation improvements, equipment efficiency upgrades and replacements, and inducement of behavioral change among the occupants and operations personnel. Some offerings may also automate the quantification of whole-building energy savings, relative to a baseline period, using empirical models that relate energy consumption to key influencing parameters, such as ambient weather conditions and building operation schedule. These automated baseline models can be used to streamline the whole-building measurement and verification (M&V) process, and therefore are of critical importance in the context of multi-measure whole-building focused utility efficiency programs. This report documents the findings of a study that was conducted to begin answering critical questions regarding quantification of savings at the whole-building level, and the use of automated and commercial software tools. To evaluate the modeling capabilities of EMIS software particular to the use case of whole-building savings estimation, four research questions were addressed: 1. What is a general methodology that can be used to evaluate baseline model performance, both in terms of a) overall robustness, and b) relative to other models? 2. How can that general methodology be applied to evaluate proprietary models that are embedded in commercial EMIS tools? How might one handle practical issues associated with data security, intellectual property, appropriate testing ‘blinds’, and large data sets? 3. How can buildings be pre-screened to identify those that are the most model-predictable, and therefore those

  8. Software metrics a rigorous and practical approach

    CERN Document Server

    Fenton, Norman

    2014-01-01

    A Framework for Managing, Measuring, and Predicting Attributes of Software Development Products and ProcessesReflecting the immense progress in the development and use of software metrics in the past decades, Software Metrics: A Rigorous and Practical Approach, Third Edition provides an up-to-date, accessible, and comprehensive introduction to software metrics. Like its popular predecessors, this third edition discusses important issues, explains essential concepts, and offers new approaches for tackling long-standing problems.New to the Third EditionThis edition contains new material relevant

  9. Software patterns, knowledge maps, and domain analysis

    CERN Document Server

    Fayad, Mohamed E; Hegde, Srikanth GK; Basia, Anshu; Vakil, Ashka

    2014-01-01

    Preface AcknowledgmentsAuthors INTRODUCTIONAn Overview of Knowledge MapsIntroduction: Key Concepts-Software Stable Models, Knowledge Maps, Pattern Language, Goals, Capabilities (Enduring Business Themes + Business Objects) The Motivation The Problem The Objectives Overview of Software Stability Concepts Overview of Knowledge Maps Pattern Languages versus Knowledge Maps: A Brief ComparisonThe Solution Knowledge Maps Methodology or Concurrent Software Development ModelWhy Knowledge Maps? Research Methodology Undertaken Research Verification and Validation The Stratification of This Book Summary

  10. Capabilities for Intercultural Dialogue

    Science.gov (United States)

    Crosbie, Veronica

    2014-01-01

    The capabilities approach offers a valuable analytical lens for exploring the challenge and complexity of intercultural dialogue in contemporary settings. The central tenets of the approach, developed by Amartya Sen and Martha Nussbaum, involve a set of humanistic goals including the recognition that development is a process whereby people's…

  11. Incorporating a Human-Computer Interaction Course into Software Development Curriculums

    Science.gov (United States)

    Janicki, Thomas N.; Cummings, Jeffrey; Healy, R. Joseph

    2015-01-01

    Individuals have increasing options on retrieving information related to hardware and software. Specific hardware devices include desktops, tablets and smart devices. Also, the number of software applications has significantly increased the user's capability to access data. Software applications include the traditional web site, smart device…

  12. Cloud-based Architecture Capabilities Summary Report

    Energy Technology Data Exchange (ETDEWEB)

    Vang, Leng [Idaho National Lab. (INL), Idaho Falls, ID (United States); Prescott, Steven R [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-09-01

    In collaborating scientific research arena it is important to have an environment where analysts have access to a shared of information documents, software tools and be able to accurately maintain and track historical changes in models. A new cloud-based environment would be accessible remotely from anywhere regardless of computing platforms given that the platform has available of Internet access and proper browser capabilities. Information stored at this environment would be restricted based on user assigned credentials. This report reviews development of a Cloud-based Architecture Capabilities (CAC) as a web portal for PRA tools.

  13. Dynamic Capabilities Associated with a Firm’s Growth in Developing Countries. A Comparative Study of Argentinean SMEs in the Software and Tourism Industries

    Directory of Open Access Journals (Sweden)

    Claudia D’Annunzio

    2015-01-01

    Full Text Available Although recent evidence suggests that the development of dynamic capabilities (DC is a key factor to gain and sustain competitive advantages to promote firm ́s growth, the question of how SMEs create, identify, and seize opportunities for growth have not been fully explored, particularly in developing countries with scarce resources. The aim of this study is to shed light on how SMEs develop capabilities to grow in the specific context of developing countries with resources constraints. To achieve a detailed description of the processes involved, this study applies a qualitative methodology based on a comparative case study of eight SMEs within the software and tourism industries in Argentine, which have been previously identified as dynamic sectors with high growth potential. Our findings suggest that SMEs develop DC mainly through an emerging process of iterative experimentation rather than through strategic planning. This process involves the coordination of organizational actions and resources, with managers playing a key role.

  14. Software support environment design knowledge capture

    Science.gov (United States)

    Dollman, Tom

    1990-01-01

    The objective of this task is to assess the potential for using the software support environment (SSE) workstations and associated software for design knowledge capture (DKC) tasks. This assessment will include the identification of required capabilities for DKC and hardware/software modifications needed to support DKC. Several approaches to achieving this objective are discussed and interim results are provided: (1) research into the problem of knowledge engineering in a traditional computer-aided software engineering (CASE) environment, like the SSE; (2) research into the problem of applying SSE CASE tools to develop knowledge based systems; and (3) direct utilization of SSE workstations to support a DKC activity.

  15. Software Past, Present, and Future: Views from Government, Industry and Academia

    Science.gov (United States)

    Holcomb, Lee; Page, Jerry; Evangelist, Michael

    2000-01-01

    Views from the NASA CIO NASA Software Engineering Workshop on software development from the past, present, and future are presented. The topics include: 1) Software Past; 2) Software Present; 3) NASA's Largest Software Challenges; 4) 8330 Software Projects in Industry Standish Groups 1994 Report; 5) Software Future; 6) Capability Maturity Model (CMM): Software Engineering Institute (SEI) levels; 7) System Engineering Quality Also Part of the Problem; 8) University Environment Trends Will Increase the Problem in Software Engineering; and 9) NASA Software Engineering Goals.

  16. Robotics Offer Newfound Surgical Capabilities

    Science.gov (United States)

    2008-01-01

    Barrett Technology Inc., of Cambridge, Massachusetts, completed three Phase II Small Business Innovation Research (SBIR) contracts with Johnson Space Center, during which the company developed and commercialized three core technologies: a robotic arm, a hand that functions atop the arm, and a motor driver to operate the robotics. Among many industry uses, recently, an adaptation of the arm has been cleared by the U.S. Food and Drug Administration (FDA) for use in a minimally invasive knee surgery procedure, where its precision control makes it ideal for inserting a very small implant.

  17. Using Porterian Activity Analysis to Understand Organizational Capabilities

    DEFF Research Database (Denmark)

    Sheehan, Norman T.; Foss, Nicolai Juul

    2017-01-01

    conceptualized by Porter’s writings on the activity-based view. Porterian activity analysis is becoming more accepted in the strategy literature, but no strategy scholar has explicitly used Porter’s activities, and particularly his concept of drivers, to understand and analyze organizational capabilities....... Introducing Porterian activities into the discussion of capabilities improves strategy scholars’ understanding of the bases of capability heterogeneity, offers academics future directions for research, and provides managers with guidance to enhance their organizations’ capabilities....

  18. Agile distributed software development

    DEFF Research Database (Denmark)

    Persson, John Stouby; Mathiassen, Lars; Aaen, Ivan

    2012-01-01

    While face-to-face interaction is fundamental in agile software development, distributed environments must rely extensively on mediated interactions. Practicing agile principles in distributed environments therefore poses particular control challenges related to balancing fixed vs. evolving quality...... requirements and people vs. process-based collaboration. To investigate these challenges, we conducted an in-depth case study of a successful agile distributed software project with participants from a Russian firm and a Danish firm. Applying Kirsch’s elements of control framework, we offer an analysis of how...

  19. Transient analysis capabilities at ABB-CE

    International Nuclear Information System (INIS)

    Kling, C.L.

    1992-01-01

    The transient capabilities at ABB-Combustion Engineering (ABB-CE) Nuclear Power are a function of the computer hardware and related network used, the computer software that has evolved over the years, and the commercial technical exchange agreements with other related organizations and customers. ABB-CEA is changing from a mainframe/personal computer network to a distributed workstation/personal computer local area network. The paper discusses computer hardware, mainframe computing, personal computers, mainframe/personal computer networks, workstations, transient analysis computer software, design/operation transient analysis codes, safety (licensed) analysis codes, cooperation with ABB-Atom, and customer support

  20. The Organizational Economics of Organizational Capability and Heterogeneity

    DEFF Research Database (Denmark)

    Foss, Nicolai Juul; Argyres, Nicholas; Felin, Teppo

    For decades, the literatures on firm capabilities and organizational economics have been at odds with each other, specifically relative to explaining organizational boundaries and heterogeneity. We briefly trace the history of the relationship between the capabilities literature and organizational...... economics and point to the dominance of a “capabilities first” logic in this relationship. We argue that capabilities considerations are inherently intertwined with questions about organizational boundaries and internal organization, and use this point to respond to the prevalent “capabilities first” logic....... We offer an integrative research agenda that focuses, first, on the governance of capabilities and, second, on the capability of governance....

  1. Use of software in the Cuban environmental monitoring

    International Nuclear Information System (INIS)

    Alonso Abad, Dolores; Dominguez Ley, Orlando; Ramos Viltre, Enma O.; Caveda Ramos, Celia; Molina Perez, Daniel; Capote Ferrara, Eduardo; Dominguez Garcia, Adriel; Grinan Torres, Reinaldo

    2008-01-01

    The work group of the National Network of Environmental Radiological Surveillance (NNERS) of the Republic of Cuba offers services which can be divided into two main domains: Services for Environmental Radiological Surveillance and Services for Measuring Radioactivity in Scraps. As part of this work a National Monitoring System has been put into place. Such monitoring system generates a great volume of information which needs to be evaluated, processed, controlled and efficiently stored. To face this challenge in 2002 the National Network started two investigation projects in order to automate some of its services. In this work, we briefly present five software designed to improve the quality of the services offered by the NNERS. We also give a short description of the main results obtained with the use of these tools. The automatization of the measurements for some indicators and the development of technical methods for the interpretation of the results have significantly improved the operational capabilities for the national network. For instance, it has also made possible to carry out studies to characterize radiologically the posts of the network. The improvement of the radiological control of the scrap has diminished the risk of exposure to radiation and the risk of pollution due to the fusion of radioactive sources. The use these software has not only increased technical efficiency of the network, but it has also had a very important social impact. Indeed, the main advantage of such automatization is that it increases the capacity for giving an early response in case of a radiological emergency what redounds in an increased safety for the population and the environment. (author)

  2. EPRI engineering workstation software - Discussion and demonstration

    International Nuclear Information System (INIS)

    Stewart, R.P.; Peterson, C.E.; Agee, L.J.

    1992-01-01

    Computing technology is undergoing significant changes with respect to engineering applications in the electric utility industry. These changes result mainly from the introduction of several UNIX workstations that provide mainframe calculational capability at much lower costs. The workstations are being coupled with microcomputers through local area networks to provide engineering groups with a powerful and versatile analysis capability. PEGASYS, the Professional Engineering Graphic Analysis System, is a software package for use with engineering analysis codes executing in a workstation environment. PEGASYS has a menu driven, user-friendly interface to provide pre-execution support for preparing unput and graphical packages for post-execution analysis and on-line monitoring capability for engineering codes. The initial application of this software is for use with RETRAN-02 operating on an IBM RS/6000 workstation using X-Windows/UNIX and a personal computer under DOS

  3. Enhancing GIS Capabilities for High Resolution Earth Science Grids

    Science.gov (United States)

    Koziol, B. W.; Oehmke, R.; Li, P.; O'Kuinghttons, R.; Theurich, G.; DeLuca, C.

    2017-12-01

    Applications for high performance GIS will continue to increase as Earth system models pursue more realistic representations of Earth system processes. Finer spatial resolution model input and output, unstructured or irregular modeling grids, data assimilation, and regional coordinate systems present novel challenges for GIS frameworks operating in the Earth system modeling domain. This presentation provides an overview of two GIS-driven applications that combine high performance software with big geospatial datasets to produce value-added tools for the modeling and geoscientific community. First, a large-scale interpolation experiment using National Hydrography Dataset (NHD) catchments, a high resolution rectilinear CONUS grid, and the Earth System Modeling Framework's (ESMF) conservative interpolation capability will be described. ESMF is a parallel, high-performance software toolkit that provides capabilities (e.g. interpolation) for building and coupling Earth science applications. ESMF is developed primarily by the NOAA Environmental Software Infrastructure and Interoperability (NESII) group. The purpose of this experiment was to test and demonstrate the utility of high performance scientific software in traditional GIS domains. Special attention will be paid to the nuanced requirements for dealing with high resolution, unstructured grids in scientific data formats. Second, a chunked interpolation application using ESMF and OpenClimateGIS (OCGIS) will demonstrate how spatial subsetting can virtually remove computing resource ceilings for very high spatial resolution interpolation operations. OCGIS is a NESII-developed Python software package designed for the geospatial manipulation of high-dimensional scientific datasets. An overview of the data processing workflow, why a chunked approach is required, and how the application could be adapted to meet operational requirements will be discussed here. In addition, we'll provide a general overview of OCGIS

  4. Active Internationalization of Small and Medium-Sized Software Enterprises - Cases of French Software Companies

    Directory of Open Access Journals (Sweden)

    Maurício Floriano Galimberti

    2015-12-01

    Full Text Available Implementations of software production processes usually ignore organizational, market, and economical attributes of products that are to be inserted in international markets. Software engineering has begun to deal with the business aspects of software products only recently. The Guide to the Software Engineering Body of Knowledge v3.0 presents two concepts of life cycle: the software development life cycle and the software product life cycle. The second is more concerned with business issues related to software products, but research on those issues is still due. In this sense, this paper aims to answer the following question: what factors allow small and medium software enterprises to offer high value added products in order to enter and remain in the international market? This work selects four research dimensions from literature and explores a number of variables inside those dimensions, which are considered as candidates to help explaining a successful process of active internationalization. The paper presents a multiple case study that shows that although innovation, entrepreneurship, and foreign market knowledge are important dimensions for the active internationalization, networking is not as relevant as it could be thought.

  5. Pybus - A Python Software Bus

    International Nuclear Information System (INIS)

    Lavrijsen, Wim T.L.P.

    2004-01-01

    A software bus, just like its hardware equivalent, allows for the discovery, installation, configuration, loading, unloading, and run-time replacement of software components, as well as channeling of inter-component communication. Python, a popular open-source programming language, encourages a modular design on software written in it, but it offers little or no component functionality. However, the language and its interpreter provide sufficient hooks to implement a thin, integral layer of component support. This functionality can be presented to the developer in the form of a module, making it very easy to use. This paper describes a Pythonmodule, PyBus, with which the concept of a ''software bus'' can be realized in Python. It demonstrates, within the context of the ATLAS software framework Athena, how PyBus can be used for the installation and (run-time) configuration of software, not necessarily Python modules, from a Python application in a way that is transparent to the end-user

  6. Digital PIV (DPIV) Software Analysis System

    Science.gov (United States)

    Blackshire, James L.

    1997-01-01

    A software package was developed to provide a Digital PIV (DPIV) capability for NASA LaRC. The system provides an automated image capture, test correlation, and autocorrelation analysis capability for the Kodak Megaplus 1.4 digital camera system for PIV measurements. The package includes three separate programs that, when used together with the PIV data validation algorithm, constitutes a complete DPIV analysis capability. The programs are run on an IBM PC/AT host computer running either Microsoft Windows 3.1 or Windows 95 using a 'quickwin' format that allows simple user interface and output capabilities to the windows environment.

  7. A Method to Improve the Software Acceptance Criteria for Nuclear Power Plants

    International Nuclear Information System (INIS)

    Suh, Yong Suk; Park, Heui Youn; Son, Ki Sung; Lee, Ki Hyun; Kim, Hyeon Soo

    2005-01-01

    The license is a mandatory process required by a governmental authority and the certification is a voluntary process administrated by a professional community. A software certification is a result of an assessment that the certified software conforms to required criteria or standards. The certification is used as a committed promise to produce a high quality software, so software acquirers are requiring it from their suppliers. For example, US DoD (Department of Defense) requires an achievement of CMMI-SW (Capability Maturity Model Integration-Software) certification for participating in a major military software project. It is commonly said that the purpose of achieving a certification is to improve the product quality. In the nuclear area, a software certification has been rarely concerned with or required for the software used in a safety function of NPPs (Nuclear Power Plants). The safety critical software for NPPs is accepted by the nuclear regulators when the following three criteria are met: acceptable plans should be prepared to control the software development activities, the plans should be followed in an acceptable software life cycle, and the process should produce acceptable design outputs. The acceptance criteria are so abstractive that the nuclear regulators may assess the software development plans, activities, outputs based on their subjective engineering judgments. This is inevitable because a software has invisible or intangible characteristics. It is hard to assess the totality of a software prior to running it. These have caused the judgments to be biased. The regulators may want some objectiveness in assessing how much capability for software development the supplier possesses. In that case, the software certification can assist them for such an assessment. This paper proposes a method to improve the software acceptance criteria by applying the software certification to the criteria. This will assist the regulators to assess the supplier

  8. MetaComp: comprehensive analysis software for comparative meta-omics including comparative metagenomics.

    Science.gov (United States)

    Zhai, Peng; Yang, Longshu; Guo, Xiao; Wang, Zhe; Guo, Jiangtao; Wang, Xiaoqi; Zhu, Huaiqiu

    2017-10-02

    During the past decade, the development of high throughput nucleic sequencing and mass spectrometry analysis techniques have enabled the characterization of microbial communities through metagenomics, metatranscriptomics, metaproteomics and metabolomics data. To reveal the diversity of microbial communities and interactions between living conditions and microbes, it is necessary to introduce comparative analysis based upon integration of all four types of data mentioned above. Comparative meta-omics, especially comparative metageomics, has been established as a routine process to highlight the significant differences in taxon composition and functional gene abundance among microbiota samples. Meanwhile, biologists are increasingly concerning about the correlations between meta-omics features and environmental factors, which may further decipher the adaptation strategy of a microbial community. We developed a graphical comprehensive analysis software named MetaComp comprising a series of statistical analysis approaches with visualized results for metagenomics and other meta-omics data comparison. This software is capable to read files generated by a variety of upstream programs. After data loading, analyses such as multivariate statistics, hypothesis testing of two-sample, multi-sample as well as two-group sample and a novel function-regression analysis of environmental factors are offered. Here, regression analysis regards meta-omic features as independent variable and environmental factors as dependent variables. Moreover, MetaComp is capable to automatically choose an appropriate two-group sample test based upon the traits of input abundance profiles. We further evaluate the performance of its choice, and exhibit applications for metagenomics, metaproteomics and metabolomics samples. MetaComp, an integrative software capable for applying to all meta-omics data, originally distills the influence of living environment on microbial community by regression analysis

  9. Facility Interface Capability Assessment (FICA) user manual

    International Nuclear Information System (INIS)

    Pope, R.B.; MacDonald, R.R.; Massaglia, J.L.; Williamson, D.A.; Viebrock, J.M.; Mote, N.

    1995-09-01

    The US Department of Energy's (DOE) Office of Civilian Radioactive Waste Management (OCRWM) is responsible for developing the Civilian Radioactive Waste Management System (CRWMS) to accept spent nuclear fuel from commercial facilities. The objective of the Facility Interface Capability Assessment (FICA) project was to assess the capability of each commercial spent nuclear fuel (SNF) storage facility, at which SNF is stored, to handle various SNF shipping casks. The purpose of this report is describe the FICA computer software and to provide the FICA user with a guide on how to use the FICA system. The FICA computer software consists of two executable programs: the FICA Reactor Report program and the FICA Summary Report program (written in the Ca-Clipper version 5.2 development system). The complete FICA software system is contained on either a 3.5 in. (double density) or a 5.25 in. (high density) diskette and consists of the two FICA programs and all the database files (generated using dBASE III). The FICA programs are provided as ''stand alone'' systems and neither the Ca-Clipper compiler nor dBASE III is required to run the FICA programs. The steps for installing the FICA software system and executing the FICA programs are described in this report. Instructions are given on how to install the FICA software system onto the hard drive of the PC and how to execute the FICA programs from the FICA subdirectory on the hard drive. Both FICA programs are menu driven with the up-arrow and down-arrow keys used to move the cursor to the desired selection

  10. Software Components and Formal Methods from a Computational Viewpoint

    OpenAIRE

    Lambertz, Christian

    2012-01-01

    Software components and the methodology of component-based development offer a promising approach to master the design complexity of huge software products because they separate the concerns of software architecture from individual component behavior and allow for reusability of components. In combination with formal methods, the specification of a formal component model of the later software product or system allows for establishing and verifying important system properties in an automatic a...

  11. DEVELOPMENT OF TECHNOLOGIES AND ANALYTICAL CAPABILITIES FOR VISION 21 ENERGY PLANTS

    Energy Technology Data Exchange (ETDEWEB)

    Maxwell Osawe; Madhave Symlal; Krishna Thotapalli; and Stephen Zitney

    2003-04-30

    This is the tenth Quarterly Technical Report for DOE Cooperative Agreement No: DE-FC26-00NT40954. The goal of the project is to develop and demonstrate a software framework to enable virtual simulation of Vision 21 plants. During the last quarter much progress was made in software development. The CO wrapper template was developed for the integration of Alstom Power proprietary code INDVU. The session management tasks were completed. The multithreading capability was made functional so that user of the integrated simulation may directly interact with the CFD software. The V21-Controller and the Fluent CO wrapper were upgraded to CO v.1.0. The testing and debugging of the upgraded software is ongoing. Testing of the integrated software was continued. A list of suggested GUI enhancements was made. Remote simulation capability was successfully tested using two networked Windows machines. Work on preparing the release version progressed: CFD database was enhanced, a convergence detection capability was implemented, a Configuration Wizard for low-order models was developed, and the Configuration Wizard for Fluent was enhanced. During the last quarter good progress was made in software demonstration. Various simplified versions of Demo Case 1 were used to debug Configuration Wizard and V21-Controller. The heat exchanger model in FLUENT was calibrated and the energy balance was verified. The INDVU code was integrated into the V21-Controller, and the integrated model is being debugged. A sensitivity loop was inserted into Demo Case 2 to check whether the simulation converges over the desired load range. Work on converting HRSGSIM code to run in batch mode was started. Work on calibrating Demo Case 2 was started.

  12. DAE Tools: equation-based object-oriented modelling, simulation and optimisation software

    Directory of Open Access Journals (Sweden)

    Dragan D. Nikolić

    2016-04-01

    Full Text Available In this work, DAE Tools modelling, simulation and optimisation software, its programming paradigms and main features are presented. The current approaches to mathematical modelling such as the use of modelling languages and general-purpose programming languages are analysed. The common set of capabilities required by the typical simulation software are discussed, and the shortcomings of the current approaches recognised. A new hybrid approach is introduced, and the modelling languages and the hybrid approach are compared in terms of the grammar, compiler, parser and interpreter requirements, maintainability and portability. The most important characteristics of the new approach are discussed, such as: (1 support for the runtime model generation; (2 support for the runtime simulation set-up; (3 support for complex runtime operating procedures; (4 interoperability with the third party software packages (i.e. NumPy/SciPy; (5 suitability for embedding and use as a web application or software as a service; and (6 code-generation, model exchange and co-simulation capabilities. The benefits of an equation-based approach to modelling, implemented in a fourth generation object-oriented general purpose programming language such as Python are discussed. The architecture and the software implementation details as well as the type of problems that can be solved using DAE Tools software are described. Finally, some applications of the software at different levels of abstraction are presented, and its embedding capabilities and suitability for use as a software as a service is demonstrated.

  13. View of software for HEP experiments

    Energy Technology Data Exchange (ETDEWEB)

    Johnstad, H.; Lebrun, P.; Lessner, E.S.; Montgomery, H.E.

    1986-05-01

    A view of the software structure typical of a High Energy Physics experiment is given and the availability of general software modules in most of the important regions is discussed. The aim is to provide a framework for discussion of capabilities and inadequecies and thereby define areas where effort should be assigned and perhaps also to serve as a useful source document for the newcomer to High Energy Physics. 74 refs.

  14. View of software for HEP experiments

    International Nuclear Information System (INIS)

    Johnstad, H.; Lebrun, P.; Lessner, E.S.; Montgomery, H.E.

    1986-05-01

    A view of the software structure typical of a High Energy Physics experiment is given and the availability of general software modules in most of the important regions is discussed. The aim is to provide a framework for discussion of capabilities and inadequecies and thereby define areas where effort should be assigned and perhaps also to serve as a useful source document for the newcomer to High Energy Physics. 74 refs

  15. 2016 International Conference on Software Process Improvement

    CERN Document Server

    Muñoz, Mirna; Rocha, Álvaro; Feliu, Tomas; Peña, Adriana

    2017-01-01

    This book offers a selection of papers from the 2016 International Conference on Software Process Improvement (CIMPS’16), held between the 12th and 14th of October 2016 in Aguascalientes, Aguascalientes, México. The CIMPS’16 is a global forum for researchers and practitioners to present and discuss the most recent innovations, trends, results, experiences and concerns in the different aspects of software engineering with a focus on, but not limited to, software processes, security in information and communication technology, and big data. The main topics covered include: organizational models, standards and methodologies, knowledge management, software systems, applications and tools, information and communication technologies and processes in non-software domains (mining, automotive, aerospace, business, health care, manufacturing, etc.) with a clear focus on software process challenges.

  16. JColorGrid: software for the visualization of biological measurements.

    Science.gov (United States)

    Joachimiak, Marcin P; Weisman, Jennifer L; May, Barnaby Ch

    2006-04-27

    Two-dimensional data colourings are an effective medium by which to represent three-dimensional data in two dimensions. Such "color-grid" representations have found increasing use in the biological sciences (e.g. microarray 'heat maps' and bioactivity data) as they are particularly suited to complex data sets and offer an alternative to the graphical representations included in traditional statistical software packages. The effectiveness of color-grids lies in their graphical design, which introduces a standard for customizable data representation. Currently, software applications capable of generating limited color-grid representations can be found only in advanced statistical packages or custom programs (e.g. micro-array analysis tools), often associated with steep learning curves and requiring expert knowledge. Here we describe JColorGrid, a Java library and platform independent application that renders color-grid graphics from data. The software can be used as a Java library, as a command-line application, and as a color-grid parameter interface and graphical viewer application. Data, titles, and data labels are input as tab-delimited text files or Microsoft Excel spreadsheets and the color-grid settings are specified through the graphical interface or a text configuration file. JColorGrid allows both user graphical data exploration as well as a means of automatically rendering color-grids from data as part of research pipelines. The program has been tested on Windows, Mac, and Linux operating systems, and the binary executables and source files are available for download at http://jcolorgrid.ucsf.edu.

  17. Happy software developers solve problems better: psychological measurements in empirical software engineering.

    Science.gov (United States)

    Graziotin, Daniel; Wang, Xiaofeng; Abrahamsson, Pekka

    2014-01-01

    For more than thirty years, it has been claimed that a way to improve software developers' productivity and software quality is to focus on people and to provide incentives to make developers satisfied and happy. This claim has rarely been verified in software engineering research, which faces an additional challenge in comparison to more traditional engineering fields: software development is an intellectual activity and is dominated by often-neglected human factors (called human aspects in software engineering research). Among the many skills required for software development, developers must possess high analytical problem-solving skills and creativity for the software construction process. According to psychology research, affective states-emotions and moods-deeply influence the cognitive processing abilities and performance of workers, including creativity and analytical problem solving. Nonetheless, little research has investigated the correlation between the affective states, creativity, and analytical problem-solving performance of programmers. This article echoes the call to employ psychological measurements in software engineering research. We report a study with 42 participants to investigate the relationship between the affective states, creativity, and analytical problem-solving skills of software developers. The results offer support for the claim that happy developers are indeed better problem solvers in terms of their analytical abilities. The following contributions are made by this study: (1) providing a better understanding of the impact of affective states on the creativity and analytical problem-solving capacities of developers, (2) introducing and validating psychological measurements, theories, and concepts of affective states, creativity, and analytical-problem-solving skills in empirical software engineering, and (3) raising the need for studying the human factors of software engineering by employing a multidisciplinary viewpoint.

  18. Virtual Exercise Training Software System

    Science.gov (United States)

    Vu, L.; Kim, H.; Benson, E.; Amonette, W. E.; Barrera, J.; Perera, J.; Rajulu, S.; Hanson, A.

    2018-01-01

    The purpose of this study was to develop and evaluate a virtual exercise training software system (VETSS) capable of providing real-time instruction and exercise feedback during exploration missions. A resistive exercise instructional system was developed using a Microsoft Kinect depth-camera device, which provides markerless 3-D whole-body motion capture at a small form factor and minimal setup effort. It was hypothesized that subjects using the newly developed instructional software tool would perform the deadlift exercise with more optimal kinematics and consistent technique than those without the instructional software. Following a comprehensive evaluation in the laboratory, the system was deployed for testing and refinement in the NASA Extreme Environment Mission Operations (NEEMO) analog.

  19. Near-Earth Object Survey Simulation Software

    Science.gov (United States)

    Naidu, Shantanu P.; Chesley, Steven R.; Farnocchia, Davide

    2017-10-01

    There is a significant interest in Near-Earth objects (NEOs) because they pose an impact threat to Earth, offer valuable scientific information, and are potential targets for robotic and human exploration. The number of NEO discoveries has been rising rapidly over the last two decades with over 1800 being discovered last year, making the total number of known NEOs >16000. Pan-STARRS and the Catalina Sky Survey are currently the most prolific NEO surveys, having discovered >1600 NEOs between them in 2016. As next generation surveys such as Large Synoptic Survey Telescope (LSST) and the proposed Near-Earth Object Camera (NEOCam) become operational in the next decade, the discovery rate is expected to increase tremendously. Coordination between various survey telescopes will be necessary in order to optimize NEO discoveries and create a unified global NEO discovery network. We are collaborating on a community-based, open-source software project to simulate asteroid surveys to facilitate such coordination and develop strategies for improving discovery efficiency. Our effort so far has focused on development of a fast and efficient tool capable of accepting user-defined asteroid population models and telescope parameters such as a list of pointing angles and camera field-of-view, and generating an output list of detectable asteroids. The software takes advantage of the widely used and tested SPICE library and architecture developed by NASA’s Navigation and Ancillary Information Facility (Acton, 1996) for saving and retrieving asteroid trajectories and camera pointing. Orbit propagation is done using OpenOrb (Granvik et al. 2009) but future versions will allow the user to plug in a propagator of their choice. The software allows the simulation of both ground-based and space-based surveys. Performance is being tested using the Grav et al. (2011) asteroid population model and the LSST simulated survey “enigma_1189”.

  20. Capabilities for managing service innovation: towards a conceptual framework

    NARCIS (Netherlands)

    den Hertog, P.; van der Aa, W.; de Jong, M.W.

    2010-01-01

    Purpose - The purpose of this paper is to identify and reflect on a set of dynamic capabilities for managing service innovation and applies a dynamic capabilities view (DCV) of firms for managing service innovation. Design/methodology/approach - This theoretical paper offers a conceptual framework

  1. ON APPROACHES ON THE SOFTWARE DEVELOPMENT FOR THE MEDICAL EDUCATION AREA

    Directory of Open Access Journals (Sweden)

    A. V. Semenets

    2015-05-01

    Importance of application of the version control system to the software development process optimization is shown. Capabilities of the Google Apps For Education cloud platform usage in the software development process are also presented. The final recommendations to the software development process organization in the medical university are formed.

  2. Increasing the reliability of ecological models using modern software engineering techniques

    Science.gov (United States)

    Robert M. Scheller; Brian R. Sturtevant; Eric J. Gustafson; Brendan C. Ward; David J. Mladenoff

    2009-01-01

    Modern software development techniques are largely unknown to ecologists. Typically, ecological models and other software tools are developed for limited research purposes, and additional capabilities are added later, usually in an ad hoc manner. Modern software engineering techniques can substantially increase scientific rigor and confidence in ecological models and...

  3. Software project management tools in global software development: a systematic mapping study.

    Science.gov (United States)

    Chadli, Saad Yasser; Idri, Ali; Ros, Joaquín Nicolás; Fernández-Alemán, José Luis; de Gea, Juan M Carrillo; Toval, Ambrosio

    2016-01-01

    Global software development (GSD) which is a growing trend in the software industry is characterized by a highly distributed environment. Performing software project management (SPM) in such conditions implies the need to overcome new limitations resulting from cultural, temporal and geographic separation. The aim of this research is to discover and classify the various tools mentioned in literature that provide GSD project managers with support and to identify in what way they support group interaction. A systematic mapping study has been performed by means of automatic searches in five sources. We have then synthesized the data extracted and presented the results of this study. A total of 102 tools were identified as being used in SPM activities in GSD. We have classified these tools, according to the software life cycle process on which they focus and how they support the 3C collaboration model (communication, coordination and cooperation). The majority of the tools found are standalone tools (77%). A small number of platforms (8%) also offer a set of interacting tools that cover the software development lifecycle. Results also indicate that SPM areas in GSD are not adequately supported by corresponding tools and deserve more attention from tool builders.

  4. SOFTM: a software maintenance expert system in Prolog

    DEFF Research Database (Denmark)

    Pau, L.; Negret, J. M.

    1988-01-01

    A description is given of a knowledge-based system called SOFTM, serving the following purposes: (1) assisting a software programmer or analyst in his application code maintenance tasks, (2) generating and updating automatically software correction documentation, (3) helping the end user register......, and on interfacing capabilities of Prolog II to a variety of other languages...

  5. Improvements to the APBS biomolecular solvation software suite.

    Science.gov (United States)

    Jurrus, Elizabeth; Engel, Dave; Star, Keith; Monson, Kyle; Brandi, Juan; Felberg, Lisa E; Brookes, David H; Wilson, Leighton; Chen, Jiahui; Liles, Karina; Chun, Minju; Li, Peter; Gohara, David W; Dolinsky, Todd; Konecny, Robert; Koes, David R; Nielsen, Jens Erik; Head-Gordon, Teresa; Geng, Weihua; Krasny, Robert; Wei, Guo-Wei; Holst, Michael J; McCammon, J Andrew; Baker, Nathan A

    2018-01-01

    The Adaptive Poisson-Boltzmann Solver (APBS) software was developed to solve the equations of continuum electrostatics for large biomolecular assemblages that have provided impact in the study of a broad range of chemical, biological, and biomedical applications. APBS addresses the three key technology challenges for understanding solvation and electrostatics in biomedical applications: accurate and efficient models for biomolecular solvation and electrostatics, robust and scalable software for applying those theories to biomolecular systems, and mechanisms for sharing and analyzing biomolecular electrostatics data in the scientific community. To address new research applications and advancing computational capabilities, we have continually updated APBS and its suite of accompanying software since its release in 2001. In this article, we discuss the models and capabilities that have recently been implemented within the APBS software package including a Poisson-Boltzmann analytical and a semi-analytical solver, an optimized boundary element solver, a geometry-based geometric flow solvation model, a graph theory-based algorithm for determining pK a values, and an improved web-based visualization tool for viewing electrostatics. © 2017 The Protein Society.

  6. Knowledge coordination in distributed software management

    DEFF Research Database (Denmark)

    Persson, John Stouby; Mathiassen, Lars

    2012-01-01

    Software organizations are increasingly relying on cross-organizational and cross-border collaboration, requiring effective coordination of distributed knowledge. However, such coordination is challenging due to spatial separation, diverging communities-of-practice, and unevenly distributed...... communication breakdowns on recordings of their combined teleconferencing and real-time collaborative modeling. As a result, we offer theoretical propositions that explain how distributed software managers can deal with communication breakdowns and effectively coordinate knowledge through multimodal virtual...

  7. An overview of the National Earthquake Information Center acquisition software system, Edge/Continuous Waveform Buffer

    Science.gov (United States)

    Patton, John M.; Ketchum, David C.; Guy, Michelle R.

    2015-11-02

    This document provides an overview of the capabilities, design, and use cases of the data acquisition and archiving subsystem at the U.S. Geological Survey National Earthquake Information Center. The Edge and Continuous Waveform Buffer software supports the National Earthquake Information Center’s worldwide earthquake monitoring mission in direct station data acquisition, data import, short- and long-term data archiving, data distribution, query services, and playback, among other capabilities. The software design and architecture can be configured to support acquisition and (or) archiving use cases. The software continues to be developed in order to expand the acquisition, storage, and distribution capabilities.

  8. The Implementation of Satellite Control System Software Using Object Oriented Design

    Science.gov (United States)

    Anderson, Mark O.; Reid, Mark; Drury, Derek; Hansell, William; Phillips, Tom

    1998-01-01

    NASA established the Small Explorer (SMEX) program in 1988 to provide frequent opportunities for highly focused and relatively inexpensive space science missions that can be launched into low earth orbit by small expendable vehicles. The development schedule for each SMEX spacecraft was three years from start to launch. The SMEX program has produced five satellites; Solar Anomalous and Magnetospheric Particle Explorer (SAMPEX), Fast Auroral Snapshot Explorer (FAST), Submillimeter Wave Astronomy Satellite (SWAS), Transition Region and Coronal Explorer (TRACE) and Wide-Field Infrared Explorer (WIRE). SAMPEX and FAST are on-orbit, TRACE is scheduled to be launched in April of 1998, WIRE is scheduled to be launched in September of 1998, and SWAS is scheduled to be launched in January of 1999. In each of these missions, the Attitude Control System (ACS) software was written using a modular procedural design. Current program goals require complete spacecraft development within 18 months. This requirement has increased pressure to write reusable flight software. Object-Oriented Design (OOD) offers the constructs for developing an application that only needs modification for mission unique requirements. This paper describes the OOD that was used to develop the SMEX-Lite ACS software. The SMEX-Lite ACS is three-axis controlled, momentum stabilized, and is capable of performing sub-arc-minute pointing. The paper first describes the high level requirements which governed the architecture of the SMEX-Lite ACS software. Next, the context in which the software resides is explained. The paper describes the benefits of encapsulation, inheritance and polymorphism with respect to the implementation of an ACS software system. This paper will discuss the design of several software components that comprise the ACS software. Specifically, Object-Oriented designs are presented for sensor data processing, attitude control, attitude determination and failure detection. The paper addresses

  9. Development of evaluation method for software hazard identification techniques

    International Nuclear Information System (INIS)

    Huang, H. W.; Chen, M. H.; Shih, C.; Yih, S.; Kuo, C. T.; Wang, L. H.; Yu, Y. C.; Chen, C. W.

    2006-01-01

    This research evaluated the applicable software hazard identification techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flow-graph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/noise ratio, complexity, and implementation cost. By this proposed method, the analysts can evaluate various software hazard identification combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (with transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and simulation-based model-analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantages are the completeness complexity and implementation cost. This evaluation method can be a platform to reach common consensus for the stakeholders. Following the evolution of software hazard identification techniques, the evaluation results could be changed. However, the insight of software hazard identification techniques is much more important than the numbers obtained by the evaluation. (authors)

  10. Software complex "remember me"

    OpenAIRE

    Kosheutova, N. V.; Osina, P. M.

    2016-01-01

    The article describes the importance of time management and effective planning in modern society and is devoted to an Android OS application development. It points out the main features of a mobile application such as cross-platform capability and synchronization. Much attention is given to the software architecture as well as user data protection via password hashing methods.

  11. Implementation Of Carlson Survey Software2009 In Survey Works And Comparison With CDS Software

    Directory of Open Access Journals (Sweden)

    Mohamed Faraj EL Megrahi

    2017-02-01

    Full Text Available The automation surveying is one of the most influential changes to surveying concept and profession has had to go through, this has taken effect in two major courses, hardware (instrumentation used in data collection and presentation, and the software (the applications used in data processing and manipulation. Automation is majorly computer based and just like all such systems is subject to improvement often; this is manifested in the new kinds of instrumentation models every few years such as total station and newer versions of software’s. The software that has the potential to completely affect survey automation is Carlson Surveying Software. This when coupled with total station as data processing and collection methods respectively; is capable of greatly improving productivity while reducing time and cost required in the long run. However, it is only natural for users to desire a competent software and be able to choose from what is available on the market based on guided research and credible information from previous researches. Such studies not only help in choice of software but are also handy when it comes to testing approaches and recommending improvements based on advantages and disadvantages to the manufacturers to help in advancement in the software industry for better and more comfortable use. The expected outcome of the research is a successful implementation of Carlson survey 2009 software in survey works and a comparison with other existing software like Civil Design Software (CDS was highlighted its advantages and disadvantages.

  12. Hooke: an open software platform for force spectroscopy.

    Science.gov (United States)

    Sandal, Massimo; Benedetti, Fabrizio; Brucale, Marco; Gomez-Casado, Alberto; Samorì, Bruno

    2009-06-01

    Hooke is an open source, extensible software intended for analysis of atomic force microscope (AFM)-based single molecule force spectroscopy (SMFS) data. We propose it as a platform on which published and new algorithms for SMFS analysis can be integrated in a standard, open fashion, as a general solution to the current lack of a standard software for SMFS data analysis. Specific features and support for file formats are coded as independent plugins. Any user can code new plugins, extending the software capabilities. Basic automated dataset filtering and semi-automatic analysis facilities are included. Software and documentation are available at (http://code.google.com/p/hooke). Hooke is a free software under the GNU Lesser General Public License.

  13. The ESA River & Lake System: Current Capabilities and Future Potential

    DEFF Research Database (Denmark)

    Smith, Richard G.; Salloway, Mark; Berry, Philippa A. M.

    Measuring the earth's river and lake resources using satellite radar altimetry offers a unique global monitoring capability, which complements the detailed measurements made by the steadily decreasing number of in-situ gauges. To exploit this unique remote monitoring capability, a global pilot...

  14. Extracting software static defect models using data mining

    Directory of Open Access Journals (Sweden)

    Ahmed H. Yousef

    2015-03-01

    Full Text Available Large software projects are subject to quality risks of having defective modules that will cause failures during the software execution. Several software repositories contain source code of large projects that are composed of many modules. These software repositories include data for the software metrics of these modules and the defective state of each module. In this paper, a data mining approach is used to show the attributes that predict the defective state of software modules. Software solution architecture is proposed to convert the extracted knowledge into data mining models that can be integrated with the current software project metrics and bugs data in order to enhance the prediction. The results show better prediction capabilities when all the algorithms are combined using weighted votes. When only one individual algorithm is used, Naïve Bayes algorithm has the best results, then the Neural Network and the Decision Trees algorithms.

  15. Safety critical software development qualification

    International Nuclear Information System (INIS)

    Marron, J. E.

    2006-01-01

    With the increasing use of digital systems in control applications, customers must acquire appropriate expectations for software development and quality assurance procedures. Purchasers and users of digital systems need to understand the benefits to the supplier of effective quality systems. These systems consist not only of procedures but tools that enable automation. Without the use of automation, quality can not be assured. A software and systems quality program starts with the documents you are very familiar with. But these documents must define more than the final system. They must address specific development environment characteristics and testing capabilities. Starting with the RFP, some of the items that should be introduced are Software Configuration Management, regression testing and defect tracking. The digital system customer is in the best position to enforce the use of software and systems quality programs by including them in project requirements as early as the Purchase Order. The customer's understanding of the full scope and implementation of a software quality program is essential to achieving the quality necessary in nuclear projects, and, incidentally, completing those projects on schedule. (authors)

  16. JColorGrid: software for the visualization of biological measurements

    Directory of Open Access Journals (Sweden)

    May Barnaby CH

    2006-04-01

    Full Text Available Abstract Background Two-dimensional data colourings are an effective medium by which to represent three-dimensional data in two dimensions. Such "color-grid" representations have found increasing use in the biological sciences (e.g. microarray 'heat maps' and bioactivity data as they are particularly suited to complex data sets and offer an alternative to the graphical representations included in traditional statistical software packages. The effectiveness of color-grids lies in their graphical design, which introduces a standard for customizable data representation. Currently, software applications capable of generating limited color-grid representations can be found only in advanced statistical packages or custom programs (e.g. micro-array analysis tools, often associated with steep learning curves and requiring expert knowledge. Results Here we describe JColorGrid, a Java library and platform independent application that renders color-grid graphics from data. The software can be used as a Java library, as a command-line application, and as a color-grid parameter interface and graphical viewer application. Data, titles, and data labels are input as tab-delimited text files or Microsoft Excel spreadsheets and the color-grid settings are specified through the graphical interface or a text configuration file. JColorGrid allows both user graphical data exploration as well as a means of automatically rendering color-grids from data as part of research pipelines. Conclusion The program has been tested on Windows, Mac, and Linux operating systems, and the binary executables and source files are available for download at http://jcolorgrid.ucsf.edu.

  17. Seven Processes that Enable NASA Software Engineering Technologies

    Science.gov (United States)

    Housch, Helen; Godfrey, Sally

    2011-01-01

    This slide presentation reviews seven processes that NASA uses to ensure that software is developed, acquired and maintained as specified in the NPR 7150.2A requirement. The requirement is to ensure that all software be appraised for the Capability Maturity Model Integration (CMMI). The enumerated processes are: (7) Product Integration, (6) Configuration Management, (5) Verification, (4) Software Assurance, (3) Measurement and Analysis, (2) Requirements Management and (1) Planning & Monitoring. Each of these is described and the group(s) that are responsible is described.

  18. Small is beautiful: customer driven software development

    DEFF Research Database (Denmark)

    Hansen, Henrik A.B.; Koch, Christian; Pleman, Allan

    1999-01-01

    to develop their software. In small software houses operating in markets with complex products such as ERP (enterprise resource planning) systems, networking is necessary in order to gain the needed knowledge and resources in the production development process. Network is not seen as a magic word but leads......Summary form only given. The topics addressed in this paper is how networking can be used as a way for small software houses to enhances their innovative capabilities by using different kinds of collaboration in order to overcome the problems of lacking knowledge as well as resources in order...

  19. Clinical software for MR imaging system, 4

    International Nuclear Information System (INIS)

    Shimizu, Koji; Kasai, Akira; Okamura, Shoichi

    1992-01-01

    Magnetic resonance imaging continues to elicit new application software through the recent technological advances of MR equipment. This paper describes several applications of our newly developed clinical software. The fast SE sequence (RISE) has proved to reduce routine examination time and to improve image quality, and ultra-fast FE sequence (SMASH) was found to extend the diagnostic capabilities in the field of cardiac study. Diffusion/perfusion imaging achieved in our MR system showed significant promise for providing novel information regarding tissue characterization. Furthermore, Image quality and practicalities of MR angiography have been improved by advanced imaging sequences and sophisticated post-processing software. (author)

  20. Integrated Software Development System/Higher Order Software Conceptual Description (ISDS/HOS)

    Science.gov (United States)

    1976-11-01

    Structured Flowchart Conventions 270 6.3.5.3 Design Diagram Notation 273 xii HIGHER ORDER SOFTWARE, INC. 843 MASSACHUSETTS AVENUE. CAMBRIDGE, MASSACHUSETTS...associated with the process steps. They also reference other HIPO diagrams as well an non-HIPO documentation such as flowcharts or decision tables of...syntax that is easy to learn and must provide the novice with some prompting to help him avoid classic beginner errors. Desirable editing capabilities

  1. Software Engineering Issues for Cyber-Physical Systems

    DEFF Research Database (Denmark)

    Al-Jaroodi, Jameela; Mohamed, Nader; Jawhar, Imad

    2016-01-01

    step; however, designing and implementing the right software to integrate and use them effectively is essential. The software facilitates better interfaces, more control and adds smart services, high flexibility and many other added values and features to the CPS. However, software development for CPS......Cyber-Physical Systems (CPS) provide many smart features for enhancing physical processes. These systems are designed with a set of distributed hardware, software, and network components that are embedded in physical systems and environments or attached to humans. Together they function seamlessly...... to offer specific functionalities or features that help enhance human lives, operations or environments. While different CPS components play important roles in a successful CPS development, the software plays the most important role among them. Acquiring and using high quality CPS components is the first...

  2. Software for Demonstration of Features of Chain Polymerization Processes

    Science.gov (United States)

    Sosnowski, Stanislaw

    2013-01-01

    Free software for the demonstration of the features of homo- and copolymerization processes (free radical, controlled radical, and living) is described. The software is based on the Monte Carlo algorithms and offers insight into the kinetics, molecular weight distribution, and microstructure of the macromolecules formed in those processes. It also…

  3. Development of a software for the curimeter model cdn102

    International Nuclear Information System (INIS)

    Dotres Llera, Armando

    2001-01-01

    The characteristics of the software for the Curimeter Model CD-N102 developed at CEADEN are presented. The software consists of two main parts: a basic software for the electrometer block and an application software for a P C. The basic software is totally independent of the Pc and performs all the basic functions of the process of measurement. The application software is optional and offers a friendlier interface and additional options to the user. Among these is the possibility to keep a statistical record of the measurements in a database, to create labels and to introduce new isotopes and calibrate them. A more detailed explanation of both software is given

  4. ORNL's DCAL software package

    International Nuclear Information System (INIS)

    Eckerman, K.F.

    2007-01-01

    Oak Ridge National Laboratory has released its Dose and Risk Calculation software, DCAL. DCAL, developed with the support of the U.S. Environmental Protection Agency, consists of a series of computational modules, driven in either an interactive or a batch mode for computation of dose and risk coefficients from intakes of radionuclides or exposure to radionuclides in environmental media. The software package includes extensive libraries of biokinetic and dosimetric data that represent the current state of the art. The software has unique capability for addressing intakes of radionuclides by non-adults. DCAL runs as 32-bit extended DOS and console applications under Windows 98/NT/2000/XP. It is intended for users familiar with the basic elements of computational radiation dosimetry. Components of DCAL have been used to prepare U.S. Environmental Protection Agency's Federal Guidance Reports 12 and 13 and several publications of the International Commission on Radiological Protection. (author)

  5. Configuring the Orion Guidance, Navigation, and Control Flight Software for Automated Sequencing

    Science.gov (United States)

    Odegard, Ryan G.; Siliwinski, Tomasz K.; King, Ellis T.; Hart, Jeremy J.

    2010-01-01

    The Orion Crew Exploration Vehicle is being designed with greater automation capabilities than any other crewed spacecraft in NASA s history. The Guidance, Navigation, and Control (GN&C) flight software architecture is designed to provide a flexible and evolvable framework that accommodates increasing levels of automation over time. Within the GN&C flight software, a data-driven approach is used to configure software. This approach allows data reconfiguration and updates to automated sequences without requiring recompilation of the software. Because of the great dependency of the automation and the flight software on the configuration data, the data management is a vital component of the processes for software certification, mission design, and flight operations. To enable the automated sequencing and data configuration of the GN&C subsystem on Orion, a desktop database configuration tool has been developed. The database tool allows the specification of the GN&C activity sequences, the automated transitions in the software, and the corresponding parameter reconfigurations. These aspects of the GN&C automation on Orion are all coordinated via data management, and the database tool provides the ability to test the automation capabilities during the development of the GN&C software. In addition to providing the infrastructure to manage the GN&C automation, the database tool has been designed with capabilities to import and export artifacts for simulation analysis and documentation purposes. Furthermore, the database configuration tool, currently used to manage simulation data, is envisioned to evolve into a mission planning tool for generating and testing GN&C software sequences and configurations. A key enabler of the GN&C automation design, the database tool allows both the creation and maintenance of the data artifacts, as well as serving the critical role of helping to manage, visualize, and understand the data-driven parameters both during software development

  6. Containment and surveillance for software

    International Nuclear Information System (INIS)

    Andress, J.C.; Adams, G.N.; Cotton, J.H.

    1993-07-01

    Some operators and state authorities are offering their computer systems, both hardware and software, to be used for safeguards purposes by the International Atomic Energy Agency. Therefore a need exists to develop a method of authenticating the data produced by a computer program before it can be used by the Agency. As part of a complete Computer Systems Authentication (COMSAT) package, a method of software containment and surveillance has been developed to compliment existing software authentication techniques. The package is applicable to both operator and Agency provided systems. A program to demonstrate the principles has been written. With this facility, the Agency will be able to leave unattended software in the field, either to be used by the operator to generate data for inspection on their own computer, or to save an inspector having to re-install inspection-specific software on an Agency computer, in the knowledge that the operation of the protected computer is being continuously monitored. If adopted, either of these uses will enable the Agency to reduce their costs. (Author)

  7. Tuning COCOMO-II for Software Process Improvement: A Tool Based Approach

    Directory of Open Access Journals (Sweden)

    SYEDA UMEMA HANI

    2016-10-01

    Full Text Available In order to compete in the international software development market the software organizations have to adopt internationally accepted software practices i.e. standard like ISO (International Standard Organization or CMMI (Capability Maturity Model Integration in spite of having scarce resources and tools. The aim of this study is to develop a tool which could be used to present an actual picture of Software Process Improvement benefits in front of the software development companies. However, there are few tools available to assist in making predictions, they are too expensive and could not cover dataset that reflect the cultural behavior of organizations for software development in developing countries. In extension to our previously done research reported elsewhere for Pakistani software development organizations which has quantified benefits of SDPI (Software Development Process Improvement, this research has used sixty-two datasets from three different software development organizations against the set of metrics used in COCOMO-II (Constructive Cost Model 2000. It derived a verifiable equation for calculating ISF (Ideal Scale Factor and tuned the COCOMO-II model to bring prediction capability for SDPI (benefit measurement classes such as ESCP (Effort, Schedule, Cost, and Productivity. This research has contributed towards software industry by giving a reliable and low-cost mechanism for generating prediction models with high prediction accuracy. Hopefully, this study will help software organizations to use this tool not only to predict ESCP but also to predict an exact impact of SDPI.

  8. Writing virtual environments for software visualization

    CERN Document Server

    Jeffery, Clinton

    2015-01-01

    This book describes the software for creating networked, 3D multi-user virtual environments that allow users to create and remotely share visualizations of program behavior. The authors cover the major features of collaborative virtual environments and how to program them in a very high level language, and show how visualization can enable important advances in our ability to understand and reduce the costs of maintaining software. The book also examines the application of popular game-like software technologies.   • Discusses the acquisition of program behavior data to be visualized • Demonstrates the integration of multiple 2D and 3D dynamic views within a 3Dscene • Presents the network messaging capabilities to share those visualizations

  9. 48 CFR 570.303-3 - Late offers, modifications of offers, and withdrawals of offers.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Late offers, modifications of offers, and withdrawals of offers. 570.303-3 Section 570.303-3 Federal Acquisition Regulations... PROPERTY Contracting Procedures for Leasehold Interests in Real Property 570.303-3 Late offers...

  10. The IceCube Data Acquisition Software: Lessons Learned during Distributed, Collaborative, Multi-Disciplined Software Development.

    Energy Technology Data Exchange (ETDEWEB)

    Beattie, Keith S; Beattie, Keith; Day Ph.D., Christopher; Glowacki, Dave; Hanson Ph.D., Kael; Jacobsen Ph.D., John; McParland, Charles; Patton Ph.D., Simon

    2007-09-21

    In this experiential paper we report on lessons learned during the development ofthe data acquisition software for the IceCube project - specifically, how to effectively address the unique challenges presented by a distributed, collaborative, multi-institutional, multi-disciplined project such as this. While development progress in software projects is often described solely in terms of technical issues, our experience indicates that non- and quasi-technical interactions play a substantial role in the effectiveness of large software development efforts. These include: selection and management of multiple software development methodologies, the effective useof various collaborative communication tools, project management structure and roles, and the impact and apparent importance of these elements when viewed through the differing perspectives of hardware, software, scientific and project office roles. Even in areas clearly technical in nature, success is still influenced by non-technical issues that can escape close attention. In particular we describe our experiences on software requirements specification, development methodologies and communication tools. We make observations on what tools and techniques have and have not been effective in this geographically disperse (including the South Pole) collaboration and offer suggestions on how similarly structured future projects may build upon our experiences.

  11. The STARLINK software collection

    Science.gov (United States)

    Penny, A. J.; Wallace, P. T.; Sherman, J. C.; Terret, D. L.

    1993-12-01

    A demonstration will be given of some recent Starlink software. STARLINK is: a network of computers used by UK astronomers; a collection of programs for the calibration and analysis of astronomical data; a team of people giving hardware, software and administrative support. The Starlink Project has been in operation since 1980 to provide UK astronomers with interactive image processing and data reduction facilities. There are now Starlink computer systems at 25 UK locations, serving about 1500 registered users. The Starlink software collection now has about 25 major packages covering a wide range of astronomical data reduction and analysis techniques, as well as many smaller programs and utilities. At the core of most of the packages is a common `software environment', which provides many of the functions which applications need and offers standardized methods of structuring and accessing data. The software environment simplifies programming and support, and makes it easy to use different packages for different stages of the data reduction. Users see a consistent style, and can mix applications without hitting problems of differing data formats. The Project group coordinates the writing and distribution of this software collection, which is Unix based. Outside the UK, Starlink is used at a large number of places, which range from installations at major UK telescopes, which are Starlink-compatible and managed like Starlink sites, to individuals who run only small parts of the Starlink software collection.

  12. REVEAL: Software Documentation and Platform Migration

    Science.gov (United States)

    Wilson, Michael A.; Veibell, Victoir T.

    2011-01-01

    The Research Environment for Vehicle Embedded Analysis on Linux (REVEAL) is reconfigurable data acquisition software designed for network-distributed test and measurement applications. In development since 2001, it has been successfully demonstrated in support of a number of actual missions within NASA's Suborbital Science Program. Improvements to software configuration control were needed to properly support both an ongoing transition to operational status and continued evolution of REVEAL capabilities. For this reason the project described in this report targets REVEAL software source documentation and deployment of the software on a small set of hardware platforms different from what is currently used in the baseline system implementation. This presentation specifically describes the actions taken over a ten week period by two undergraduate student interns and serves as an overview of the content of the final report for that internship.

  13. Introducing a New Software for Geodetic Analysis

    Science.gov (United States)

    Hjelle, Geir Arne; Dähnn, Michael; Fausk, Ingrid; Kirkvik, Ann-Silje; Mysen, Eirik

    2017-04-01

    At the Norwegian Mapping Authority, we are currently developing Where, a new software for geodetic analysis. Where is built on our experiences with the Geosat software, and will be able to analyse and combine data from VLBI, SLR, GNSS and DORIS. The software is mainly written in Python which has proved very fruitful. The code is quick to write and the architecture is easily extendable and maintainable, while at the same time taking advantage of well-tested code like the SOFA and IERS libraries. This presentation will show some of the current capabilities of Where, including benchmarks against other software packages, and outline our plans for further progress. In addition we will report on some investigations we have done experimenting with alternative weighting strategies for VLBI.

  14. STARS software tool for analysis of reliability and safety

    International Nuclear Information System (INIS)

    Poucet, A.; Guagnini, E.

    1989-01-01

    This paper reports on the STARS (Software Tool for the Analysis of Reliability and Safety) project aims at developing an integrated set of Computer Aided Reliability Analysis tools for the various tasks involved in systems safety and reliability analysis including hazard identification, qualitative analysis, logic model construction and evaluation. The expert system technology offers the most promising perspective for developing a Computer Aided Reliability Analysis tool. Combined with graphics and analysis capabilities, it can provide a natural engineering oriented environment for computer assisted reliability and safety modelling and analysis. For hazard identification and fault tree construction, a frame/rule based expert system is used, in which the deductive (goal driven) reasoning and the heuristic, applied during manual fault tree construction, is modelled. Expert system can explain their reasoning so that the analyst can become aware of the why and the how results are being obtained. Hence, the learning aspect involved in manual reliability and safety analysis can be maintained and improved

  15. Open Source Next Generation Visualization Software for Interplanetary Missions

    Science.gov (United States)

    Trimble, Jay; Rinker, George

    2016-01-01

    Mission control is evolving quickly, driven by the requirements of new missions, and enabled by modern computing capabilities. Distributed operations, access to data anywhere, data visualization for spacecraft analysis that spans multiple data sources, flexible reconfiguration to support multiple missions, and operator use cases, are driving the need for new capabilities. NASA's Advanced Multi-Mission Operations System (AMMOS), Ames Research Center (ARC) and the Jet Propulsion Laboratory (JPL) are collaborating to build a new generation of mission operations software for visualization, to enable mission control anywhere, on the desktop, tablet and phone. The software is built on an open source platform that is open for contributions (http://nasa.github.io/openmct).

  16. Resilience Engineering in Critical Long Term Aerospace Software Systems: A New Approach to Spacecraft Software Safety

    Science.gov (United States)

    Dulo, D. A.

    Safety critical software systems permeate spacecraft, and in a long term venture like a starship would be pervasive in every system of the spacecraft. Yet software failure today continues to plague both the systems and the organizations that develop them resulting in the loss of life, time, money, and valuable system platforms. A starship cannot afford this type of software failure in long journeys away from home. A single software failure could have catastrophic results for the spaceship and the crew onboard. This paper will offer a new approach to developing safe reliable software systems through focusing not on the traditional safety/reliability engineering paradigms but rather by focusing on a new paradigm: Resilience and Failure Obviation Engineering. The foremost objective of this approach is the obviation of failure, coupled with the ability of a software system to prevent or adapt to complex changing conditions in real time as a safety valve should failure occur to ensure safe system continuity. Through this approach, safety is ensured through foresight to anticipate failure and to adapt to risk in real time before failure occurs. In a starship, this type of software engineering is vital. Through software developed in a resilient manner, a starship would have reduced or eliminated software failure, and would have the ability to rapidly adapt should a software system become unstable or unsafe. As a result, long term software safety, reliability, and resilience would be present for a successful long term starship mission.

  17. Fixed-Wing Micro Air Vehicles with Hovering Capabilities

    National Research Council Canada - National Science Library

    Bataille, Boris; Poinsot, Damien; Thipyopas, Chinnapat; Moschetta, Jean-Marc

    2007-01-01

    Fixed-wing micro air vehicles (MAV) are very attractive for outdoor surveillance missions since they generally offer better payload and endurance capabilities than rotorcraft or flapping-wing vehicles of equal size...

  18. Design to Process Capabilities: challenges for the use of Process Capability Databases (PCDBs) in development

    DEFF Research Database (Denmark)

    Eifler, Tobias; Göhler, Simon Moritz; Howard, Thomas J.

    2014-01-01

    capabilities may lead to low yields and a cost/time overrun, conservatively underestimated capabilities affect quality through the reduced design space, or through increased play, rattle/noise, size or weight. A possibility to overcome the subjective assessment of variation in development projects is a Process...... and Maiti (2012), Breyfogle (2003)]. At the same time, information on the achievable manufacturing accuracy or the supplier’s performance is usually inaccurate and largely qualitative in early development stages. Design decisions as well as the choice of manufacturing processes, therefore, often rely...... Capability Data Base (PCDB) offering valuable insight into the actual or expected performance of production processes (Tata and Thornton, 1999). But although the potential benefits as well as initial challenges for the use of PCDBs have been addressed in earlier research, e. g. by Delaney and Phelan (2008...

  19. From On-Premise Software to Cloud Services: The Impact of Cloud Computing on Enterprise Software Vendors' Business Models

    OpenAIRE

    Boillat, Thomas; Legner, Christine

    2013-01-01

    Cloud computing is an emerging paradigm that allows users to conveniently access computing resources as pay-per-use services. Whereas cloud offerings such as Amazon's Elastic Compute Cloud and Google Apps are rapidly gaining a large user base, enterprise software's migration towards the cloud is still in its infancy. For software vendors the move towardscloud solutions implies profound changes in their value-creation logic. Not only are they forced to deliver fully web-enabled solutions and t...

  20. Baseline Assessment and Prioritization Framework for IVHM Integrity Assurance Enabling Capabilities

    Science.gov (United States)

    Cooper, Eric G.; DiVito, Benedetto L.; Jacklin, Stephen A.; Miner, Paul S.

    2009-01-01

    Fundamental to vehicle health management is the deployment of systems incorporating advanced technologies for predicting and detecting anomalous conditions in highly complex and integrated environments. Integrated structural integrity health monitoring, statistical algorithms for detection, estimation, prediction, and fusion, and diagnosis supporting adaptive control are examples of advanced technologies that present considerable verification and validation challenges. These systems necessitate interactions between physical and software-based systems that are highly networked with sensing and actuation subsystems, and incorporate technologies that are, in many respects, different from those employed in civil aviation today. A formidable barrier to deploying these advanced technologies in civil aviation is the lack of enabling verification and validation tools, methods, and technologies. The development of new verification and validation capabilities will not only enable the fielding of advanced vehicle health management systems, but will also provide new assurance capabilities for verification and validation of current generation aviation software which has been implicated in anomalous in-flight behavior. This paper describes the research focused on enabling capabilities for verification and validation underway within NASA s Integrated Vehicle Health Management project, discusses the state of the art of these capabilities, and includes a framework for prioritizing activities.

  1. Softwareland Chronicles: A Software Development Meta-Process Proposal

    Directory of Open Access Journals (Sweden)

    Bolanos Sandro

    2016-05-01

    Full Text Available This paper presents the software development meta-process (SD-MP as a proposal to set up software projects. Within this proposal we offer conceptual elements that help solve the war of methodologies and processes in favor of an integrating viewpoint, where the main flaws associated with conventional and agile approaches are removed. Our newly developed software platform to support the meta-process is also presented together with three case studies involving projects currently in progress, where the framework proposed in SD-MP has been applied.

  2. Software Analysis of New Space Gravity Data for Geophysics and Climate Research

    Science.gov (United States)

    Deese, Rupert; Ivins, Erik R.; Fielding, Eric J.

    2012-01-01

    Both the Gravity Recovery and Climate Experiment (GRACE) and Gravity field and steady-state Ocean Circulation Explorer (GOCE) satellites are returning rich data for the study of the solid earth, the oceans, and the climate. Current software analysis tools do not provide researchers with the ease and flexibility required to make full use of this data. We evaluate the capabilities and shortcomings of existing software tools including Mathematica, the GOCE User Toolbox, the ICGEM's (International Center for Global Earth Models) web server, and Tesseroids. Using existing tools as necessary, we design and implement software with the capability to produce gridded data and publication quality renderings from raw gravity data. The straight forward software interface marks an improvement over previously existing tools and makes new space gravity data more useful to researchers. Using the software we calculate Bouguer anomalies of the gravity tensor's vertical component in the Gulf of Mexico, Antarctica, and the 2010 Maule earthquake region. These maps identify promising areas of future research.

  3. Collaborative environments for capability-based planning

    Science.gov (United States)

    McQuay, William K.

    2005-05-01

    Distributed collaboration is an emerging technology for the 21st century that will significantly change how business is conducted in the defense and commercial sectors. Collaboration involves two or more geographically dispersed entities working together to create a "product" by sharing and exchanging data, information, and knowledge. A product is defined broadly to include, for example, writing a report, creating software, designing hardware, or implementing robust systems engineering and capability planning processes in an organization. Collaborative environments provide the framework and integrate models, simulations, domain specific tools, and virtual test beds to facilitate collaboration between the multiple disciplines needed in the enterprise. The Air Force Research Laboratory (AFRL) is conducting a leading edge program in developing distributed collaborative technologies targeted to the Air Force's implementation of systems engineering for a simulation-aided acquisition and capability-based planning. The research is focusing on the open systems agent-based framework, product and process modeling, structural architecture, and the integration technologies - the glue to integrate the software components. In past four years, two live assessment events have been conducted to demonstrate the technology in support of research for the Air Force Agile Acquisition initiatives. The AFRL Collaborative Environment concept will foster a major cultural change in how the acquisition, training, and operational communities conduct business.

  4. Imprinting Community College Computer Science Education with Software Engineering Principles

    Science.gov (United States)

    Hundley, Jacqueline Holliday

    2012-01-01

    Although the two-year curriculum guide includes coverage of all eight software engineering core topics, the computer science courses taught in Alabama community colleges limit student exposure to the programming, or coding, phase of the software development lifecycle and offer little experience in requirements analysis, design, testing, and…

  5. The Software Bus, an Object-Oriented Data Exchange System

    International Nuclear Information System (INIS)

    Akerbaek, T.; Louka, M.

    1996-01-01

    This document describes the Software Bus System, developed for object-oriented task to task communication in a TCP/IP based network. The Software Bus is a set of library functions, developed to be used for the Picasso-3 UIMS, and as a general purpose tool for dynamically interfacing programs at run-time. The Software Bus offers a high level object-oriented data exchange mechanism that relieves the application programmer of the low level TCP/IP-programming and communication protocol handling. The Software Bus is currently available under several UNIX platforms and a version for Windows NT is planned for late 1996. (author)

  6. A model-based software development methodology for high-end automotive components

    NARCIS (Netherlands)

    Ravanan, Mahmoud

    2014-01-01

    This report provides a model-based software development methodology for high-end automotive components. The V-model is used as a process model throughout the development of the software platform. It offers a framework that simplifies the relation between requirements, design, implementation,

  7. Engineering Play: Children's Software and the Cultural Politics of Edutainment

    Science.gov (United States)

    Ito, Mizuko

    2006-01-01

    The late 1980s saw the emergence of a new genre of instructional media, "edutainment", which utilized the capabilities of multimedia personal computers to animate software designed to both educate and entertain young children. This paper describes the production of, marketing of and play with edutainment software as a contemporary example of…

  8. Lecture 2: Software Security

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    Computer security has been an increasing concern for IT professionals for a number of years, yet despite all the efforts, computer systems and networks remain highly vulnerable to attacks of different kinds. Design flaws and security bugs in the underlying software are among the main reasons for this. This lecture addresses the following question: how to create secure software? The lecture starts with a definition of computer security and an explanation of why it is so difficult to achieve. It then introduces the main security principles (like least-privilege, or defense-in-depth) and discusses security in different phases of the software development cycle. The emphasis is put on the implementation part: most common pitfalls and security bugs are listed, followed by advice on best practice for security development, testing and deployment. Sebastian Lopienski is CERN’s deputy Computer Security Officer. He works on security strategy and policies; offers internal consultancy and audit services; develops and ...

  9. A real-time GNSS-R system based on software-defined radio and graphics processing units

    Science.gov (United States)

    Hobiger, Thomas; Amagai, Jun; Aida, Masanori; Narita, Hideki

    2012-04-01

    Reflected signals of the Global Navigation Satellite System (GNSS) from the sea or land surface can be utilized to deduce and monitor physical and geophysical parameters of the reflecting area. Unlike most other remote sensing techniques, GNSS-Reflectometry (GNSS-R) operates as a passive radar that takes advantage from the increasing number of navigation satellites that broadcast their L-band signals. Thereby, most of the GNSS-R receiver architectures are based on dedicated hardware solutions. Software-defined radio (SDR) technology has advanced in the recent years and enabled signal processing in real-time, which makes it an ideal candidate for the realization of a flexible GNSS-R system. Additionally, modern commodity graphic cards, which offer massive parallel computing performances, allow to handle the whole signal processing chain without interfering with the PC's CPU. Thus, this paper describes a GNSS-R system which has been developed on the principles of software-defined radio supported by General Purpose Graphics Processing Units (GPGPUs), and presents results from initial field tests which confirm the anticipated capability of the system.

  10. Portability and the National Energy Software Center

    International Nuclear Information System (INIS)

    Butler, M.K.

    1978-01-01

    The software portability problem is examined from the viewpoint of experience gained in the operation of a software exchange and information center. First, the factors contributing to the program interchange to date are identified, then major problem areas remaining are noted. The import of the development of programing language and documentation standards is noted, and the program packaging procedures and dissemination practices employed by the Center to facilitate successful software transport are described. Organization, or installation, dependencies of the computing environment, often hidden from the program author, and data interchange complexities are seen as today's primary issues, with dedicated processors and network communications offering an alternative solution

  11. Software for Evaluation of Conceptual Design

    DEFF Research Database (Denmark)

    Hartvig, Susanne C

    1998-01-01

    by the prototype, it addresses the requirements that the methods imply, and it explains the actual implementation of the prototype. Finally it discusses what have been learned from developing and testing the prototype. In this paper it is suggested, that a software tool which supports evaluation of design can...... be developed with a limited effort, and that such tools could support a structured evaluation process as opposed to no evaluation. Compared to manual evaluation, the introduced software based evaluation tool offers automation of tasks, such as performing assessments, when they are based on prior evaluations...

  12. Comparing internal and external run-time coupling of CFD and building energy simulation software

    NARCIS (Netherlands)

    Djunaedy, E.; Hensen, J.L.M.; Loomans, M.G.L.C.

    2004-01-01

    This paper describes a comparison between internal and external run-time coupling of CFD and building energy simulation software. Internal coupling can be seen as the "traditional" way of developing software, i.e. the capabilities of existing software are expanded by merging codes. With external

  13. Robotic Software Integration Using MARIE

    Directory of Open Access Journals (Sweden)

    Carle Côté

    2006-03-01

    Full Text Available This paper presents MARIE, a middleware framework oriented towards developing and integrating new and existing software for robotic systems. By using a generic communication framework, MARIE aims to create a flexible distributed component system that allows robotics developers to share software programs and algorithms, and design prototypes rapidly based on their own integration needs. The use of MARIE is illustrated with the design of a socially interactive autonomous mobile robot platform capable of map building, localization, navigation, tasks scheduling, sound source localization, tracking and separation, speech recognition and generation, visual tracking, message reading and graphical interaction using a touch screen interface.

  14. COMPARISON OF UAS-BASED PHOTOGRAMMETRY SOFTWARE FOR 3D POINT CLOUD GENERATION: A SURVEY OVER A HISTORICAL SITE

    Directory of Open Access Journals (Sweden)

    F. Alidoost

    2017-11-01

    Full Text Available Nowadays, Unmanned Aerial System (UAS-based photogrammetry offers an affordable, fast and effective approach to real-time acquisition of high resolution geospatial information and automatic 3D modelling of objects for numerous applications such as topography mapping, 3D city modelling, orthophoto generation, and cultural heritages preservation. In this paper, the capability of four different state-of-the-art software packages as 3DSurvey, Agisoft Photoscan, Pix4Dmapper Pro and SURE is examined to generate high density point cloud as well as a Digital Surface Model (DSM over a historical site. The main steps of this study are including: image acquisition, point cloud generation, and accuracy assessment. The overlapping images are first captured using a quadcopter and next are processed by different software to generate point clouds and DSMs. In order to evaluate the accuracy and quality of point clouds and DSMs, both visual and geometric assessments are carry out and the comparison results are reported.

  15. Comparison of Uas-Based Photogrammetry Software for 3d Point Cloud Generation: a Survey Over a Historical Site

    Science.gov (United States)

    Alidoost, F.; Arefi, H.

    2017-11-01

    Nowadays, Unmanned Aerial System (UAS)-based photogrammetry offers an affordable, fast and effective approach to real-time acquisition of high resolution geospatial information and automatic 3D modelling of objects for numerous applications such as topography mapping, 3D city modelling, orthophoto generation, and cultural heritages preservation. In this paper, the capability of four different state-of-the-art software packages as 3DSurvey, Agisoft Photoscan, Pix4Dmapper Pro and SURE is examined to generate high density point cloud as well as a Digital Surface Model (DSM) over a historical site. The main steps of this study are including: image acquisition, point cloud generation, and accuracy assessment. The overlapping images are first captured using a quadcopter and next are processed by different software to generate point clouds and DSMs. In order to evaluate the accuracy and quality of point clouds and DSMs, both visual and geometric assessments are carry out and the comparison results are reported.

  16. Steady-state capabilities for hydroturbines with OpenFOAM

    Science.gov (United States)

    Page, M.; Beaudoin, M.; Giroux, A. M.

    2010-08-01

    The availability of a high quality Open Source CFD simulation platform like OpenFOAM offers new R&D opportunities by providing direct access to models and solver implementation details. Efforts have been made by Hydro-Québec to adapt OpenFOAM to hydroturbines for the development of steady-state capabilities. The paper describes the developments that have been made to implement new turbomachinery related capabilities: Multiple Frame of Reference solver, domain coupling interfaces (GGI, cyclicGGI and mixing plane) and specialized boundary conditions. Practical use of the new turbomachinery capabilities are demonstrated for the analysis of a 195-MW Francis hydroturbine.

  17. Steady-state capabilities for hydroturbines with OpenFOAM

    Energy Technology Data Exchange (ETDEWEB)

    Page, M; Beaudoin, M; Giroux, A M, E-mail: page.maryse@ireq.c [Hydro-Quebec, Institut de recherche 1800 Lionel-Boulet, Varennes, Quebec J3X 1S1 (Canada)

    2010-08-15

    The availability of a high quality Open Source CFD simulation platform like OpenFOAM offers new R and D opportunities by providing direct access to models and solver implementation details. Efforts have been made by Hydro-Quebec to adapt OpenFOAM to hydroturbines for the development of steady-state capabilities. The paper describes the developments that have been made to implement new turbomachinery related capabilities: Multiple Frame of Reference solver, domain coupling interfaces (GGI, cyclicGGI and mixing plane) and specialized boundary conditions. Practical use of the new turbomachinery capabilities are demonstrated for the analysis of a 195-MW Francis hydroturbine.

  18. Steady-state capabilities for hydroturbines with OpenFOAM

    International Nuclear Information System (INIS)

    Page, M; Beaudoin, M; Giroux, A M

    2010-01-01

    The availability of a high quality Open Source CFD simulation platform like OpenFOAM offers new R and D opportunities by providing direct access to models and solver implementation details. Efforts have been made by Hydro-Quebec to adapt OpenFOAM to hydroturbines for the development of steady-state capabilities. The paper describes the developments that have been made to implement new turbomachinery related capabilities: Multiple Frame of Reference solver, domain coupling interfaces (GGI, cyclicGGI and mixing plane) and specialized boundary conditions. Practical use of the new turbomachinery capabilities are demonstrated for the analysis of a 195-MW Francis hydroturbine.

  19. Coupling methodology within the software platform alliances

    Energy Technology Data Exchange (ETDEWEB)

    Montarnal, Ph; Deville, E; Adam, E; Bengaouer, A [CEA Saclay, Dept. de Modelisation des Systemes et Structures 91 - Gif-sur-Yvette (France); Dimier, A; Gaombalet, J; Loth, L [Agence Nationale pour la Gestion des Dechets Radioactifs (ANDRA), 92 - Chatenay Malabry (France); Chavant, C [Electricite de France (EDF), 92 - Clamart (France)

    2005-07-01

    CEA, ANDRA and EDF are jointly developing the software platform ALLIANCES which aim is to produce a tool for the simulation of nuclear waste storage and disposal repository. This type of simulations deals with highly coupled thermo-hydro-mechanical and chemical (T-H-M-C) processes. A key objective of Alliances is to give the capability for coupling algorithms development between existing codes. The aim of this paper is to present coupling methodology use in the context of this software platform. (author)

  20. Coupling methodology within the software platform alliances

    International Nuclear Information System (INIS)

    Montarnal, Ph.; Deville, E.; Adam, E.; Bengaouer, A.; Dimier, A.; Gaombalet, J.; Loth, L.; Chavant, C.

    2005-01-01

    CEA, ANDRA and EDF are jointly developing the software platform ALLIANCES which aim is to produce a tool for the simulation of nuclear waste storage and disposal repository. This type of simulations deals with highly coupled thermo-hydro-mechanical and chemical (T-H-M-C) processes. A key objective of Alliances is to give the capability for coupling algorithms development between existing codes. The aim of this paper is to present coupling methodology use in the context of this software platform. (author)

  1. Considerations of the Software Metric-based Methodology for Software Reliability Assessment in Digital I and C Systems

    International Nuclear Information System (INIS)

    Ha, J. H.; Kim, M. K.; Chung, B. S.; Oh, H. C.; Seo, M. R.

    2007-01-01

    Analog I and C systems have been replaced by digital I and C systems because the digital systems have many potential benefits to nuclear power plants in terms of operational and safety performance. For example, digital systems are essentially free of drifts, have higher data handling and storage capabilities, and provide improved performance by accuracy and computational capabilities. In addition, analog replacement parts become more difficult to obtain since they are obsolete and discontinued. There are, however, challenges to the introduction of digital technology into the nuclear power plants because digital systems are more complex than analog systems and their operation and failure modes are different. Especially, software, which can be the core of functionality in the digital systems, does not wear out physically like hardware and its failure modes are not yet defined clearly. Thus, some researches to develop the methodology for software reliability assessment are still proceeding in the safety-critical areas such as nuclear system, aerospace and medical devices. Among them, software metric-based methodology has been considered for the digital I and C systems of Korean nuclear power plants. Advantages and limitations of that methodology are identified and requirements for its application to the digital I and C systems are considered in this study

  2. Advanced Visualization Software System for Nuclear Power Plant Inspection

    International Nuclear Information System (INIS)

    Kukic, I.; Jambresic, D.; Reskovic, S.

    2006-01-01

    Visualization techniques have been widely used in industrial environment for enhancing process control. Traditional techniques of visualization are based on control panels with switches and lights, and 2D graphic representations of processes. However, modern visualization systems enable significant new opportunities in creating 3D virtual environments. These opportunities arise from the availability of high end graphics capabilities in low cost personal computers. In this paper we describe implementation of process visualization software, developed by INETEC. This software is used to visualize testing equipment, components being tested and the overall power plant inspection process. It improves security of the process due to its real-time visualization and collision detection capabilities, and therefore greatly enhances the inspection process. (author)

  3. Integrating and Managing Bim in GIS, Software Review

    Science.gov (United States)

    El Meouche, R.; Rezoug, M.; Hijazi, I.

    2013-08-01

    Since the advent of Computer-Aided Design (CAD) and Geographical Information System (GIS) tools, project participants have been increasingly leveraging these tools throughout the different phases of a civil infrastructure project. In recent years the number of GIS software that provides tools to enable the integration of Building information in geo context has risen sharply. More and more GIS software are added tools for this purposes and other software projects are regularly extending these tools. However, each software has its different strength and weakness and its purpose of use. This paper provides a thorough review to investigate the software capabilities and clarify its purpose. For this study, Autodesk Revit 2012 i.e. BIM editor software was used to create BIMs. In the first step, three building models were created, the resulted models were converted to BIM format and then the software was used to integrate it. For the evaluation of the software, general characteristics was studied such as the user interface, what formats are supported (import/export), and the way building information are imported.

  4. Alinhando objetivos estratégicos e processo de desenvolvimento em empresas de software Aligning strategic objectives and development practices at software companies

    Directory of Open Access Journals (Sweden)

    André Leme Fleury

    2013-01-01

    Full Text Available Sistemas de software são atualmente responsáveis por parte significativa das inovações tecnológicas viabilizadas em produtos e serviços. A história do software é recente, assim como são recentes as pesquisas sobre o tema, que tem como foco principal o aprimoramento dos seus processos de desenvolvimento. Apesar das contribuições trazidas por essas teorias, a questão do alinhamento entre objetivos estratégicos e o processo de desenvolvimento nas empresas de software considerando as suas principais capacidades produtivas permanece um tema inexplorado. Como consequência, técnicas da engenharia de software são aplicadas sem incorporar considerações de valor nos processos de análise e na tomada de decisões. Este artigo analisa e apresenta soluções para duas questões: como diferenciar as empresas de software de acordo com seus processos produtivos mais relevantes e como garantir que esses processos de desenvolvimento encontram-se alinhados com os objetivos do negócio. A abordagem resultante inclui um referencial para classificação de empresas de software e uma técnica de planejamento tecnológico para essas empresas. O projeto de pesquisa incluiu a realização de surveys, pesquisa-ação e estudos de caso.Software systems are currently responsible for a significant part of the technological innovations introduced in products and services. software history is recent and research concerning software usually aims to improve process capabilities. Despite the contributions of these theories, questions concerning the alignment between strategic objectives and development process at software companies considering their most important capabilities have not been clarified. Consequently, software engineering techniques are applied without value considerations for analysis and decision. This paper analyses and proposes solutions for two issues: how to differentiate software companies according with their most relevant development

  5. Dynamic visualization techniques for high consequence software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-02-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification. The prototype tool is described along with the requirements constraint language after a brief literature review is presented. Examples of how the tool can be used are also presented. In conclusion, the most significant advantage of this tool is to provide a first step in evaluating specification completeness, and to provide a more productive method for program comprehension and debugging. The expected payoff is increased software surety confidence, increased program comprehension, and reduced development and debugging time.

  6. Software Assessment of the Global Force Management (GFM) Search Capability Study

    Science.gov (United States)

    2017-02-01

    MOS Capability Vector Work 3 3.3 Schema and Database Work Breakdown 4 3.3.1 Overall Generation 4 3.3.2 Algorithm Descriptions 5 3.3.3 Java API...battalion for example) and makes sure the results have a known name (not ‘%NKN’). 3.3.3 Java API Description Generation API call: GenerateAll.run...Database Login Username databaseUsername=root #Database Login Password databasePassword= mysql #Application Database Name appDatabaseName=Simple_GFM

  7. Advanced software development workstation project: Engineering scripting language. Graphical editor

    Science.gov (United States)

    1992-01-01

    Software development is widely considered to be a bottleneck in the development of complex systems, both in terms of development and in terms of maintenance of deployed systems. Cost of software development and maintenance can also be very high. One approach to reducing costs and relieving this bottleneck is increasing the reuse of software designs and software components. A method for achieving such reuse is a software parts composition system. Such a system consists of a language for modeling software parts and their interfaces, a catalog of existing parts, an editor for combining parts, and a code generator that takes a specification and generates code for that application in the target language. The Advanced Software Development Workstation is intended to be an expert system shell designed to provide the capabilities of a software part composition system.

  8. Space Logistics: Launch Capabilities

    Science.gov (United States)

    Furnas, Randall B.

    1989-01-01

    The current maximum launch capability for the United States are shown. The predicted Earth-to-orbit requirements for the United States are presented. Contrasting the two indicates the strong National need for a major increase in Earth-to-orbit lift capability. Approximate weights for planned payloads are shown. NASA is studying the following options to meet the need for a new heavy-lift capability by mid to late 1990's: (1) Shuttle-C for near term (include growth versions); and (2) the Advanced Lauching System (ALS) for the long term. The current baseline two-engine Shuttle-C has a 15 x 82 ft payload bay and an expected lift capability of 82,000 lb to Low Earth Orbit. Several options are being considered which have expanded diameter payload bays. A three-engine Shuttle-C with an expected lift of 145,000 lb to LEO is being evaluated as well. The Advanced Launch System (ALS) is a potential joint development between the Air Force and NASA. This program is focused toward long-term launch requirements, specifically beyond the year 2000. The basic approach is to develop a family of vehicles with the same high reliability as the Shuttle system, yet offering a much greater lift capability at a greatly reduced cost (per pound of payload). The ALS unmanned family of vehicles will provide a low end lift capability equivalent to Titan IV, and a high end lift capability greater than the Soviet Energia if requirements for such a high-end vehicle are defined.In conclusion, the planning of the next generation space telescope should not be constrained to the current launch vehicles. New vehicle designs will be driven by the needs of anticipated heavy users.

  9. Intercomparison of PIXE spectrometry software packages

    International Nuclear Information System (INIS)

    2003-02-01

    During the year 2000, an exercise was organized to make a intercomparison of widely available software packages for analysis of particle induced X ray emission (PIXE) spectra. This TECDOC describes the method used in this intercomparison exercise and presents the results obtained. It also gives a general overview of the participating software packages. This includes basic information on their user interface, graphical presentation capabilities, physical phenomena taken in account, way of presenting results, etc. No recommendation for a particular software package or method for spectrum analysis is given. It is intended that the readers reach their own conclusions and make their own choices, according to their specific needs. This TECDOC will be useful to anyone involved in PIXE spectrum analysis. This TECDOC includes a companion CD with the complete set of test spectra used for intercomparison. The test spectra on this CD can be used to test any PIXE spectral analysis software package

  10. REVEAL: Software Documentation and Platform Migration

    Science.gov (United States)

    Wilson, Michael A.; Veibell, Victoir T.; Freudinger, Lawrence C.

    2008-01-01

    The Research Environment for Vehicle Embedded Analysis on Linux (REVEAL) is reconfigurable data acquisition software designed for network-distributed test and measurement applications. In development since 2001, it has been successfully demonstrated in support of a number of actual missions within NASA s Suborbital Science Program. Improvements to software configuration control were needed to properly support both an ongoing transition to operational status and continued evolution of REVEAL capabilities. For this reason the project described in this report targets REVEAL software source documentation and deployment of the software on a small set of hardware platforms different from what is currently used in the baseline system implementation. This report specifically describes the actions taken over a ten week period by two undergraduate student interns and serves as a final report for that internship. The topics discussed include: the documentation of REVEAL source code; the migration of REVEAL to other platforms; and an end-to-end field test that successfully validates the efforts.

  11. User Interface Design for Dynamic Geometry Software

    Science.gov (United States)

    Kortenkamp, Ulrich; Dohrmann, Christian

    2010-01-01

    In this article we describe long-standing user interface issues with Dynamic Geometry Software and common approaches to address them. We describe first prototypes of multi-touch-capable DGS. We also give some hints on the educational benefits of proper user interface design.

  12. software for natural gas pipeline design and simulation

    African Journals Online (AJOL)

    Global Journal

    2017-01-17

    Jan 17, 2017 ... investment and operating cost required for natural gas pipeline transmission ... In the early development of the natural gas transmission industry, pressures were low and ..... The software has an error control capability in.

  13. Software engineering and automatic continuous verification of scientific software

    Science.gov (United States)

    Piggott, M. D.; Hill, J.; Farrell, P. E.; Kramer, S. C.; Wilson, C. R.; Ham, D.; Gorman, G. J.; Bond, T.

    2011-12-01

    Software engineering of scientific code is challenging for a number of reasons including pressure to publish and a lack of awareness of the pitfalls of software engineering by scientists. The Applied Modelling and Computation Group at Imperial College is a diverse group of researchers that employ best practice software engineering methods whilst developing open source scientific software. Our main code is Fluidity - a multi-purpose computational fluid dynamics (CFD) code that can be used for a wide range of scientific applications from earth-scale mantle convection, through basin-scale ocean dynamics, to laboratory-scale classic CFD problems, and is coupled to a number of other codes including nuclear radiation and solid modelling. Our software development infrastructure consists of a number of free tools that could be employed by any group that develops scientific code and has been developed over a number of years with many lessons learnt. A single code base is developed by over 30 people for which we use bazaar for revision control, making good use of the strong branching and merging capabilities. Using features of Canonical's Launchpad platform, such as code review, blueprints for designing features and bug reporting gives the group, partners and other Fluidity uers an easy-to-use platform to collaborate and allows the induction of new members of the group into an environment where software development forms a central part of their work. The code repositoriy are coupled to an automated test and verification system which performs over 20,000 tests, including unit tests, short regression tests, code verification and large parallel tests. Included in these tests are build tests on HPC systems, including local and UK National HPC services. The testing of code in this manner leads to a continuous verification process; not a discrete event performed once development has ceased. Much of the code verification is done via the "gold standard" of comparisons to analytical

  14. Software V ampersand V methods for digital plant protection system

    International Nuclear Information System (INIS)

    Kim, Hung-Jun; Han, Jai-Bok; Chun, Chong-Son; Kim, Sung; Kim, Kern-Joong.

    1997-01-01

    Careful thought must be given to software design in the development of digital based systems that play a critical role in the successful operation of nuclear power plants. To evaluate the software verification and validation methods as well as to verify its system performance capabilities for the upgrade instrumentation and control system in the Korean future nuclear power plants, the prototype Digital Plant, Protection System (DPPS) based on the Programmable Logic Controller (PLC) has been constructed. The system design description and features are briefly presented, and the software design and software verification and validation methods are focused. 6 refs., 2 figs

  15. PIV/HPIV Film Analysis Software Package

    Science.gov (United States)

    Blackshire, James L.

    1997-01-01

    A PIV/HPIV film analysis software system was developed that calculates the 2-dimensional spatial autocorrelations of subregions of Particle Image Velocimetry (PIV) or Holographic Particle Image Velocimetry (HPIV) film recordings. The software controls three hardware subsystems including (1) a Kodak Megaplus 1.4 camera and EPIX 4MEG framegrabber subsystem, (2) an IEEE/Unidex 11 precision motion control subsystem, and (3) an Alacron I860 array processor subsystem. The software runs on an IBM PC/AT host computer running either the Microsoft Windows 3.1 or Windows 95 operating system. It is capable of processing five PIV or HPIV displacement vectors per second, and is completely automated with the exception of user input to a configuration file prior to analysis execution for update of various system parameters.

  16. Real-time software for the fusion experiment WENDELSTEIN 7-X

    International Nuclear Information System (INIS)

    Laqua, Heike; Niedermeyer, Helmut; Schacht, Joerg; Spring, Anett

    2006-01-01

    The super conducting stellarator WENDELSTEIN 7-X will be capable of steady state operation as well as of pulsed operation. All discharge scenarios compatible with these capabilities will be supported by the control system. Each technical component and each diagnostic system will have its own control system, based on a real-time computer with the dedicated software described here, permitting autonomous operation for commissioning and testing and coordinated operation during experimental sessions. The system behaviour as far as it is relevant for the experiment, like parameters and algorithms, will be exclusively controlled by complex software objects. By changing references to these objects synchronously in all computers the whole system behaviour can be changed from one cycle to the next. All data required for the construction of the software objects will be stored in one central database and constructed in the control computers well before they are required

  17. Using software interoperability to achieve a virtual design environment

    Science.gov (United States)

    Gregory, G. Groot; Koshel, R. John

    2005-09-01

    A variety of simulation tools, including optical design and analysis, have benefited by many years of evolution in software functionality and computing power, thus making the notion of virtual design environments a reality. To simulate the optical characteristics of a system, one needs to include optical performance, mechanical design and manufacturing aspects simultaneously. To date, no single software program offers a universal solution. One approach to achieve an integrated environment is to select tools that offer a high degree of interoperability. This allows the selection of the best tools for each aspect of the design working in concert to solve the problem. This paper discusses the issues of how to assemble a design environment and provides an example of a combination of tools for illumination design. We begin by offering a broad definition of interoperability from an optical analysis perspective. This definition includes aspects of file interchange formats, software communications protocols and customized applications. One example solution is proposed by combining SolidWorks1 for computer-aided design (CAD), TracePro2 for optical analysis and MATLAB3 as the mathematical engine for tolerance analysis. The resulting virtual tool will be applied to a lightpipe design task to illustrate how such a system can be used.

  18. Student perceptions of drill-and-practice mathematics software in primary education

    NARCIS (Netherlands)

    Kuiper, E.; de Pater-Sneep, M.

    2014-01-01

    Drill-and-practice mathematics software offers teachers a relatively simple way to use technology in the classroom. One of the reasons to use the software may be that it motivates children, working on the computer being more "fun" than doing regular school work. However, students’ own perceptions of

  19. Software development with C++ maximizing reuse with object technology

    CERN Document Server

    Nielsen, Kjell

    2014-01-01

    Software Development with C++: Maximizing Reuse with Object Technology is about software development and object-oriented technology (OT), with applications implemented in C++. The basis for any software development project of complex systems is the process, rather than an individual method, which simply supports the overall process. This book is not intended as a general, all-encompassing treatise on OT. The intent is to provide practical information that is directly applicable to a development project. Explicit guidelines are offered for the infusion of OT into the various development phases.

  20. A Study on the Software Quality Assurance Plan

    International Nuclear Information System (INIS)

    Kim, Hyun Tae

    2006-01-01

    On 25 August 2006, the CMMI V1.2 (Capability Maturity Model Integration Version 1.2) was released with the new title CMMI-DEV (CMMI for Development) which supersedes the CMMI-SE/SW (CMMI for systems engineering and software engineering) V1.1. This study discusses the application of IEEE Std 730-2002, IEEE Standard for Software Quality Assurance Plans, for the implementation of the Process and Product Quality Assurance (PPQA) process area (PA) of the CMMI-DEV

  1. The intersection of software and strengths: Using internet technology and case management software to assist Strength-Based Practice.

    Science.gov (United States)

    Clark, Michael D; Brien, Dale W

    2016-01-01

    The focus of this investigation is the helping professionals working within American Indian and Alaska Native (AI/AN) communities. This article looks at how innovative technology-in the form of automated case management software and Internet connectivity-can assist effective implementation of Strength-based Practice and agency services within tribal courts and the many other helping agencies that serve AI/AN populations. We seek to expand practice knowledge by reviewing the benefits that this software and Internet connectivity can offer to agency operations and exploring how they can assist case management services.

  2. Software Defined Radio: Basic Principles and Applications

    Directory of Open Access Journals (Sweden)

    José Raúl Machado-Fernández

    2014-12-01

    Full Text Available The author makes a review of the SDR (Software Defined Radio technology, including hardware schemes and application fields. A low performance device is presented and several tests are executed with it using free software. With the acquired experience, SDR employment opportunities are identified for low-cost solutions that can solve significant problems. In addition, a list of the most important frameworks related to the technology developed in the last years is offered, recommending the use of three of them.

  3. Innovation Initiatives in Large Software Companies

    DEFF Research Database (Denmark)

    Edison, Henry; Wang, Xiaofeng; Jabangwe, Ronald

    2018-01-01

    empirical studies on innovation initiative in the context of large software companies. A total of 7 studies are conducted in the context of large software companies, which reported 5 types of initiatives: intrapreneurship, bootlegging, internal venture, spin-off and crowdsourcing. Our study offers three......Context: To keep the competitive advantage and adapt to changes in the market and technology, companies need to innovate in an organised, purposeful and systematic manner. However, due to their size and complexity, large companies tend to focus on the structure in maintaining their business, which...... can potentially lower their agility to innovate. Objective:The aims of this study are to provide an overview of the current research on innovation initiatives and to identify the challenges of implementing those initiatives in the context of large software companies. Method: The investigation...

  4. Computing and software

    Directory of Open Access Journals (Sweden)

    White, G. C.

    2004-06-01

    Full Text Available The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methods available. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them. In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented. Rotella et al. (2004 compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004. Efford et al. (2004 present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years. Barker & White (2004 discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine

  5. Standardising Software Processes - An Obstacle for Innovation?

    DEFF Research Database (Denmark)

    Aaen, Ivan; Pries-Heje, Jan

    2004-01-01

    Over the last 10 years CMM has achieved widespread use as a model for improving software organisations. Often CMM is used to standardise software processes across projects. In this paper we discuss this standardisation of SPI in relation to innovation, organisational size and company growth. Our...... discussion is empirically based on years work and experience working with companies on SPI. In the concrete our discussion is enhanced by vignette stories taken from our experience. As a result we find that standardisation focussing on process, metrics, and controls may jeopardize innovative capabilities...

  6. Reactor physics verification of the MCNP6 unstructured mesh capability

    International Nuclear Information System (INIS)

    Burke, T. P.; Kiedrowski, B. C.; Martz, R. L.; Martin, W. R.

    2013-01-01

    The Monte Carlo software package MCNP6 has the ability to transport particles on unstructured meshes generated from the Computed-Aided Engineering software Abaqus. Verification is performed using benchmarks with features relevant to reactor physics - Big Ten and the C5G7 computational benchmark. Various meshing strategies are tested and results are compared to reference solutions. Computational performance results are also given. The conclusions show MCNP6 is capable of producing accurate calculations for reactor physics geometries and the computational requirements for small lattice benchmarks are reasonable on modern computing platforms. (authors)

  7. Reactor physics verification of the MCNP6 unstructured mesh capability

    Energy Technology Data Exchange (ETDEWEB)

    Burke, T. P. [Department of Nuclear Engineering and Radiological Sciences, University of Michigan, 2355 Bonisteel Boulevard, Ann Arbor, MI 48109 (United States); Kiedrowski, B. C.; Martz, R. L. [X-Computational Physics Division, Monte Carlo Codes Group, Los Alamos National Laboratory, P.O. Box 1663, Los Alamos, NM 87545 (United States); Martin, W. R. [Department of Nuclear Engineering and Radiological Sciences, University of Michigan, 2355 Bonisteel Boulevard, Ann Arbor, MI 48109 (United States)

    2013-07-01

    The Monte Carlo software package MCNP6 has the ability to transport particles on unstructured meshes generated from the Computed-Aided Engineering software Abaqus. Verification is performed using benchmarks with features relevant to reactor physics - Big Ten and the C5G7 computational benchmark. Various meshing strategies are tested and results are compared to reference solutions. Computational performance results are also given. The conclusions show MCNP6 is capable of producing accurate calculations for reactor physics geometries and the computational requirements for small lattice benchmarks are reasonable on modern computing platforms. (authors)

  8. Development of data acquisition and analysis software for multichannel detectors

    International Nuclear Information System (INIS)

    Chung, Y.

    1988-06-01

    This report describes the development of data acquisition and analysis software for Apple Macintosh computers, capable of controlling two multichannel detectors. With the help of outstanding graphics capabilities, easy-to-use user interface, and several other built-in convenience features, this application has enhanced the productivity and the efficiency of data analysis. 2 refs., 6 figs

  9. The cognitive dynamics of computer science cost-effective large scale software development

    CERN Document Server

    De Gyurky, Szabolcs Michael; John Wiley & Sons

    2006-01-01

    This book has three major objectives: To propose an ontology for computer software; To provide a methodology for development of large software systems to cost and schedule that is based on the ontology; To offer an alternative vision regarding the development of truly autonomous systems.

  10. Applications of ATILA FEM software to smart materials case studies in designing devices

    CERN Document Server

    Uchino, Kenji

    2013-01-01

    ATILA Finite Element Method (FEM) software facilitates the modelling and analysis of applications using piezoelectric, magnetostrictor and shape memory materials. It allows entire designs to be constructed, refined and optimized before production begins. Through a range of instructive case studies, Applications of ATILA FEM software to smart materials provides an indispensable guide to the use of this software in the design of effective products.Part one provides an introduction to ATILA FEM software, beginning with an overview of the software code. New capabilities and loss integratio

  11. Software life cycle management standards real-world solutions and scenarios for savings

    CERN Document Server

    Wright, David

    2011-01-01

    Software Life Cycle Management Standards will help you apply ISO/IEC 19770 to your business and enjoy the rewards it offers. David Wright calls on his vast experience to explain how the Standard applies to the whole of the software life cycle, not just the software asset management aspects. His informative guide gives up-to-date information using practical examples, clear diagrams and entertaining anecdotes.

  12. Assessment of CFD capability for prediction of hypersonic shock interactions

    Science.gov (United States)

    Knight, Doyle; Longo, José; Drikakis, Dimitris; Gaitonde, Datta; Lani, Andrea; Nompelis, Ioannis; Reimann, Bodo; Walpot, Louis

    2012-01-01

    The aerothermodynamic loadings associated with shock wave boundary layer interactions (shock interactions) must be carefully considered in the design of hypersonic air vehicles. The capability of Computational Fluid Dynamics (CFD) software to accurately predict hypersonic shock wave laminar boundary layer interactions is examined. A series of independent computations performed by researchers in the US and Europe are presented for two generic configurations (double cone and cylinder) and compared with experimental data. The results illustrate the current capabilities and limitations of modern CFD methods for these flows.

  13. Software to model AXAF-I image quality

    Science.gov (United States)

    Ahmad, Anees; Feng, Chen

    1995-01-01

    A modular user-friendly computer program for the modeling of grazing-incidence type x-ray optical systems has been developed. This comprehensive computer software GRAZTRACE covers the manipulation of input data, ray tracing with reflectivity and surface deformation effects, convolution with x-ray source shape, and x-ray scattering. The program also includes the capabilities for image analysis, detector scan modeling, and graphical presentation of the results. A number of utilities have been developed to interface the predicted Advanced X-ray Astrophysics Facility-Imaging (AXAF-I) mirror structural and thermal distortions with the ray-trace. This software is written in FORTRAN 77 and runs on a SUN/SPARC station. An interactive command mode version and a batch mode version of the software have been developed.

  14. Integrating R and Java for Enhancing Interactivity of Algorithmic Data Analysis Software Solutions

    Directory of Open Access Journals (Sweden)

    Titus Felix FURTUNĂ

    2016-06-01

    Full Text Available Conceiving software solutions for statistical processing and algorithmic data analysis involves handling diverse data, fetched from various sources and in different formats, and presenting the results in a suggestive, tailorable manner. Our ongoing research aims to design programming technics for integrating R developing environment with Java programming language for interoperability at a source code level. The goal is to combine the intensive data processing capabilities of R programing language, along with the multitude of statistical function libraries, with the flexibility offered by Java programming language and platform, in terms of graphical user interface and mathematical function libraries. Both developing environments are multiplatform oriented, and can complement each other through interoperability. R is a comprehensive and concise programming language, benefiting from a continuously expanding and evolving set of packages for statistical analysis, developed by the open source community. While is a very efficient environment for statistical data processing, R platform lacks support for developing user friendly, interactive, graphical user interfaces (GUIs. Java on the other hand, is a high level object oriented programming language, which supports designing and developing performant and interactive frameworks for general purpose software solutions, through Java Foundation Classes, JavaFX and various graphical libraries. In this paper we treat both aspects of integration and interoperability that refer to integrating Java code into R applications, and bringing R processing sequences into Java driven software solutions. Our research has been conducted focusing on case studies concerning pattern recognition and cluster analysis.

  15. Lean principles applied to software development – avoiding waste

    Directory of Open Access Journals (Sweden)

    Ionel NAFTANAILA

    2009-12-01

    Full Text Available Under the current economic conditions many organizations strive to continue the trend towards adopting better software development processes, in order to take advantage of the numerous benefits that these can offer. Those benefits include quicker return on investment, better software quality, and higher customer satisfaction. To date, however, there is little body of research that can guide organizations in adopting modern software development practices, especially when it comes to Lean thinking and principles. To address this situation, the current paper identifies and structures the main wastes (or muda in Lean terms in software development as described by Lean principles, in an attempt to bring into researchers’ and practitioners’ attention Lean Software Development, a modern development methodology based on well-established practices such as Lean Manufacturing or Toyota Production System.

  16. Software Comparison for Renewable Energy Deployment in a Distribution Network

    Energy Technology Data Exchange (ETDEWEB)

    Gao, David Wenzhong [Alternative Power Innovations, LLC, Sharonville, OH (United States); Muljadi, Eduard [National Renewable Energy Lab. (NREL), Golden, CO (United States); Tian, Tian [National Renewable Energy Lab. (NREL), Golden, CO (United States); Miller, Mackay [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-02-22

    The main objective of this report is to evaluate different software options for performing robust distributed generation (DG) power system modeling. The features and capabilities of four simulation tools, OpenDSS, GridLAB-D, CYMDIST, and PowerWorld Simulator, are compared to analyze their effectiveness in analyzing distribution networks with DG. OpenDSS and GridLAB-D, two open source software, have the capability to simulate networks with fluctuating data values. These packages allow the running of a simulation each time instant by iterating only the main script file. CYMDIST, a commercial software, allows for time-series simulation to study variations on network controls. PowerWorld Simulator, another commercial tool, has a batch mode simulation function through the 'Time Step Simulation' tool, which obtains solutions for a list of specified time points. PowerWorld Simulator is intended for analysis of transmission-level systems, while the other three are designed for distribution systems. CYMDIST and PowerWorld Simulator feature easy-to-use graphical user interfaces (GUIs). OpenDSS and GridLAB-D, on the other hand, are based on command-line programs, which increase the time necessary to become familiar with the software packages.

  17. Software-as-a-Service Vendors: Are They Ready to Successfully Deliver?

    Science.gov (United States)

    Heart, Tsipi; Tsur, Noa Shamir; Pliskin, Nava

    Software as a service (SaaS) is a software sourcing option that allows organizations to remotely access enterprise applications, without having to install the application in-house. In this work we study vendors' readiness to deliver SaaS, a topic scarcely studied before. The innovation classification (evolutionary vs. revolutionary) and a new, Seven Fundamental Organizational Capabilities (FOCs) Model, are used as the theoretical frameworks. The Seven FOCs model suggests generic yet comprehensive set of capabilities that are required for organizational success: 1) sensing the stakeholders, 2) sensing the business environment, 3) sensing the knowledge environment, 4) process control, 5) process improvement, 6) new process development, and 7) appropriate resolution.

  18. RAVEN AS A TOOL FOR DYNAMIC PROBABILISTIC RISK ASSESSMENT: SOFTWARE OVERVIEW

    Energy Technology Data Exchange (ETDEWEB)

    Alfonsi Andrea; Mandelli Diego; Rabiti Cristian; Joshua Cogliati; Robert Kinoshita

    2013-05-01

    RAVEN is a software tool under development at the Idaho National Laboratory (INL) that acts as the control logic driver and post-processing tool for the newly developed Thermo-Hydraylic code RELAP- 7. The scope of this paper is to show the software structure of RAVEN and its utilization in connection with RELAP-7. A short overview of the mathematical framework behind the code is presented along with its main capabilities such as on-line controlling/monitoring and Monte-Carlo sampling. A demo of a Station Black Out PRA analysis of a simplified Pressurized Water Reactor (PWR) model is shown in order to demonstrate the Monte-Carlo and clustering capabilities.

  19. Offers

    CERN Multimedia

    Staff Association

    2012-01-01

    L'Occitane en Provence proposes the following offer: 10 % discount on all products in all L'Occitane shops in Metropolitan France upon presentation of your Staff Association membership card and a valid ID. This offer is valid only for one person, is non-transferable and cannot be combined with other promotions.

  20. CAX a software for automated spectrum analysis

    International Nuclear Information System (INIS)

    Zahn, Guilherme S.; Genezini, Frederico A.

    2017-01-01

    In this work, the scripting capabilities of Genie-2000 were used to develop a software that automatically analyses all spectrum files in either Ortec's CHN or Canberra's MCA or CNF formats in a folder, generating two output files: a print-ready text le (.DAT) and a Comma-Separated Values (.CSV) le which can be easily imported in any major spreadsheet software. This software, named CAX ('Convert and Analyse for eXcel'), uses Genie-2000's functions to import spectrum files into Genie's native CNF format and analyze the converted spectra. The software can also, if requested, import energy and FWHM calibrations from a stored calibrated spectrum. The print-ready output le (.DAT) is generated by Genie-2000 using a customized script, and the CSV le is generated by a custom-built DAT2CSV software which generates a CSV le that complies to the Brazilian standards, with commas as a decimal indicator and semicolons as eld separators. This software is already used in the daily routines in IPEN's Neutron Activation Laboratory, greatly reducing the time required for sample analyses, as well as reducing the possibility of transcription errors. (author)

  1. Approximator: Predicting Interruptibility in Software Development with Commodity Computers

    DEFF Research Database (Denmark)

    Tell, Paolo; Jalaliniya, Shahram; Andersen, Kristian S. M.

    2015-01-01

    Assessing the presence and availability of a remote colleague is key in coordination in global software development but is not easily done using existing computer-mediated channels. Previous research has shown that automated estimation of interruptibility is feasible and can achieve a precision....... These early but promising results represent a starting point for designing tools with support for interruptibility capable of improving distributed awareness and cooperation to be used in global software development....

  2. A software methodology for compiling quantum programs

    Science.gov (United States)

    Häner, Thomas; Steiger, Damian S.; Svore, Krysta; Troyer, Matthias

    2018-04-01

    Quantum computers promise to transform our notions of computation by offering a completely new paradigm. To achieve scalable quantum computation, optimizing compilers and a corresponding software design flow will be essential. We present a software architecture for compiling quantum programs from a high-level language program to hardware-specific instructions. We describe the necessary layers of abstraction and their differences and similarities to classical layers of a computer-aided design flow. For each layer of the stack, we discuss the underlying methods for compilation and optimization. Our software methodology facilitates more rapid innovation among quantum algorithm designers, quantum hardware engineers, and experimentalists. It enables scalable compilation of complex quantum algorithms and can be targeted to any specific quantum hardware implementation.

  3. An Ada Linear-Algebra Software Package Modeled After HAL/S

    Science.gov (United States)

    Klumpp, Allan R.; Lawson, Charles L.

    1990-01-01

    New avionics software written more easily. Software package extends Ada programming language to include linear-algebra capabilities similar to those of HAL/S programming language. Designed for such avionics applications as Space Station flight software. In addition to built-in functions of HAL/S, package incorporates quaternion functions used in Space Shuttle and Galileo projects and routines from LINPAK solving systems of equations involving general square matrices. Contains two generic programs: one for floating-point computations and one for integer computations. Written on IBM/AT personal computer running under PC DOS, v.3.1.

  4. Automatic program generation: future of software engineering

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, J.H.

    1979-01-01

    At this moment software development is still more of an art than an engineering discipline. Each piece of software is lovingly engineered, nurtured, and presented to the world as a tribute to the writer's skill. When will this change. When will the craftsmanship be removed and the programs be turned out like so many automobiles from an assembly line. Sooner or later it will happen: economic necessities will demand it. With the advent of cheap microcomputers and ever more powerful supercomputers doubling capacity, much more software must be produced. The choices are to double the number of programers, double the efficiency of each programer, or find a way to produce the needed software automatically. Producing software automatically is the only logical choice. How will automatic programing come about. Some of the preliminary actions which need to be done and are being done are to encourage programer plagiarism of existing software through public library mechanisms, produce well understood packages such as compiler automatically, develop languages capable of producing software as output, and learn enough about the whole process of programing to be able to automate it. Clearly, the emphasis must not be on efficiency or size, since ever larger and faster hardware is coming.

  5. Modular, Autonomous Command and Data Handling Software with Built-In Simulation and Test

    Science.gov (United States)

    Cuseo, John

    2012-01-01

    The spacecraft system that plays the greatest role throughout the program lifecycle is the Command and Data Handling System (C&DH), along with the associated algorithms and software. The C&DH takes on this role as cost driver because it is the brains of the spacecraft and is the element of the system that is primarily responsible for the integration and interoperability of all spacecraft subsystems. During design and development, many activities associated with mission design, system engineering, and subsystem development result in products that are directly supported by the C&DH, such as interfaces, algorithms, flight software (FSW), and parameter sets. A modular system architecture has been developed that provides a means for rapid spacecraft assembly, test, and integration. This modular C&DH software architecture, which can be targeted and adapted to a wide variety of spacecraft architectures, payloads, and mission requirements, eliminates the current practice of rewriting the spacecraft software and test environment for every mission. This software allows missionspecific software and algorithms to be rapidly integrated and tested, significantly decreasing time involved in the software development cycle. Additionally, the FSW includes an Onboard Dynamic Simulation System (ODySSy) that allows the C&DH software to support rapid integration and test. With this solution, the C&DH software capabilities will encompass all phases of the spacecraft lifecycle. ODySSy is an on-board simulation capability built directly into the FSW that provides dynamic built-in test capabilities as soon as the FSW image is loaded onto the processor. It includes a six-degrees- of-freedom, high-fidelity simulation that allows complete closed-loop and hardware-in-the-loop testing of a spacecraft in a ground processing environment without any additional external stimuli. ODySSy can intercept and modify sensor inputs using mathematical sensor models, and can intercept and respond to actuator

  6. An Assessment of the Library Application Software Packages in ...

    African Journals Online (AJOL)

    Journal Home > Vol 7, No 2 (2007) > ... the study examined the adopted softwares' security, compatibility/capabilities, ... The study found that most application packages available in the Nigerian automation market place are effective since they ...

  7. Current trends in hardware and software for brain-computer interfaces (BCIs).

    Science.gov (United States)

    Brunner, P; Bianchi, L; Guger, C; Cincotti, F; Schalk, G

    2011-04-01

    A brain-computer interface (BCI) provides a non-muscular communication channel to people with and without disabilities. BCI devices consist of hardware and software. BCI hardware records signals from the brain, either invasively or non-invasively, using a series of device components. BCI software then translates these signals into device output commands and provides feedback. One may categorize different types of BCI applications into the following four categories: basic research, clinical/translational research, consumer products, and emerging applications. These four categories use BCI hardware and software, but have different sets of requirements. For example, while basic research needs to explore a wide range of system configurations, and thus requires a wide range of hardware and software capabilities, applications in the other three categories may be designed for relatively narrow purposes and thus may only need a very limited subset of capabilities. This paper summarizes technical aspects for each of these four categories of BCI applications. The results indicate that BCI technology is in transition from isolated demonstrations to systematic research and commercial development. This process requires several multidisciplinary efforts, including the development of better integrated and more robust BCI hardware and software, the definition of standardized interfaces, and the development of certification, dissemination and reimbursement procedures.

  8. Current trends in hardware and software for brain-computer interfaces (BCIs)

    Science.gov (United States)

    Brunner, P.; Bianchi, L.; Guger, C.; Cincotti, F.; Schalk, G.

    2011-04-01

    A brain-computer interface (BCI) provides a non-muscular communication channel to people with and without disabilities. BCI devices consist of hardware and software. BCI hardware records signals from the brain, either invasively or non-invasively, using a series of device components. BCI software then translates these signals into device output commands and provides feedback. One may categorize different types of BCI applications into the following four categories: basic research, clinical/translational research, consumer products, and emerging applications. These four categories use BCI hardware and software, but have different sets of requirements. For example, while basic research needs to explore a wide range of system configurations, and thus requires a wide range of hardware and software capabilities, applications in the other three categories may be designed for relatively narrow purposes and thus may only need a very limited subset of capabilities. This paper summarizes technical aspects for each of these four categories of BCI applications. The results indicate that BCI technology is in transition from isolated demonstrations to systematic research and commercial development. This process requires several multidisciplinary efforts, including the development of better integrated and more robust BCI hardware and software, the definition of standardized interfaces, and the development of certification, dissemination and reimbursement procedures.

  9. A PICKSC Science Gateway for enabling the common plasma physicist to run kinetic software

    Science.gov (United States)

    Hu, Q.; Winjum, B. J.; Zonca, A.; Youn, C.; Tsung, F. S.; Mori, W. B.

    2017-10-01

    Computer simulations offer tremendous opportunities for studying plasmas, ranging from simulations for students that illuminate fundamental educational concepts to research-level simulations that advance scientific knowledge. Nevertheless, there is a significant hurdle to using simulation tools. Users must navigate codes and software libraries, determine how to wrangle output into meaningful plots, and oftentimes confront a significant cyberinfrastructure with powerful computational resources. Science gateways offer a Web-based environment to run simulations without needing to learn or manage the underlying software and computing cyberinfrastructure. We discuss our progress on creating a Science Gateway for the Particle-in-Cell and Kinetic Simulation Software Center that enables users to easily run and analyze kinetic simulations with our software. We envision that this technology could benefit a wide range of plasma physicists, both in the use of our simulation tools as well as in its adaptation for running other plasma simulation software. Supported by NSF under Grant ACI-1339893 and by the UCLA Institute for Digital Research and Education.

  10. Linear mixed models a practical guide using statistical software

    CERN Document Server

    West, Brady T; Galecki, Andrzej T

    2014-01-01

    Highly recommended by JASA, Technometrics, and other journals, the first edition of this bestseller showed how to easily perform complex linear mixed model (LMM) analyses via a variety of software programs. Linear Mixed Models: A Practical Guide Using Statistical Software, Second Edition continues to lead readers step by step through the process of fitting LMMs. This second edition covers additional topics on the application of LMMs that are valuable for data analysts in all fields. It also updates the case studies using the latest versions of the software procedures and provides up-to-date information on the options and features of the software procedures available for fitting LMMs in SAS, SPSS, Stata, R/S-plus, and HLM.New to the Second Edition A new chapter on models with crossed random effects that uses a case study to illustrate software procedures capable of fitting these models Power analysis methods for longitudinal and clustered study designs, including software options for power analyses and suggest...

  11. Subsystem software for TSTA [Tritium Systems Test Assembly

    International Nuclear Information System (INIS)

    Mann, L.W.; Claborn, G.W.; Nielson, C.W.

    1987-01-01

    The Subsystem Control Software at the Tritium System Test Assembly (TSTA) must control sophisticated chemical processes through the physical operation of valves, motor controllers, gas sampling devices, thermocouples, pressure transducers, and similar devices. Such control software has to be capable of passing stringent quality assurance (QA) criteria to provide for the safe handling of significant amounts of tritium on a routine basis. Since many of the chemical processes and physical components are experimental, the control software has to be flexible enough to allow for trial/error learning curve, but still protect the environment and personnel from exposure to unsafe levels of radiation. The software at TSTA is implemented in several levels as described in a preceding paper in these proceedings. This paper depends on information given in the preceding paper for understanding. The top level is the Subsystem Control level

  12. The Software Therapist: Usability Problem Diagnosis Through Latent Semantic Analysis

    National Research Council Canada - National Science Library

    Sparks, Randall; Hartson, Rex

    2006-01-01

    The work we report on here addresses the problem of low return on investment in software usability engineering and offers support for usability practitioners in identifying, understanding, documenting...

  13. Service software engineering for innovative infrastructure for global financial services

    OpenAIRE

    MAAD , Soha; MCCARTHY , James B.; GARBAYA , Samir; Beynon , Meurig; Nagarajan , Rajagopal

    2010-01-01

    International audience; The recent financial crisis motivates our re-thinking of the engineering principles for service software and infrastructures intended to create business value in vital sectors. Existing monolithic, inwarddirected, cost insensitive and highly regulated technical and organizational infrastructures for financial services make it difficult for the domain to benefit from opportunities offered by new computing models such as cloud computing, software as a service, hardware a...

  14. Linac-augmented light sources : an incremental concept for enhancing the capabilities of existing 3rd-generation storage rings

    International Nuclear Information System (INIS)

    Lewellen, J. W.

    2003-01-01

    Planned and proposed 4th-generation x-ray sources, such as energy-recovery linacs (ERLs) and single-pass x-ray free-electron lasers (X-FELs) offer a number of potential advantages, including small source size, higher peak brightness, ultrashort pulses, and potentially temporally and transversely coherent pulses. While offering unique capabilities, such facilities will also offer several important limitations, including limited numbers of user beamlines (for FELs) and a pulse-repetition rate that may be too high for many dynamics experiments (ERLs). In addition, there are many technical challenges associated with both types of facilities. A third type of facility, exemplified by the Short Pulse Photon Source (SPPS) at SLAC [1], would support neither a large number of users simultaneously nor generate coherent pulses, but would generate very intense, short x-ray pulses. Such a facility could serve as the starting point for either an ERL or an X-FEL, or a combined, hybrid machine. For the foreseeable future, however, existing 3rd-generation light source storage rings, such as the Advanced Photon Source, will continue to play important roles in supporting scientific research utilizing high-brightness x-rays. Existing facilities offer the powerful combination of a large number of user beamlines, efficient use of electron beam energy, and established user communities, and a program of incremental investment in, and improvements to, these facilities should continue to pay dividends into the future. This document discusses potential upgrade paths based on the Advanced Photon Source (APS) as a model 3rd-generation facility. If existing 3rd-generation facilities are to remain centers of excellence for light source-based research into the future, they must not only maintain and enhance their support of their existing user base, but also seek to expand their capabilities to support additional classes of users. There are several paths available toward this goal. The APS is

  15. Offers

    CERN Multimedia

    Staff Association

    2011-01-01

    Special offers for our members       Go Sport in Val Thoiry is offering 15% discount on all purchases made in the shop upon presentation of the Staff Association membership card (excluding promotions, sale items and bargain corner, and excluding purchases using Go Sport  and Kadéos gift cards. Only one discount can be applied to each purchase).  

  16. COSTMODL - AN AUTOMATED SOFTWARE DEVELOPMENT COST ESTIMATION TOOL

    Science.gov (United States)

    Roush, G. B.

    1994-01-01

    The cost of developing computer software consumes an increasing portion of many organizations' budgets. As this trend continues, the capability to estimate the effort and schedule required to develop a candidate software product becomes increasingly important. COSTMODL is an automated software development estimation tool which fulfills this need. Assimilating COSTMODL to any organization's particular environment can yield significant reduction in the risk of cost overruns and failed projects. This user-customization capability is unmatched by any other available estimation tool. COSTMODL accepts a description of a software product to be developed and computes estimates of the effort required to produce it, the calendar schedule required, and the distribution of effort and staffing as a function of the defined set of development life-cycle phases. This is accomplished by the five cost estimation algorithms incorporated into COSTMODL: the NASA-developed KISS model; the Basic, Intermediate, and Ada COCOMO models; and the Incremental Development model. This choice affords the user the ability to handle project complexities ranging from small, relatively simple projects to very large projects. Unique to COSTMODL is the ability to redefine the life-cycle phases of development and the capability to display a graphic representation of the optimum organizational structure required to develop the subject project, along with required staffing levels and skills. The program is menu-driven and mouse sensitive with an extensive context-sensitive help system that makes it possible for a new user to easily install and operate the program and to learn the fundamentals of cost estimation without having prior training or separate documentation. The implementation of these functions, along with the customization feature, into one program makes COSTMODL unique within the industry. COSTMODL was written for IBM PC compatibles, and it requires Turbo Pascal 5.0 or later and Turbo

  17. Cosmopolitan capabilities in the HE classroom

    Directory of Open Access Journals (Sweden)

    Veronica Crosbie

    2014-04-01

    Full Text Available This study, concerning the development of cosmopolitan citizenship, draws on theories of human development and capabilities (Sen 1999; Nussbaum 2000 from a social justice perspective, where individual wellbeing is articulated as having the freedom to live a life of one’s choosing. In the context of an English to Speakers of Other Languages (ESOL classroom this involves paying attention to pedagogical strategies, power dynamics and curriculum content as a means of developing valued beings and doings (or capabilities and functionings as they are described in the literature. Sample activities are presented and evaluated to see to what extent they achieve the desired end. These include critical pedagogical interventions, students’ artefacts and extracts from focus group interviews, class reports and reflective journals.  Results from the textual data offer research evidence of successful curriculum change, demonstrating that the learning that takes place there can make a difference: in terms of the learners’ identity development, capability enhancement and cosmopolitan citizenship.

  18. Evaluation of The Virtual Cells Software: a Teaching Tool

    Directory of Open Access Journals (Sweden)

    C.C.P. da Silva

    2005-07-01

    Full Text Available Studies show that the use of games and interactive materials  at schools is a good educational strategy, motivating students to create mental  outlines and developing the reasoning and facilitating  the learn- ing. In this context, the Scientific Dissemination Coordination of the Center  for Structural Molecular Biotechnology  (CBME,  developed  a series of educational materials  destined  to the  elementary and high  schools,  universities  and  general  public.   Among  these,  we highlighted  the  Virtual  Cells soft- ware that was developed  with  the  aim of helping  in the  understanding of the  basic concepts  of cell types,  their  structures, organelles  and  specific functions.   Characterized by its  interactive  interface, this  software shows eukaryotes  and prokaryotes cells images, where organelles are shown as dynamic structures. In addition, it presents exercises in another  step that reinforce the comprehension  of Cy- tology.  A speaker  narrates the  resources  offered by the  program  and  the  necessary  steps  for its use. During  the  stage  of development of the  software,  students and  teachers of public and  private  high schools from Sao Carlos  city, Sao Paulo  State,  were invited  to register their  opinions  regarding  the language and content of the software in order to help us in the improvement of it.  After this stage, the Scientific Dissemination Coordination of CBME organized a series of workshops, where 120 individuals evaluated the software (students and teachers  of high school and others undergraduate students. For this evaluation, a questionnaire was elaborated based on the international current literature in the area of sciences teaching  and it was applied  after the interactive section with the software.  The analysis of the results demonstrated that most of the individuals  considered the software of easy

  19. Western aeronautical test range real-time graphics software package MAGIC

    Science.gov (United States)

    Malone, Jacqueline C.; Moore, Archie L.

    1988-01-01

    The master graphics interactive console (MAGIC) software package used on the Western Aeronautical Test Range (WATR) of the NASA Ames Research Center is described. MAGIC is a resident real-time research tool available to flight researchers-scientists in the NASA mission control centers of the WATR at the Dryden Flight Research Facility at Edwards, California. The hardware configuration and capabilities of the real-time software package are also discussed.

  20. China's experimental pragmatics of "Scientific development" in wind power: Algorithmic struggles over software in wind turbines

    DEFF Research Database (Denmark)

    Kirkegaard, Julia

    2016-01-01

    . This increased focus on quality, to ensure the sustainable and scientific development of China's wind energy market, requires improved indigenous Chinese innovation capabilities in wind turbine technology. To shed light on how the turn to quality impacts upon the industry and global competition, this study......This article presents a case study on the development of China's wind power market. As China's wind industry has experienced a quality crisis, the Chinese government has intervened to steer the industry towards a turn to quality, indicating a pragmatist and experimental mode of market development...... unfold over issues associated with intellectual property rights (IPRs), certification and standardisation of software algorithms. The article concludes that the use of this STS lens makes a fresh contribution to the often path-dependent, structuralist and hierarchical China literature, offering instead...

  1. An Architecture, System Engineering, and Acquisition Approach for Space System Software Resiliency

    Science.gov (United States)

    Phillips, Dewanne Marie

    Software intensive space systems can harbor defects and vulnerabilities that may enable external adversaries or malicious insiders to disrupt or disable system functions, risking mission compromise or loss. Mitigating this risk demands a sustained focus on the security and resiliency of the system architecture including software, hardware, and other components. Robust software engineering practices contribute to the foundation of a resilient system so that the system "can take a hit to a critical component and recover in a known, bounded, and generally acceptable period of time". Software resiliency must be a priority and addressed early in the life cycle development to contribute a secure and dependable space system. Those who develop, implement, and operate software intensive space systems must determine the factors and systems engineering practices to address when investing in software resiliency. This dissertation offers methodical approaches for improving space system resiliency through software architecture design, system engineering, increased software security, thereby reducing the risk of latent software defects and vulnerabilities. By providing greater attention to the early life cycle phases of development, we can alter the engineering process to help detect, eliminate, and avoid vulnerabilities before space systems are delivered. To achieve this objective, this dissertation will identify knowledge, techniques, and tools that engineers and managers can utilize to help them recognize how vulnerabilities are produced and discovered so that they can learn to circumvent them in future efforts. We conducted a systematic review of existing architectural practices, standards, security and coding practices, various threats, defects, and vulnerabilities that impact space systems from hundreds of relevant publications and interviews of subject matter experts. We expanded on the system-level body of knowledge for resiliency and identified a new software

  2. Group Capability Model

    Science.gov (United States)

    Olejarski, Michael; Appleton, Amy; Deltorchio, Stephen

    2009-01-01

    The Group Capability Model (GCM) is a software tool that allows an organization, from first line management to senior executive, to monitor and track the health (capability) of various groups in performing their contractual obligations. GCM calculates a Group Capability Index (GCI) by comparing actual head counts, certifications, and/or skills within a group. The model can also be used to simulate the effects of employee usage, training, and attrition on the GCI. A universal tool and common method was required due to the high risk of losing skills necessary to complete the Space Shuttle Program and meet the needs of the Constellation Program. During this transition from one space vehicle to another, the uncertainty among the critical skilled workforce is high and attrition has the potential to be unmanageable. GCM allows managers to establish requirements for their group in the form of head counts, certification requirements, or skills requirements. GCM then calculates a Group Capability Index (GCI), where a score of 1 indicates that the group is at the appropriate level; anything less than 1 indicates a potential for improvement. This shows the health of a group, both currently and over time. GCM accepts as input head count, certification needs, critical needs, competency needs, and competency critical needs. In addition, team members are categorized by years of experience, percentage of contribution, ex-members and their skills, availability, function, and in-work requirements. Outputs are several reports, including actual vs. required head count, actual vs. required certificates, CGI change over time (by month), and more. The program stores historical data for summary and historical reporting, which is done via an Excel spreadsheet that is color-coded to show health statistics at a glance. GCM has provided the Shuttle Ground Processing team with a quantifiable, repeatable approach to assessing and managing the skills in their organization. They now have a common

  3. Towards the identification of the influence of SPI on the successful evolution of software SMEs

    OpenAIRE

    Clarke, Paul; O'Connor, Rory

    2010-01-01

    peer-reviewed Software development requires multi-stage processes in order to organise the software development effort. Each software development project should implement a development process that is appropriate to the project setting. Since business needs and technologies are subject to change, software process improvement (SPI) actions are required so as to harmonise the process with the emerging business and technology needs. SPI frameworks such as the Capability Maturity Model Integra...

  4. Advanced multilateration theory, software development, and data processing: The MICRODOT system

    Science.gov (United States)

    Escobal, P. R.; Gallagher, J. F.; Vonroos, O. H.

    1976-01-01

    The process of geometric parameter estimation to accuracies of one centimeter, i.e., multilateration, is defined and applications are listed. A brief functional explanation of the theory is presented. Next, various multilateration systems are described in order of increasing system complexity. Expected systems accuracy is discussed from a general point of view and a summary of the errors is listed. An outline of the design of a software processing system for multilateration, called MICRODOT, is presented next. The links of this software, which can be used for multilateration data simulations or operational data reduction, are examined on an individual basis. Functional flow diagrams are presented to aid in understanding the software capability. MICRODOT capability is described with respect to vehicle configurations, interstation coordinate reduction, geophysical parameter estimation, and orbit determination. Numerical results obtained from MICRODOT via data simulations are displayed both for hypothetical and real world vehicle/station configurations such as used in the GEOS-3 Project. These simulations show the inherent power of the multilateration procedure.

  5. A Framework for Teaching Software Development Methods

    Science.gov (United States)

    Dubinsky, Yael; Hazzan, Orit

    2005-01-01

    This article presents a study that aims at constructing a teaching framework for software development methods in higher education. The research field is a capstone project-based course, offered by the Technion's Department of Computer Science, in which Extreme Programming is introduced. The research paradigm is an Action Research that involves…

  6. The Orion GN and C Data-Driven Flight Software Architecture for Automated Sequencing and Fault Recovery

    Science.gov (United States)

    King, Ellis; Hart, Jeremy; Odegard, Ryan

    2010-01-01

    The Orion Crew Exploration Vehicle (CET) is being designed to include significantly more automation capability than either the Space Shuttle or the International Space Station (ISS). In particular, the vehicle flight software has requirements to accommodate increasingly automated missions throughout all phases of flight. A data-driven flight software architecture will provide an evolvable automation capability to sequence through Guidance, Navigation & Control (GN&C) flight software modes and configurations while maintaining the required flexibility and human control over the automation. This flexibility is a key aspect needed to address the maturation of operational concepts, to permit ground and crew operators to gain trust in the system and mitigate unpredictability in human spaceflight. To allow for mission flexibility and reconfrgurability, a data driven approach is being taken to load the mission event plan as well cis the flight software artifacts associated with the GN&C subsystem. A database of GN&C level sequencing data is presented which manages and tracks the mission specific and algorithm parameters to provide a capability to schedule GN&C events within mission segments. The flight software data schema for performing automated mission sequencing is presented with a concept of operations for interactions with ground and onboard crew members. A prototype architecture for fault identification, isolation and recovery interactions with the automation software is presented and discussed as a forward work item.

  7. SIGMA Release v1.2 - Capabilities, Enhancements and Fixes

    Energy Technology Data Exchange (ETDEWEB)

    Mahadevan, Vijay [Argonne National Lab. (ANL), Argonne, IL (United States); Grindeanu, Iulian R. [Argonne National Lab. (ANL), Argonne, IL (United States); Ray, Navamita [Argonne National Lab. (ANL), Argonne, IL (United States); Jain, Rajeev [Argonne National Lab. (ANL), Argonne, IL (United States); Wu, Danqing [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-09-30

    In this report, we present details on SIGMA toolkit along with its component structure, capabilities, and feature additions in FY15, release cycles, and continuous integration process. These software processes along with updated documentation are imperative to successfully integrate and utilize in several applications including the SHARP coupled analysis toolkit for reactor core systems funded under the NEAMS DOE-NE program.

  8. SIGMA Release v1.2 - Capabilities, Enhancements and Fixes

    International Nuclear Information System (INIS)

    Mahadevan, Vijay; Grindeanu, Iulian R.; Ray, Navamita; Jain, Rajeev; Wu, Danqing

    2015-01-01

    In this report, we present details on SIGMA toolkit along with its component structure, capabilities, and feature additions in FY15, release cycles, and continuous integration process. These software processes along with updated documentation are imperative to successfully integrate and utilize in several applications including the SHARP coupled analysis toolkit for reactor core systems funded under the NEAMS DOE-NE program.

  9. Intelligent Hardware-Enabled Sensor and Software Safety and Health Management for Autonomous UAS

    Science.gov (United States)

    Rozier, Kristin Y.; Schumann, Johann; Ippolito, Corey

    2015-01-01

    Unmanned Aerial Systems (UAS) can only be deployed if they can effectively complete their mission and respond to failures and uncertain environmental conditions while maintaining safety with respect to other aircraft as well as humans and property on the ground. We propose to design a real-time, onboard system health management (SHM) capability to continuously monitor essential system components such as sensors, software, and hardware systems for detection and diagnosis of failures and violations of safety or performance rules during the ight of a UAS. Our approach to SHM is three-pronged, providing: (1) real-time monitoring of sensor and software signals; (2) signal analysis, preprocessing, and advanced on-the- y temporal and Bayesian probabilistic fault diagnosis; (3) an unobtrusive, lightweight, read-only, low-power hardware realization using Field Programmable Gate Arrays (FPGAs) in order to avoid overburdening limited computing resources or costly re-certi cation of ight software due to instrumentation. No currently available SHM capabilities (or combinations of currently existing SHM capabilities) come anywhere close to satisfying these three criteria yet NASA will require such intelligent, hardwareenabled sensor and software safety and health management for introducing autonomous UAS into the National Airspace System (NAS). We propose a novel approach of creating modular building blocks for combining responsive runtime monitoring of temporal logic system safety requirements with model-based diagnosis and Bayesian network-based probabilistic analysis. Our proposed research program includes both developing this novel approach and demonstrating its capabilities using the NASA Swift UAS as a demonstration platform.

  10. Integrating open-source software applications to build molecular dynamics systems.

    Science.gov (United States)

    Allen, Bruce M; Predecki, Paul K; Kumosa, Maciej

    2014-04-05

    Three open-source applications, NanoEngineer-1, packmol, and mis2lmp are integrated using an open-source file format to quickly create molecular dynamics (MD) cells for simulation. The three software applications collectively make up the open-source software (OSS) suite known as MD Studio (MDS). The software is validated through software engineering practices and is verified through simulation of the diglycidyl ether of bisphenol-a and isophorone diamine (DGEBA/IPD) system. Multiple simulations are run using the MDS software to create MD cells, and the data generated are used to calculate density, bulk modulus, and glass transition temperature of the DGEBA/IPD system. Simulation results compare well with published experimental and numerical results. The MDS software prototype confirms that OSS applications can be analyzed against real-world research requirements and integrated to create a new capability. Copyright © 2014 Wiley Periodicals, Inc.

  11. Software development tools using GPGPU potentialities

    International Nuclear Information System (INIS)

    Dudnik, V.A.; Kudryavtsev, V.I.; Sereda, T.M.; Us, S.A.; Shestakov, M.V.

    2011-01-01

    The paper deals with potentialities of various up-to-date software development tools for making use of graphic processor (GPU) parallel computing resources. Examples are given to illustrate the use of present-day software tools for the development of applications and realization of algorithms for scientific-technical calculations performed by GPGPU. The paper presents some classes of hard mathematical problems of scientific-technical calculations, for which the GPGPU can be efficiently used. is possible. To reduce the time of calculation program development with the use of GPGPU capabilities, various dedicated programming systems and problem-oriented subroutine libraries are recommended. Performance parameters when solving the problems with and without the use of GPGPU potentialities are compared.

  12. Service-oriented Software Defined Optical Networks for Cloud Computing

    Science.gov (United States)

    Liu, Yuze; Li, Hui; Ji, Yuefeng

    2017-10-01

    With the development of big data and cloud computing technology, the traditional software-defined network is facing new challenges (e.g., ubiquitous accessibility, higher bandwidth, more flexible management and greater security). This paper proposes a new service-oriented software defined optical network architecture, including a resource layer, a service abstract layer, a control layer and an application layer. We then dwell on the corresponding service providing method. Different service ID is used to identify the service a device can offer. Finally, we experimentally evaluate that proposed service providing method can be applied to transmit different services based on the service ID in the service-oriented software defined optical network.

  13. Software Architecture Coupling Metric for Assessing Operational Responsiveness of Trading Systems

    Directory of Open Access Journals (Sweden)

    Claudiu VINTE

    2012-01-01

    Full Text Available The empirical observation that motivates our research relies on the difficulty to assess the performance of a trading architecture beyond a few synthetic indicators like response time, system latency, availability or volume capacity. Trading systems involve complex software architectures of distributed resources. However, in the context of a large brokerage firm, which offers a global coverage from both, market and client perspectives, the term distributed gains a critical significance indeed. Offering a low latency ordering system by nowadays standards is relatively easily achievable, but integrating it in a flexible manner within the broader information system architecture of a broker/dealer requires operational aspects to be factored in. We propose a metric for measuring the coupling level within software architecture, and employ it to identify architectural designs that can offer a higher level of operational responsiveness, which ultimately would raise the overall real-world performance of a trading system.

  14. CAX a software for automated spectrum analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zahn, Guilherme S.; Genezini, Frederico A., E-mail: gzahn@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (CRPq/IPEN/CNEN-SP), Sao Paulo, SP (Brazil). Centro do Reator de Pesquisas

    2017-11-01

    In this work, the scripting capabilities of Genie-2000 were used to develop a software that automatically analyses all spectrum files in either Ortec's CHN or Canberra's MCA or CNF formats in a folder, generating two output files: a print-ready text le (.DAT) and a Comma-Separated Values (.CSV) le which can be easily imported in any major spreadsheet software. This software, named CAX ('Convert and Analyse for eXcel'), uses Genie-2000's functions to import spectrum files into Genie's native CNF format and analyze the converted spectra. The software can also, if requested, import energy and FWHM calibrations from a stored calibrated spectrum. The print-ready output le (.DAT) is generated by Genie-2000 using a customized script, and the CSV le is generated by a custom-built DAT2CSV software which generates a CSV le that complies to the Brazilian standards, with commas as a decimal indicator and semicolons as eld separators. This software is already used in the daily routines in IPEN's Neutron Activation Laboratory, greatly reducing the time required for sample analyses, as well as reducing the possibility of transcription errors. (author)

  15. STAR-GENERIS - a software package for information processing

    International Nuclear Information System (INIS)

    Felkel, L.

    1985-01-01

    Man-machine-communication in electrical power plants is increasingly based on the capabilities of minicomputers. Rather than just displaying raw process data more complex processing is done to aid operators by improving information quality. Advanced operator aids for nuclear power plants are, e.g. alarm reduction, disturbance analysis and expert systems. Operator aids use complex combinations and computations of plant signals, which have to be described in a formal and homogeneous way. The design of such computer-based information systems requires extensive software and engineering efforts. The STAR software concept reduces the software effort to a minimum by proving an advanced program package which facilitates specification and implementation of engineering know-how necessary for sophisticated operator aids. (orig./HP) [de

  16. Early experiences building a software quality prediction model

    Science.gov (United States)

    Agresti, W. W.; Evanco, W. M.; Smith, M. C.

    1990-01-01

    Early experiences building a software quality prediction model are discussed. The overall research objective is to establish a capability to project a software system's quality from an analysis of its design. The technical approach is to build multivariate models for estimating reliability and maintainability. Data from 21 Ada subsystems were analyzed to test hypotheses about various design structures leading to failure-prone or unmaintainable systems. Current design variables highlight the interconnectivity and visibility of compilation units. Other model variables provide for the effects of reusability and software changes. Reported results are preliminary because additional project data is being obtained and new hypotheses are being developed and tested. Current multivariate regression models are encouraging, explaining 60 to 80 percent of the variation in error density of the subsystems.

  17. Software-engineering-based model for mitigating Repetitive Strain ...

    African Journals Online (AJOL)

    The incorporation of Information and Communication Technology (ICT) in virtually all facets of human endeavours has fostered the use of computers. This has induced Repetitive Stress Injury (RSI) for continuous and persistent computer users. Proposing a software engineering model capable of enacted RSI force break ...

  18. Factors to Consider When Implementing Automated Software Testing

    Science.gov (United States)

    2016-11-10

    development and integration is a continuous process throughout the acquisition life cycle . Automated Software Testing can improve testing capabilities...requires a lab, conference room, or both, and whether it should be located in- house or an external facility. 2. Ensure space is adequate to support team

  19. Upgrades to the Probabilistic NAS Platform Air Traffic Simulation Software

    Science.gov (United States)

    Hunter, George; Boisvert, Benjamin

    2013-01-01

    This document is the final report for the project entitled "Upgrades to the Probabilistic NAS Platform Air Traffic Simulation Software." This report consists of 17 sections which document the results of the several subtasks of this effort. The Probabilistic NAS Platform (PNP) is an air operations simulation platform developed and maintained by the Saab Sensis Corporation. The improvements made to the PNP simulation include the following: an airborne distributed separation assurance capability, a required time of arrival assignment and conformance capability, and a tactical and strategic weather avoidance capability.

  20. Development of Data Acquisition and Control Software for Neutron Radiography Facility at Serpong, Indonesia

    International Nuclear Information System (INIS)

    Bharoto

    2013-01-01

    A system for data acquisition and control software for the neutron radiography facility at Serpong has been developed. The software was developed to replace the previously existing control software which was no longer used due to problems on its computer hardware. Visual Basic running under Microsoft Windows operating system was used in developing the new software. In the hardware side, the film grabber and the motor driver were replaced. In the new system, the film grabber which was used to capture the image in the old system is replaced with a programmable CCD camera. The motor driver which was used to control the camera in two directions has been replaced with a four-direction motor driver. The software is capable of displaying the images in a real time mode and record the images in the hard disk of a personal computer. To obtain optimal image quality, the software processes the captured images by performing temperature adjustment, camera exposure time adjustment, and integration of the captured image in a certain frame numbers. The software is capable of taking a number of snapshots at a certain time interval. For neutron tomography purposes, the software takes the snapshots automatically at a sample position in line with the stepping movement of the rotating sample table. The snapshots were saved in a picture format and a numeric format for further processing. The software has been successfully tested for real time method and tomography reconstruction. The data captured by using this software has been verified using both commercial and in-house computed tomography software. (author)

  1. Global Software Development with Cloud Platforms

    Science.gov (United States)

    Yara, Pavan; Ramachandran, Ramaseshan; Balasubramanian, Gayathri; Muthuswamy, Karthik; Chandrasekar, Divya

    Offshore and outsourced distributed software development models and processes are facing challenges, previously unknown, with respect to computing capacity, bandwidth, storage, security, complexity, reliability, and business uncertainty. Clouds promise to address these challenges by adopting recent advances in virtualization, parallel and distributed systems, utility computing, and software services. In this paper, we envision a cloud-based platform that addresses some of these core problems. We outline a generic cloud architecture, its design and our first implementation results for three cloud forms - a compute cloud, a storage cloud and a cloud-based software service- in the context of global distributed software development (GSD). Our ”compute cloud” provides computational services such as continuous code integration and a compile server farm, ”storage cloud” offers storage (block or file-based) services with an on-line virtual storage service, whereas the on-line virtual labs represent a useful cloud service. We note some of the use cases for clouds in GSD, the lessons learned with our prototypes and identify challenges that must be conquered before realizing the full business benefits. We believe that in the future, software practitioners will focus more on these cloud computing platforms and see clouds as a means to supporting a ecosystem of clients, developers and other key stakeholders.

  2. Computer organization and design the hardware/software interface

    CERN Document Server

    Hennessy, John L

    1994-01-01

    Computer Organization and Design: The Hardware/Software Interface presents the interaction between hardware and software at a variety of levels, which offers a framework for understanding the fundamentals of computing. This book focuses on the concepts that are the basis for computers.Organized into nine chapters, this book begins with an overview of the computer revolution. This text then explains the concepts and algorithms used in modern computer arithmetic. Other chapters consider the abstractions and concepts in memory hierarchies by starting with the simplest possible cache. This book di

  3. Ignoring 'Best Practice': why Irish software SMEs are rejecting CMMI and ISO 9000

    OpenAIRE

    O'Connor, Rory V.; Coleman, Gerry

    2009-01-01

    peer-reviewed Software Process Improvement (SPI) "best practice" models such as ISO 9000 and the Capability Maturity Model Integrated (CMMI) have been developed to assist software development organisations by harnessing their experience and providing them with support so that they can produce software products on time, within budget and to a high level of quality. However there is increasing evidence that these models are not being adopted by Small and Medium sized Enterprises ...

  4. A Brief Study of Software Engineering Professional Continuing Education in DoD Acquisition

    Science.gov (United States)

    2010-04-01

    Lifecycle Processes (IEEE 12207 ) (810) 37% 61% 2% Guide to the Software Engineering Body of K l d (SWEBOK) (804) 67% 31% 2% now e ge Software...Engineering-Software Measurement Process ( ISO /IEC 15939) (797) 55% 44% 2% Capability Maturity Model Integration (806) 17% 81% 2% Six Sigma Process...Improvement (804) 7% 91% 1% ISO 9000 Quality Management Systems (803) 10% 89% 1% 28 Conclusions Significant problem areas R i tequ remen s Management Very

  5. SLIMarray: Lightweight software for microarray facility management

    Directory of Open Access Journals (Sweden)

    Marzolf Bruz

    2006-10-01

    Full Text Available Abstract Background Microarray core facilities are commonplace in biological research organizations, and need systems for accurately tracking various logistical aspects of their operation. Although these different needs could be handled separately, an integrated management system provides benefits in organization, automation and reduction in errors. Results We present SLIMarray (System for Lab Information Management of Microarrays, an open source, modular database web application capable of managing microarray inventories, sample processing and usage charges. The software allows modular configuration and is well suited for further development, providing users the flexibility to adapt it to their needs. SLIMarray Lite, a version of the software that is especially easy to install and run, is also available. Conclusion SLIMarray addresses the previously unmet need for free and open source software for managing the logistics of a microarray core facility.

  6. Security Process Capability Model Based on ISO/IEC 15504 Conformant Enterprise SPICE

    Directory of Open Access Journals (Sweden)

    Mitasiunas Antanas

    2014-07-01

    Full Text Available In the context of modern information systems, security has become one of the most critical quality attributes. The purpose of this paper is to address the problem of quality of information security. An approach to solve this problem is based on the main assumption that security is a process oriented activity. According to this approach, product quality can be achieved by means of process quality - process capability. Introduced in the paper, SPICE conformant information security process capability model is based on process capability modeling elaborated by world-wide software engineering community during the last 25 years, namely ISO/IEC 15504 that defines the capability dimension and the requirements for process definition and domain independent integrated model for enterprise-wide assessment and Enterprise SPICE improvement

  7. ProjectQ: An Open Source Software Framework for Quantum Computing

    OpenAIRE

    Steiger, Damian S.; Häner, Thomas; Troyer, Matthias

    2016-01-01

    We introduce ProjectQ, an open source software effort for quantum computing. The first release features a compiler framework capable of targeting various types of hardware, a high-performance simulator with emulation capabilities, and compiler plug-ins for circuit drawing and resource estimation. We introduce our Python-embedded domain-specific language, present the features, and provide example implementations for quantum algorithms. The framework allows testing of quantum algorithms through...

  8. Software for computers in safety systems of nuclear power plants

    International Nuclear Information System (INIS)

    Gallagher, J.M.

    1983-01-01

    The application of distributed digital processing techniques to the protection systems of nuclear power plants provides a means to significantly improve the functional capability of the protection system with respect to the operability and availability of the power plant. A major factor in the realization of this improvement is the development and maintenance of essentially error-free software. A joint program for the development of principles for the design, testing and documentation of software to achieve this goal is presented. Results from two separate experiences in the application of these principles in terms of detected software errors are summarized. The low number of errors detected during the verification testing phase demonstrates the effectiveness of the design and documentation principles in the realization of highly reliable software. (author)

  9. A Half-Day Workshop on ``Smarter Investment by Aligning SPI Initiatives, Capabilities and Stakeholder Values''

    Science.gov (United States)

    Selioukova, Yana; Frühwirth, Christian

    Software companies who want to improve software process capabilities (SPCs)a systematic method to make informed investment decisions on software process improvement (SPI) initiatives. Such decisions should aim at creating maximum stakeholder values. To address this problem, we present a method with tool support that may help companies align stakeholder values with SPCs and SPI initiatives. The proposed method has been developed based on the well-established “Quality Function Deployment” (QFD) approach. The experience with the proposed method suggests that it particularly helps to reduce the risk of misalignment by identifying those SPI initiatives that are most beneficial to stakeholders. The tool support provided with the proposed method also generated positive experiences in increasing the usability of the method and helped companies in the elicitation and prioritization of stakeholder values. Therefore, we propose a workshop for the method work out named “Smarter Investment by Aligning SPI Initiatives, Capabilities and Stakeholder Values” in hypothetical case company.

  10. ImatraNMR: Novel software for batch integration and analysis of quantitative NMR spectra

    Science.gov (United States)

    Mäkelä, A. V.; Heikkilä, O.; Kilpeläinen, I.; Heikkinen, S.

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D 1H and 13C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request.

  11. Análisis de experiencias de mejora de procesos de desarrollo de software en PYMEs

    Directory of Open Access Journals (Sweden)

    Coque-Villegas, Shirley

    2017-12-01

    Full Text Available The services of the software development companies are based on producing high quality software products. Software products quality is ensured by applying software engineering practices throughout the development process. In order to improve these processes, it is necessary adapting the software improvement process models into companies according to their own characteristics. This paper offers an analysis of the application of various software process improvement models in small and medium size enterprises. Finally, results presented here show the influence of the inherent factors of companies and their work teams into choosing a specific software improvement process model.

  12. Optimal Offering Strategies for Wind Power in Energy and Primary Reserve Markets

    DEFF Research Database (Denmark)

    Soares, Tiago; Pinson, Pierre; Jensen, Tue Vissing

    2016-01-01

    generation from the turbines. These offering strategies aim at maximizing expected revenues from both market floors using probabilistic forecasts for wind power generation, complemented with estimated regulation costs and penalties for failing to provide primary reserve. A set of numerical examples, as well......Wind power generation is to play an important role in supplying electric power demand, and will certainly impact the design of future energy and reserve markets. Operators of wind power plants will consequently develop adequate offering strategies, accounting for the market rules...... and the operational capabilities of the turbines, e.g., to participate in primary reserve markets. We consider two different offering strategies for joint participation of wind power in energy and primary reserve markets, based on the idea of proportional and constant splitting of potentially available power...

  13. Adaptive intrusion data system (AIDS) software routines

    International Nuclear Information System (INIS)

    Corlis, N.E.

    1980-07-01

    An Adaptive Intrusion Data System (AIDS) was developed to collect information from intrusion alarm sensors as part of an evaluation system to improve sensor performance. AIDS is a unique digital data-compression, storage, and formatting system; it also incorporates a capability for video selection and recording for assessment of the sensors monitored by the system. The system is software reprogrammable to numerous configurations that may be used for the collection of environmental, bilevel, analog, and video data. This report describes the software routines that control the different AIDS data-collection modes, the diagnostic programs to test the operating hardware, and the data format. Sample data printouts are also included

  14. SOFTWARE OPEN SOURCE, SOFTWARE GRATIS?

    Directory of Open Access Journals (Sweden)

    Nur Aini Rakhmawati

    2006-01-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 Berlakunya Undang – undang Hak Atas Kekayaan Intelektual (HAKI, memunculkan suatu alternatif baru untuk menggunakan software open source. Penggunaan software open source menyebar seiring dengan isu global pada Information Communication Technology (ICT saat ini. Beberapa organisasi dan perusahaan mulai menjadikan software open source sebagai pertimbangan. Banyak konsep mengenai software open source ini. Mulai dari software yang gratis sampai software tidak berlisensi. Tidak sepenuhnya isu software open source benar, untuk itu perlu dikenalkan konsep software open source mulai dari sejarah, lisensi dan bagaimana cara memilih lisensi, serta pertimbangan dalam memilih software open source yang ada. Kata kunci :Lisensi, Open Source, HAKI

  15. Quality assurance for software important to safety

    International Nuclear Information System (INIS)

    2000-01-01

    Software applications play an increasingly relevant role in nuclear power plant systems. This is particularly true of software important to safety used in both: calculations for the design, testing and analysis of nuclear reactor systems (design, engineering and analysis software); and monitoring, control and safety functions as an integral part of the reactor systems (monitoring, control and safety system software). Computer technology is advancing at a fast pace, offering new possibilities in nuclear reactor design, construction, commissioning, operation, maintenance and decommissioning. These advances also present new issues which must be considered both by the utility and by the regulatory organization. Refurbishment of ageing instrumentation and control systems in nuclear power plants and new safety related application areas have emerged, with direct (e.g. interfaces with safety systems) and indirect (e.g. operator intervention) implications for safety. Currently, there exist several international standards and guides on quality assurance for software important to safety. However, none of the existing documents provides comprehensive guidance to the developer, manager and regulator during all phases of the software life-cycle. The present publication was developed taking into account the large amount of available documentation, the rapid development of software systems and the need for updated guidance on h ow to do it . It provides information and guidance for defining and implementing quality assurance programmes covering the entire life-cycle of software important to safety. Expected users are managers, performers and assessors from nuclear utilities, regulatory bodies, suppliers and technical support organizations involved with the development and use of software applied in nuclear power plants

  16. Helicopter precision approach capability using the Global Positioning System

    Science.gov (United States)

    Kaufmann, David N.

    1992-01-01

    The period between 1 July and 31 December, 1992, was spent developing a research plan as well as a navigation system document and flight test plan to investigate helicopter precision approach capability using the Global Positioning System (GPS). In addition, all hardware and software required for the research was acquired, developed, installed, and verified on both the test aircraft and the ground-based reference station.

  17. Image enhancement software for underwater recovery operations: User's manual

    Science.gov (United States)

    Partridge, William J.; Therrien, Charles W.

    1989-06-01

    This report describes software for performing image enhancement on live or recorded video images. The software was developed for operational use during underwater recovery operations at the Naval Undersea Warfare Engineering Station. The image processing is performed on an IBM-PC/AT compatible computer equipped with hardware to digitize and display video images. The software provides the capability to provide contrast enhancement and other similar functions in real time through hardware lookup tables, to automatically perform histogram equalization, to capture one or more frames and average them or apply one of several different processing algorithms to a captured frame. The report is in the form of a user manual for the software and includes guided tutorial and reference sections. A Digital Image Processing Primer in the appendix serves to explain the principle concepts that are used in the image processing.

  18. A Study On Traditional And Evolutionary Software Development Models

    Directory of Open Access Journals (Sweden)

    Kamran Rasheed

    2017-07-01

    Full Text Available Today Computing technologies are becoming the pioneers of the organizations and helpful in individual functionality i.e. added to computing device we need to add softwares. Set of instruction or computer program is known as software. The development of software is done through some traditional or some new or evolutionary models. Software development is becoming a key and a successful business nowadays. Without software all hardware is useless. Some collective steps that are performed in the development of these are known as Software development life cycle SDLC. There are some adaptive and predictive models for developing software. Predictive mean already known like WATERFALL Spiral Prototype and V-shaped models while Adaptive model include agile Scrum. All methodologies of both adaptive and predictive have their own procedure and steps. Predictive are Static and Adaptive are dynamic mean change cannot be made to the predictive while adaptive have the capability of changing. The purpose of this study is to get familiar with all these and discuss their uses and steps of development. This discussion will be helpful in deciding which model they should use in which circumstance and what are the development step including in each model.

  19. Managing Risk Areas in Software Development Offshoring: A CMMI Level 5 Case

    DEFF Research Database (Denmark)

    Persson, John Stouby; Schlichter, Bjarne Rerup

    2015-01-01

    Software companies are increasingly offshoring development to countries with high expertise at lower cost. Offshoring involves particular risk areas that if ignored increase the likelihood of failure. However, the offshoring client’s maturity level may influence the management of these risk areas....... Against this backdrop, we present an interpretive case study of how managers perceive and mitigate the risk areas in software development offshoring with a mature CMMI level 5 (Capability Maturity Model, Integrated) software company as the client. We find that managers perceive and mitigate most...

  20. Use of Data Base Microcomputer Software in Descriptive Nursing Research

    OpenAIRE

    Chapman, Judy Jean

    1985-01-01

    Data base microcomputer software was used to design a file for data storage and retrieval in a qualitative nursing research project. The needs of 50 breast feeding mothers from birth to four months were studied. One thousand records with descriptive nursing data were entered into the file. The search and retrieval capability of data base software facilitated this qualitative research. The findings will be discussed in three areas: (1) infant concerns, (2) postpartum concerns, and (3) breast c...

  1. An assessment system for the system safety engineering capability maturity model in the case of spent fuel reprocessing

    International Nuclear Information System (INIS)

    Yang Xiaohua; Liu Zhenghai; Liu Zhiming; Wan Yaping; Bai Xiaofeng

    2012-01-01

    We can improve the processing, the evaluation of capability and promote the user's trust by using system security engineering capability maturity model (SSE-CMM). SSE-CMM is the common method for organizing and implementing safety engineering, and it is a mature method for system safety engineering. Combining capability maturity model (CMM) with total quality management and statistic theory, SSE-CMM turns systems security engineering into a well-defined, mature, measurable, advanced engineering discipline. Lack of domain knowledge, the size of data, the diversity of evidences, the cumbersomeness of processes, and the complexity of matching evidences with problems are the main issues that SSE-CMM assessment has to face. To improve effectively the efficiency of assessment of spent fuel reprocessing system security engineering capability maturity model (SFR-SSE-CMM), in this paper we de- signed an intelligent assessment software based on domain ontology and that uses methods such as ontology, evidence theory, semantic web, intelligent information retrieval and intelligent auto-matching techniques. This software includes four subsystems, which are domain ontology creation and management system, evidence auto collection system, and a problem and evidence matching system. The architecture of the software is divided into five layers: a data layer, an oncology layer, a knowledge layer, a service layer arid a presentation layer. (authors)

  2. Key attributes of the SAPHIRE risk and reliability analysis software for risk-informed probabilistic applications

    International Nuclear Information System (INIS)

    Smith, Curtis; Knudsen, James; Kvarfordt, Kellie; Wood, Ted

    2008-01-01

    The Idaho National Laboratory is a primary developer of probabilistic risk and reliability analysis (PRRA) tools, dating back over 35 years. Evolving from mainframe-based software, the current state-of-the-practice has led to the creation of the SAPHIRE software. Currently, agencies such as the Nuclear Regulatory Commission, the National Aeronautics and Aerospace Agency, the Department of Energy, and the Department of Defense use version 7 of the SAPHIRE software for many of their risk-informed activities. In order to better understand and appreciate the power of software as part of risk-informed applications, we need to recall that our current analysis methods and solution methods have built upon pioneering work done 30-40 years ago. We contrast this work with the current capabilities in the SAPHIRE analysis package. As part of this discussion, we provide information for both the typical features and special analysis capabilities, which are available. We also present the application and results typically found with state-of-the-practice PRRA models. By providing both a high-level and detailed look at the SAPHIRE software, we give a snapshot in time for the current use of software tools in a risk-informed decision arena

  3. A New Framework for Evaluating the Functional Capabilities of Intra-Enterprise Application Integration Technologies

    Directory of Open Access Journals (Sweden)

    Hossein Moradi

    2010-10-01

    Full Text Available Enterprise Application Integration (EAI technologies facilitate the sharing of information and business processes of interrelated information systems in order to achieve the target integrated systems. Different EAI solutions and technologies provide various capabilities which lead to the complexity of their evaluation process. To reduce this complexity, appropriate tools for evaluating the functional capabilities of EAI technologies are required. This paper proposes a new framework for evaluating the functional capabilities of EAI technologies, which simplify the process of evaluating the functional capabilities of intra-enterprise integration technologies and solutions.The proposed framework for evaluating the EAI technologies was enhanced using the structural and conceptual aspects of previous frameworks. It offers a new schema for which various EAI technologies are categorized in different classes and are evaluated based on their supporting level for functional integration capabilities’ criteria.The new framework offers two lists containing integration technologies and their associated classifications, and functional capabilities of integration technologies. The proposed framework is a novel one which can be used by information system experts for evaluation and comparison purposes of various integration technologies.

  4. Realising the SPECT capability of a rotating gamma camera: an alternative approach

    International Nuclear Information System (INIS)

    Morris, P.B.; Sloboda, R.S.; Malik, M.H.

    1984-01-01

    The present paper demonstrates that the SPECT capability of the GE 400T and DEC Gamma-11 combination can be realised without any additional hardware. It is shown that projection data can be collected using acquisition software which already exists as an integral part of the Gamma-11 system. A description of the software which was developed to perform the image reconstruction is also given. The results of two phantom studies verify the validity of the method, which is currently being used regularly in non-routine clinical investigations of the brain and liver. (author)

  5. SHINE Virtual Machine Model for In-flight Updates of Critical Mission Software

    Science.gov (United States)

    Plesea, Lucian

    2008-01-01

    This software is a new target for the Spacecraft Health Inference Engine (SHINE) knowledge base that compiles a knowledge base to a language called Tiny C - an interpreted version of C that can be embedded on flight processors. This new target allows portions of a running SHINE knowledge base to be updated on a "live" system without needing to halt and restart the containing SHINE application. This enhancement will directly provide this capability without the risk of software validation problems and can also enable complete integration of BEAM and SHINE into a single application. This innovation enables SHINE deployment in domains where autonomy is used during flight-critical applications that require updates. This capability eliminates the need for halting the application and performing potentially serious total system uploads before resuming the application with the loss of system integrity. This software enables additional applications at JPL (microsensors, embedded mission hardware) and increases the marketability of these applications outside of JPL.

  6. Reconfigurable, Cognitive Software-Defined Radio

    Science.gov (United States)

    Bhat, Arvind

    2015-01-01

    Software-defined radio (SDR) technology allows radios to be reconfigured to perform different communication functions without using multiple radios to accomplish each task. Intelligent Automation, Inc., has developed SDR platforms that switch adaptively between different operation modes. The innovation works by modifying both transmit waveforms and receiver signal processing tasks. In Phase I of the project, the company developed SDR cognitive capabilities, including adaptive modulation and coding (AMC), automatic modulation recognition (AMR), and spectrum sensing. In Phase II, these capabilities were integrated into SDR platforms. The reconfigurable transceiver design employs high-speed field-programmable gate arrays, enabling multimode operation and scalable architecture. Designs are based on commercial off-the-shelf (COTS) components and are modular in nature, making it easier to upgrade individual components rather than redesigning the entire SDR platform as technology advances.

  7. Offers

    CERN Multimedia

    Staff Association

    2014-01-01

    New offers : Discover the theater Galpon in Geneva. The Staff Association is happy to offer to its members a discount of 8.00 CHF on a full-price ticket (tickets of 15.00 CHF instead of 22.00 CHF) so do not hesitate anymore (mandatory reservation by phone + 4122 321  21 76 as tickets are quickly sold out!). For further information, please see our website: http://staff-association.web.cern.ch/fr/content/th%C3%A9%C3%A2tre-du-galpon  

  8. CrossTalk: The Journal of Defense Software Engineering. Volume 20, Number 2

    National Research Council Canada - National Science Library

    Phillips, Mike; Craig, Rushby; Jackelen, George; Humphrey, Watts S; Konrad, Michael D; Over, James W; Pries-Heje, Jan; Johansen, Joern; Christiansen, Mads; Korsaa, Morten; Laporte, Claude Y; April, Alain; Renault, Alain

    2007-01-01

    ...: This article describes how the 309th Software Maintenance Group used Standard Capability Maturity Model Integration Appraisal Method for Process Improvement B to identify opportunities for additional...

  9. The TSO Logic and G2 Software Product

    Science.gov (United States)

    Davis, Derrick D.

    2014-01-01

    This internship assignment for spring 2014 was at John F. Kennedy Space Center (KSC), in NASAs Engineering and Technology (NE) group in support of the Control and Data Systems Division (NE-C) within the Systems Hardware Engineering Branch. (NEC-4) The primary focus was in system integration and benchmarking utilizing two separate computer software products. The first half of this 2014 internship is spent in assisting NE-C4s Electronics and Embedded Systems Engineer, Kelvin Ruiz and fellow intern Scott Ditto with the evaluation of a newly piece of software, called G2. Its developed by the Gensym Corporation and introduced to the group as a tool used in monitoring launch environments. All fellow interns and employees of the G2 group have been working together in order to better understand the significance of the G2 application and how KSC can benefit from its capabilities. The second stage of this Spring project is to assist with an ongoing integration of a benchmarking tool, developed by a group of engineers from a Canadian based organization known as TSO Logic. Guided by NE-C4s Computer Engineer, Allen Villorin, NASA 2014 interns put forth great effort in helping to integrate TSOs software into the Spaceport Processing Systems Development Laboratory (SPSDL) for further testing and evaluating. The TSO Logic group claims that their software is designed for, monitoring and reducing energy consumption at in-house server farms and large data centers, allows data centers to control the power state of servers, without impacting availability or performance and without changes to infrastructure and the focus of the assignment is to test this theory. TSOs Aaron Rallo Founder and CEO, and Chris Tivel CTO, both came to KSC to assist with the installation of their software in the SPSDL laboratory. TSOs software is installed onto 24 individual workstations running three different operating systems. The workstations were divided into three groups of 8 with each group having its

  10. QuantiFly: Robust Trainable Software for Automated Drosophila Egg Counting.

    Directory of Open Access Journals (Sweden)

    Dominic Waithe

    Full Text Available We report the development and testing of software called QuantiFly: an automated tool to quantify Drosophila egg laying. Many laboratories count Drosophila eggs as a marker of fitness. The existing method requires laboratory researchers to count eggs manually while looking down a microscope. This technique is both time-consuming and tedious, especially when experiments require daily counts of hundreds of vials. The basis of the QuantiFly software is an algorithm which applies and improves upon an existing advanced pattern recognition and machine-learning routine. The accuracy of the baseline algorithm is additionally increased in this study through correction of bias observed in the algorithm output. The QuantiFly software, which includes the refined algorithm, has been designed to be immediately accessible to scientists through an intuitive and responsive user-friendly graphical interface. The software is also open-source, self-contained, has no dependencies and is easily installed (https://github.com/dwaithe/quantifly. Compared to manual egg counts made from digital images, QuantiFly achieved average accuracies of 94% and 85% for eggs laid on transparent (defined and opaque (yeast-based fly media. Thus, the software is capable of detecting experimental differences in most experimental situations. Significantly, the advanced feature recognition capabilities of the software proved to be robust to food surface artefacts like bubbles and crevices. The user experience involves image acquisition, algorithm training by labelling a subset of eggs in images of some of the vials, followed by a batch analysis mode in which new images are automatically assessed for egg numbers. Initial training typically requires approximately 10 minutes, while subsequent image evaluation by the software is performed in just a few seconds. Given the average time per vial for manual counting is approximately 40 seconds, our software introduces a timesaving advantage for

  11. MSiReader v1.0: Evolving Open-Source Mass Spectrometry Imaging Software for Targeted and Untargeted Analyses

    Science.gov (United States)

    Bokhart, Mark T.; Nazari, Milad; Garrard, Kenneth P.; Muddiman, David C.

    2018-01-01

    A major update to the mass spectrometry imaging (MSI) software MSiReader is presented, offering a multitude of newly added features critical to MSI analyses. MSiReader is a free, open-source, and vendor-neutral software written in the MATLAB platform and is capable of analyzing most common MSI data formats. A standalone version of the software, which does not require a MATLAB license, is also distributed. The newly incorporated data analysis features expand the utility of MSiReader beyond simple visualization of molecular distributions. The MSiQuantification tool allows researchers to calculate absolute concentrations from quantification MSI experiments exclusively through MSiReader software, significantly reducing data analysis time. An image overlay feature allows the incorporation of complementary imaging modalities to be displayed with the MSI data. A polarity filter has also been incorporated into the data loading step, allowing the facile analysis of polarity switching experiments without the need for data parsing prior to loading the data file into MSiReader. A quality assurance feature to generate a mass measurement accuracy (MMA) heatmap for an analyte of interest has also been added to allow for the investigation of MMA across the imaging experiment. Most importantly, as new features have been added performance has not degraded, in fact it has been dramatically improved. These new tools and the improvements to the performance in MSiReader v1.0 enable the MSI community to evaluate their data in greater depth and in less time. [Figure not available: see fulltext.

  12. Free and open-source software application for the evaluation of coronary computed tomography angiography images.

    Science.gov (United States)

    Hadlich, Marcelo Souza; Oliveira, Gláucia Maria Moraes; Feijóo, Raúl A; Azevedo, Clerio F; Tura, Bernardo Rangel; Ziemer, Paulo Gustavo Portela; Blanco, Pablo Javier; Pina, Gustavo; Meira, Márcio; Souza e Silva, Nelson Albuquerque de

    2012-10-01

    The standardization of images used in Medicine in 1993 was performed using the DICOM (Digital Imaging and Communications in Medicine) standard. Several tests use this standard and it is increasingly necessary to design software applications capable of handling this type of image; however, these software applications are not usually free and open-source, and this fact hinders their adjustment to most diverse interests. To develop and validate a free and open-source software application capable of handling DICOM coronary computed tomography angiography images. We developed and tested the ImageLab software in the evaluation of 100 tests randomly selected from a database. We carried out 600 tests divided between two observers using ImageLab and another software sold with Philips Brilliance computed tomography appliances in the evaluation of coronary lesions and plaques around the left main coronary artery (LMCA) and the anterior descending artery (ADA). To evaluate intraobserver, interobserver and intersoftware agreements, we used simple and kappa statistics agreements. The agreements observed between software applications were generally classified as substantial or almost perfect in most comparisons. The ImageLab software agreed with the Philips software in the evaluation of coronary computed tomography angiography tests, especially in patients without lesions, with lesions 70% in the ADA was lower, but this is also observed when the anatomical reference standard is used.

  13. Special offers

    CERN Multimedia

    Staff Association

    2011-01-01

    Are you a member of the Staff Association? Did you know that as a member you can benefit from the following special offers: BCGE (Banque Cantonale de Genève): personalized banking solutions with preferential conditions. TPG: reduced rates on annual transport passes for active and retired staff. Aquaparc: reduced ticket prices for children and adults at this Swiss waterpark in Le Bouveret. FNAC: 5% reduction on FNAC vouchers. For more information about all these offers, please consult our web site: http://association.web.cern.ch/association/en/OtherActivities/Offers.html

  14. An effective technique for the software requirements analysis of NPP safety-critical systems, based on software inspection, requirements traceability, and formal specification

    International Nuclear Information System (INIS)

    Koo, Seo Ryong; Seong, Poong Hyun; Yoo, Junbeom; Cha, Sung Deok; Yoo, Yeong Jae

    2005-01-01

    A thorough requirements analysis is indispensable for developing and implementing safety-critical software systems such as nuclear power plant (NPP) software systems because a single error in the requirements can generate serious software faults. However, it is very difficult to completely analyze system requirements. In this paper, an effective technique for the software requirements analysis is suggested. For requirements verification and validation (V and V) tasks, our technique uses software inspection, requirement traceability, and formal specification with structural decomposition. Software inspection and requirements traceability analysis are widely considered the most effective software V and V methods. Although formal methods are also considered an effective V and V activity, they are difficult to use properly in the nuclear fields as well as in other fields because of their mathematical nature. In this work, we propose an integrated environment (IE) approach for requirements, which is an integrated approach that enables easy inspection by combining requirement traceability and effective use of a formal method. The paper also introduces computer-aided tools for supporting IE approach for requirements. Called the nuclear software inspection support and requirements traceability (NuSISRT), the tool incorporates software inspection, requirement traceability, and formal specification capabilities. We designed the NuSISRT to partially automate software inspection and analysis of requirement traceability. In addition, for the formal specification and analysis, we used the formal requirements specification and analysis tool for nuclear engineering (NuSRS)

  15. Impacts of software and its engineering on the carbon footprint of ICT

    Energy Technology Data Exchange (ETDEWEB)

    Kern, Eva, E-mail: e.kern@umwelt-campus.de [Institute for Software Systems, Environmental Campus Birkenfeld, Campusallee, D-55761 Birkenfeld (Germany); Dick, Markus, E-mail: sustainablesoftwareblog@gmail.com [Fritz-Wunderlich-Straße 14, D-66869 Kusel (Germany); Naumann, Stefan, E-mail: s.naumann@umwelt-campus.de [Institute for Software Systems, Environmental Campus Birkenfeld, Campusallee, D-55761 Birkenfeld (Germany); Hiller, Tim, E-mail: tim.hiller@gmx.com [Institute for Software Systems, Environmental Campus Birkenfeld, Campusallee, D-55761 Birkenfeld (Germany)

    2015-04-15

    The energy consumption of information and communication technology (ICT) is still increasing. Even though several solutions regarding the hardware side of Green IT exist, the software contribution to Green IT is not well investigated. The carbon footprint is one way to rate the environmental impacts of ICT. In order to get an impression of the induced CO{sub 2} emissions of software, we will present a calculation method for the carbon footprint of a software product over its life cycle. We also offer an approach on how to integrate some aspects of carbon footprint calculation into software development processes and discuss impacts and tools regarding this calculation method. We thus show the relevance of energy measurements and the attention to impacts on the carbon footprint by software within Green Software Engineering.

  16. Impacts of software and its engineering on the carbon footprint of ICT

    International Nuclear Information System (INIS)

    Kern, Eva; Dick, Markus; Naumann, Stefan; Hiller, Tim

    2015-01-01

    The energy consumption of information and communication technology (ICT) is still increasing. Even though several solutions regarding the hardware side of Green IT exist, the software contribution to Green IT is not well investigated. The carbon footprint is one way to rate the environmental impacts of ICT. In order to get an impression of the induced CO 2 emissions of software, we will present a calculation method for the carbon footprint of a software product over its life cycle. We also offer an approach on how to integrate some aspects of carbon footprint calculation into software development processes and discuss impacts and tools regarding this calculation method. We thus show the relevance of energy measurements and the attention to impacts on the carbon footprint by software within Green Software Engineering

  17. THE INTERNATIONALIZATION OF THE SOFTWARE MARKET: OPPORTUNITIES AND CHALLENGES FOR BRAZILIAN COMPANIES

    Directory of Open Access Journals (Sweden)

    Oscar Roberto Burzynski

    2010-12-01

    Full Text Available This paper deals with the potential for internationalization of the Brazilian software industry from the perspective of software developers and service providers. The purpose of the study conducted was to better understand the way Brazilian software companies relate to the international software market by comparing the perceptions of entrepreneurs and those of government agency officials responsible for increasing Brazil's participation in the international software market. Data collection took place by means of semi-structured interviews with entrepreneurs and government agency officials. The data gathered was subjected to content analysis. Results show that Brazilian software companies perform poorly with regard to levels of exporting their products and services for a number of reasons, among which the most outstanding is that they still think that the internal market offers enough challenges and opportunities.

  18. RTSPM: real-time Linux control software for scanning probe microscopy.

    Science.gov (United States)

    Chandrasekhar, V; Mehta, M M

    2013-01-01

    Real time computer control is an essential feature of scanning probe microscopes, which have become important tools for the characterization and investigation of nanometer scale samples. Most commercial (and some open-source) scanning probe data acquisition software uses digital signal processors to handle the real time data processing and control, which adds to the expense and complexity of the control software. We describe here scan control software that uses a single computer and a data acquisition card to acquire scan data. The computer runs an open-source real time Linux kernel, which permits fast acquisition and control while maintaining a responsive graphical user interface. Images from a simulated tuning-fork based microscope as well as a standard topographical sample are also presented, showing some of the capabilities of the software.

  19. AWARE-P: a system-based software for urban water IAM planning

    OpenAIRE

    Coelho, S.T.; Vitorino, D.; Alegre, H.

    2013-01-01

    The AWARE-P IAM planning software offers a non-intrusive, web-based, collaborative integration environment for a wide variety of data and processes that may be relevant to the IAM decision-making process, including maps, GIS shapefiles and geodatabases; inventory records; work orders, maintenance, inspections/CCTV records; network models, performance indicators, asset valuation records, among others. The software provides an organized framework for evaluating and comparing planning alternativ...

  20. Open Data, Open Specifications and Free and Open Source Software: A powerful mix to create distributed Web-based water information systems

    Science.gov (United States)

    Arias, Carolina; Brovelli, Maria Antonia; Moreno, Rafael

    2015-04-01

    We are in an age when water resources are increasingly scarce and the impacts of human activities on them are ubiquitous. These problems don't respect administrative or political boundaries and they must be addressed integrating information from multiple sources at multiple spatial and temporal scales. Communication, coordination and data sharing are critical for addressing the water conservation and management issues of the 21st century. However, different countries, provinces, local authorities and agencies dealing with water resources have diverse organizational, socio-cultural, economic, environmental and information technology (IT) contexts that raise challenges to the creation of information systems capable of integrating and distributing information across their areas of responsibility in an efficient and timely manner. Tight and disparate financial resources, and dissimilar IT infrastructures (data, hardware, software and personnel expertise) further complicate the creation of these systems. There is a pressing need for distributed interoperable water information systems that are user friendly, easily accessible and capable of managing and sharing large volumes of spatial and non-spatial data. In a distributed system, data and processes are created and maintained in different locations each with competitive advantages to carry out specific activities. Open Data (data that can be freely distributed) is available in the water domain, and it should be further promoted across countries and organizations. Compliance with Open Specifications for data collection, storage and distribution is the first step toward the creation of systems that are capable of interacting and exchanging data in a seamlessly (interoperable) way. The features of Free and Open Source Software (FOSS) offer low access cost that facilitate scalability and long-term viability of information systems. The World Wide Web (the Web) will be the platform of choice to deploy and access these systems

  1. Helping organizations to address their effort toward the implementation of improvements in their software process

    OpenAIRE

    Muñoz-Mata, Mirna Ariadna; Mejia-Miranda, Jezreel; Valtierra-Alvarado, Claudia

    2015-01-01

    Due to the importance of Software Development Small and Medium Enterprises (SMEs) in the software industry, it is necessary to guarantee the quality of their products. In this context, the implementation of software process improvements offers an attractive way to achieve it. Unfortunately, the lack of knowledge on how to address the improvement effort makes the implementation of software improvements in SMEs a path full of obstacles, and most of the times impossible to achieve. In order to h...

  2. The architecture of a reliable software monitoring system for embedded software systems

    International Nuclear Information System (INIS)

    Munson, J.; Krings, A.; Hiromoto, R.

    2006-01-01

    We develop the notion of a measurement-based methodology for embedded software systems to ensure properties of reliability, survivability and security, not only under benign faults but under malicious and hazardous conditions as well. The driving force is the need to develop a dynamic run-time monitoring system for use in these embedded mission critical systems. These systems must run reliably, must be secure and they must fail gracefully. That is, they must continue operating in the face of the departures from their nominal operating scenarios, the failure of one or more system components due to normal hardware and software faults, as well as malicious acts. To insure the integrity of embedded software systems, the activity of these systems must be monitored as they operate. For each of these systems, it is possible to establish a very succinct representation of nominal system activity. Furthermore, it is possible to detect departures from the nominal operating scenario in a timely fashion. Such departure may be due to various circumstances, e.g., an assault from an outside agent, thus forcing the system to operate in an off-nominal environment for which it was neither tested nor certified, or a hardware/software component that has ceased to operate in a nominal fashion. A well-designed system will have the property of graceful degradation. It must continue to run even though some of the functionality may have been lost. This involves the intelligent re-mapping of system functions. Those functions that are impacted by the failure of a system component must be identified and isolated. Thus, a system must be designed so that its basic operations may be re-mapped onto system components still operational. That is, the mission objectives of the software must be reassessed in terms of the current operational capabilities of the software system. By integrating the mechanisms to support observation and detection directly into the design methodology, we propose to shift

  3. Behavior Tracking Software Enhancement and Integration of a Feedback Module, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Horizon Performance designed a Behavior Tracking Software System to collect crew member behavior throughout a mission, giving NASA the capability to monitor...

  4. PScan 1.0: flexible software framework for polygon based multiphoton microscopy

    Science.gov (United States)

    Li, Yongxiao; Lee, Woei Ming

    2016-12-01

    Multiphoton laser scanning microscopes exhibit highly localized nonlinear optical excitation and are powerful instruments for in-vivo deep tissue imaging. Customized multiphoton microscopy has a significantly superior performance for in-vivo imaging because of precise control over the scanning and detection system. To date, there have been several flexible software platforms catered to custom built microscopy systems i.e. ScanImage, HelioScan, MicroManager, that perform at imaging speeds of 30-100fps. In this paper, we describe a flexible software framework for high speed imaging systems capable of operating from 5 fps to 1600 fps. The software is based on the MATLAB image processing toolbox. It has the capability to communicate directly with a high performing imaging card (Matrox Solios eA/XA), thus retaining high speed acquisition. The program is also designed to communicate with LabVIEW and Fiji for instrument control and image processing. Pscan 1.0 can handle high imaging rates and contains sufficient flexibility for users to adapt to their high speed imaging systems.

  5. Astronomy Student Activities Using Stellarium Software

    Science.gov (United States)

    Benge, Raymond D.; Tuttle, S. R.

    2012-01-01

    Planetarium programs can be used to provide a valuable learning experience for introductory astronomy students. Educational activities can be designed to utilize the capabilities of the software to display the sky, coordinates, motions in the sky, etc., in order to learn basic astronomical concepts. Most of the major textbook publishers have an option of bundling planetarium software and even laboratory activities using such software with textbooks. However, commercial planetarium software often is updated on a different schedule from the textbook revision and new edition schedule. The software updates also sometimes occur out of sync with college textbook adoption deadlines. Changes in software and activity curriculum often translate into increases costs for students and the college. To provide stability to the process, faculty at Tarrant County College have developed a set of laboratory exercises, entitled Distant Nature, using free open source Stellarium software. Stellarium is a simple, yet powerful, program that is available in formats that run on a variety of operating systems (Windows, Apple, linux). A web site was developed for the Distant Nature activities having a set version of Stellarium that students can download and install on their own computers. Also on the web site, students can access the instructions and worksheets associated with the various Stellarium based activities. A variety of activities are available to support two semesters of introductory astronomy. The Distant Nature web site has been used for one year with Tarrant County College astronomy students and is now available for use by other institutions. The Distant Nature web site is http://www.stuttle1.com/DN_Astro/index.html .

  6. Imaging Sensor Flight and Test Equipment Software

    Science.gov (United States)

    Freestone, Kathleen; Simeone, Louis; Robertson, Byran; Frankford, Maytha; Trice, David; Wallace, Kevin; Wilkerson, DeLisa

    2007-01-01

    The Lightning Imaging Sensor (LIS) is one of the components onboard the Tropical Rainfall Measuring Mission (TRMM) satellite, and was designed to detect and locate lightning over the tropics. The LIS flight code was developed to run on a single onboard digital signal processor, and has operated the LIS instrument since 1997 when the TRMM satellite was launched. The software provides controller functions to the LIS Real-Time Event Processor (RTEP) and onboard heaters, collects the lightning event data from the RTEP, compresses and formats the data for downlink to the satellite, collects housekeeping data and formats the data for downlink to the satellite, provides command processing and interface to the spacecraft communications and data bus, and provides watchdog functions for error detection. The Special Test Equipment (STE) software was designed to operate specific test equipment used to support the LIS hardware through development, calibration, qualification, and integration with the TRMM spacecraft. The STE software provides the capability to control instrument activation, commanding (including both data formatting and user interfacing), data collection, decompression, and display and image simulation. The LIS STE code was developed for the DOS operating system in the C programming language. Because of the many unique data formats implemented by the flight instrument, the STE software was required to comprehend the same formats, and translate them for the test operator. The hardware interfaces to the LIS instrument using both commercial and custom computer boards, requiring that the STE code integrate this variety into a working system. In addition, the requirement to provide RTEP test capability dictated the need to provide simulations of background image data with short-duration lightning transients superimposed. This led to the development of unique code used to control the location, intensity, and variation above background for simulated lightning strikes

  7. Eprints Institutional Repository Software: A Review

    Directory of Open Access Journals (Sweden)

    Mike R. Beazley

    2011-01-01

    Full Text Available Setting up an institutional repository (IR can be a daunting task. There are many software packages out there, some commercial, some open source, all of which offer different features and functionality. This article will provide some thoughts about one of these software packages: Eprints. Eprints was one of the first IR software packages to appear and has been available for 10 years. It is under continual development by its creators at the University of Southampton and the current version is v3.2.3. Eprints is open-source, meaning that anyone can download and make use of the software for free and the software can be modified however the user likes. This presents clear advantages for institutions will smaller budgets and also for institutions that have programmers on staff. Eprints requires some additional software to run: Linux, Apache, MySQL, and Perl. This software is all open-source and already present on the servers of many institutions. There is now a version of Eprints that will run on Windows servers as well, which will make the adoption of Eprints even easier for some. In brief, Eprints is an excellent choice for any institution looking to get an IR up and running quickly and easily. Installation is straightforward as is the initial configuration. Once the IR is up and running, users may upload documents and provide the necessary metadata for the records by filling out a simple web form. Embargoes on published documents are handled elegantly by the software, and the software links to the SHERPA/RoMEO database so authors can easily verify their rights regarding IR submissions. Eprints has some drawbacks, which will be discussed later in the review, but on the whole it is easy to recommend to anyone looking to start an IR. However, It is less clear that an institution with an existing IR based on another software package should migrate to Eprints.

  8. Overview of the next generation of Fermilab collider software

    International Nuclear Information System (INIS)

    Hendricks, B.; Joshel, R.

    1992-01-01

    Fermilab is entering an era of operating a more complex collider facility. In addition, new operator workstations are available that have increased capabilities. The task of providing updated software in this new environment precipitated a project called Colliding Beam Software (CBS). It was soon evident that a new approach was needed for developing console software. Hence CBS, although a common acronym, is too narrow a description. A new generation of the application program subroutine library has been created to enhance the existing programming environment with a set of value added tools. Several key Collider applications were written that exploit CBS tools. This paper will discuss the new tools and the underlying change in methodology in application program development for accelerator control at Fermilab. (author)

  9. A Framework for Software-as-a-Service Selection and Provisioning

    OpenAIRE

    Badidi, Elarbi

    2013-01-01

    As cloud computing is increasingly transforming the information technology landscape, organizations and businesses are exhibiting strong interest in Software-as-a-Service (SaaS) offerings that can help them increase business agility and reduce their operational costs. They increasingly demand services that can meet their functional and non-functional requirements. Given the plethora and the variety of SaaS offerings, we propose, in this paper, a framework for SaaS provisioning, which relies o...

  10. Software Should be Written by Writers.

    Science.gov (United States)

    Sheridan, James

    1983-01-01

    Considering the computer as a collaborator rather than a machine, it is encouraged that those in the humanities and the arts fields take advantage of the great potential that artificial intelligence can offer. Stresses that unless deliberately restricted, the computer is an inherently interdisciplinary medium, and capable of interacting with any…

  11. Knowledge Architect : A Tool Suite for Managing Software Architecture Knowledge

    NARCIS (Netherlands)

    Liang, Peng; Jansen, Anton; Avgeriou, Paris

    2009-01-01

    Management of software architecture knowledge (AK) is vital for improving an organization’s architectural capabilities. To support the architecting process within our industrial partner: Astron, the Dutch radio astronomy institute, we implemented the Knowledge Architect (KA): a tool suite for

  12. Design Features and Capabilities of the First Materials Science Research Rack

    Science.gov (United States)

    Pettigrew, P. J.; Lehoczky, S. L.; Cobb, S. D.; Holloway, T.; Kitchens, L.

    2003-01-01

    The First Materials Science Research Rack (MSRR-1) aboard the International Space Station (ISS) will offer many unique capabilities and design features to facilitate a wide range of materials science investigations. The initial configuration of MSRR-1 will accommodate two independent Experiment Modules (EMS) and provide the capability for simultaneous on-orbit processing. The facility will provide the common subsystems and interfaces required for the operation of experiment hardware and accommodate telescience capabilities. MSRR1 will utilize an International Standard Payload Rack (ISPR) equipped with an Active Rack Isolation System (ARIS) for vibration isolation of the facility.

  13. Characterization of Morphology using MAMA Software

    Energy Technology Data Exchange (ETDEWEB)

    Gravelle, Julie [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-02

    from watching other users into the user guide, I believe that anyone who utilizes the software will be able to quickly understand the best way to analyze their image and use the tools the program offers to achieve useful results.

  14. Lung nodule volumetry: segmentation algorithms within the same software package cannot be used interchangeably

    Energy Technology Data Exchange (ETDEWEB)

    Ashraf, H.; Bach, K.S.; Hansen, H. [Copenhagen University, Department of Radiology, Gentofte Hospital, Hellerup (Denmark); Hoop, B. de [University Medical Centre Utrecht, Department of Radiology, Utrecht (Netherlands); Shaker, S.B.; Dirksen, A. [Copenhagen University, Department of Respiratory Medicine, Gentofte Hospital, Hellerup (Denmark); Prokop, M. [University Medical Centre Utrecht, Department of Radiology, Utrecht (Netherlands); Radboud University Nijmegen, Department of Radiology, Nijmegen (Netherlands); Pedersen, J.H. [Copenhagen University, Department of Cardiothoracic Surgery RT, Rigshospitalet, Copenhagen (Denmark)

    2010-08-15

    We examined the reproducibility of lung nodule volumetry software that offers three different volumetry algorithms. In a lung cancer screening trial, 188 baseline nodules >5 mm were identified. Including follow-ups, these nodules formed a study-set of 545 nodules. Nodules were independently double read by two readers using commercially available volumetry software. The software offers readers three different analysing algorithms. We compared the inter-observer variability of nodule volumetry when the readers used the same and different algorithms. Both readers were able to correctly segment and measure 72% of nodules. In 80% of these cases, the readers chose the same algorithm. When readers used the same algorithm, exactly the same volume was measured in 50% of readings and a difference of >25% was observed in 4%. When the readers used different algorithms, 83% of measurements showed a difference of >25%. Modern volumetric software failed to correctly segment a high number of screen detected nodules. While choosing a different algorithm can yield better segmentation of a lung nodule, reproducibility of volumetric measurements deteriorates substantially when different algorithms were used. It is crucial even in the same software package to choose identical parameters for follow-up. (orig.)

  15. Lung nodule volumetry: segmentation algorithms within the same software package cannot be used interchangeably

    International Nuclear Information System (INIS)

    Ashraf, H.; Bach, K.S.; Hansen, H.; Hoop, B. de; Shaker, S.B.; Dirksen, A.; Prokop, M.; Pedersen, J.H.

    2010-01-01

    We examined the reproducibility of lung nodule volumetry software that offers three different volumetry algorithms. In a lung cancer screening trial, 188 baseline nodules >5 mm were identified. Including follow-ups, these nodules formed a study-set of 545 nodules. Nodules were independently double read by two readers using commercially available volumetry software. The software offers readers three different analysing algorithms. We compared the inter-observer variability of nodule volumetry when the readers used the same and different algorithms. Both readers were able to correctly segment and measure 72% of nodules. In 80% of these cases, the readers chose the same algorithm. When readers used the same algorithm, exactly the same volume was measured in 50% of readings and a difference of >25% was observed in 4%. When the readers used different algorithms, 83% of measurements showed a difference of >25%. Modern volumetric software failed to correctly segment a high number of screen detected nodules. While choosing a different algorithm can yield better segmentation of a lung nodule, reproducibility of volumetric measurements deteriorates substantially when different algorithms were used. It is crucial even in the same software package to choose identical parameters for follow-up. (orig.)

  16. Special Offers

    CERN Multimedia

    Association du personnel

    2011-01-01

    Walibi Rhône-Alpes is open until 31 October. Reduced prices for children and adults at this French attraction park in Les Avenières. For more information about all these offers, please consult our web site: http://association.web.cern.ch/association/en/OtherActivities/Offers.html

  17. Enhancements to the Sentinel Fireball Network Video Software

    Science.gov (United States)

    Watson, Wayne

    2009-05-01

    The Sentinel Fireball Network that supports meteor imaging of bright meteors (fireballs) has been in existence for over ten years. Nearly five years ago it moved from gathering meteor data with a camera and VCR video tape to a fisheye lens attached to a hardware device, the Sentinel box, which allowed meteor data to be recorded on a PC operating under real-time Linux. In 2006, that software, sentuser, was made available on Apple, Linux, and Window operating systems using the Python computer language. It provides basic video and management functionality and a small amount of analytic software capability. This paper describes the new and attractive future features of the software, and, additionally, it reviews some of the research and networks from the past and present using video equipment to collect and analyze fireball data that have applicability to sentuser.

  18. Generic Software Architecture for Launchers

    Science.gov (United States)

    Carre, Emilien; Gast, Philippe; Hiron, Emmanuel; Leblanc, Alain; Lesens, David; Mescam, Emmanuelle; Moro, Pierre

    2015-09-01

    The definition and reuse of generic software architecture for launchers is not so usual for several reasons: the number of European launcher families is very small (Ariane 5 and Vega for these last decades); the real time constraints (reactivity and determinism needs) are very hard; low levels of versatility are required (implying often an ad hoc development of the launcher mission). In comparison, satellites are often built on a generic platform made up of reusable hardware building blocks (processors, star-trackers, gyroscopes, etc.) and reusable software building blocks (middleware, TM/TC, On Board Control Procedure, etc.). If some of these reasons are still valid (e.g. the limited number of development), the increase of the available CPU power makes today an approach based on a generic time triggered middleware (ensuring the full determinism of the system) and a centralised mission and vehicle management (offering more flexibility in the design and facilitating the long term maintenance) achievable. This paper presents an example of generic software architecture which could be envisaged for future launchers, based on the previously described principles and supported by model driven engineering and automatic code generation.

  19. How to use Open Source Software for Manage a Library System

    OpenAIRE

    Sumithchandra, Pandula

    2009-01-01

    Open source is an approach to the design, development, and distribution of software, offering practical accessibility to a software's source code. Some consider open source as one of various possible design approaches, while others consider it a critical strategic element of their operations. Before open source became widely adopted, developers and producers used a variety of phrases to describe the concept; the term open source gained popularity with the rise of the Internet, which provided ...

  20. CTMCONTROL: Addressing the MC/DC Objective for Safety-Critical Automotive Software

    OpenAIRE

    Mjeda , Anila; Hinchey , Mike

    2013-01-01

    International audience; We propose a method tailored to the requirements of safety-critical embedded automotive software, named CTMCONTROL. CTMCONTROL has a par-ticular focus on the specification-based control logic of the system under test and offers improvements in testing coverage metrics over a classic method which is routinely used in industry. The proposed method targets the Modified Condition/ Decision Coverage (MC/DC) objective for automotive safety-critical software. CTMCONTROL is va...

  1. Software package evaluation for the TJ-II Data Acquisition System

    International Nuclear Information System (INIS)

    Cremy, C.; Sanchez, E.; Portas, A.; Vega, J.

    1996-01-01

    The TJ-II Data Acquisition System (DAS) has to provide a user interface which will allow setup for sampling channels, discharge signal visualization and reduce data processing, all in run time. On the other hand, the DAS will provide a high level software capability for signal analysis, processing and data visualization either in run time or off line. A set of software packages including Builder Xcessory, X-designer, llog Builder, Toolmaster, AVS 5, AVS/Express, PV-WAVE and Iris Explorer, have been evaluated by the Data Acquisition Group of the Fusion Division. the software evaluation, resumed in this paper, has resulted in a global solution being found which meets all of the DAS requirements. (Author)

  2. Special Offers

    CERN Multimedia

    Association du personnel

    2011-01-01

    Are you a member of the Staff Association? Did you know that as a member you can benefit from the following special offers: BCGE (Banque Cantonale de Genève): personalized banking solutions with preferential conditions. TPG: reduced rates on annual transport passes for active and retired staff. Aquaparc: reduced ticket prices for children and adults at this Swiss waterpark in Le Bouveret. Walibi: reduced prices for children and adults at this French attraction park in Les Avenières. FNAC: 5% reduction on FNAC vouchers. For more information about all these offers, please consult our web site: http://association.web.cern.ch/association/en/OtherActivities/Offers.html

  3. Control software of a variably polarizing undulator (APPLE type) for SX beamline in the SPring-8

    Energy Technology Data Exchange (ETDEWEB)

    Hiramatsu, Yoichi [Kansai Research Establishment, Japan Atomic Energy Research Institute, Mikazuki, Hyogo (Japan); Shimada, Taihei; Miyahara, Yoshikazu

    1999-12-01

    This paper describes the control software of a variably polarizing undulator (APPLE Type) that was installed at the SX beamline (cell number 23) in the SPring-8 storage ring in February, 1998. This undulator produces a polarized radiation in the energy range of soft X-ray by changing the gap distance between two pairs of permanent magnet arrays (gap movement). The main characteristic of the undulator is a capability to generate right and left circular polarization alternately at a period of 2 sec (0.5 Hz) by high speed phase-shifting (periodic phase movement). The developed software makes a fast correction of the closed orbit distortion (COD) of an electron beam by exciting steering magnets at a rate of time interval of 24 msec (42 Hz) during the movement of magnet arrays. Also, the software is capable to put these magnet arrays into a constant periodic phase movement with an error less than 0.1% for the period of 2 sec. The software was developed in accordance with the directions of SPring-8 standard for software development. (author)

  4. Towards a capability approach to careers: Applying Amartya Sen's thinking

    OpenAIRE

    Robertson, Peter.

    2015-01-01

    Amartya Sen’s capability approach characterizes an individual’s well-being in terms of what they are able to be, and what they are able to do. This framework for thinking has many commonalities with the core ideas in career guidance. Sen’s approach is abstract and not in itself a complete or explanatory theory, but a case can be made that the capability approach has something to offer career theory when combined with a life-career developmental approach. It may also suggest ways of working th...

  5. Status of REBUS fuel management software development for RERTR applications

    International Nuclear Information System (INIS)

    Olson, Arne P.

    2000-01-01

    The REBUS-5 burnup code has evolved substantially in order to meet the needs of the ANL RERTR Program. This paper presents a summary of the past changes and improvements in the capabilities of this software, and also identifies future plans. (author)

  6. NEAMS Software Licensing, Release, and Distribution: Implications for FY2013 Work Package Planning

    International Nuclear Information System (INIS)

    Bernholdt, David E.

    2012-01-01

    The vision of the NEAMS program is to bring truly predictive modeling and simulation (M and S) capabilities to the nuclear engineering community in order to enable a new approach to the analysis of nuclear systems. NEAMS anticipates issuing in FY 2018 a full release of its computational 'Fermi Toolkit' aimed at advanced reactor and fuel cycles. The NEAMS toolkit involves extensive software development activities, some of which have already been underway for several years, however, the Advanced Modeling and Simulation Office (AMSO), which sponsors the NEAMS program, has not yet issued any official guidance regarding software licensing, release, and distribution policies. This motivated an FY12 task in the Capability Transfer work package to develop and recommend an appropriate set of policies. The current preliminary report is intended to provide awareness of issues with implications for work package planning for FY13. We anticipate a small amount of effort associated with putting into place formal licenses and contributor agreements for NEAMS software which doesn't already have them. We do not anticipate any additional effort or costs associated with software release procedures or schedules beyond those dictated by the quality expectations for the software. The largest potential costs we anticipate would be associated with the setup and maintenance of shared code repositories for development and early access to NEAMS software products. We also anticipate an opportunity, with modest associated costs, to work with the Radiation Safety Information Computational Center (RSICC) to clarify export control assessment policies for software under development.

  7. PT-SAFE: a software tool for development and annunciation of medical audible alarms.

    Science.gov (United States)

    Bennett, Christopher L; McNeer, Richard R

    2012-03-01

    Recent reports by The Joint Commission as well as the Anesthesia Patient Safety Foundation have indicated that medical audible alarm effectiveness needs to be improved. Several recent studies have explored various approaches to improving the audible alarms, motivating the authors to develop real-time software capable of comparing such alarms. We sought to devise software that would allow for the development of a variety of audible alarm designs that could also integrate into existing operating room equipment configurations. The software is meant to be used as a tool for alarm researchers to quickly evaluate novel alarm designs. A software tool was developed for the purpose of creating and annunciating audible alarms. The alarms consisted of annunciators that were mapped to vital sign data received from a patient monitor. An object-oriented approach to software design was used to create a tool that is flexible and modular at run-time, can annunciate wave-files from disk, and can be programmed with MATLAB by the user to create custom alarm algorithms. The software was tested in a simulated operating room to measure technical performance and to validate the time-to-annunciation against existing equipment alarms. The software tool showed efficacy in a simulated operating room environment by providing alarm annunciation in response to physiologic and ventilator signals generated by a human patient simulator, on average 6.2 seconds faster than existing equipment alarms. Performance analysis showed that the software was capable of supporting up to 15 audible alarms on a mid-grade laptop computer before audio dropouts occurred. These results suggest that this software tool provides a foundation for rapidly staging multiple audible alarm sets from the laboratory to a simulation environment for the purpose of evaluating novel alarm designs, thus producing valuable findings for medical audible alarm standardization.

  8. Innovative gas offers

    International Nuclear Information System (INIS)

    Sala, O.; Mela, P.; Chatelain, F.

    2007-01-01

    New energy offers are progressively made available as the opening of gas market to competition becomes broader. How are organized the combined offers: gas, electricity, renewable energies and energy services? What are the marketing strategies implemented? Three participants at this round table present their offer and answer these questions. (J.S.)

  9. RSYST: From nuclear reactor calculations towards a highly sophisticated scientific software integration environment

    International Nuclear Information System (INIS)

    Noack, M.; Seybold, J.; Ruehle, R.

    1996-01-01

    The software environment RSYST was originally used to solve problems of reactor physics. The consideration of advanced scientific simulation requirements and the strict application of modern software design principles led to a system which is perfectly suitable to solve problems in various complex scientific problem domains. Starting with a review of the early days of RSYST, we describe the straight evolution driven by the need of software environment which combines the advantages of a high-performance database system with the capability to integrate sophisticated scientific technical applications. The RSYST architecture is presented and the data modelling capabilities are described. To demonstrate the powerful possibilities and flexibility of the RSYST environment, we describe a wide range of RSYST applications, e.g., mechanical simulations of multibody systems, which are used in biomechanical research, civil engineering and robotics. In addition, a hypermedia system which is used for scientific technical training and documentation is presented. (orig.) [de

  10. Sound pressure level tools design used in occupational health by means of Labview software

    Directory of Open Access Journals (Sweden)

    Farhad Forouharmajd

    2015-01-01

    Conclusion: LabVIEW programming capabilities in the field of sound can be referred to the measurement of sound, frequency analysis, and sound control that actually the software acts like a sound level meter and sound analyzer. According to the mentioned features, we can use this software to analyze and process sound and vibration as a monitoring system.

  11. Offer

    CERN Multimedia

    Staff Association

    2016-01-01

    CERN was selected and participated in the ranking "Best Employers" organized by the magazine Bilan. To thank CERN for its collaboration, the magazine offers a reduction to the subscription fee for all employed members of personnel. 25% off the annual subscription: CHF 149.25 instead of CHF 199 .— The subscription includes the magazine delivered to your home for a year, every other Wednesday, as well as special editions and access to the e-paper. To benefit from this offer, simply fill out the form provided for this purpose. To get the form, please contact the secretariat of the Staff Association (Staff.Association@cern.ch).

  12. SOFTWARE PROCESS ASSESSMENT AND IMPROVEMENT USING MULTICRITERIA DECISION AIDING - CONSTRUCTIVIST

    Directory of Open Access Journals (Sweden)

    Leonardo Ensslin

    2012-12-01

    Full Text Available Software process improvement and software process assessment have received special attention since the 1980s. Some models have been created, but these models rest on a normative approach, where the decision-maker’s participation in a software organization is limited to understanding which process is more relevant to each organization. The proposal of this work is to present the MCDA-C as a constructivist methodology for software process improvement and assessment. The methodology makes it possible to visualize the criteria that must be taken into account according to the decision-makers’ values in the process improvement actions, making it possible to rank actions in the light of specific organizational needs. This process helped the manager of the company studied to focus on and prioritize process improvement actions. This paper offers an empirical understanding of the application of performance evaluation to software process improvement and identifies complementary tools to the normative models presented today.

  13. Development of evaluation method for software safety analysis techniques

    International Nuclear Information System (INIS)

    Huang, H.; Tu, W.; Shih, C.; Chen, C.; Yang, W.; Yih, S.; Kuo, C.; Chen, M.

    2006-01-01

    Full text: Full text: Following the massive adoption of digital Instrumentation and Control (I and C) system for nuclear power plant (NPP), various Software Safety Analysis (SSA) techniques are used to evaluate the NPP safety for adopting appropriate digital I and C system, and then to reduce risk to acceptable level. However, each technique has its specific advantage and disadvantage. If the two or more techniques can be complementarily incorporated, the SSA combination would be more acceptable. As a result, if proper evaluation criteria are available, the analyst can then choose appropriate technique combination to perform analysis on the basis of resources. This research evaluated the applicable software safety analysis techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flowgraph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/ noise ratio, complexity, and implementation cost. These indexes may help the decision makers and the software safety analysts to choose the best SSA combination arrange their own software safety plan. By this proposed method, the analysts can evaluate various SSA combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (without transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and Simulation-based model analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantage are the completeness complexity

  14. Leveraging Existing Mission Tools in a Re-Usable, Component-Based Software Environment

    Science.gov (United States)

    Greene, Kevin; Grenander, Sven; Kurien, James; z,s (fshir. z[orttr); z,scer; O'Reilly, Taifun

    2006-01-01

    Emerging methods in component-based software development offer significant advantages but may seem incompatible with existing mission operations applications. In this paper we relate our positive experiences integrating existing mission applications into component-based tools we are delivering to three missions. In most operations environments, a number of software applications have been integrated together to form the mission operations software. In contrast, with component-based software development chunks of related functionality and data structures, referred to as components, can be individually delivered, integrated and re-used. With the advent of powerful tools for managing component-based development, complex software systems can potentially see significant benefits in ease of integration, testability and reusability from these techniques. These benefits motivate us to ask how component-based development techniques can be relevant in a mission operations environment, where there is significant investment in software tools that are not component-based and may not be written in languages for which component-based tools even exist. Trusted and complex software tools for sequencing, validation, navigation, and other vital functions cannot simply be re-written or abandoned in order to gain the advantages offered by emerging component-based software techniques. Thus some middle ground must be found. We have faced exactly this issue, and have found several solutions. Ensemble is an open platform for development, integration, and deployment of mission operations software that we are developing. Ensemble itself is an extension of an open source, component-based software development platform called Eclipse. Due to the advantages of component-based development, we have been able to vary rapidly develop mission operations tools for three surface missions by mixing and matching from a common set of mission operation components. We have also had to determine how to

  15. Competing Values in Software Process Improvement

    DEFF Research Database (Denmark)

    Mûller, Sune Dueholm; Nielsen, Peter Axel

    2013-01-01

    Purpose The purpose of the article is to investigate the impact of organizational culture on software process improvement (SPI). Is cultural congruence between an organization and an adopted process model required? How can the level of congruence between an organizational culture and the values...... and assumptions underlying an adopted process model be assessed? How can cultural incongruence be managed to facilitate success of software process improvement? Design/methodology/approach The competing values framework and its associated assessment instrument are used in a case study to establish......-step process, SPI managers establish and compare culture profiles and decide how to address identified problems. To that end the text analysis technique is offered as a web service that allows for analysis of all text-based process models and standards, and of internal process documentation. Originality...

  16. One-Click Data Analysis Software for Science Operations

    Science.gov (United States)

    Navarro, Vicente

    2015-12-01

    One of the important activities of ESA Science Operations Centre is to provide Data Analysis Software (DAS) to enable users and scientists to process data further to higher levels. During operations and post-operations, Data Analysis Software (DAS) is fully maintained and updated for new OS and library releases. Nonetheless, once a Mission goes into the "legacy" phase, there are very limited funds and long-term preservation becomes more and more difficult. Building on Virtual Machine (VM), Cloud computing and Software as a Service (SaaS) technologies, this project has aimed at providing long-term preservation of Data Analysis Software for the following missions: - PIA for ISO (1995) - SAS for XMM-Newton (1999) - Hipe for Herschel (2009) - EXIA for EXOSAT (1983) Following goals have guided the architecture: - Support for all operations, post-operations and archive/legacy phases. - Support for local (user's computer) and cloud environments (ESAC-Cloud, Amazon - AWS). - Support for expert users, requiring full capabilities. - Provision of a simple web-based interface. This talk describes the architecture, challenges, results and lessons learnt gathered in this project.

  17. Software/hardware distributed processing network supporting the Ada environment

    Science.gov (United States)

    Wood, Richard J.; Pryk, Zen

    1993-09-01

    A high-performance, fault-tolerant, distributed network has been developed, tested, and demonstrated. The network is based on the MIPS Computer Systems, Inc. R3000 Risc for processing, VHSIC ASICs for high speed, reliable, inter-node communications and compatible commercial memory and I/O boards. The network is an evolution of the Advanced Onboard Signal Processor (AOSP) architecture. It supports Ada application software with an Ada- implemented operating system. A six-node implementation (capable of expansion up to 256 nodes) of the RISC multiprocessor architecture provides 120 MIPS of scalar throughput, 96 Mbytes of RAM and 24 Mbytes of non-volatile memory. The network provides for all ground processing applications, has merit for space-qualified RISC-based network, and interfaces to advanced Computer Aided Software Engineering (CASE) tools for application software development.

  18. The development of capability measures in health economics: opportunities, challenges and progress.

    Science.gov (United States)

    Coast, Joanna; Kinghorn, Philip; Mitchell, Paul

    2015-04-01

    Recent years have seen increased engagement amongst health economists with the capability approach developed by Amartya Sen and others. This paper focuses on the capability approach in relation to the evaluative space used for analysis within health economics. It considers the opportunities that the capability approach offers in extending this space, but also the methodological challenges associated with moving from the theoretical concepts to practical empirical applications. The paper then examines three 'families' of measures, Oxford Capability instruments (OxCap), Adult Social Care Outcome Toolkit (ASCOT) and ICEpop CAPability (ICECAP), in terms of the methodological choices made in each case. The paper concludes by discussing some of the broader issues involved in making use of the capability approach in health economics. It also suggests that continued exploration of the impact of different methodological choices will be important in moving forward.

  19. A Study on the Quantitative Assessment Method of Software Requirement Documents Using Software Engineering Measures and Bayesian Belief Networks

    International Nuclear Information System (INIS)

    Eom, Heung Seop; Kang, Hyun Gook; Park, Ki Hong; Kwon, Kee Choon; Chang, Seung Cheol

    2005-01-01

    some important software engineering measures with respect to their capability at predicting software reliability and we could utilize the results of the study in improving our methodology

  20. The Cementitious Barriers Partnership (CBP) Software Toolbox Capabilities In Assessing The Degradation Of Cementitious Barriers

    Energy Technology Data Exchange (ETDEWEB)

    Flach, G. P. [Savannah River Site (SRS), Aiken, SC (United States); Burns, H. H. [Savannah River Site (SRS), Aiken, SC (United States); Langton, C. [Savannah River Site (SRS), Aiken, SC (United States); Smith, F. G. III [Savannah River Site (SRS), Aiken, SC (United States); Brown, K. G. [Vanderbilt University, Nashville, TN (United States); Kosson, D. S. [Vanderbilt University, Nashville, TN (United States); Garrabrants, A. C. [Vanderbilt University, Nashville, TN (United States); Sarkar, S. [Vanderbilt University, Nashville, TN (United States); van der Sloot, H. [Hans van der Sloot Consultancy (The Netherlands); Meeussen, J. C.L. [Nuclear Research and Consultancy Group, Petten (The Netherlands); Samson, E. [SIMCO Technologies Inc. , 1400, boul. du Parc - Technologique , Suite 203, Quebec (Canada); Mallick, P. [United States Department of Energy, 1000 Independence Ave. SW , Washington, DC (United States); Suttora, L. [United States Department of Energy, 1000 Independence Ave. SW , Washington, DC (United States); Esh, D. W. [U .S. Nuclear Regulatory Commission , Washington, DC (United States); Fuhrmann, M. J. [U .S. Nuclear Regulatory Commission , Washington, DC (United States); Philip, J. [U .S. Nuclear Regulatory Commission , Washington, DC (United States)

    2013-01-11

    The Cementitious Barriers Partnership (CBP) Project is a multi-disciplinary, multi-institutional collaboration supported by the U.S. Department of Energy (US DOE) Office of Tank Waste and Nuclear Materials Management. The CBP program has developed a set of integrated tools (based on state-of-the-art models and leaching test methods) that help improve understanding and predictions of the long-term structural, hydraulic and chemical performance of cementitious barriers used in nuclear applications. Tools selected for and developed under this program have been used to evaluate and predict the behavior of cementitious barriers used in near-surface engineered waste disposal systems for periods of performance up to 100 years and longer for operating facilities and longer than 1000 years for waste disposal. The CBP Software Toolbox has produced tangible benefits to the DOE Performance Assessment (PA) community. A review of prior DOE PAs has provided a list of potential opportunities for improving cementitious barrier performance predictions through the use of the CBP software tools. These opportunities include: 1) impact of atmospheric exposure to concrete and grout before closure, such as accelerated slag and Tc-99 oxidation, 2) prediction of changes in Kd/mobility as a function of time that result from changing pH and redox conditions, 3) concrete degradation from rebar corrosion due to carbonation, 4) early age cracking from drying and/or thermal shrinkage and 5) degradation due to sulfate attack. The CBP has already had opportunity to provide near-term, tangible support to ongoing DOE-EM PAs such as the Savannah River Saltstone Disposal Facility (SDF) by providing a sulfate attack analysis that predicts the extent and damage that sulfate ingress will have on the concrete vaults over extended time (i.e., > 1000 years). This analysis is one of the many technical opportunities in cementitious barrier performance that can be addressed by the DOE-EM sponsored CBP software

  1. Deliverable 6.2 - Software: upgraded MC simulation tools capable of simulating a complete in-beam ET experiment, from the beam to the detected events. Report with the description of one (or few) reference clinical case(s), including the complete patient model and beam characteristics

    CERN Document Server

    The ENVISION Collaboration

    2014-01-01

    Deliverable 6.2 - Software: upgraded MC simulation tools capable of simulating a complete in-beam ET experiment, from the beam to the detected events. Report with the description of one (or few) reference clinical case(s), including the complete patient model and beam characteristics

  2. A Roadmap for NEAMS Capability Transfer

    Energy Technology Data Exchange (ETDEWEB)

    Bernholdt, David E [ORNL

    2011-11-01

    The vision of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program is to bring truly predictive modeling and simulation (M&S) capabilities to the nuclear engineering community in order to enable a new approach to the design and analysis of nuclear energy systems. From its inception, the NEAMS program has always envisioned a broad user base for its software and scientific products, including researchers within the DOE complex, nuclear industry technology developers and vendors, and operators. However activities to date have focused almost exclusively on interactions with NEAMS sponsors, who are also near-term users of NEAMS technologies. The task of the NEAMS Capability Transfer (CT) program element for FY2011 is to develop a comprehensive plan to support the program's needs for user outreach and technology transfer. In order to obtain community input to this plan, a 'NEAMS Capability Transfer Roadmapping Workshop' was held 4-5 April 2011 in Chattanooga, TN, and is summarized in this report. The 30 workshop participants represented the NEAMS program, the DOE and industrial user communities, and several outside programs. The workshop included a series of presentations providing an overview of the NEAMS program and presentations on the user outreach and technology transfer experiences of (1) The Advanced Simulation and Computing (ASC) program, (2) The Standardized Computer Analysis for Licensing Evaluation (SCALE) project, and (3) The Consortium for Advanced Simulation of Light Water Reactors (CASL), followed by discussion sessions. Based on the workshop and other discussions throughout the year, we make a number of recommendations of key areas for the NEAMS program to develop the user outreach and technology transfer activities: (1) Engage not only DOE, but also industrial users sooner and more often; (2) Engage with the Nuclear Regulatory Commission to facilitate their understanding and acceptance of NEAMS approach to predictive M&S; (3

  3. Advanced Query and Data Mining Capabilities for MaROS

    Science.gov (United States)

    Wang, Paul; Wallick, Michael N.; Allard, Daniel A.; Gladden, Roy E.; Hy, Franklin H.

    2013-01-01

    The Mars Relay Operational Service (MaROS) comprises a number of tools to coordinate, plan, and visualize various aspects of the Mars Relay network. These levels include a Web-based user interface, a back-end "ReSTlet" built in Java, and databases that store the data as it is received from the network. As part of MaROS, the innovators have developed and implemented a feature set that operates on several levels of the software architecture. This new feature is an advanced querying capability through either the Web-based user interface, or through a back-end REST interface to access all of the data gathered from the network. This software is not meant to replace the REST interface, but to augment and expand the range of available data. The current REST interface provides specific data that is used by the MaROS Web application to display and visualize the information; however, the returned information from the REST interface has typically been pre-processed to return only a subset of the entire information within the repository, particularly only the information that is of interest to the GUI (graphical user interface). The new, advanced query and data mining capabilities allow users to retrieve the raw data and/or to perform their own data processing. The query language used to access the repository is a restricted subset of the structured query language (SQL) that can be built safely from the Web user interface, or entered as freeform SQL by a user. The results are returned in a CSV (Comma Separated Values) format for easy exporting to third party tools and applications that can be used for data mining or user-defined visualization and interpretation. This is the first time that a service is capable of providing access to all cross-project relay data from a single Web resource. Because MaROS contains the data for a variety of missions from the Mars network, which span both NASA and ESA, the software also establishes an access control list (ACL) on each data record

  4. Intercultural Competence in International Software R&D Cooperation. Toward a Conceptual Framework

    DEFF Research Database (Denmark)

    Skaates, Maria Anne

    2001-01-01

    As part of a research project on cooperation between software development subcontractors from small countries and foreign customers, the dynamics of intercultural competence are being examined. This paper builds a conceptual bridge by developing a definition of organizational intercultural....... It is envisioned that the presented novel framework could be helpful to software developing subcontractors from small national states who already use the competence terminology in discussions of their firms' capabilities and strategies....

  5. Special Offers

    CERN Multimedia

    Association du personnel

    2011-01-01

    Are you a member of the Staff Association? Did you know that as a member you can benefit from the following special offers: BCGE (Banque Cantonale de Genève): personalized banking solutions with preferential conditions.     TPG: reduced rates on annual transport passes for active and retired staff.     Aquaparc: reduced ticket prices for children and adults at this Swiss waterpark in Le Bouveret.     Walibi: reduced prices for children and adults at this French attraction park in Les Avenières.       FNAC: 5% reduction on FNAC vouchers.       For more information about all these offers, please consult our web site: http://association.web.cern.ch/association/en/OtherActivities/Offers.html

  6. Special Offers

    CERN Multimedia

    Staff Association

    2011-01-01

    Are you a member of the Staff Association? Did you know that as a member you can benefit from the following special offers: BCGE (Banque Cantonale de Genève): personalized banking solutions with preferential conditions.     TPG: reduced rates on annual transport passes for all active and retired staff.     Aquaparc: reduced ticket prices for children and adults at this Swiss waterpark in Le Bouveret.     Walibi: reduced prices for children and adults at this French attraction park in Les Avenières.       FNAC: 5% reduction on FNAC vouchers.       For more information about all these offers, please consult our web site: http://association.web.cern.ch/association/en/OtherActivities/Offers.html

  7. Imprinting Community College Computer Science Education with Software Engineering Principles

    Science.gov (United States)

    Hundley, Jacqueline Holliday

    Although the two-year curriculum guide includes coverage of all eight software engineering core topics, the computer science courses taught in Alabama community colleges limit student exposure to the programming, or coding, phase of the software development lifecycle and offer little experience in requirements analysis, design, testing, and maintenance. We proposed that some software engineering principles can be incorporated into the introductory-level of the computer science curriculum. Our vision is to give community college students a broader exposure to the software development lifecycle. For those students who plan to transfer to a baccalaureate program subsequent to their community college education, our vision is to prepare them sufficiently to move seamlessly into mainstream computer science and software engineering degrees. For those students who plan to move from the community college to a programming career, our vision is to equip them with the foundational knowledge and skills required by the software industry. To accomplish our goals, we developed curriculum modules for teaching seven of the software engineering knowledge areas within current computer science introductory-level courses. Each module was designed to be self-supported with suggested learning objectives, teaching outline, software tool support, teaching activities, and other material to assist the instructor in using it.

  8. The software analysis project for the Office of Human Resources

    Science.gov (United States)

    Tureman, Robert L., Jr.

    1994-01-01

    There were two major sections of the project for the Office of Human Resources (OHR). The first section was to conduct a planning study to analyze software use with the goal of recommending software purchases and determining whether the need exists for a file server. The second section was analysis and distribution planning for retirement planning computer program entitled VISION provided by NASA Headquarters. The software planning study was developed to help OHR analyze the current administrative desktop computing environment and make decisions regarding software acquisition and implementation. There were three major areas addressed by the study: current environment new software requirements, and strategies regarding the implementation of a server in the Office. To gather data on current environment, employees were surveyed and an inventory of computers were produced. The surveys were compiled and analyzed by the ASEE fellow with interpretation help by OHR staff. New software requirements represented a compilation and analysis of the surveyed requests of OHR personnel. Finally, the information on the use of a server represents research done by the ASEE fellow and analysis of survey data to determine software requirements for a server. This included selection of a methodology to estimate the number of copies of each software program required given current use and estimated growth. The report presents the results of the computing survey, a description of the current computing environment, recommenations for changes in the computing environment, current software needs, management advantages of using a server, and management considerations in the implementation of a server. In addition, detailed specifications were presented for the hardware and software recommendations to offer a complete picture to OHR management. The retirement planning computer program available to NASA employees will aid in long-range retirement planning. The intended audience is the NASA civil

  9. A Platform for the Development and the Validation of HW IP Components Starting from Reference Software Specifications

    Directory of Open Access Journals (Sweden)

    2009-02-01

    Full Text Available Signal processing algorithms become more and more efficient as a result of the developments of new standards. It is particularly true in the field video compression. However, at each improvement in efficiency and functionality, the complexity of the algorithms is also increasing. Textual specifications, that in the past were the original form of specifications, have been substituted by reference software which became the starting point of any design flow leading to implementation. Therefore, designing an embedded application has become equivalent to port a generic software on a, possibly heterogeneous, embedded platform. Such operation is getting more and more difficult because of the increased algorithm complexity and the wide range of architectural solutions. This paper describes a new platform aiming at supporting a step-by-step mapping of reference software (i.e., generic and nonoptimized software into software and hardware implementations. The platform provides a seamless interface between the software and hardware environments with profiling capabilities for the analysis of data transfers between hardware and software. Such profiling capabilities help the designer to achieve different implementations aiming at specific objectives such as the optimization of hardware processing resources, of the memory architectures, or the minimization of data transfers to reach low-power designs.

  10. Book Review: The Software IP Detective's Handbook: Measurement, Comparison, and Infringement Detections

    Directory of Open Access Journals (Sweden)

    Diane Barrett

    2012-03-01

    Full Text Available Zeidman, B. (2011. The Software IP Detective's Handbook: Measurement, Comparison, and Infringement Detection. Boston, MA: Pearson Education, Inc. 480 pages, ISBN-10: 0137035330;ISBN-13: 978-0137035335, US$49.99.Reviewed by Diane Barrett, American Military University.Do not the book title fool you into thinking that the book is only for those looking to detect software infringement detection. It is a comprehensive look at software intellectual property. The book covers a wide range of topics and has something to offer for just about everyone from lawyers to programmers.(see PDF for full review

  11. Trends in Literacy Software Publication and Marketing: Multicultural Themes.

    Science.gov (United States)

    Balajthy, Ernest

    This article provides data and discussion of multicultural theme-related issues arising from analysis of a detailed database of commercial software products targeted to reading and literacy education. The database consisted of 1152 titles, representing the offerings of 104 publishers and distributors. Of the titles, 62 were identified as having…

  12. Reference and PDF-manager software: complexities, support and workflow.

    Science.gov (United States)

    Mead, Thomas L; Berryman, Donna R

    2010-10-01

    In the past, librarians taught reference management by training library users to use established software programs such as RefWorks or EndNote. In today's environment, there is a proliferation of Web-based programs that are being used by library clientele that offer a new twist on the well-known reference management programs. Basically, these new programs are PDF-manager software (e.g., Mendeley or Papers). Librarians are faced with new questions, issues, and concerns, given the new workflows and pathways that these PDF-manager programs present. This article takes a look at some of those.

  13. Development of Software for Measurement and Analysis of Solar Radiation

    International Nuclear Information System (INIS)

    Mohamad Idris Taib; Abul Adli Anuar; Noor Ezati Shuib

    2015-01-01

    This software was under development using LabVIEW to be using with StellarNet spectrometers system with USB communication to computer. LabVIEW have capabilities in hardware interfacing, graphical user interfacing and mathematical calculation including array manipulation and processing. This software read data from StellarNet spectrometer in real-time and then processed for analysis. Several measurement of solar radiation and analysis have been done. Solar radiation involved mainly infra-red, visible light and ultra-violet. With solar radiation spectrum data, information of weather and suitability of plant can be gathered and analyzed. Furthermore, optimization of utilization and safety precaution of solar radiation can be planned. Using this software, more research and development in utilization and safety of solar radiation can be explored. (author)

  14. 24 CFR 970.11 - Procedures for the offer of sale to established eligible organizations.

    Science.gov (United States)

    2010-04-01

    ... capabilities; (6) A plan for financing the purchase of the property and a firm financial commitment as stated... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Procedures for the offer of sale to established eligible organizations. 970.11 Section 970.11 Housing and Urban Development Regulations Relating...

  15. Global Software Engineering: A Software Process Approach

    Science.gov (United States)

    Richardson, Ita; Casey, Valentine; Burton, John; McCaffery, Fergal

    Our research has shown that many companies are struggling with the successful implementation of global software engineering, due to temporal, cultural and geographical distance, which causes a range of factors to come into play. For example, cultural, project managementproject management and communication difficulties continually cause problems for software engineers and project managers. While the implementation of efficient software processes can be used to improve the quality of the software product, published software process models do not cater explicitly for the recent growth in global software engineering. Our thesis is that global software engineering factors should be included in software process models to ensure their continued usefulness in global organisations. Based on extensive global software engineering research, we have developed a software process, Global Teaming, which includes specific practices and sub-practices. The purpose is to ensure that requirements for successful global software engineering are stipulated so that organisations can ensure successful implementation of global software engineering.

  16. Three-dimensional imaging technology offers promise in medicine.

    Science.gov (United States)

    Karako, Kenji; Wu, Qiong; Gao, Jianjun

    2014-04-01

    Medical imaging plays an increasingly important role in the diagnosis and treatment of disease. Currently, medical equipment mainly has two-dimensional (2D) imaging systems. Although this conventional imaging largely satisfies clinical requirements, it cannot depict pathologic changes in 3 dimensions. The development of three-dimensional (3D) imaging technology has encouraged advances in medical imaging. Three-dimensional imaging technology offers doctors much more information on a pathology than 2D imaging, thus significantly improving diagnostic capability and the quality of treatment. Moreover, the combination of 3D imaging with augmented reality significantly improves surgical navigation process. The advantages of 3D imaging technology have made it an important component of technological progress in the field of medical imaging.

  17. NASA Data Acquisition System Software Development for Rocket Propulsion Test Facilities

    Science.gov (United States)

    Herbert, Phillip W., Sr.; Elliot, Alex C.; Graves, Andrew R.

    2015-01-01

    Current NASA propulsion test facilities include Stennis Space Center in Mississippi, Marshall Space Flight Center in Alabama, Plum Brook Station in Ohio, and White Sands Test Facility in New Mexico. Within and across these centers, a diverse set of data acquisition systems exist with different hardware and software platforms. The NASA Data Acquisition System (NDAS) is a software suite designed to operate and control many critical aspects of rocket engine testing. The software suite combines real-time data visualization, data recording to a variety formats, short-term and long-term acquisition system calibration capabilities, test stand configuration control, and a variety of data post-processing capabilities. Additionally, data stream conversion functions exist to translate test facility data streams to and from downstream systems, including engine customer systems. The primary design goals for NDAS are flexibility, extensibility, and modularity. Providing a common user interface for a variety of hardware platforms helps drive consistency and error reduction during testing. In addition, with an understanding that test facilities have different requirements and setups, the software is designed to be modular. One engine program may require real-time displays and data recording; others may require more complex data stream conversion, measurement filtering, or test stand configuration management. The NDAS suite allows test facilities to choose which components to use based on their specific needs. The NDAS code is primarily written in LabVIEW, a graphical, data-flow driven language. Although LabVIEW is a general-purpose programming language; large-scale software development in the language is relatively rare compared to more commonly used languages. The NDAS software suite also makes extensive use of a new, advanced development framework called the Actor Framework. The Actor Framework provides a level of code reuse and extensibility that has previously been difficult

  18. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    Science.gov (United States)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  19. DERIVING 3D POINT CLOUDS FROM TERRESTRIAL PHOTOGRAPHS - COMPARISON OF DIFFERENT SENSORS AND SOFTWARE

    Directory of Open Access Journals (Sweden)

    R. Niederheiser

    2016-06-01

    Full Text Available Terrestrial photogrammetry nowadays offers a reasonably cheap, intuitive and effective approach to 3D-modelling. However, the important choice, which sensor and which software to use is not straight forward and needs consideration as the choice will have effects on the resulting 3D point cloud and its derivatives. We compare five different sensors as well as four different state-of-the-art software packages for a single application, the modelling of a vegetated rock face. The five sensors represent different resolutions, sensor sizes and price segments of the cameras. The software packages used are: (1 Agisoft PhotoScan Pro (1.16, (2 Pix4D (2.0.89, (3 a combination of Visual SFM (V0.5.22 and SURE (1.2.0.286, and (4 MicMac (1.0. We took photos of a vegetated rock face from identical positions with all sensors. Then we compared the results of the different software packages regarding the ease of the workflow, visual appeal, similarity and quality of the point cloud. While PhotoScan and Pix4D offer the user-friendliest workflows, they are also “black-box” programmes giving only little insight into their processing. Unsatisfying results may only be changed by modifying settings within a module. The combined workflow of Visual SFM, SURE and CloudCompare is just as simple but requires more user interaction. MicMac turned out to be the most challenging software as it is less user-friendly. However, MicMac offers the most possibilities to influence the processing workflow. The resulting point-clouds of PhotoScan and MicMac are the most appealing.

  20. Essential Features for a Scholarly Journal Content Management and Peer Review Software

    Directory of Open Access Journals (Sweden)

    Fatima Sheikh Shoaie

    2010-03-01

    Full Text Available   The present study investigates the software used in scientific journals for content management and peer review, in order to identify the essential features. These softwares are analyzed and presented in tabular format. A questionnaire was prepared and submitted to a panel composed of 15 referees, editor in chief, software designers and researchers. The essential features for a software managing the review process were divided into three groups with populations of 10-15, 5-10 and 0-5 respectively. The majority of peer review process software features, in view of panelists, fell into a group of features with a population of 10-15. Finally it should be said that the features represented by the first group must be taken into account when designing or purchasing a peer review software. The second tier features (with population of 5-10 are recommended given journal's status and capabilities. The third tier features were altogether discounted due to low population

  1. Making Software What Really Works, and Why We Believe It

    CERN Document Server

    Oram, Andy

    2010-01-01

    Many claims are made about how certain tools, technologies, and practices improve software development. But which claims are verifiable, and which are merely wishful thinking? In this book, leading thinkers such as Steve McConnell, Barry Boehm, and Barbara Kitchenham offer essays that uncover the truth and unmask myths commonly held among the software development community. Their insights may surprise you. Are some programmers really ten times more productive than others?Does writing tests first help you develop better code faster?Can code metrics predict the number of bugs in a piece of soft

  2. Modeling and optimizing periodically inspected software rejuvenation policy based on geometric sequences

    International Nuclear Information System (INIS)

    Meng, Haining; Liu, Jianjun; Hei, Xinhong

    2015-01-01

    Software aging is characterized by an increasing failure rate, progressive performance degradation and even a sudden crash in a long-running software system. Software rejuvenation is an effective method to counteract software aging. A periodically inspected rejuvenation policy for software systems is studied. The consecutive inspection intervals are assumed to be a decreasing geometric sequence, and upon the inspection times of software system and its failure features, software rejuvenation or system recovery is performed. The system availability function and cost rate function are obtained, and the optimal inspection time and rejuvenation interval are both derived to maximize system availability and minimize cost rate. Then, boundary conditions of the optimal rejuvenation policy are deduced. Finally, the numeric experiment result shows the effectiveness of the proposed policy. Further compared with the existing software rejuvenation policy, the new policy has higher system availability. - Highlights: • A periodically inspected rejuvenation policy for software systems is studied. • A decreasing geometric sequence is used to denote the consecutive inspection intervals. • The optimal inspection times and rejuvenation interval are found. • The new policy is capable of reducing average cost and improving system availability

  3. Testing Software Development Project Productivity Model

    Science.gov (United States)

    Lipkin, Ilya

    Software development is an increasingly influential factor in today's business environment, and a major issue affecting software development is how an organization estimates projects. If the organization underestimates cost, schedule, and quality requirements, the end results will not meet customer needs. On the other hand, if the organization overestimates these criteria, resources that could have been used more profitably will be wasted. There is no accurate model or measure available that can guide an organization in a quest for software development, with existing estimation models often underestimating software development efforts as much as 500 to 600 percent. To address this issue, existing models usually are calibrated using local data with a small sample size, with resulting estimates not offering improved cost analysis. This study presents a conceptual model for accurately estimating software development, based on an extensive literature review and theoretical analysis based on Sociotechnical Systems (STS) theory. The conceptual model serves as a solution to bridge organizational and technological factors and is validated using an empirical dataset provided by the DoD. Practical implications of this study allow for practitioners to concentrate on specific constructs of interest that provide the best value for the least amount of time. This study outlines key contributing constructs that are unique for Software Size E-SLOC, Man-hours Spent, and Quality of the Product, those constructs having the largest contribution to project productivity. This study discusses customer characteristics and provides a framework for a simplified project analysis for source selection evaluation and audit task reviews for the customers and suppliers. Theoretical contributions of this study provide an initial theory-based hypothesized project productivity model that can be used as a generic overall model across several application domains such as IT, Command and Control

  4. AgesGalore-A software program for evaluating spatially resolved luminescence data

    International Nuclear Information System (INIS)

    Greilich, S.; Harney, H.-L.; Woda, C.; Wagner, G.A.

    2006-01-01

    Low-light luminescence is usually recorded by photomultiplier tubes (PMTs) yielding integrated photon-number data. Highly sensitive CCD (charged coupled device) detectors allow for the spatially resolved recording of luminescence. The resulting two-dimensional images require suitable software for data processing. We present a recently developed software program specially designed for equivalent-dose evaluation in the framework of optically stimulated luminescence (OSL) dating. The software is capable of appropriate CCD data handling, parameter estimation using a Bayesian approach, and the pixel-wise fitting of functions for time and dose dependencies to the luminescence signal. The results of the fitting procedure and the equivalent-dose evaluation can be presented and analyzed both as spatial and as frequency distributions

  5. Survey of Verification and Validation Techniques for Small Satellite Software Development

    Science.gov (United States)

    Jacklin, Stephen A.

    2015-01-01

    The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.

  6. CernVM - a virtual software appliance for LHC applications

    International Nuclear Information System (INIS)

    Buncic, P; Sanchez, C Aguado; Blomer, J; Franco, L; Mato, P; Harutyunian, A; Yao, Y

    2010-01-01

    CernVM is a Virtual Software Appliance capable of running physics applications from the LHC experiments at CERN. It aims to provide a complete and portable environment for developing and running LHC data analysis on any end-user computer (laptop, desktop) as well as on the Grid, independently of Operating System platforms (Linux, Windows, MacOS). The experiment application software and its specific dependencies are built independently from CernVM and delivered to the appliance just in time by means of a CernVM File System (CVMFS) specifically designed for efficient software distribution. The procedures for building, installing and validating software releases remains under the control and responsibility of each user community. We provide a mechanism to publish pre-built and configured experiment software releases to a central distribution point from where it finds its way to the running CernVM instances via the hierarchy of proxy servers or content delivery networks. In this paper, we present current state of CernVM project and compare performance of CVMFS to performance of traditional network file system like AFS and discuss possible scenarios that could further improve its performance and scalability.

  7. Software for physics of tau lepton decay in LHC experiments

    CERN Document Server

    Przedzinski, Tomasz

    2010-01-01

    Software development in high energy physics experiments offers unique experience with rapidly changing environment and variety of different standards and frameworks that software must be adapted to. As such, regular methods of software development are hard to use as they do not take into account how greatly some of these changes influence the whole structure. The following thesis summarizes development of TAUOLA C++ Interface introducing tau decays to new event record standard. Documentation of the program is already published. That is why it is not recalled here again. We focus on the development cycle and methodology used in the project, starting from the definition of the expectations through planning and designing the abstract model and concluding with the implementation. In the last part of the paper we present installation of the software within different experiments surrounding Large Hadron Collider and the problems that emerged during this process.

  8. Agentes de software móviles

    Directory of Open Access Journals (Sweden)

    Crisman Martínez Barrera

    2001-10-01

    Full Text Available Los agentes móviles son programas de software inteligentes que realizan un objetivo que involucran desarrollos soportados en técnicas de Inteligencia Artificial, los cuales pretenden facilitar la interoperabilidad de sistemas. Este artículo define las disciplinas, plataformas y herramientas necesarias para el desarrollo de agentes móviles, sus características principales y las arquitecturas predominantes de éstas; presenta además una evaluación de sus perspectivas futuras.Mobile agents are intelligent software programs that can obtain an objective that involucrates developments supported in Artificial Intelligence techniques. These pretend to facilitate the interoperability of systems. This article defines disciplines, platforms and tools necessary for the development of mobile agents, their principal characteristics and the predominant architectures of these. A final evaluation and future perspectives are offered.

  9. Overview of the software for the Telemation/Sandia unattended video surveillance system

    International Nuclear Information System (INIS)

    Merillat, P.D.

    1979-10-01

    A microprocessor has been used to provide the major control functions in the Telemation/Sandia unattended video surveillance system. The software in the microprocessor provides control of the various hardware components and provides the capability of interactive communications with the operator. This document, in conjunction with the commented source listing, defines the philosophy and function of the software. It is assumed that the reader is familiar with the RCA 1802 COSMAC microprocessor and has a reasonable computer science background

  10. ElectroMagnetoEncephalography software: overview and integration with other EEG/MEG toolboxes.

    Science.gov (United States)

    Peyk, Peter; De Cesarei, Andrea; Junghöfer, Markus

    2011-01-01

    EMEGS (electromagnetic encephalography software) is a MATLAB toolbox designed to provide novice as well as expert users in the field of neuroscience with a variety of functions to perform analysis of EEG and MEG data. The software consists of a set of graphical interfaces devoted to preprocessing, analysis, and visualization of electromagnetic data. Moreover, it can be extended using a plug-in interface. Here, an overview of the capabilities of the toolbox is provided, together with a simple tutorial for both a standard ERP analysis and a time-frequency analysis. Latest features and future directions of the software development are presented in the final section.

  11. Software engineering architecture-driven software development

    CERN Document Server

    Schmidt, Richard F

    2013-01-01

    Software Engineering: Architecture-driven Software Development is the first comprehensive guide to the underlying skills embodied in the IEEE's Software Engineering Body of Knowledge (SWEBOK) standard. Standards expert Richard Schmidt explains the traditional software engineering practices recognized for developing projects for government or corporate systems. Software engineering education often lacks standardization, with many institutions focusing on implementation rather than design as it impacts product architecture. Many graduates join the workforce with incomplete skil

  12. Student perceptions of drill-and-practice mathematics software in primary education

    Science.gov (United States)

    Kuiper, Els; de Pater-Sneep, Martie

    2014-06-01

    Drill-and-practice mathematics software offers teachers a relatively simple way to use technology in the classroom. One of the reasons to use the software may be that it motivates children, working on the computer being more "fun" than doing regular school work. However, students' own perceptions of such software are seldom studied. This article reports on a study on the opinions of Grade 5 and 6 students regarding two mathematics drill-and-practice software packages. In total, 329 students from ten Dutch primary schools took part in the study. The results show that a majority of the students preferred to work in their exercise book, for various reasons. Especially the rigid structure of the software is mentioned as a negative aspect by students. The elaborate arguments students used illustrate the importance of taking their opinions into account already at the primary level. Students' perceptions also show that the idea of ICT as naturally motivating for students may need modification.

  13. A software for computer automated radioactive particle tracking

    International Nuclear Information System (INIS)

    Vieira, Wilson S.; Brandao, Luis E.; Braz, Delson

    2008-01-01

    TRACO-1 is the first software developed in Brazil for optimization and diagnosis of multiphase chemical reactors employing the technique known as 'Computer Automated Radioactive Particle Tracking' whose main idea is to follow the movement of a punctual radioactive particle inside a vessel. Considering that this particle has a behavior similar of the phase under investigation, important conclusions can be achieved. As a preliminary TRACO-1 evaluation, a simulation was carried out with the aid of a commercial software called MICROSHIELD, version 5.05, to obtain values of photon counting rates at four detector surfaces. These counting were related to the emission of gamma radiation from a radioactive source because they are the main TRACO-1 input variables. Although the results that has been found are incipient, the analysis of them suggest that the tracking of a radioactive source using TRACO- 1 can be well succeed, but a better evaluation of the capabilities of this software will only be achieved after its application in real experiments. (author)

  14. ABCD, an Open Source Software for Modern Libraries

    Directory of Open Access Journals (Sweden)

    Sangeeta Namdev Dhamdhere

    2011-12-01

    Full Text Available Nowadays, librarians are using various kinds of open source software for different purposes such as library automation, digitization, institutional repository, content management. ABCD, acronym for Automatisación de Bibliotécas y Centros de Documentación, is one of such software. It caters to almost all present needs of modern libraries of any sizes. It offers a solution to library automation with ISBD as well as local formats. It has excellent indexing and retrieval features based on UNESCO’s ISIS technology, a web OPAC, and a library Portal with integrated meta-search and content management system to manage online as well as offline digital resources and physical documents and media.

  15. Lean Development with the Morpheus Simulation Software

    Science.gov (United States)

    Brogley, Aaron C.

    2013-01-01

    The Morpheus project is an autonomous robotic testbed currently in development at NASA's Johnson Space Center (JSC) with support from other centers. Its primary objectives are to test new 'green' fuel propulsion systems and to demonstrate the capability of the Autonomous Lander Hazard Avoidance Technology (ALHAT) sensor, provided by the Jet Propulsion Laboratory (JPL) on a lunar landing trajectory. If successful, these technologies and lessons learned from the Morpheus testing cycle may be incorporated into a landing descent vehicle used on the moon, an asteroid, or Mars. In an effort to reduce development costs and cycle time, the project employs lean development engineering practices in its development of flight and simulation software. The Morpheus simulation makes use of existing software packages where possible to reduce the development time. The development and testing of flight software occurs primarily through the frequent test operation of the vehicle and incrementally increasing the scope of the test. With rapid development cycles, risk of loss of the vehicle and loss of the mission are possible, but efficient progress in development would not be possible without that risk.

  16. Selecting Advanced Software Technology in Two Small Manufacturing Enterprises

    Science.gov (United States)

    2004-05-01

    improving workflow to further reduce delivery times, enhance customer service, and obtain a competitive advantage . The company wanted help... environment , stakeholders’ needs, ecommerce , shop floor visualization, and collaboration capability. These statements are not significantly different...for the purpose of describing a software environment . This identification does not imply any recommendation or endorsement by NIST, the SEI, CMU, or

  17. The ALICE Software Release Validation cluster

    International Nuclear Information System (INIS)

    Berzano, D; Krzewicki, M

    2015-01-01

    One of the most important steps of software lifecycle is Quality Assurance: this process comprehends both automatic tests and manual reviews, and all of them must pass successfully before the software is approved for production. Some tests, such as source code static analysis, are executed on a single dedicated service: in High Energy Physics, a full simulation and reconstruction chain on a distributed computing environment, backed with a sample “golden” dataset, is also necessary for the quality sign off. The ALICE experiment uses dedicated and virtualized computing infrastructures for the Release Validation in order not to taint the production environment (i.e. CVMFS and the Grid) with non-validated software and validation jobs: the ALICE Release Validation cluster is a disposable virtual cluster appliance based on CernVM and the Virtual Analysis Facility, capable of deploying on demand, and with a single command, a dedicated virtual HTCondor cluster with an automatically scalable number of virtual workers on any cloud supporting the standard EC2 interface. Input and output data are externally stored on EOS, and a dedicated CVMFS service is used to provide the software to be validated. We will show how the Release Validation Cluster deployment and disposal are completely transparent for the Release Manager, who simply triggers the validation from the ALICE build system's web interface. CernVM 3, based entirely on CVMFS, permits to boot any snapshot of the operating system in time: we will show how this allows us to certify each ALICE software release for an exact CernVM snapshot, addressing the problem of Long Term Data Preservation by ensuring a consistent environment for software execution and data reprocessing in the future. (paper)

  18. Software Process Improvement for SMEs using OMM

    OpenAIRE

    Rodríguez, Jessica

    2010-01-01

    Software Process Improvement initiatives have been implemented by many companies in order to achieve quality of products and practices. Many models like CMMI and IDEAL have been adopted as a means to gain competitive advantages among competitors and trustworthiness of customers. Although these models have proved successful results, the inherent characteristics of SMEs make it difficult and in many cases unfeasible to implement such models, without meaning that those companies are less capable...

  19. Simplifying the Development, Use and Sustainability of HPC Software

    Directory of Open Access Journals (Sweden)

    Jeremy Cohen

    2014-07-01

    Full Text Available Developing software to undertake complex, compute-intensive scientific processes requires a challenging combination of both specialist domain knowledge and software development skills to convert this knowledge into efficient code. As computational platforms become increasingly heterogeneous and newer types of platform such as Infrastructure-as-a-Service (IaaS cloud computing become more widely accepted for high-performance computing (HPC, scientists require more support from computer scientists and resource providers to develop efficient code that offers long-term sustainability and makes optimal use of the resources available to them. As part of the libhpc stage 1 and 2 projects we are developing a framework to provide a richer means of job specification and efficient execution of complex scientific software on heterogeneous infrastructure. In this updated version of our submission to the WSSSPE13 workshop at SuperComputing 2013 we set out our approach to simplifying access to HPC applications and resources for end-users through the use of flexible and interchangeable software components and associated high-level functional-style operations. We believe this approach can support sustainability of scientific software and help to widen access to it.

  20. IFDOTMETER: A New Software Application for Automated Immunofluorescence Analysis.

    Science.gov (United States)

    Rodríguez-Arribas, Mario; Pizarro-Estrella, Elisa; Gómez-Sánchez, Rubén; Yakhine-Diop, S M S; Gragera-Hidalgo, Antonio; Cristo, Alejandro; Bravo-San Pedro, Jose M; González-Polo, Rosa A; Fuentes, José M

    2016-04-01

    Most laboratories interested in autophagy use different imaging software for managing and analyzing heterogeneous parameters in immunofluorescence experiments (e.g., LC3-puncta quantification and determination of the number and size of lysosomes). One solution would be software that works on a user's laptop or workstation that can access all image settings and provide quick and easy-to-use analysis of data. Thus, we have designed and implemented an application called IFDOTMETER, which can run on all major operating systems because it has been programmed using JAVA (Sun Microsystems). Briefly, IFDOTMETER software has been created to quantify a variety of biological hallmarks, including mitochondrial morphology and nuclear condensation. The program interface is intuitive and user-friendly, making it useful for users not familiar with computer handling. By setting previously defined parameters, the software can automatically analyze a large number of images without the supervision of the researcher. Once analysis is complete, the results are stored in a spreadsheet. Using software for high-throughput cell image analysis offers researchers the possibility of performing comprehensive and precise analysis of a high number of images in an automated manner, making this routine task easier. © 2015 Society for Laboratory Automation and Screening.

  1. Offering memorable patient experience through creative, dynamic marketing strategy

    Science.gov (United States)

    Raţiu, M; Purcărea, T

    2008-01-01

    Creative, dynamic strategies are the ones that identify new and better ways of uniquely offering the target customers what they want or need. A business can achieve competitive advantage if it chooses a marketing strategy that sets the business apart from anyone else. Healthcare services companies have to understand that the customer should be placed in the centre of all specific marketing operations. The brand message should reflect the focus on the patient. Healthcare products and services offered must represent exactly the solutions that customers expect. The touchpoints with the patients must be well mastered in order to convince them to accept the proposed solutions. Healthcare service providers must be capable to look beyond customer's behaviour or product and healthcare service aquisition. This will demand proactive and far–reaching changes, including focusing specifically on customer preference, quality, and technological interfaces; rewiring strategy to find new value from existing and unfamiliar sources; disintegrating and radically reassembling operational processes; and restructuring the organization to accommodate new typess of work and skill. PMID:20108466

  2. Offering memorable patient experience through creative, dynamic marketing strategy.

    Science.gov (United States)

    Purcărea, Victor Lorín; Raţíu, Monica; Purcărea, Theodor; Davila, Carol

    2008-01-01

    Creative, dynamic strategies are the ones that identify new and better ways of uniquely offering the target customers what they want or need. A business can achieve competitive advantage if it chooses a marketing strategy that sets the business apart from anyone else. Healthcare services companies have to understand that the customer should be placed in the centre of all specific marketing operations. The brand message should reflect the focus on the patient. Healthcare products and services offered must represent exactly the solutions that customers expect. The touchpoints with the patients must be well mastered in order to convince them to accept the proposed solutions. Healthcare service providers must be capable to look beyond customer's behaviour or product and healthcare service aquisition. This will demand proactive and far-reaching changes, including focusing specifically on customer preference, quality, and technological interfaces; rewiring strategy to find new value from existing and unfamiliar sources: disintegrating and radically reassembling operational processes: and restructuring the organization to accommodate new types of work and skill.

  3. Evaluation of the free, open source software WordPress as electronic portfolio system in undergraduate medical education.

    Science.gov (United States)

    Avila, Javier; Sostmann, Kai; Breckwoldt, Jan; Peters, Harm

    2016-06-03

    Electronic portfolios (ePortfolios) are used to document and support learning activities. E-portfolios with mobile capabilities allow even more flexibility. However, the development or acquisition of ePortfolio software is often costly, and at the same time, commercially available systems may not sufficiently fit the institution's needs. The aim of this study was to design and evaluate an ePortfolio system with mobile capabilities using a commercially free and open source software solution. We created an online ePortfolio environment using the blogging software WordPress based on reported capability features of such software by a qualitative weight and sum method. Technical implementation and usability were evaluated by 25 medical students during their clinical training by quantitative and qualitative means using online questionnaires and focus groups. The WordPress ePortfolio environment allowed students a broad spectrum of activities - often documented via mobile devices - like collection of multimedia evidences, posting reflections, messaging, web publishing, ePortfolio searches, collaborative learning, knowledge management in a content management system including a wiki and RSS feeds, and the use of aid tools for studying. The students' experience with WordPress revealed a few technical problems, and this report provides workarounds. The WordPress ePortfolio was rated positively by the students as a content management system (67 % of the students), for exchange with other students (74 %), as a note pad for reflections (53 %) and for its potential as an information source for assessment (48 %) and exchange with a mentor (68 %). On the negative side, 74 % of the students in this pilot study did not find it easy to get started with the system, and 63 % rated the ePortfolio as not being user-friendly. Qualitative analysis indicated a need for more introductory information and training. It is possible to build an advanced ePortfolio system with mobile

  4. THE EFFECT OF SOCIAL CAPITAL AND KNOWLEDGE SHARING ON INNOVATION CAPABILITY

    Directory of Open Access Journals (Sweden)

    Dhyah Harjanti

    2017-09-01

    Full Text Available This research examines social capital and knowledge sharing effect on innovation capability among lectures in universities. Social capital was analyzed using three constructs, namely trust, norm and network, while knowledge sharing was broken down into two variables, namely knowledge collecting and knowledge donating. Innovation capability was explained on an individual level based on personality, behavioral and output perspectives. The research model and hypotheses were developed from the literature. Data collection is conducted through a survey on lecturers of private universities in Surabaya. The obtained data from the questionnaires were analyzed with the Partial Least Square (PLS to investigate the research model. The results suggest that social capital significantly influences innovation capability, while high level of knowledge collecting and knowledge donating can lead to high level of innovation capability. This study offers a foundation to analyze the relationships between social capital, knowledge-sharing process, consisting of knowledge collecting and knowledge donating, and innovation capability

  5. PENGELOLAAN KNOWLEDGE MANAGEMENT CAPABILITY DALAM MEMEDIASI DUKUNGAN INFORMATION TECHNOLOGY RELATEDNESS TERHADAP KINERJA PERUSAHAAN

    Directory of Open Access Journals (Sweden)

    Luluk Muhimatul Ifada

    2011-06-01

    Full Text Available The study examines whether or not and how information technology (IT relatedness influences corporate performance. This study proposes that knowledge management (KM is a critical organizational capability through which IT influences firm performance. Measurement of IT relatedness and KM capability uses a reflective second-order factor modeling approach for capturing complementarities among the four dimensions of IT relatedness (IT strategy making processes, IT vendor management processes, IT human resource management processes and IT infrastructure and for capturing complementarities among the three dimensions of KM capability (product KM capability, customer KM capability, and managerial KM capability. A survey was conducted among 93 branch managers of banking in Central Java. Structural Equation Model (SEM was used to analyze the data using the software program of SmartPLS (Partial Least Square. The findings support for the hypotheses of the study. IT relatedness of business units enhances the cross unit KM capability of the corporate. The KM capability creates and exploits cross-unit synergies from the product, customer, and managerial knowledge resources of the corporate. These synergies increase the corporate performance. IT relatedness of business units positively influences corporate performance. IT relatedness also has significant indirect effects on corporate performance through the mediation of KM capability.

  6. New software for improving performance in wind farm operations

    Energy Technology Data Exchange (ETDEWEB)

    Collins, Mark [Ekho for Wind (Canada)

    2011-07-01

    The performance of wind farms depends on multiple field and business systems. This makes operational planning difficult because of so many data being in separate systems, duplication of data and the impossibility of gathering all relevant data together in one place. The aim of this paper is to present a new software, Ekho for Wind, which helps improve performance in wind farm operations by providing features such as high level views, performance analysis, downtime tracking, quality data management and forecast generation. This new software provides operational intelligence which offers incentives for continuous improvement. Ekho for Wind can bring such benefits as maximization of generation, increased lifetime of assets, minimization of costs and increased profitability. This presentation introduced a new software for improving the performance of wind farms and the lifetime of assets, resulting in significant payback.

  7. NASA's Space Launch System: A New Capability for Science and Exploration

    Science.gov (United States)

    Crumbly, Christopher M.; May, Todd A.; Robinson, Kimberly F.

    2014-01-01

    The National Aeronautics and Space Administration's (NASA's) Marshall Space Flight Center (MSFC) is directing efforts to build the Space Launch System (SLS), a heavy-lift rocket that will launch the Orion Multi-Purpose Crew Vehicle (MPCV) and other high-priority payloads into deep space. Its evolvable architecture will allow NASA to begin with human missions beyond the Moon and then go on to transport astronauts or robots to distant places such as asteroids and Mars. Developed with the goals of safety, affordability, and sustainability in mind, SLS will start with 10 percent more thrust than the Saturn V rocket that launched astronauts to the Moon 40 years ago. From there it will evolve into the most powerful launch vehicle ever flown, via an upgrade approach that will provide building blocks for future space exploration. This paper will explain how NASA will execute this development within flat budgetary guidelines by using existing engines assets and heritage technology, from the initial 70 metric ton (t) lift capability through a block upgrade approach to an evolved 130-t capability, and will detail the progress that has already been made toward a first launch in 2017. This paper will also explore the requirements needed for human missions to deep-space destinations and for game-changing robotic science missions, and the capability of SLS to meet those requirements and enable those missions, along with the evolution strategy that will increase that capability. The International Space Exploration Coordination Group, representing 12 of the world's space agencies, has worked together to create the Global Exploration Roadmap, which outlines paths towards a human landing on Mars, beginning with capability-demonstrating missions to the Moon or an asteroid. The Roadmap and corresponding NASA research outline the requirements for reference missions for all three destinations. The SLS will offer a robust way to transport international crews and the air, water, food, and

  8. Organizational Response to the Introduction of New Computer Software Technology

    Science.gov (United States)

    1991-07-01

    AutoCAD user with AutoLISP , a programming language included in the package. (Some CADD packages come with these features and others as part of the...center of gravity and other mass properties; the AutoCAD user may create such capabilities in the software through programming with AutoLISP . Iteration

  9. Software FMEA analysis for safety-related application software

    International Nuclear Information System (INIS)

    Park, Gee-Yong; Kim, Dong Hoon; Lee, Dong Young

    2014-01-01

    Highlights: • We develop a modified FMEA analysis suited for applying to software architecture. • A template for failure modes on a specific software language is established. • A detailed-level software FMEA analysis on nuclear safety software is presented. - Abstract: A method of a software safety analysis is described in this paper for safety-related application software. The target software system is a software code installed at an Automatic Test and Interface Processor (ATIP) in a digital reactor protection system (DRPS). For the ATIP software safety analysis, at first, an overall safety or hazard analysis is performed over the software architecture and modules, and then a detailed safety analysis based on the software FMEA (Failure Modes and Effect Analysis) method is applied to the ATIP program. For an efficient analysis, the software FMEA analysis is carried out based on the so-called failure-mode template extracted from the function blocks used in the function block diagram (FBD) for the ATIP software. The software safety analysis by the software FMEA analysis, being applied to the ATIP software code, which has been integrated and passed through a very rigorous system test procedure, is proven to be able to provide very valuable results (i.e., software defects) that could not be identified during various system tests

  10. Does the NASA Constellation Architecture Offer Opportunities to Achieve Multiple Additional Goals in Space?

    Science.gov (United States)

    Thronson, Harley; Lester, Daniel

    2008-01-01

    Every major NASA human spaceflight program in the last four decades has been modified to achieve goals in space not incorporated within the original design goals: the Apollo Applications Program, Skylab, Space Shuttle, and International Space Station. Several groups in the U.S. have been identifying major future science goals, the science facilities necessary to investigate them, as well as possible roles for augmented versions of elements of NASA's Constellation program. Specifically, teams in the astronomy community have been developing concepts for very capable missions to follow the James Webb Space Telescope that could take advantage of - or require - free-space operations by astronauts and/or robots. Taking as one example, the Single-Aperture Far-InfraRed (SAFIR) telescope with a 10+ m aperture proposed for operation in the 2020 timeframe. According to current NASA plans, the Ares V launch vehicle (or a variant) will be available about the same time, as will the capability to transport astronauts to the vicinity of the Moon via the Orion Crew Exploration Vehicle and associated systems. [As the lunar surface offers no advantages - and major disadvantages - for most major optical systems, the expensive system for landing and operating on the lunar surface is not required.] Although as currently conceived, SAFIR and other astronomical missions will operate at the Sun-Earth L2 location, it appears trivial to travel for servicing to the more accessible Earth-Moon L1,2 locations. Moreover, as the recent Orbital Express and Automated Transfer Vehicle Missions have demonstrated, future robotic capabilities should offer capabilities that would (remotely) extend human presence far beyond the vicinity of the Earth.

  11. New strategies of sensitivity analysis capabilities in continuous-energy Monte Carlo code RMC

    International Nuclear Information System (INIS)

    Qiu, Yishu; Liang, Jingang; Wang, Kan; Yu, Jiankai

    2015-01-01

    Highlights: • Data decomposition techniques are proposed for memory reduction. • New strategies are put forward and implemented in RMC code to improve efficiency and accuracy for sensitivity calculations. • A capability to compute region-specific sensitivity coefficients is developed in RMC code. - Abstract: The iterated fission probability (IFP) method has been demonstrated to be an accurate alternative for estimating the adjoint-weighted parameters in continuous-energy Monte Carlo forward calculations. However, the memory requirements of this method are huge especially when a large number of sensitivity coefficients are desired. Therefore, data decomposition techniques are proposed in this work. Two parallel strategies based on the neutron production rate (NPR) estimator and the fission neutron population (FNP) estimator for adjoint fluxes, as well as a more efficient algorithm which has multiple overlapping blocks (MOB) in a cycle, are investigated and implemented in the continuous-energy Reactor Monte Carlo code RMC for sensitivity analysis. Furthermore, a region-specific sensitivity analysis capability is developed in RMC. These new strategies, algorithms and capabilities are verified against analytic solutions of a multi-group infinite-medium problem and against results from other software packages including MCNP6, TSUANAMI-1D and multi-group TSUNAMI-3D. While the results generated by the NPR and FNP strategies agree within 0.1% of the analytic sensitivity coefficients, the MOB strategy surprisingly produces sensitivity coefficients exactly equal to the analytic ones. Meanwhile, the results generated by the three strategies in RMC are in agreement with those produced by other codes within a few percent. Moreover, the MOB strategy performs the most efficient sensitivity coefficient calculations (offering as much as an order of magnitude gain in FoMs over MCNP6), followed by the NPR and FNP strategies, and then MCNP6. The results also reveal that these

  12. Software Graphics Processing Unit (sGPU) for Deep Space Applications

    Science.gov (United States)

    McCabe, Mary; Salazar, George; Steele, Glen

    2015-01-01

    A graphics processing capability will be required for deep space missions and must include a range of applications, from safety-critical vehicle health status to telemedicine for crew health. However, preliminary radiation testing of commercial graphics processing cards suggest they cannot operate in the deep space radiation environment. Investigation into an Software Graphics Processing Unit (sGPU)comprised of commercial-equivalent radiation hardened/tolerant single board computers, field programmable gate arrays, and safety-critical display software shows promising results. Preliminary performance of approximately 30 frames per second (FPS) has been achieved. Use of multi-core processors may provide a significant increase in performance.

  13. DEVELOPING A WIRELESS SENSOR NETWORD WITH RFID CAPABILITIES

    Directory of Open Access Journals (Sweden)

    Dan LICA

    2016-05-01

    Full Text Available The concept underlying the pervasive communications systems is Internet of Objects, defined as a global network of smart objects, resulting in "attachment" of communication functions, data acquisition, data processing objects we interact with daily. Internet of Objects will be based on 4th generation of Internet systems. In such systems, each intelligent networked device must offer its users a number of services, which constitute a development of the current model of Web services. Such services will result from aggregation of micro services offered by each pervasive device connected to the Internet. Aggregating these micro services requires that each node in the network has the opportunity to present / expose its capabilities (so-called "advertising" in a standard way for client applications to be integrated into complex services. Therefore there appears the need to introduce a coding scheme to identity the nodes in the network, enabling the identification of their capabilities. An alternative to implementing such a system of encoding / identifying characteristics of network nodes is active RFID technology. The aim of this chapter is to develop a method to ensure the convergence of wireless sensor networks with active RFID systems within a pervasive hybrid network based on the IEEE 802.15.4 standard, which satisfies the Internet of Objects network.

  14. Modeling of a 3DTV service in the software-defined networking architecture

    Science.gov (United States)

    Wilczewski, Grzegorz

    2014-11-01

    In this article a newly developed concept towards modeling of a multimedia service offering stereoscopic motion imagery is presented. Proposed model is based on the approach of utilization of Software-defined Networking or Software Defined Networks architecture (SDN). The definition of 3D television service spanning SDN concept is identified, exposing basic characteristic of a 3DTV service in a modern networking organization layout. Furthermore, exemplary functionalities of the proposed 3DTV model are depicted. It is indicated that modeling of a 3DTV service in the Software-defined Networking architecture leads to multiplicity of improvements, especially towards flexibility of a service supporting heterogeneity of end user devices.

  15. Antenna Controller Replacement Software

    Science.gov (United States)

    Chao, Roger Y.; Morgan, Scott C.; Strain, Martha M.; Rockwell, Stephen T.; Shimizu, Kenneth J.; Tehrani, Barzia J.; Kwok, Jaclyn H.; Tuazon-Wong, Michelle; Valtier, Henry; Nalbandi, Reza; hide

    2010-01-01

    The Antenna Controller Replacement (ACR) software accurately points and monitors the Deep Space Network (DSN) 70-m and 34-m high-efficiency (HEF) ground-based antennas that are used to track primarily spacecraft and, periodically, celestial targets. To track a spacecraft, or other targets, the antenna must be accurately pointed at the spacecraft, which can be very far away with very weak signals. ACR s conical scanning capability collects the signal in a circular pattern around the target, calculates the location of the strongest signal, and adjusts the antenna pointing to point directly at the spacecraft. A real-time, closed-loop servo control algorithm performed every 0.02 second allows accurate positioning of the antenna in order to track these distant spacecraft. Additionally, this advanced servo control algorithm provides better antenna pointing performance in windy conditions. The ACR software provides high-level commands that provide a very easy user interface for the DSN operator. The operator only needs to enter two commands to start the antenna and subreflector, and Master Equatorial tracking. The most accurate antenna pointing is accomplished by aligning the antenna to the Master Equatorial, which because of its small size and sheltered location, has the most stable pointing. The antenna has hundreds of digital and analog monitor points. The ACR software provides compact displays to summarize the status of the antenna, subreflector, and the Master Equatorial. The ACR software has two major functions. First, it performs all of the steps required to accurately point the antenna (and subreflector and Master Equatorial) at the spacecraft (or celestial target). This involves controlling the antenna/ subreflector/Master-Equatorial hardware, initiating and monitoring the correct sequence of operations, calculating the position of the spacecraft relative to the antenna, executing the real-time servo control algorithm to maintain the correct position, and

  16. Advances in software development for intelligent interfaces for alarm and emergency management consoles

    International Nuclear Information System (INIS)

    Moseley, M.R.; Olson, C.E.

    1986-01-01

    Recent advances in technology allow features like voice synthesis, voice and speech recognition, image understanding, and intelligent data base management to be incorporated in computer driven alarm and emergency management information systems. New software development environments make it possible to do rapid prototyping of custom applications. Three examples using these technologies are discussed. (1) Maximum use is made of high-speed graphics and voice synthesis to implement a state-of-the-art alarm processing and display system with features that make the operator-machine interface efficient and accurate. Although very functional, this system is not portable or flexible; the software would have to be substantially rewritten for other applications. (2) An application generator which has the capability of ''building'' a specific alarm processing and display application in a matter of a few hours, using the site definition developed in the security planning phase to produce the custom application. This package is based on a standardized choice of hardware, within which it is capable of building a system to order, automatically constructing graphics, data tables, alarm prioritization rules, and interfaces to peripherals. (3) A software tool, the User Interface Management System (UIMS), is described which permits rapid prototyping of human-machine interfaces for a variety of applications including emergency management, alarm display and process information display. The object-oriented software of the UIMS achieves rapid prototyping of a new interface by standardizing to a class library of software objects instead of hardware objects

  17. Trends in Human-Computer Interaction to Support Future Intelligence Analysis Capabilities

    Science.gov (United States)

    2011-06-01

    Oblong Industries Inc. (Oblong, 2011). In addition to the camera-based gesture interaction (Figure 4), this system offers a management capability...EyeTap Lumus Eyewear LOE FogScreen HP LiM PC Microvision PEK and SHOWWX Pico Projectors Head Mounted Display Chinese Holo Screen 10 Advanced Analyst

  18. The mythical man-month essays on software engineering

    CERN Document Server

    Brooks, Frederick Phillips

    1995-01-01

    Few books on software project management have been as influential and timeless as The Mythical Man-Month. With a blend of software engineering facts and thought-provoking opinions, Fred Brooks offers insight for anyone managing complex projects. These essays draw from his experience as project manager for the IBM System/360 computer family and then for OS/360, its massive software system. Now, 20 years after the initial publication of his book, Brooks has revisited his original ideas and added new thoughts and advice, both for readers already familiar with his work and for readers discovering it for the first time. The added chapters contain (1) a crisp condensation of all the propositions asserted in the original book, including Brooks' central argument in The Mythical Man-Month: that large programming projects suffer management problems different from small ones due to the division of labor; that the conceptual integrity of the product is therefore critical; and that it is difficult but possible to achi...

  19. Evolution of the ATLAS Software Framework towards Concurrency

    CERN Document Server

    Jones, Roger; The ATLAS collaboration; Leggett, Charles; Wynne, Benjamin

    2015-01-01

    The ATLAS experiment has successfully used its Gaudi/Athena software framework for data taking and analysis during the first LHC run, with billions of events successfully processed. However, the design of Gaudi/Athena dates from early 2000 and the software and the physics code has been written using a single threaded, serial design. This programming model has increasing difficulty in exploiting the potential of current CPUs, which offer their best performance only through taking full advantage of multiple cores and wide vector registers. Future CPU evolution will intensify this trend, with core counts increasing and memory per core falling. Maximising performance per watt will be a key metric, so all of these cores must be used as efficiently as possible. In order to address the deficiencies of the current framework, ATLAS has embarked upon two projects: first, a practical demonstration of the use of multi-threading in our reconstruction software, using the GaudiHive framework; second, an exercise to gather r...

  20. Virtual immunology: software for teaching basic immunology.

    Science.gov (United States)

    Berçot, Filipe Faria; Fidalgo-Neto, Antônio Augusto; Lopes, Renato Matos; Faggioni, Thais; Alves, Luiz Anastácio

    2013-01-01

    As immunology continues to evolve, many educational methods have found difficulty in conveying the degree of complexity inherent in its basic principles. Today, the teaching-learning process in such areas has been improved with tools such as educational software. This article introduces "Virtual Immunology," a software program available free of charge in Portuguese and English, which can be used by teachers and students in physiology, immunology, and cellular biology classes. We discuss the development of the initial two modules: "Organs and Lymphoid Tissues" and "Inflammation" and the use of interactive activities to provide microscopic and macroscopic understanding in immunology. Students, both graduate and undergraduate, were questioned along with university level professors about the quality of the software and intuitiveness of use, facility of navigation, and aesthetic organization using a Likert scale. An overwhelmingly satisfactory result was obtained with both students and immunology teachers. Programs such as "Virtual Immunology" are offering more interactive, multimedia approaches to complex scientific principles that increase student motivation, interest, and comprehension. © 2013 by The International Union of Biochemistry and Molecular Biology.

  1. Software-based annunciator replacement: a tale of two projects

    International Nuclear Information System (INIS)

    Simmons, G.T.

    2015-01-01

    Annunciator upgrade projects are often included as parts of operating plant life extension projects as the systems are old and replacement parts are difficult to source. This paper contains case studies of the software-based annunciator replacement projects at the Westinghouse SNUPPS training simulator in Pennsylvania and the Axpo Beznau nuclear power plant in Switzerland. Software-based annunciator systems can offer a number of feature enhancements including improved readability and operator awareness, easy configuration, alarm suppression features, and alarm management at operator workstations. This paper provides an overview of each project and discusses advantages, challenges, and lessons learned from both annunciator-replacement projects. (author)

  2. Software-based annunciator replacement: a tale of two projects

    Energy Technology Data Exchange (ETDEWEB)

    Simmons, G.T., E-mail: simmongt@westinghouse.com [Westinghouse Electric Company LLC, Cranberry Township, PA (United States)

    2015-07-01

    Annunciator upgrade projects are often included as parts of operating plant life extension projects as the systems are old and replacement parts are difficult to source. This paper contains case studies of the software-based annunciator replacement projects at the Westinghouse SNUPPS training simulator in Pennsylvania and the Axpo Beznau nuclear power plant in Switzerland. Software-based annunciator systems can offer a number of feature enhancements including improved readability and operator awareness, easy configuration, alarm suppression features, and alarm management at operator workstations. This paper provides an overview of each project and discusses advantages, challenges, and lessons learned from both annunciator-replacement projects. (author)

  3. Managing the Testing Process Practical Tools and Techniques for Managing Hardware and Software Testing

    CERN Document Server

    Black, Rex

    2011-01-01

    New edition of one of the most influential books on managing software and hardware testing In this new edition of his top-selling book, Rex Black walks you through the steps necessary to manage rigorous testing programs of hardware and software. The preeminent expert in his field, Mr. Black draws upon years of experience as president of both the International and American Software Testing Qualifications boards to offer this extensive resource of all the standards, methods, and tools you'll need. The book covers core testing concepts and thoroughly examines the best test management practices

  4. Software/firmware design specification for 10-MWe solar-thermal central-receiver pilot plant

    Energy Technology Data Exchange (ETDEWEB)

    Ladewig, T.D.

    1981-03-01

    The software and firmware employed for the operation of the Barstow Solar Pilot Plant are completely described. The systems allow operator control of up to 2048 heliostats, and include the capability of operator-commanded control, graphic displays, status displays, alarm generation, system redundancy, and interfaces to the Operational Control System, the Data Acquisition System, and the Beam Characterization System. The requirements are decomposed into eleven software modules for execution in the Heliostat Array Controller computer, one firmware module for execution in the Heliostat Field Controller microprocessor, and one firmware module for execution in the Heliostat Controller microprocessor. The design of the modules to satisfy requirements, the interfaces between the computers, the software system structure, and the computers in which the software and firmware will execute are detailed. The testing sequence for validation of the software/firmware is described. (LEW)

  5. Conserving analyst attention units: use of multi-agent software and CEP methods to assist information analysis

    Science.gov (United States)

    Rimland, Jeffrey; McNeese, Michael; Hall, David

    2013-05-01

    Although the capability of computer-based artificial intelligence techniques for decision-making and situational awareness has seen notable improvement over the last several decades, the current state-of-the-art still falls short of creating computer systems capable of autonomously making complex decisions and judgments in many domains where data is nuanced and accountability is high. However, there is a great deal of potential for hybrid systems in which software applications augment human capabilities by focusing the analyst's attention to relevant information elements based on both a priori knowledge of the analyst's goals and the processing/correlation of a series of data streams too numerous and heterogeneous for the analyst to digest without assistance. Researchers at Penn State University are exploring ways in which an information framework influenced by Klein's (Recognition Primed Decision) RPD model, Endsley's model of situational awareness, and the Joint Directors of Laboratories (JDL) data fusion process model can be implemented through a novel combination of Complex Event Processing (CEP) and Multi-Agent Software (MAS). Though originally designed for stock market and financial applications, the high performance data-driven nature of CEP techniques provide a natural compliment to the proven capabilities of MAS systems for modeling naturalistic decision-making, performing process adjudication, and optimizing networked processing and cognition via the use of "mobile agents." This paper addresses the challenges and opportunities of such a framework for augmenting human observational capability as well as enabling the ability to perform collaborative context-aware reasoning in both human teams and hybrid human / software agent teams.

  6. Application of software technology to automatic test data analysis

    Science.gov (United States)

    Stagner, J. R.

    1991-01-01

    The verification process for a major software subsystem was partially automated as part of a feasibility demonstration. The methods employed are generally useful and applicable to other types of subsystems. The effort resulted in substantial savings in test engineer analysis time and offers a method for inclusion of automatic verification as a part of regression testing.

  7. Creating a strategic plan for configuration management using computer aided software engineering (CASE) tools

    International Nuclear Information System (INIS)

    Smith, P.R.; Sarfaty, R.

    1993-01-01

    This paper provides guidance in the definition, documentation, measurement, enhancement of processes, and validation of a strategic plan for configuration management (CM). The approach and methodology used in establishing a strategic plan is the same for any enterprise, including the Department of Energy (DOE), commercial nuclear plants, the Department of Defense (DOD), or large industrial complexes. The principles and techniques presented are used world wide by some of the largest corporations. The authors used industry knowledge and the areas of their current employment to illustrate and provide examples. Developing a strategic configuration and information management plan for DOE Idaho Field Office (DOE-ID) facilities is discussed in this paper. A good knowledge of CM principles is the key to successful strategic planning. This paper will describe and define CM elements, and discuss how CM integrates the facility's physical configuration, design basis, and documentation. The strategic plan does not need the support of a computer aided software engineering (CASE) tool. However, the use of the CASE tool provides a methodology for consistency in approach, graphics, and database capability combined to form an encyclopedia and a method of presentation that is easily understood and aids the process of reengineering. CASE tools have much more capability than those stated above. Some examples are supporting a joint application development group (JAD) to prepare a software functional specification document and, if necessary, provide the capability to automatically generate software application code. This paper briefly discusses characteristics and capabilities of two CASE tools that use different methodologies to generate similar deliverables

  8. Algebraic software analysis and embedded simulation of a driving robot

    NARCIS (Netherlands)

    Merkx, L.L.F.; Duringhof, H.M.; Cuijpers, P.J.L.

    2007-01-01

    At TNO Automotive the Generic Driving Actuator (GDA) is developed. The GDA is a device capable of driving a vehicle fully automatically using the same interface as a human driver does. In this paper, the design of the GDA is discussed. The software and hardware of the GDA and its effect on vehicle

  9. Advanced Software V&V for Civil Aviation and Autonomy

    Science.gov (United States)

    Brat, Guillaume P.

    2017-01-01

    With the advances in high-computing platform (e.g., advanced graphical processing units or multi-core processors), computationally-intensive software techniques such as the ones used in artificial intelligence or formal methods have provided us with an opportunity to further increase safety in the aviation industry. Some of these techniques have facilitated building safety at design time, like in aircraft engines or software verification and validation, and others can introduce safety benefits during operations as long as we adapt our processes. In this talk, I will present how NASA is taking advantage of these new software techniques to build in safety at design time through advanced software verification and validation, which can be applied earlier and earlier in the design life cycle and thus help also reduce the cost of aviation assurance. I will then show how run-time techniques (such as runtime assurance or data analytics) offer us a chance to catch even more complex problems, even in the face of changing and unpredictable environments. These new techniques will be extremely useful as our aviation systems become more complex and more autonomous.

  10. Improving the Product Documentation Process of a Small Software Company

    Science.gov (United States)

    Valtanen, Anu; Ahonen, Jarmo J.; Savolainen, Paula

    Documentation is an important part of the software process, even though it is often neglected in software companies. The eternal question is how much documentation is enough. In this article, we present a practical implementation of lightweight product documentation process resulting from SPI efforts in a small company. Small companies’ financial and human resources are often limited. The documentation process described here, offers a template for creating adequate documentation consuming minimal amount of resources. The key element of the documentation process is an open source web-based bugtracking system that was customized to be used as a documentation tool. The use of the tool enables iterative and well structured documentation. The solution best serves the needs of a small company with off-the-shelf software products and striving for SPI.

  11. Software tool for horizontal-axis wind turbine simulation

    Energy Technology Data Exchange (ETDEWEB)

    Vitale, A.J. [Instituto Argentino de Oceanografia, Camino La Carrindanga Km. 7, 5 CC 804, B8000FWB Bahia Blanca (Argentina); Universidad Tecnologica Nacional Facultad Regional Bahia Blanca, GESE, 11 de Abril 461, B8000LMI Bahia Blanca (Argentina); Dpto. de Ing. Electrica y de Computadoras, Universidad Nacional del Sur, Av. Alem 1253, 8000 Bahia Blanca (Argentina); Rossi, A.P. [Universidad Tecnologica Nacional Facultad Regional Bahia Blanca, GESE, 11 de Abril 461, B8000LMI Bahia Blanca (Argentina); Dpto. de Ing. Electrica y de Computadoras, Universidad Nacional del Sur, Av. Alem 1253, 8000 Bahia Blanca (Argentina)

    2008-07-15

    The main problem of a wind turbine generator design project is the design of the right blades capable of satisfying the specific energy requirement of an electric system with optimum performance. Once the blade has been designed for optimum operation at a particular rotor angular speed, it is necessary to determine the overall performance of the rotor under the range of wind speed that it will encounter. A software tool that simulates low-power, horizontal-axis wind turbines was developed for this purpose. With this program, the user can calculate the rotor power output for any combination of wind and rotor speeds, with definite blade shape and airfoil characteristics. The software also provides information about distribution of forces along the blade span, for different operational conditions. (author)

  12. A General Water Resources Regulation Software System in China

    Science.gov (United States)

    LEI, X.

    2017-12-01

    To avoid iterative development of core modules in water resource normal regulation and emergency regulation and improve the capability of maintenance and optimization upgrading of regulation models and business logics, a general water resources regulation software framework was developed based on the collection and analysis of common demands for water resources regulation and emergency management. It can provide a customizable, secondary developed and extensible software framework for the three-level platform "MWR-Basin-Province". Meanwhile, this general software system can realize business collaboration and information sharing of water resources regulation schemes among the three-level platforms, so as to improve the decision-making ability of national water resources regulation. There are four main modules involved in the general software system: 1) A complete set of general water resources regulation modules allows secondary developer to custom-develop water resources regulation decision-making systems; 2) A complete set of model base and model computing software released in the form of Cloud services; 3) A complete set of tools to build the concept map and model system of basin water resources regulation, as well as a model management system to calibrate and configure model parameters; 4) A database which satisfies business functions and functional requirements of general water resources regulation software can finally provide technical support for building basin or regional water resources regulation models.

  13. Sandia software guidelines: Software quality planning

    Energy Technology Data Exchange (ETDEWEB)

    1987-08-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. In consonance with the IEEE Standard for Software Quality Assurance Plans, this volume identifies procedures to follow in producing a Software Quality Assurance Plan for an organization or a project, and provides an example project SQA plan. 2 figs., 4 tabs.

  14. Software Productivity of Field Experiments Using the Mobile Agents Open Architecture with Workflow Interoperability

    Science.gov (United States)

    Clancey, William J.; Lowry, Michael R.; Nado, Robert Allen; Sierhuis, Maarten

    2011-01-01

    We analyzed a series of ten systematically developed surface exploration systems that integrated a variety of hardware and software components. Design, development, and testing data suggest that incremental buildup of an exploration system for long-duration capabilities is facilitated by an open architecture with appropriate-level APIs, specifically designed to facilitate integration of new components. This improves software productivity by reducing changes required for reconfiguring an existing system.

  15. Improving Security at Work with Software that Uses OpenMP

    Directory of Open Access Journals (Sweden)

    P. S. Polishuk

    2010-03-01

    Full Text Available A model of the offender and the list of major types of threats, the conditions for the realization of which are created by using the software that uses OpenMP is considered. A method for verification of software using OpenMP for the presence of vulnerabilities associated with multi-threaded execution is offered. We give basic algorithms and the system architecture that implements the proposed method. The results of testing the method on various programs, including those containing malicious code, as well as assessment of the possibilities of applying the method in different computing environments are given.

  16. Fun and software exploring pleasure, paradox and pain in computing

    CERN Document Server

    Goriunova, Olga

    2014-01-01

    Fun and Software offers the untold story of fun as constitutive of the culture and aesthetics of computing. Fun in computing is a mode of thinking, making and experiencing. It invokes and convolutes the question of rationalism and logical reason, addresses the sensibilities and experience of computation and attests to its creative drives. By exploring topics as diverse as the pleasure and pain of the programmer, geek wit, affects of play and coding as a bodily pursuit of the unique in recursive structures, Fun and Software helps construct a different point of entry to the understanding of soft

  17. Using MDA for integration of heterogeneous components in software supply chains

    NARCIS (Netherlands)

    Hartmann, Johan Herman; Keren, Mila; Matsinger, Aart; Rubin, Julia; Trew, Tim; Yatzkar-Haham, Tali

    2013-01-01

    Software product lines are increasingly built using components from specialized suppliers. A company that is in the middle of a supply chain has to integrate components from its suppliers and offer (partially configured) products to its customers. To satisfy both the variability required by each

  18. Proteomics Quality Control: Quality Control Software for MaxQuant Results.

    Science.gov (United States)

    Bielow, Chris; Mastrobuoni, Guido; Kempa, Stefan

    2016-03-04

    Mass spectrometry-based proteomics coupled to liquid chromatography has matured into an automatized, high-throughput technology, producing data on the scale of multiple gigabytes per instrument per day. Consequently, an automated quality control (QC) and quality analysis (QA) capable of detecting measurement bias, verifying consistency, and avoiding propagation of error is paramount for instrument operators and scientists in charge of downstream analysis. We have developed an R-based QC pipeline called Proteomics Quality Control (PTXQC) for bottom-up LC-MS data generated by the MaxQuant software pipeline. PTXQC creates a QC report containing a comprehensive and powerful set of QC metrics, augmented with automated scoring functions. The automated scores are collated to create an overview heatmap at the beginning of the report, giving valuable guidance also to nonspecialists. Our software supports a wide range of experimental designs, including stable isotope labeling by amino acids in cell culture (SILAC), tandem mass tags (TMT), and label-free data. Furthermore, we introduce new metrics to score MaxQuant's Match-between-runs (MBR) functionality by which peptide identifications can be transferred across Raw files based on accurate retention time and m/z. Last but not least, PTXQC is easy to install and use and represents the first QC software capable of processing MaxQuant result tables. PTXQC is freely available at https://github.com/cbielow/PTXQC .

  19. Report of AAPM Task Group 162: Software for planar image quality metrology.

    Science.gov (United States)

    Samei, Ehsan; Ikejimba, Lynda C; Harrawood, Brian P; Rong, John; Cunningham, Ian A; Flynn, Michael J

    2018-02-01

    The AAPM Task Group 162 aimed to provide a standardized approach for the assessment of image quality in planar imaging systems. This report offers a description of the approach as well as the details of the resultant software bundle to measure detective quantum efficiency (DQE) as well as its basis components and derivatives. The methodology and the associated software include the characterization of the noise power spectrum (NPS) from planar images acquired under specific acquisition conditions, modulation transfer function (MTF) using an edge test object, the DQE, and effective DQE (eDQE). First, a methodological framework is provided to highlight the theoretical basis of the work. Then, a step-by-step guide is included to assist in proper execution of each component of the code. Lastly, an evaluation of the method is included to validate its accuracy against model-based and experimental data. The code was built using a Macintosh OSX operating system. The software package contains all the source codes to permit an experienced user to build the suite on a Linux or other *nix type system. The package further includes manuals and sample images and scripts to demonstrate use of the software for new users. The results of the code are in close alignment with theoretical expectations and published results of experimental data. The methodology and the software package offered in AAPM TG162 can be used as baseline for characterization of inherent image quality attributes of planar imaging systems. © 2017 American Association of Physicists in Medicine.

  20. WSC-07: Evolving the Web Services Challenge

    NARCIS (Netherlands)

    Blake, M. Brian; Cheung, William K.W.; Jaeger, Michael C.; Wombacher, Andreas

    Service-oriented architecture (SOA) is an evolving architectural paradigm where businesses can expose their capabilities as modular, network-accessible software services. By decomposing capabilities into modular services, organizations can share their offerings at multiple levels of granularity

  1. Computer-assisted operational management of power plants in the field of tension between standard and individual software; IT-unterstuetzte Betriebsfuehrung von Kraftwerken. Im Spannungsfeld von Standard- und Individual-Software

    Energy Technology Data Exchange (ETDEWEB)

    Hippmann, Norbert [RWE Power AG, Essen (Germany). Sparte Steinkohle-/Gas-Kraftwerke

    2010-07-01

    Process routines in the operational management of power plants - particularly maintenance - are now largely planned, controlled and documented with the help of IT. Depending on corporate policy, IT support for routines is currently realised either with commercially available standard ERP software or with dedicated applications that have been specially developed for a given company. Whereas standard software has certain technical benefits (homogeneous databases, data integrity, standard user interface, no software interfaces, standard maintenance and service), customised applications have the undisputed advantage of offering the best possible mapping of company-specific process routines. By exploiting the full spectrum of IT enhancement options of its SAP system, RWE Power has largely combined the respective benefits of both standard and customised software, while also realising high-end user requirements that go beyond the mere standard. (orig.)

  2. Integrated software health management for aerospace guidance, navigation, and control systems: A probabilistic reasoning approach

    Science.gov (United States)

    Mbaya, Timmy

    Embedded Aerospace Systems have to perform safety and mission critical operations in a real-time environment where timing and functional correctness are extremely important. Guidance, Navigation, and Control (GN&C) systems substantially rely on complex software interfacing with hardware in real-time; any faults in software or hardware, or their interaction could result in fatal consequences. Integrated Software Health Management (ISWHM) provides an approach for detection and diagnosis of software failures while the software is in operation. The ISWHM approach is based on probabilistic modeling of software and hardware sensors using a Bayesian network. To meet memory and timing constraints of real-time embedded execution, the Bayesian network is compiled into an Arithmetic Circuit, which is used for on-line monitoring. This type of system monitoring, using an ISWHM, provides automated reasoning capabilities that compute diagnoses in a timely manner when failures occur. This reasoning capability enables time-critical mitigating decisions and relieves the human agent from the time-consuming and arduous task of foraging through a multitude of isolated---and often contradictory---diagnosis data. For the purpose of demonstrating the relevance of ISWHM, modeling and reasoning is performed on a simple simulated aerospace system running on a real-time operating system emulator, the OSEK/Trampoline platform. Models for a small satellite and an F-16 fighter jet GN&C (Guidance, Navigation, and Control) system have been implemented. Analysis of the ISWHM is then performed by injecting faults and analyzing the ISWHM's diagnoses.

  3. Airborne gamma spectrometry - towards integration of European operational capability

    International Nuclear Information System (INIS)

    Toivonen, H.

    2003-01-01

    Full text: A nuclear threat can take several forms. The fallout from nuclear weapons or from an accident in a nuclear power reactor may contaminate a large area (>>100,000 km 2 ) whereas the dispersion of single sources, either accidentally or deliberately (dirty bomb), contaminates a much smaller area, perhaps only a few thousand square kilometres or less. Airborne gamma spectrometry (AGS) plays an important role in providing detailed information an the dispersion of radioactive materials. AGS using a fixed-wing or a rotary-wing aircraft is at its best in fallout mapping and in searching for orphan sources. Plume tracking could be a third application but is very complex, and there is a risk of vehicle contamination, which would deteriorate mission capability in the later phases of an accident. Because of obvious advantages, unmanned aerial vehicles could be used to monitor the release rate at the site of an accident and perhaps the plume itself. The aim of the present paper is to discuss ways to utilize existing European airborne monitoring capabilities for multilateral assistance in an accident and to give some thoughts to how an integrated system could be developed to take into account various national measuring strategies. In a large-scale accident, it is to be expected that the European countries use their radiological resources to map their own territory. It is realistic to think of assistance by transferring equipment and staff to another country only in accident scenarios where a country or countries with essential AGS capability would not have been affected by the fallout. Various AGS survey results can be fused only if a common platform for data exchange is available. Formats and protocols have been developed for special cases (ECCOMAGS, Nuclear Fission Safety, 4 th and 5 th Framework Programmes) but there are no universal solutions applicable to different situations and Instruments. The hardware and software among the European AGS teams are tailor

  4. A Java-based electronic healthcare record software for beta-thalassaemia.

    Science.gov (United States)

    Deftereos, S; Lambrinoudakis, C; Andriopoulos, P; Farmakis, D; Aessopos, A

    2001-01-01

    Beta-thalassaemia is a hereditary disease, the prevalence of which is high in persons of Mediterranean, African, and Southeast Asian ancestry. In Greece it constitutes an important public health problem. Beta-thalassaemia necessitates continuous and complicated health care procedures such as daily chelation; biweekly transfusions; and periodic cardiology, endocrinology, and hepatology evaluations. Typically, different care items are offered in different, often-distant, health care units, which leads to increased patient mobility. This is especially true in rural areas. Medical records of patients suffering from beta-thalassaemia are inevitably complex and grow in size very fast. They are currently paper-based, scattered over all units involved in the care process. This hinders communication of information between health care professionals and makes processing of the medical records difficult, thus impeding medical research. Our objective is to provide an electronic means for recording, communicating, and processing all data produced in the context of the care process of patients suffering from beta-thalassaemia. We have developed - and we present in this paper - Java-based Electronic Healthcare Record (EHCR) software, called JAnaemia. JAnaemia is a general-purpose EHCR application, which can be customized for use in all medical specialties. Customization for beta-thalassaemia has been performed in collaboration with 4 Greek hospitals. To be capable of coping with patient record diversity, JAnaemia has been based on the EHCR architecture proposed in the ENV 13606:1999 standard, published by the CEN/TC251 committee. Compliance with the CEN architecture also ensures that several additional requirements are fulfilled in relation to clinical comprehensiveness; to record sharing and communication; and to ethical, medico-legal, and computational issues. Special care has been taken to provide a user-friendly, form-based interface for data entry and processing. The

  5. PTaaS: Platform for Providing Software Developing Applications and Tools as a Service

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Babar, Muhammad Ali

    2014-01-01

    technological support for it that is not limited to one specific tools and a particular phase of software development life cycle. In this thesis, we have explored the possibility of offering software development applications and tools as services that can be acquired on demand according to the software...... with process. Information gained from the review of literature on GSD tools and processes is used to extract functional requirements for the middleware platform for provisioning of software development applications and tools as services. Finding from the review of literature on architecture solutions for cloud......Cloud computing has become an established paradigm for enabling organizations to build scalable software systems and to meet challenges of rapid demand of computing and storage resources. There has been a significant success in building cloud-enabled applications for many disciplines ranging from...

  6. Offers

    CERN Document Server

    Staff Association

    2015-01-01

    New offer for our members. The Staff Association CERN staff has recently concluded a framework agreement with AXA Insurance Ltd, General-Guisan-Strasse 40, 8401 Winterthur. This contract allows you to benefit from a preferential tariff and conditions for insurances: Motor vehicles for passenger cars and motorcycles of the product line STRADA: 10% discount Household insurance (personal liability and household contents) the product line BOX: 10% discount Travel insurance: 10% discount Buildings: 10% discount Legal protection: 10% discount AXA is number one on the Swiss insurance market. The product range encompasses all non-life insurance such as insurance of persons, property, civil liability, vehicles, credit and travel as well as innovative and comprehensive solutions in the field of occupational benefits insurance for individuals and businesses. Finally, the affiliate AXA-ARAG (legal expenses insurance) completes the offer. Armed with your staff association CERN card, you can always get the off...

  7. Thermal-hydraulic software development for nuclear waste transportation cask design and analysis

    International Nuclear Information System (INIS)

    Brown, N.N.; Burns, S.P.; Gianoulakis, S.E.; Klein, D.E.

    1991-01-01

    This paper describes the development of a state-of-the-art thermal-hydraulic software package intended for spent fuel and high-level nuclear waste transportation cask design and analysis. The objectives of this software development effort are threefold: (1) to take advantage of advancements in computer hardware and software to provide a more efficient user interface, (2) to provide a tool for reducing inefficient conservatism in spent fuel and high-level waste shipping cask design by including convection as well as conduction and radiation heat transfer modeling capabilities, and (3) to provide a thermal-hydraulic analysis package which is developed under a rigorous quality assurance program established at Sandia National Laboratories. 20 refs., 5 figs., 2 tabs

  8. The Cementitious Barriers Partnership (CBP) Software Toolbox Capabilities in Assessing the Degradation of Cementitious Barriers - 13487

    Energy Technology Data Exchange (ETDEWEB)

    Flach, G.P.; Burns, H.H.; Langton, C.; Smith, F.G. III [Savannah River National Laboratory, Savannah River Site, Aiken SC 29808 (United States); Brown, K.G.; Kosson, D.S.; Garrabrants, A.C.; Sarkar, S. [Vanderbilt University, Nashville, TN (United States); Van der Sloot, H. [Hans Van der Sloot Consultancy (Netherlands); Meeussen, J.C.L. [Nuclear Research and Consultancy Group, Petten (Netherlands); Samson, E. [SIMCO Technologies Inc., 1400, boul. du Parc-Technologique, Suite 203, Quebec (Canada); Mallick, P.; Suttora, L. [United States Department of Energy, 1000 Independence Ave. SW, Washington, DC (United States); Esh, D.W.; Fuhrmann, M.J.; Philip, J. [U.S. Nuclear Regulatory Commission, Washington, DC (United States)

    2013-07-01

    The Cementitious Barriers Partnership (CBP) Project is a multi-disciplinary, multi-institutional collaboration supported by the U.S. Department of Energy (US DOE) Office of Tank Waste and Nuclear Materials Management. The CBP program has developed a set of integrated tools (based on state-of-the-art models and leaching test methods) that help improve understanding and predictions of the long-term structural, hydraulic and chemical performance of cementitious barriers used in nuclear applications. Tools selected for and developed under this program have been used to evaluate and predict the behavior of cementitious barriers used in near-surface engineered waste disposal systems for periods of performance up to 100 years and longer for operating facilities and longer than 1000 years for waste disposal. The CBP Software Toolbox has produced tangible benefits to the DOE Performance Assessment (PA) community. A review of prior DOE PAs has provided a list of potential opportunities for improving cementitious barrier performance predictions through the use of the CBP software tools. These opportunities include: 1) impact of atmospheric exposure to concrete and grout before closure, such as accelerated slag and Tc-99 oxidation, 2) prediction of changes in K{sub d}/mobility as a function of time that result from changing pH and redox conditions, 3) concrete degradation from rebar corrosion due to carbonation, 4) early age cracking from drying and/or thermal shrinkage and 5) degradation due to sulfate attack. The CBP has already had opportunity to provide near-term, tangible support to ongoing DOE-EM PAs such as the Savannah River Saltstone Disposal Facility (SDF) by providing a sulfate attack analysis that predicts the extent and damage that sulfate ingress will have on the concrete vaults over extended time (i.e., > 1000 years). This analysis is one of the many technical opportunities in cementitious barrier performance that can be addressed by the DOE-EM sponsored CBP

  9. Enabling Flexible and Continuous Capability Invocation in Mobile Prosumer Environments

    Science.gov (United States)

    Alcarria, Ramon; Robles, Tomas; Morales, Augusto; López-de-Ipiña, Diego; Aguilera, Unai

    2012-01-01

    Mobile prosumer environments require the communication with heterogeneous devices during the execution of mobile services. These environments integrate sensors, actuators and smart devices, whose availability continuously changes. The aim of this paper is to design a reference architecture for implementing a model for continuous service execution and access to capabilities, i.e., the functionalities provided by these devices. The defined architecture follows a set of software engineering patterns and includes some communication paradigms to cope with the heterogeneity of sensors, actuators, controllers and other devices in the environment. In addition, we stress the importance of the flexibility in capability invocation by allowing the communication middleware to select the access technology and change the communication paradigm when dealing with smart devices, and by describing and evaluating two algorithms for resource access management. PMID:23012526

  10. An Analytical Framework for Miles and Snow Typology and Dynamic Capabilities

    Directory of Open Access Journals (Sweden)

    Tomas Sparano Martins

    2014-03-01

    Full Text Available The literature on dynamic capabilities is confusing, full of overlapping definitions, and contradictions. The theoretical and practical importance of developing and applying dynamic capabilities to sustain competitive advantage in complex external environment is central in studies about strategy nowadays. In this paper, we offer a definition of dynamic capabilities under two aspects: first, it refers to the shifting character of the environment; second, it emphasizes the key role of strategic management in appropriately adapting, integrating, and re-configuring internal and external organizational skills, resources, and functional competences towards a changing environment. This paper aims to clarify the concept of dynamic capabilities, propose an analytical framework that connects this “new” concept to a well known and recognized generic strategic model (Miles and Snow, 1978 and to the concept of sustainable competitive advantage and evolutionary fit. DOI:10.5585/riae.v13i1.1934

  11. Core Community Specifications for Electron Microprobe Operating Systems: Software, Quality Control, and Data Management Issues

    Science.gov (United States)

    Fournelle, John; Carpenter, Paul

    2006-01-01

    Modem electron microprobe systems have become increasingly sophisticated. These systems utilize either UNIX or PC computer systems for measurement, automation, and data reduction. These systems have undergone major improvements in processing, storage, display, and communications, due to increased capabilities of hardware and software. Instrument specifications are typically utilized at the time of purchase and concentrate on hardware performance. The microanalysis community includes analysts, researchers, software developers, and manufacturers, who could benefit from exchange of ideas and the ultimate development of core community specifications (CCS) for hardware and software components of microprobe instrumentation and operating systems.

  12. INDUSTRIAL ROBOT ARM SIMULATION SOFTWARE DEVELOPMENT USING JAVA-3D AND MATLAB SIMULINK PROGRAMMING LANGUAGE

    OpenAIRE

    Wirabhuana, Arya

    2011-01-01

    Robot Arms Simulation Software development using Structured Programming Languages, Third Party Language, and Artificial Intelligence Programming Language are the common techniques in simulating robot arms movement. Those three techniques are having its strengths and weaknesses depend on several constraints such as robot type, degree of operation complexity to be simulated, operator skills, and also computer capability. This paper will discuss on Robot Arms Simulation Software (RSS) developmen...

  13. Application Software for the Cabinet Operator Module of the Reactor Protection System

    International Nuclear Information System (INIS)

    Lee, Hyun-Chul; Jung, Hae-Won; Lee, Sung-Jin; Koo, Young-Ho; Kim, Seong-Tae; Kwak, Tae-Kil; Jin, Kyo-Hong

    2006-01-01

    A reactor protection system (RPS) plays the roles of generating the reactor trip signal and the engineered safety features (ESF) actuation signal when the monitored plant processes reach predefined limits. A Korean project group, so-called KNICS (Korean Nuclear I and C System), is developing a new digitalized RPS and the Cabinet Operator Module (COM) of the RPS which is used for the RPS integrity testing and monitoring by equipment operators. A flat panel display (FPD) with a touch screen capability is provided as a main user interface for the RPS. This paper shows the application software developed for the COM FPD. Equipment operators can monitor the status of the RPS and carry out various tests to verify system functions by means of the application software. A qualified hardware and software development environment are used to develop the application software

  14. Techniques to maximize software reliability in radiation fields

    International Nuclear Information System (INIS)

    Eichhorn, G.; Piercey, R.B.

    1986-01-01

    Microprocessor system failures due to memory corruption by single event upsets (SEUs) and/or latch-up in RAM or ROM memory are common in environments where there is high radiation flux. Traditional methods to harden microcomputer systems against SEUs and memory latch-up have usually involved expensive large scale hardware redundancy. Such systems offer higher reliability, but they tend to be more complex and non-standard. At the Space Astronomy Laboratory the authors have developed general programming techniques for producing software which is resistant to such memory failures. These techniques, which may be applied to standard off-the-shelf hardware, as well as custom designs, include an implementation of Maximally Redundant Software (MRS) model, error detection algorithms and memory verification and management

  15. Structural Capability of an Organization toward Innovation Capability

    DEFF Research Database (Denmark)

    Nielsen, Susanne Balslev; Momeni, Mostafa

    2016-01-01

    The scholars in the field of strategic management have developed two major approaches for attainment of competitive advantage: an approach based on environmental opportunities, and another one based on internal capabilities of an organization. Some investigations in the last two decades have...... indicated that the advantages relying on the internal capabilities of organizations may determine the competitive position of organizations better than environmental opportunities do. Characteristics of firms shows that one of the most internal capabilities that lead the organizations to the strongest...... competitive advantage in the organizations is the innovation capability. The innovation capability is associated with other organizational capabilities, and many organizations have focused on the need to identify innovation capabilities.This research focuses on recognition of the structural aspect...

  16. Software-defined reconfigurable microwave photonics processor.

    Science.gov (United States)

    Pérez, Daniel; Gasulla, Ivana; Capmany, José

    2015-06-01

    We propose, for the first time to our knowledge, a software-defined reconfigurable microwave photonics signal processor architecture that can be integrated on a chip and is capable of performing all the main functionalities by suitable programming of its control signals. The basic configuration is presented and a thorough end-to-end design model derived that accounts for the performance of the overall processor taking into consideration the impact and interdependencies of both its photonic and RF parts. We demonstrate the model versatility by applying it to several relevant application examples.

  17. Software diagnostic aids on Aladdin

    International Nuclear Information System (INIS)

    Eisert, D.E.; Stott, J.P.

    1990-01-01

    The upgrade of the Aladdin control system included many enhancements in the software. Some of the new diagnostic aids provided are: intelligent temporary logging of all readbacks for at least the previous 12 hours, permanent logging of specified readbacks into disk files, temporary logging of beam positions at an operator-specified interval, an alarm system for all devices, including range limits for analogue readbacks that should remain fixed and window limits which automatically track readbacks that should decay or increase monotonically, automated checklists to verify the devices are within the specified operating range for a particular phase of operation, latched digital signals to capture momentary changes. The software has been designed to alert the operator when something is wrong, without generating a flood of unimportant messages, and to make it possible to observe and record readbacks over a range of time scales. The latter capability is essential for tracking down marginal components and correlating observed problems with possible causes. The algorithms used for these diagnostic aids, and how well they perform their desired tasks, are described in this paper. (orig.)

  18. Zoneminder as ‘Software as a Service’ and Load Balancing of Video Surveillance Requests

    DEFF Research Database (Denmark)

    Deshmukh, Aaradhana A.; Mihovska, Albena D.; Prasad, Ramjee

    2012-01-01

    Cloud computing is evolving as a key computing platform for sharing resources that include infrastructures, softwares, applications, and business processes. Virtualization is a core technology for enabling cloud resource sharing. Software as a Service (SaaS) on the cloud platform provides software...... application vendors a Web based delivery model to serve large amount of clients with multi-tenancy based infrastructure and application sharing architecture so as to get great benefit from the economy of scale. The emergence of the Software-as-a-Service (SaaS) business model has attracted great attentions...... from both researchers and practitioners. SaaS vendors deliver on demand information processing services to users, and thus offer computing utility rather than the standalone software itself. This paper proposes a deployment of an open source video surveillance application named Zoneminder...

  19. Conceptualization and software development of a simulation environment for probalistic safety assessment of radioactive waste repositories

    Energy Technology Data Exchange (ETDEWEB)

    Ghofrani, Javad

    2016-05-26

    Uncertainty and sensitivity analysis of complex simulation models are prominent issues, both in scientific research and education. ReSUS (Repository Simulation, Uncertainty propagation and Sensitivity analysis) is an integrated platform to perform such analysis with numerical models that simulate the THMC (Thermal Hydraulical Mechanical and Chemical) coupled processes via different programs, in particular in the context of safety assessments for radioactive waste repositories. This thesis presents the idea behind the software platform ReSUS and its working mechanisms. Apart from the idea and the working mechanisms, the thesis describes applications related to the safety assessment of radioactive waste disposal systems. In this thesis, previous simulation tools (including the preceding version of ReSUS) are analyzed in order to provide a comprehensive view of the state of the art. In comparison to this state, a more sophisticated software tool is developed here, which provides features which are not offered by previous simulation tools. To achieve this objective, the software platform ReSUS provides a framework for handling probabilistic data uncertainties using deterministic external simulation tools, thus enhancing uncertainty and sensitivity analysis. This platform performs probabilistic simulations of various models, in particular THMC coupled processes, using stand-alone deterministic simulation software tools. The complete software development process of the ReSUS Platform is discussed in this thesis. ReSUS components are developed as libraries, which are capable of being linked to other code implementations. In addition, ASCII template files are used as means for uncertainty propagation into the input files of deterministic simulation tools. The embedded input sampler and analysis tools allow for sensitivity analysis in several kinds of simulation designs. The novelty of the ReSUS platform consists in the flexibility to assign external stand-alone software

  20. Conceptualization and software development of a simulation environment for probalistic safety assessment of radioactive waste repositories

    International Nuclear Information System (INIS)

    Ghofrani, Javad

    2016-01-01

    Uncertainty and sensitivity analysis of complex simulation models are prominent issues, both in scientific research and education. ReSUS (Repository Simulation, Uncertainty propagation and Sensitivity analysis) is an integrated platform to perform such analysis with numerical models that simulate the THMC (Thermal Hydraulical Mechanical and Chemical) coupled processes via different programs, in particular in the context of safety assessments for radioactive waste repositories. This thesis presents the idea behind the software platform ReSUS and its working mechanisms. Apart from the idea and the working mechanisms, the thesis describes applications related to the safety assessment of radioactive waste disposal systems. In this thesis, previous simulation tools (including the preceding version of ReSUS) are analyzed in order to provide a comprehensive view of the state of the art. In comparison to this state, a more sophisticated software tool is developed here, which provides features which are not offered by previous simulation tools. To achieve this objective, the software platform ReSUS provides a framework for handling probabilistic data uncertainties using deterministic external simulation tools, thus enhancing uncertainty and sensitivity analysis. This platform performs probabilistic simulations of various models, in particular THMC coupled processes, using stand-alone deterministic simulation software tools. The complete software development process of the ReSUS Platform is discussed in this thesis. ReSUS components are developed as libraries, which are capable of being linked to other code implementations. In addition, ASCII template files are used as means for uncertainty propagation into the input files of deterministic simulation tools. The embedded input sampler and analysis tools allow for sensitivity analysis in several kinds of simulation designs. The novelty of the ReSUS platform consists in the flexibility to assign external stand-alone software