WorldWideScience

Sample records for surgiplan computer-planning software

  1. Analysis of secondary coxarthrosis by three dimensional computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Hemmi, Osamu [Keio Univ., Tokyo (Japan). School of Medicine

    1997-11-01

    The majority of coxarthrosis in Japan is due to congenital dislocation of the hip and acetabular dysplasia. Until now coxarthrosis has been chiefly analyzed on the basis of anterior-posterior radiographs. By using three-dimensional (3D) CT, it was possible to analyze the morphological features of secondary coxarthrosis more accurately, and by using new computer graphics software, it was possible to display the contact area in the hip joint and observe changes associated with progression of the stages of the disease. There were 34 subjects (68 joints), and all of who were women. The CT data were read into a work station, and 3D reconstruction was achieved with hip surgery simulation software (SurgiPlan). Pelvic inclination, acetabular anteversion, seven parameters indicating the investment of the femoral head and two indicating the position of the hip joint in the pelvis were measured. The results showed that secondary coxarthrosis is characterized not only by lateral malposition of the hip joint according to the pelvic coordinates, but by anterior malposition as well. Many other measurements provided 3D information on the acetabular dysplasia. Many of them were correlated with the CE angle on plain radiographs. Furthermore, a strong correlation was not found between anterior and posterior acetabular coverage of the femoral head. In addition, SurgiPlan`s distance mapping function enabled 3D observation of the pattern of progression of arthrosis based on the pattern of progression of joint space narrowing. (author)

  2. Analysis of secondary coxarthrosis by three dimensional computed tomography

    International Nuclear Information System (INIS)

    Hemmi, Osamu

    1997-01-01

    The majority of coxarthrosis in Japan is due to congenital dislocation of the hip and acetabular dysplasia. Until now coxarthrosis has been chiefly analyzed on the basis of anterior-posterior radiographs. By using three-dimensional (3D) CT, it was possible to analyze the morphological features of secondary coxarthrosis more accurately, and by using new computer graphics software, it was possible to display the contact area in the hip joint and observe changes associated with progression of the stages of the disease. There were 34 subjects (68 joints), and all of who were women. The CT data were read into a work station, and 3D reconstruction was achieved with hip surgery simulation software (SurgiPlan). Pelvic inclination, acetabular anteversion, seven parameters indicating the investment of the femoral head and two indicating the position of the hip joint in the pelvis were measured. The results showed that secondary coxarthrosis is characterized not only by lateral malposition of the hip joint according to the pelvic coordinates, but by anterior malposition as well. Many other measurements provided 3D information on the acetabular dysplasia. Many of them were correlated with the CE angle on plain radiographs. Furthermore, a strong correlation was not found between anterior and posterior acetabular coverage of the femoral head. In addition, SurgiPlan's distance mapping function enabled 3D observation of the pattern of progression of arthrosis based on the pattern of progression of joint space narrowing. (author)

  3. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan : ASC software quality engineering practices Version 3.0.

    Energy Technology Data Exchange (ETDEWEB)

    Turgeon, Jennifer L.; Minana, Molly A.; Hackney, Patricia; Pilch, Martin M.

    2009-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in the US Department of Energy/National Nuclear Security Agency (DOE/NNSA) Quality Criteria, Revision 10 (QC-1) as 'conformance to customer requirements and expectations'. This quality plan defines the SNL ASC Program software quality engineering (SQE) practices and provides a mapping of these practices to the SNL Corporate Process Requirement (CPR) 001.3.6; 'Corporate Software Engineering Excellence'. This plan also identifies ASC management's and the software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals. This SNL ASC Software Quality Plan establishes the signatories commitments to improving software products by applying cost-effective SQE practices. This plan enumerates the SQE practices that comprise the development of SNL ASC's software products and explains the project teams opportunities for tailoring and implementing the practices.

  4. Software quality assurance plan for the National Ignition Facility integrated computer control system

    Energy Technology Data Exchange (ETDEWEB)

    Woodruff, J.

    1996-11-01

    Quality achievement is the responsibility of the line organizations of the National Ignition Facility (NIF) Project. This Software Quality Assurance Plan (SQAP) applies to the activities of the Integrated Computer Control System (ICCS) organization and its subcontractors. The Plan describes the activities implemented by the ICCS section to achieve quality in the NIF Project`s controls software and implements the NIF Quality Assurance Program Plan (QAPP, NIF-95-499, L-15958-2) and the Department of Energy`s (DOE`s) Order 5700.6C. This SQAP governs the quality affecting activities associated with developing and deploying all control system software during the life cycle of the NIF Project.

  5. Software quality assurance plan for the National Ignition Facility integrated computer control system

    International Nuclear Information System (INIS)

    Woodruff, J.

    1996-11-01

    Quality achievement is the responsibility of the line organizations of the National Ignition Facility (NIF) Project. This Software Quality Assurance Plan (SQAP) applies to the activities of the Integrated Computer Control System (ICCS) organization and its subcontractors. The Plan describes the activities implemented by the ICCS section to achieve quality in the NIF Project's controls software and implements the NIF Quality Assurance Program Plan (QAPP, NIF-95-499, L-15958-2) and the Department of Energy's (DOE's) Order 5700.6C. This SQAP governs the quality affecting activities associated with developing and deploying all control system software during the life cycle of the NIF Project

  6. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1: ASC software quality engineering practices, Version 2.0.

    Energy Technology Data Exchange (ETDEWEB)

    Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr. (,; .); Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR 1.3.2 and 1.3.6 and to a Department of Energy document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines. This document also identifies ASC management and software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  7. Computer software configuration management plan for the Honeywell modular automation system

    International Nuclear Information System (INIS)

    Cunningham, L.T.

    1997-01-01

    This document provides a Computer Software management plan for a new Honeywell Modular Automation System (MAS) being installed in the Plutonium Finishing Plant (PFP). This type of system will be used to control new thermal stabilization furnaces, a vertical denitrator calciner, and a pyrolysis furnace

  8. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1 : ASC software quality engineering practices version 1.0.

    Energy Technology Data Exchange (ETDEWEB)

    Minana, Molly A.; Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Forsythe, Christi A.; Schofield, Joseph Richard, Jr.; Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2005-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in DOE/AL Quality Criteria (QC-1) as conformance to customer requirements and expectations. This quality plan defines the ASC program software quality practices and provides mappings of these practices to the SNL Corporate Process Requirements (CPR 1.3.2 and CPR 1.3.6) and the Department of Energy (DOE) document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines (GP&G). This quality plan identifies ASC management and software project teams' responsibilities for cost-effective software engineering quality practices. The SNL ASC Software Quality Plan establishes the signatories commitment to improving software products by applying cost-effective software engineering quality practices. This document explains the project teams opportunities for tailoring and implementing the practices; enumerates the practices that compose the development of SNL ASC's software products; and includes a sample assessment checklist that was developed based upon the practices in this document.

  9. Computer software configuration management plan for 200 East/West Liquid Effluent Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Graf, F.A. Jr.

    1995-02-27

    This computer software management configuration plan covers the control of the software for the monitor and control system that operates the Effluent Treatment Facility and its associated truck load in station and some key aspects of the Liquid Effluent Retention Facility that stores condensate to be processed. Also controlled is the Treated Effluent Disposal System`s pumping stations and monitors waste generator flows in this system as well as the Phase Two Effluent Collection System.

  10. Computer software configuration management plan for 200 East/West Liquid Effluent Facilities

    International Nuclear Information System (INIS)

    Graf, F.A. Jr.

    1995-01-01

    This computer software management configuration plan covers the control of the software for the monitor and control system that operates the Effluent Treatment Facility and its associated truck load in station and some key aspects of the Liquid Effluent Retention Facility that stores condensate to be processed. Also controlled is the Treated Effluent Disposal System's pumping stations and monitors waste generator flows in this system as well as the Phase Two Effluent Collection System

  11. The manual of a computer software 'FBR Plant Planning Design Prototype System'

    International Nuclear Information System (INIS)

    2003-10-01

    This is a manual of a computer software 'FBR Plant Planning Design Prototype System', which enables users to conduct case studies of deviated FBR design concepts based on 'MONJU'. The calculations simply proceed as the user clicks displayed buttons, therefore step-by-step explanation is supposed not be necessary. The following pages introduce only particular features of this software, i.e, each interactive screens, functions of buttons and consequences after clicks, and the quitting procedure. (author)

  12. Upgrade Software and Computing

    CERN Document Server

    The LHCb Collaboration, CERN

    2018-01-01

    This document reports the Research and Development activities that are carried out in the software and computing domains in view of the upgrade of the LHCb experiment. The implementation of a full software trigger implies major changes in the core software framework, in the event data model, and in the reconstruction algorithms. The increase of the data volumes for both real and simulated datasets requires a corresponding scaling of the distributed computing infrastructure. An implementation plan in both domains is presented, together with a risk assessment analysis.

  13. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan part 2 mappings for the ASC software quality engineering practices, version 2.0.

    Energy Technology Data Exchange (ETDEWEB)

    Heaphy, Robert; Sturtevant, Judith E.; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr. (,; .); Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR001.3.2 and CPR001.3.6 and to a Department of Energy document, ''ASCI Software Quality Engineering: Goals, Principles, and Guidelines''. This document also identifies ASC management and software project teams' responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  14. Survey of ANL organization plans for word processors, personal computers, workstations, and associated software

    Energy Technology Data Exchange (ETDEWEB)

    Fenske, K.R.

    1991-11-01

    The Computing and Telecommunications Division (CTD) has compiled this Survey of ANL Organization Plans for Word Processors, Personal Computers, Workstations, and Associated Software to provide DOE and Argonne with a record of recent growth in the acquisition and use of personal computers, microcomputers, and word processors at ANL. Laboratory planners, service providers, and people involved in office automation may find the Survey useful. It is for internal use only, and any unauthorized use is prohibited. Readers of the Survey should use it as a reference that documents the plans of each organization for office automation, identifies appropriate planners and other contact people in those organizations, and encourages the sharing of this information among those people making plans for organizations and decisions about office automation. The Survey supplements information in both the ANL Statement of Site Strategy for Computing Workstations and the ANL Site Response for the DOE Information Technology Resources Long-Range Plan.

  15. Survey of ANL organization plans for word processors, personal computers, workstations, and associated software

    Energy Technology Data Exchange (ETDEWEB)

    Fenske, K.R.; Rockwell, V.S.

    1992-08-01

    The Computing and Telecommunications Division (CTD) has compiled this Survey of ANL Organization plans for Word Processors, Personal Computers, Workstations, and Associated Software (ANL/TM, Revision 4) to provide DOE and Argonne with a record of recent growth in the acquisition and use of personal computers, microcomputers, and word processors at ANL. Laboratory planners, service providers, and people involved in office automation may find the Survey useful. It is for internal use only, and any unauthorized use is prohibited. Readers of the Survey should use it as a reference document that (1) documents the plans of each organization for office automation, (2) identifies appropriate planners and other contact people in those organizations and (3) encourages the sharing of this information among those people making plans for organizations and decisions about office automation. The Survey supplements information in both the ANL Statement of Site Strategy for Computing Workstations (ANL/TM 458) and the ANL Site Response for the DOE Information Technology Resources Long-Range Plan (ANL/TM 466).

  16. Light Duty Utility Arm computer software configuration management plan

    International Nuclear Information System (INIS)

    Philipp, B.L.

    1998-01-01

    This plan describes the configuration management for the Light Duty Utility Arm robotic manipulation arm control software. It identifies the requirement, associated documents, and the software control methodology. The Light Duty Utility Ann (LDUA) System is a multi-axis robotic manipulator arm and deployment vehicle, used to perform surveillance and characterization operations in support of remediation of defense nuclear wastes currently stored in the Hanford Underground Storage Tanks (USTs) through the available 30.5 cm (12 in.) risers. This plan describes the configuration management of the LDUA software

  17. Survey of ANL organization plans for word processors, personal computers, workstations, and associated software. Revision 3

    Energy Technology Data Exchange (ETDEWEB)

    Fenske, K.R.

    1991-11-01

    The Computing and Telecommunications Division (CTD) has compiled this Survey of ANL Organization Plans for Word Processors, Personal Computers, Workstations, and Associated Software to provide DOE and Argonne with a record of recent growth in the acquisition and use of personal computers, microcomputers, and word processors at ANL. Laboratory planners, service providers, and people involved in office automation may find the Survey useful. It is for internal use only, and any unauthorized use is prohibited. Readers of the Survey should use it as a reference that documents the plans of each organization for office automation, identifies appropriate planners and other contact people in those organizations, and encourages the sharing of this information among those people making plans for organizations and decisions about office automation. The Survey supplements information in both the ANL Statement of Site Strategy for Computing Workstations and the ANL Site Response for the DOE Information Technology Resources Long-Range Plan.

  18. Survey of ANL organization plans for word processors, personal computers, workstations, and associated software. Revision 4

    Energy Technology Data Exchange (ETDEWEB)

    Fenske, K.R.; Rockwell, V.S.

    1992-08-01

    The Computing and Telecommunications Division (CTD) has compiled this Survey of ANL Organization plans for Word Processors, Personal Computers, Workstations, and Associated Software (ANL/TM, Revision 4) to provide DOE and Argonne with a record of recent growth in the acquisition and use of personal computers, microcomputers, and word processors at ANL. Laboratory planners, service providers, and people involved in office automation may find the Survey useful. It is for internal use only, and any unauthorized use is prohibited. Readers of the Survey should use it as a reference document that (1) documents the plans of each organization for office automation, (2) identifies appropriate planners and other contact people in those organizations and (3) encourages the sharing of this information among those people making plans for organizations and decisions about office automation. The Survey supplements information in both the ANL Statement of Site Strategy for Computing Workstations (ANL/TM 458) and the ANL Site Response for the DOE Information Technology Resources Long-Range Plan (ANL/TM 466).

  19. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  20. Sandia software guidelines: Software quality planning

    Energy Technology Data Exchange (ETDEWEB)

    1987-08-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. In consonance with the IEEE Standard for Software Quality Assurance Plans, this volume identifies procedures to follow in producing a Software Quality Assurance Plan for an organization or a project, and provides an example project SQA plan. 2 figs., 4 tabs.

  1. FY95 software project management plan: TMACS, CASS computer systems

    International Nuclear Information System (INIS)

    Spurling, D.G.

    1994-01-01

    The FY95 Work Plan for TMACS and CASS Software Projects describes the activities planned for the current fiscal year. This plan replaces WHC-SD-WM-SDP-008. The TMACS project schedule is included in the TWRS Integrated Schedule

  2. Software quality assurance plan for PORFLOW-3D

    International Nuclear Information System (INIS)

    Maheras, S.J.

    1993-03-01

    This plan describes the steps taken by the Idaho National Engineering Laboratory Subsurface and Environmental Modeling Unit personnel to implement software quality assurance procedures for the PORFLOW-3D computer code. PORFLOW-3D was used to conduct radiological performance assessments at the Savannah River Site. software quality assurance procedures for PORFLOW-3D include software acquisition, installation, testing, operation, maintenance, and retirement. Configuration control and quality assurance procedures are also included or referenced in this plan

  3. Creating a strategic plan for configuration management using computer aided software engineering (CASE) tools

    International Nuclear Information System (INIS)

    Smith, P.R.; Sarfaty, R.

    1993-01-01

    This paper provides guidance in the definition, documentation, measurement, enhancement of processes, and validation of a strategic plan for configuration management (CM). The approach and methodology used in establishing a strategic plan is the same for any enterprise, including the Department of Energy (DOE), commercial nuclear plants, the Department of Defense (DOD), or large industrial complexes. The principles and techniques presented are used world wide by some of the largest corporations. The authors used industry knowledge and the areas of their current employment to illustrate and provide examples. Developing a strategic configuration and information management plan for DOE Idaho Field Office (DOE-ID) facilities is discussed in this paper. A good knowledge of CM principles is the key to successful strategic planning. This paper will describe and define CM elements, and discuss how CM integrates the facility's physical configuration, design basis, and documentation. The strategic plan does not need the support of a computer aided software engineering (CASE) tool. However, the use of the CASE tool provides a methodology for consistency in approach, graphics, and database capability combined to form an encyclopedia and a method of presentation that is easily understood and aids the process of reengineering. CASE tools have much more capability than those stated above. Some examples are supporting a joint application development group (JAD) to prepare a software functional specification document and, if necessary, provide the capability to automatically generate software application code. This paper briefly discusses characteristics and capabilities of two CASE tools that use different methodologies to generate similar deliverables

  4. RELAP-7 Software Verification and Validation Plan

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support; Choi, Yong-Joon [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support; Zou, Ling [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support

    2014-09-25

    This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty years of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.

  5. Honeywell Modular Automation System Computer Software Documentation

    International Nuclear Information System (INIS)

    STUBBS, A.M.

    2000-01-01

    The purpose of this Computer Software Document (CSWD) is to provide configuration control of the Honeywell Modular Automation System (MAS) in use at the Plutonium Finishing Plant (PFP). This CSWD describes hardware and PFP developed software for control of stabilization furnaces. The Honeywell software can generate configuration reports for the developed control software. These reports are described in the following section and are attached as addendum's. This plan applies to PFP Engineering Manager, Thermal Stabilization Cognizant Engineers, and the Shift Technical Advisors responsible for the Honeywell MAS software/hardware and administration of the Honeywell System

  6. Imprinting Community College Computer Science Education with Software Engineering Principles

    Science.gov (United States)

    Hundley, Jacqueline Holliday

    Although the two-year curriculum guide includes coverage of all eight software engineering core topics, the computer science courses taught in Alabama community colleges limit student exposure to the programming, or coding, phase of the software development lifecycle and offer little experience in requirements analysis, design, testing, and maintenance. We proposed that some software engineering principles can be incorporated into the introductory-level of the computer science curriculum. Our vision is to give community college students a broader exposure to the software development lifecycle. For those students who plan to transfer to a baccalaureate program subsequent to their community college education, our vision is to prepare them sufficiently to move seamlessly into mainstream computer science and software engineering degrees. For those students who plan to move from the community college to a programming career, our vision is to equip them with the foundational knowledge and skills required by the software industry. To accomplish our goals, we developed curriculum modules for teaching seven of the software engineering knowledge areas within current computer science introductory-level courses. Each module was designed to be self-supported with suggested learning objectives, teaching outline, software tool support, teaching activities, and other material to assist the instructor in using it.

  7. Software Development Plan for DESCARTES and CIDER

    International Nuclear Information System (INIS)

    Eslinger, P.W.

    1992-01-01

    This Software Development Plan (SDP) outlines all software activities required to obtain functional environmental accumulation and individual dose codes for the Hanford Environmental Dose Reconstruction (HEDR) project. The modeling activities addressed use the output of the air transport-code HATCHET to compute radionuclide concentrations in environmental pathways, and continue on through calculations of dose for individuals. The Hanford Environmental Dose Reconstruction (HEDR) Project has a deliverable in the June 1993 time frame to be able to start computing doses to individuals from nuclear-related activities on the Hanford Site during and following World War II. The CIDER code will compute doses and their uncertainties for individuals living in the contaminated environment computed by DESCARTES. The projected size of the code is 3000 lines

  8. Acceptance Test Plan for ANSYS Software

    International Nuclear Information System (INIS)

    CREA, B.A.

    2000-01-01

    This plan governs the acceptance testing of the ANSYS software (Full Mechanical Release 5.5) for use on Project Word Management Contract (PHMC) computer systems (either UNIX or Microsoft Windows/NT). There are two phases to the acceptance testing covered by this test plan: program execution in accordance with the guidance provided in installation manuals; and ensuring results of the execution are consistent with the expected physical behavior of the system being modeled

  9. Software quality assurance plans for safety-critical software

    International Nuclear Information System (INIS)

    Liddle, P.

    2006-01-01

    Application software is defined as safety-critical if a fault in the software could prevent the system components from performing their nuclear-safety functions. Therefore, for nuclear-safety systems, the AREVA TELEPERM R XS (TXS) system is classified 1E, as defined in the Inst. of Electrical and Electronics Engineers (IEEE) Std 603-1998. The application software is classified as Software Integrity Level (SIL)-4, as defined in IEEE Std 7-4.3.2-2003. The AREVA NP Inc. Software Program Manual (SPM) describes the measures taken to ensure that the TELEPERM XS application software attains a level of quality commensurate with its importance to safety. The manual also describes how TELEPERM XS correctly performs the required safety functions and conforms to established technical and documentation requirements, conventions, rules, and standards. The program manual covers the requirements definition, detailed design, integration, and test phases for the TELEPERM XS application software, and supporting software created by AREVA NP Inc. The SPM is required for all safety-related TELEPERM XS system applications. The program comprises several basic plans and practices: 1. A Software Quality-Assurance Plan (SQAP) that describes the processes necessary to ensure that the software attains a level of quality commensurate with its importance to safety function. 2. A Software Safety Plan (SSP) that identifies the process to reasonably ensure that safety-critical software performs as intended during all abnormal conditions and events, and does not introduce any new hazards that could jeopardize the health and safety of the public. 3. A Software Verification and Validation (V and V) Plan that describes the method of ensuring the software is in accordance with the requirements. 4. A Software Configuration Management Plan (SCMP) that describes the method of maintaining the software in an identifiable state at all times. 5. A Software Operations and Maintenance Plan (SO and MP) that

  10. An AmI-Based Software Architecture Enabling Evolutionary Computation in Blended Commerce: The Shopping Plan Application

    Directory of Open Access Journals (Sweden)

    Giuseppe D’Aniello

    2015-01-01

    Full Text Available This work describes an approach to synergistically exploit ambient intelligence technologies, mobile devices, and evolutionary computation in order to support blended commerce or ubiquitous commerce scenarios. The work proposes a software architecture consisting of three main components: linked data for e-commerce, cloud-based services, and mobile apps. The three components implement a scenario where a shopping mall is presented as an intelligent environment in which customers use NFC capabilities of their smartphones in order to handle e-coupons produced, suggested, and consumed by the abovesaid environment. The main function of the intelligent environment is to help customers define shopping plans, which minimize the overall shopping cost by looking for best prices, discounts, and coupons. The paper proposes a genetic algorithm to find suboptimal solutions for the shopping plan problem in a highly dynamic context, where the final cost of a product for an individual customer is dependent on his previous purchases. In particular, the work provides details on the Shopping Plan software prototype and some experimentation results showing the overall performance of the genetic algorithm.

  11. Tank monitor and control system (TMACS) software configuration management plan

    International Nuclear Information System (INIS)

    GLASSCOCK, J.A.

    1999-01-01

    This Software Configuration Management Plan (SCMP) describes the methodology for control of computer software developed and supported by the Systems Development and Integration (SD and I) organization of Lockheed Martin Services, Inc. (LMSI) for the Tank Monitor and Control System (TMACS). This plan controls changes to the software and configuration files used by TMACS. The controlled software includes the Gensym software package, Gensym knowledge base files developed for TMACS, C-language programs used by TMACS, the operating system on the production machine, language compilers, and all Windows NT commands and functions which affect the operating environment. The configuration files controlled include the files downloaded to the Acromag and Westronic field instruments

  12. 48 CFR 227.7203-2 - Acquisition of noncommercial computer software and computer software documentation.

    Science.gov (United States)

    2010-10-01

    ... at one site or multiple site licenses, and the format and media in which the software or... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software...

  13. The SSCL framework software plans

    International Nuclear Information System (INIS)

    Frederiksen, S.

    1993-12-01

    In about ten years the Superconducting Super Collider Laboratory (SSCL) will be Producing 40 TeV proton-proton interactions. The size and scale of the effort demands new approaches to design and develop software used by the experimental collaborations. The Physics Research Division Computing Department (PRCD) of the SSCL is developing (in collaboration with the Solenoidal Detector Collaboration (SDC) and Gamma, Electron and Muon (GEM) collaborations a support system which will be used to build and run the collaboration software. It will be used for simulating the events needed for detector development and for the analysis of these complicated events. The plans status of this program will be discussed

  14. Windows Calorimeter Control (WinCal) program computer software configuration management plan

    International Nuclear Information System (INIS)

    1997-01-01

    This document describes the system configuration management activities performed in support of the Windows Calorimeter Control (WinCal) system, in accordance with Site procedures based on Institute of Electrical and Electronic Engineers (IEEE) Standard 828-1990, Standard for Software Configuration Management Plans (IEEE 1990) and IEEE Standard 1042-1987, Guide to Software Configuration Management (IEEE 1987)

  15. Hardware and software maintenance strategies for upgrading vintage computers

    International Nuclear Information System (INIS)

    Wang, B.C.; Buijs, W.J.; Banting, R.D.

    1992-01-01

    The paper focuses on the maintenance of the computer hardware and software for digital control computers (DCC). Specific design and problems related to various maintenance strategies are reviewed. A foundation was required for a reliable computer maintenance and upgrading program to provide operation of the DCC with high availability and reliability for 40 years. This involved a carefully planned and executed maintenance and upgrading program, involving complementary hardware and software strategies. The computer system was designed on a modular basis, with large sections easily replaceable, to facilitate maintenance and improve availability of the system. Advances in computer hardware have made it possible to replace DCC peripheral devices with reliable, inexpensive, and widely available components from PC-based systems (PC = personal computer). By providing a high speed link from the DCC to a PC, it is now possible to use many commercial software packages to process data from the plant. 1 fig

  16. MCNP trademark Software Quality Assurance plan

    International Nuclear Information System (INIS)

    Abhold, H.M.; Hendricks, J.S.

    1996-04-01

    MCNP is a computer code that models the interaction of radiation with matter. MCNP is developed and maintained by the Transport Methods Group (XTM) of the Los Alamos National Laboratory (LANL). This plan describes the Software Quality Assurance (SQA) program applied to the code. The SQA program is consistent with the requirements of IEEE-730.1 and the guiding principles of ISO 900

  17. Saltwell PIC Skid Programmable Logic Controller (PLC) Software Configuration Management Plan

    International Nuclear Information System (INIS)

    KOCH, M.R.

    1999-01-01

    This document provides the procedures and guidelines necessary for computer software configuration management activities during the operation and maintenance phases of the Saltwell PIC Skids as required by LMH-PRO-309/Rev. 0, Computer Software Quality Assurance, Section 2.6, Software Configuration Management. The software configuration management plan (SCMP) integrates technical and administrative controls to establish and maintain technical consistency among requirements, physical configuration, and documentation for the Saltwell PIC Skid Programmable Logic Controller (PLC) software during the Hanford application, operations and maintenance. This SCMP establishes the Saltwell PIC Skid PLC Software Baseline, status changes to that baseline, and ensures that software meets design and operational requirements and is tested in accordance with their design basis

  18. A portable software tool for computing digitally reconstructed radiographs

    International Nuclear Information System (INIS)

    Chaney, Edward L.; Thorn, Jesse S.; Tracton, Gregg; Cullip, Timothy; Rosenman, Julian G.; Tepper, Joel E.

    1995-01-01

    Purpose: To develop a portable software tool for fast computation of digitally reconstructed radiographs (DRR) with a friendly user interface and versatile image format and display options. To provide a means for interfacing with commercial and custom three-dimensional (3D) treatment planning systems. To make the tool freely available to the Radiation Oncology community. Methods and Materials: A computer program for computing DRRs was enhanced with new features and rewritten to increase computational efficiency. A graphical user interface was added to improve ease of data input and DRR display. Installer, programmer, and user manuals were written, and installation test data sets were developed. The code conforms to the specifications of the Cooperative Working Group (CWG) of the National Cancer Institute (NCI) Contract on Radiotherapy Treatment Planning Tools. Results: The interface allows the user to select DRR input data and image formats primarily by point-and-click mouse operations. Digitally reconstructed radiograph formats are predefined by configuration files that specify 19 calculation parameters. Enhancements include improved contrast resolution for visualizing surgical clips, an extended source model to simulate the penumbra region in a computed port film, and the ability to easily modify the CT numbers of objects contoured on the planning computed tomography (CT) scans. Conclusions: The DRR tool can be used with 3D planning systems that lack this functionality, or perhaps improve the quality and functionality of existing DRR software. The tool can be interfaced to 3D planning systems that run on most modern graphics workstations, and can also function as a stand-alone program

  19. Evaluation procedure of software safety plan for digital I and C of KNGR

    International Nuclear Information System (INIS)

    Lee, Jang Soo; Park, Jong Kyun; Lee, Ki Young; Kwon, Ki Choon; Kim, Jang Yeol; Cheon, Se Woo

    2000-05-01

    The development, use, and regulation of computer systems in nuclear reactor instrumentation and control (I and C) systems to enhance reliability and safety is a complex issue. This report is one of a series of reports from the Korean next generation reactor (KNGR) software safety verification and validation (SSVV) task, Korea Atomic Energy Research Institute, which investigates different aspects of computer software in reactor I and C systems, and describes the engineering procedures for developing such a software. The purpose of this guideline is to give the software safety evaluator the trail map between the code and standards layer and the design methodology and documents layer for the software important to safety in nuclear power plants. Recently, the safety planning for safety-critical software systems is being recognized as the most important phase in the software life cycle, and being developed new regulatory positions and standards by the regulatory and the standardization organizations. The requirements for software important to safety of nuclear reactor are described in such positions and standards, for example, the new standard review plan (SRP), IEC 880 supplements, IEEE standard 1228-1994, IEEE standard 7-4.3.2-1993, and IAEA safety series No. 50-SG-D3 and D8. We presented the guidance for evaluating the safety plan of the software in the KNGR protection systems. The guideline consists of the regulatory requirements for software safety in chapter 2, the evaluation checklist of software safety plan in chapter3, and the evaluation results of KNGR software safety plan in chapter 4

  20. Evaluation procedure of software safety plan for digital I and C of KNGR

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jang Soo; Park, Jong Kyun; Lee, Ki Young; Kwon, Ki Choon; Kim, Jang Yeol; Cheon, Se Woo

    2000-05-01

    The development, use, and regulation of computer systems in nuclear reactor instrumentation and control (I and C) systems to enhance reliability and safety is a complex issue. This report is one of a series of reports from the Korean next generation reactor (KNGR) software safety verification and validation (SSVV) task, Korea Atomic Energy Research Institute, which investigates different aspects of computer software in reactor I and C systems, and describes the engineering procedures for developing such a software. The purpose of this guideline is to give the software safety evaluator the trail map between the code and standards layer and the design methodology and documents layer for the software important to safety in nuclear power plants. Recently, the safety planning for safety-critical software systems is being recognized as the most important phase in the software life cycle, and being developed new regulatory positions and standards by the regulatory and the standardization organizations. The requirements for software important to safety of nuclear reactor are described in such positions and standards, for example, the new standard review plan (SRP), IEC 880 supplements, IEEE standard 1228-1994, IEEE standard 7-4.3.2-1993, and IAEA safety series No. 50-SG-D3 and D8. We presented the guidance for evaluating the safety plan of the software in the KNGR protection systems. The guideline consists of the regulatory requirements for software safety in chapter 2, the evaluation checklist of software safety plan in chapter3, and the evaluation results of KNGR software safety plan in chapter 4.

  1. Tank monitor and control system (TMACS) software configuration management plan; TOPICAL

    International Nuclear Information System (INIS)

    GLASSCOCK, J.A.

    1999-01-01

    This Software Configuration Management Plan (SCMP) describes the methodology for control of computer software developed and supported by the Systems Development and Integration (SD and I) organization of Lockheed Martin Services, Inc. (LMSI) for the Tank Monitor and Control System (TMACS). This plan controls changes to the software and configuration files used by TMACS. The controlled software includes the Gensym software package, Gensym knowledge base files developed for TMACS, C-language programs used by TMACS, the operating system on the production machine, language compilers, and all Windows NT commands and functions which affect the operating environment. The configuration files controlled include the files downloaded to the Acromag and Westronic field instruments

  2. Configuration management plan for the GENII software

    International Nuclear Information System (INIS)

    Rittmann, P.D.

    1994-01-01

    The GENII program calculates doses from radionuclides released into the environment for a variety of possible exposure scenarios. The user prepares an input data file with the necessary modelling assumptions and parameters. The program reads the user's input file, computes the necessary doses and stores these results in an output file. The output file also contains a listing of the user's input and gives the title lines from the data libraries which are accessed in the course of the calculations. The purpose of this document is to provide users of the GENII software with the configuration controls which are planned for use by WHC in accordance with WHC-CM-3-10. The controls are solely for WHC employees. Non-WHC individuals are not excluded, but no promise is made or implied that they will be informed of errors or revisions to the software. The configuration controls cover the GENII software, the GENII user's guide, the list of GENII users at WHC, and the backup copies. Revisions to the software must be approved prior to distribution in accordance with this configuration management plan

  3. Some recent developments in treatment planning software and methodology for BNCT

    International Nuclear Information System (INIS)

    Nigg, D.W.; Wheeler, F.J.; Wessol, D.E.; Wemple, C.A.; Babcock, R.; Capala, J.

    1996-01-01

    Over the past several years/the Idaho National Engineering Laboratory (INEL) has led the development of a unique, internationally-recognized set of software modules (BNCT rtpe) for computational dosimetry and treatment planning for Boron Neutron Capture Therapy (BNCT). The computational capability represented by this software is essential to the proper administration of all forms of radiotherapy for cancer. Such software addresses the need to perform pretreatment computation and optimization of the radiation dose distribution in the target volume. This permits the achievement of the optimal therapeutic ratio (tumor dose relative to critical normal tissue dose) for each individual patient via a systematic procedure for specifying the appropriate irradiation parameters to be employed for a given treatment. These parameters include angle of therapy beam incidence, beam aperture and shape,and beam intensity as a function of position across the beam front. The INEL software is used for treatment planning in the current series of human glioma trials at Brookhaven National Laboratory (BNL) and has also been licensed for research and developmental purposes to several other BNCT research centers in the US and in Europe

  4. Some recent developments in treatment planning software and methodology for BNCT

    International Nuclear Information System (INIS)

    Nigg, D.W.; Wheeler, F.J.; Wessol, D.E.

    1996-01-01

    Over the past several years the Idaho National Engineering Laboratory (INEL) has led the development of a unique, internationally-recognized set of software modules (BNCT-rtpe) for computational dosimetry and treatment planning for Boron Neutron Capture Therapy (BNCT). The computational capability represented by this software is essential to the proper administration of all forms of radiotherapy for cancer. Such software addresses the need to perform pretreatment computation and optimization of the radiation dose distribution in the target volume. This permits the achievement of the optimal therapeutic ratio (tumor dose relative to critical normal tissue dose) for each individual patient via a systematic procedure for specifying the appropriate irradiation parameters to be employed for a given treatment. These parameters include angle of therapy beam incidence, beam aperture and shape, and beam intensity as a function of position across the beam front. The INEL software is used for treatment planning in the current series of human glioma trials at Brookhaven National Laboratory (BNL) and has also been licensed for research and developmental purposes to several other BNCT research centers in the US and in Europe

  5. Computer assisted SCFE osteotomy planning

    International Nuclear Information System (INIS)

    Drapikowski, Pawel; Tyrakowski, Marcin; Czubak, Jaroslaw; Czwojdzinski, Adam

    2008-01-01

    Slipped capital femoral epiphysis (SCFE) is a common pediatric orthopedic disorder that requires surgical correction. Preoperative planning of a proximal femoral osteotomy is essential in cases of SCFE. This planning is usually done using 2D radiographs, but 3D data can be acquired with CT and analyzed with 3D visualization software. SCFEanalyzer is a computer program developed for preoperative planning of proximal femoral osteotomy to correct SCFE. Computed tomography scans were performed on human bone specimens: one pelvis and two femoral bones (right and left) and volume data of a patient. The CT data were used to test the abilities of the SCFEanalyzer software, which utilizes 3D virtual models of anatomic structures constructed from CT image data. Separation of anatomical bone structures is done by means of ''cutting'' 3D surface model of the pelvis. The software enables qualitative and quantitative spatial analysis of chosen parameters analogous to those done on the basis of plain radiographs. SCFEanalyzer makes it possible to evaluate the function of the hip joint by calculating the range of motion depending on the shape of bone structures based on oriented bounding box object representation. Pelvic and hip CT scans from a patient with SCFE were subjected to femoral geometry analysis and hip joint function assessment. These were done to plan and simulate osteotomy of the proximal femur. Analogous qualitative and quantitative evaluation after performing the virtual surgery were evaluated to determine the potential treatment effects. The use of computer assistance in preoperative planning enable us to increase objectivity and repeatability, and to compare the results of different types of osteotomy on the proximal femur, and thus to choose the optimal operation in each individual case. (orig.)

  6. Computer assisted SCFE osteotomy planning

    Energy Technology Data Exchange (ETDEWEB)

    Drapikowski, Pawel [Poznan University of Technology, Institute of Control and Information Engineering, Poznan (Poland); Tyrakowski, Marcin; Czubak, Jaroslaw; Czwojdzinski, Adam [Postgraduate Medical Education Center, Department of Orthopaedics, Warsaw (Poland)

    2008-11-15

    Slipped capital femoral epiphysis (SCFE) is a common pediatric orthopedic disorder that requires surgical correction. Preoperative planning of a proximal femoral osteotomy is essential in cases of SCFE. This planning is usually done using 2D radiographs, but 3D data can be acquired with CT and analyzed with 3D visualization software. SCFEanalyzer is a computer program developed for preoperative planning of proximal femoral osteotomy to correct SCFE. Computed tomography scans were performed on human bone specimens: one pelvis and two femoral bones (right and left) and volume data of a patient. The CT data were used to test the abilities of the SCFEanalyzer software, which utilizes 3D virtual models of anatomic structures constructed from CT image data. Separation of anatomical bone structures is done by means of ''cutting'' 3D surface model of the pelvis. The software enables qualitative and quantitative spatial analysis of chosen parameters analogous to those done on the basis of plain radiographs. SCFEanalyzer makes it possible to evaluate the function of the hip joint by calculating the range of motion depending on the shape of bone structures based on oriented bounding box object representation. Pelvic and hip CT scans from a patient with SCFE were subjected to femoral geometry analysis and hip joint function assessment. These were done to plan and simulate osteotomy of the proximal femur. Analogous qualitative and quantitative evaluation after performing the virtual surgery were evaluated to determine the potential treatment effects. The use of computer assistance in preoperative planning enable us to increase objectivity and repeatability, and to compare the results of different types of osteotomy on the proximal femur, and thus to choose the optimal operation in each individual case. (orig.)

  7. CMS software and computing for LHC Run 2

    CERN Document Server

    INSPIRE-00067576

    2016-11-09

    The CMS offline software and computing system has successfully met the challenge of LHC Run 2. In this presentation, we will discuss how the entire system was improved in anticipation of increased trigger output rate, increased rate of pileup interactions and the evolution of computing technology. The primary goals behind these changes was to increase the flexibility of computing facilities where ever possible, as to increase our operational efficiency, and to decrease the computing resources needed to accomplish the primary offline computing workflows. These changes have resulted in a new approach to distributed computing in CMS for Run 2 and for the future as the LHC luminosity should continue to increase. We will discuss changes and plans to our data federation, which was one of the key changes towards a more flexible computing model for Run 2. Our software framework and algorithms also underwent significant changes. We will summarize the our experience with a new multi-threaded framework as deployed on ou...

  8. SigmaPlot 2000, Version 6.00, SPSS Inc. Computer Software Test Plan

    Energy Technology Data Exchange (ETDEWEB)

    HURLBUT, S.T.

    2000-10-24

    SigmaPlot is a vendor software product used in conjunction with the supercritical fluid extraction Fourier transform infrared spectrometer (SFE-FTIR) system. This product converts the raw spectral data to useful area numbers. SigmaPlot will be used in conjunction with procedure ZA-565-301, ''Determination of Moisture by Supercritical Fluid Extraction and Infrared Detection.'' This test plan will be performed in conjunction with or prior to HNF-6936, ''HA-53 Supercritical Fluid Extraction System Acceptance Test Plan'', to perform analyses for water. The test will ensure that the software can be installed properly and will manipulate the analytical data correctly.

  9. New ATLAS Software & Computing Organization

    CERN Multimedia

    Barberis, D

    Following the election by the ATLAS Collaboration Board of Dario Barberis (Genoa University/INFN) as Computing Coordinator and David Quarrie (LBNL) as Software Project Leader, it was considered necessary to modify the organization of the ATLAS Software & Computing ("S&C") project. The new organization is based upon the following principles: separation of the responsibilities for computing management from those of software development, with the appointment of a Computing Coordinator and a Software Project Leader who are both members of the Executive Board; hierarchical structure of responsibilities and reporting lines; coordination at all levels between TDAQ, S&C and Physics working groups; integration of the subdetector software development groups with the central S&C organization. A schematic diagram of the new organization can be seen in Fig.1. Figure 1: new ATLAS Software & Computing organization. Two Management Boards will help the Computing Coordinator and the Software Project...

  10. Computer Software Reviews.

    Science.gov (United States)

    Hawaii State Dept. of Education, Honolulu. Office of Instructional Services.

    Intended to provide guidance in the selection of the best computer software available to support instruction and to make optimal use of schools' financial resources, this publication provides a listing of computer software programs that have been evaluated according to their currency, relevance, and value to Hawaii's educational programs. The…

  11. 48 CFR 12.212 - Computer software.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Computer software. 12.212... software. (a) Commercial computer software or commercial computer software documentation shall be acquired... required to— (1) Furnish technical information related to commercial computer software or commercial...

  12. Software Engineering Improvement Activities/Plan

    Science.gov (United States)

    2003-01-01

    bd Systems personnel accomplished the technical responsibilities for this reporting period, as planned. A close working relationship was maintained with personnel of the MSFC Avionics Department Software Group (ED14). Work accomplishments included development, evaluation, and enhancement of a software cost model, performing literature search and evaluation of software tools available for code analysis and requirements analysis, and participating in other relevant software engineering activities. Monthly reports were submitted. This support was provided to the Flight Software Group/ED 1 4 in accomplishing the software engineering improvement engineering activities of the Marshall Space Flight Center (MSFC) Software Engineering Improvement Plan.

  13. 48 CFR 227.7203-10 - Contractor identification and marking of computer software or computer software documentation to...

    Science.gov (United States)

    2010-10-01

    ... operation of the software to display a restrictive rights legend or other license notice; and (2) Requires a... and marking of computer software or computer software documentation to be furnished with restrictive... Rights in Computer Software and Computer Software Documentation 227.7203-10 Contractor identification and...

  14. AWARE-P: a collaborative, system-based IAM planning software

    OpenAIRE

    Coelho, S. T.; Vitorino, D.

    2011-01-01

    The AWARE-P project aims to promote the application of integrated and risk-based approaches to the rehabilitation of urban water supply and wastewater drainage systems. Central to the project is the development of a software platform based on a set of computational components, which assist in the analyses and decision support involved in the planning process for sustainable infrastructural asset management. The AWARE-P software system brings together onto a common platform the inf...

  15. 48 CFR 252.227-7014 - Rights in noncommercial computer software and noncommercial computer software documentation.

    Science.gov (United States)

    2010-10-01

    ...) Restricted rights in computer software, limited rights in technical data, or government purpose license... necessary to perfect a license or licenses in the deliverable software or documentation of the appropriate... the license rights obtained. (e) Identification and delivery of computer software and computer...

  16. Project W-211, initial tank retrieval systems, retrieval control system software configuration management plan

    International Nuclear Information System (INIS)

    RIECK, C.A.

    1999-01-01

    This Software Configuration Management Plan (SCMP) provides the instructions for change control of the W-211 Project, Retrieval Control System (RCS) software after initial approval/release but prior to the transfer of custody to the waste tank operations contractor. This plan applies to the W-211 system software developed by the project, consisting of the computer human-machine interface (HMI) and programmable logic controller (PLC) software source and executable code, for production use by the waste tank operations contractor. The plan encompasses that portion of the W-211 RCS software represented on project-specific AUTOCAD drawings that are released as part of the C1 definitive design package (these drawings are identified on the drawing list associated with each C-1 package), and the associated software code. Implementation of the plan is required for formal acceptance testing and production release. The software configuration management plan does not apply to reports and data generated by the software except where specifically identified. Control of information produced by the software once it has been transferred for operation is the responsibility of the receiving organization

  17. Saltwell Leak Detector Station Programmable Logic Controller (PLC) Software Configuration Management Plan (SCMP)

    International Nuclear Information System (INIS)

    WHITE, K.A.

    2000-01-01

    This document provides the procedures and guidelines necessary for computer software configuration management activities during the operation and maintenance phases of the Saltwell Leak Detector Stations as required by HNF-PRO-309/Rev.1, Computer Software Quality Assurance, Section 2.4, Software Configuration Management. The software configuration management plan (SCMP) integrates technical and administrative controls to establish and maintain technical consistency among requirements, physical configuration, and documentation for the Saltwell Leak Detector Station Programmable Logic Controller (PLC) software during the Hanford application, operations and maintenance. This SCMP establishes the Saltwell Leak Detector Station PLC Software Baseline, status changes to that baseline, and ensures that software meets design and operational requirements and is tested in accordance with their design basis

  18. Software Configuration Management Plan for the K West Basin Integrated Water Treatment System (IWTS) - Project A.9

    International Nuclear Information System (INIS)

    GREEN, J.W.

    2000-01-01

    This document provides a configuration control plan for the software associated with the operation and control of the Integrated Water Treatment System (IWTS). It establishes requirements for ensuring configuration item identification, configuration control, configuration status accounting, defect reporting and resolution of computer software. It is written to comply with HNF-SD-SNF-CM-001, Spent Nuclear Fuel Configuration Management Plan (Forehand 1998) and HNF-PRO-309 Computer Software Quality Assurance Requirements, and applicable sections of administrative procedure CM-6-037-00, SNF Project Process Automation Software and Equipment

  19. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    Energy Technology Data Exchange (ETDEWEB)

    Habib, Salman [Argonne National Lab. (ANL), Argonne, IL (United States); Roser, Robert [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); LeCompte, Tom [Argonne National Lab. (ANL), Argonne, IL (United States); Marshall, Zach [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Borgland, Anders [SLAC National Accelerator Lab., Menlo Park, CA (United States); Viren, Brett [Brookhaven National Lab. (BNL), Upton, NY (United States); Nugent, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Asai, Makato [SLAC National Accelerator Lab., Menlo Park, CA (United States); Bauerdick, Lothar [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Finkel, Hal [Argonne National Lab. (ANL), Argonne, IL (United States); Gottlieb, Steve [Indiana Univ., Bloomington, IN (United States); Hoeche, Stefan [SLAC National Accelerator Lab., Menlo Park, CA (United States); Sheldon, Paul [Vanderbilt Univ., Nashville, TN (United States); Vay, Jean-Luc [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Elmer, Peter [Princeton Univ., NJ (United States); Kirby, Michael [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Patton, Simon [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Potekhin, Maxim [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Yanny, Brian [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Calafiura, Paolo [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Dart, Eli [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Gutsche, Oliver [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Izubuchi, Taku [Brookhaven National Lab. (BNL), Upton, NY (United States); Lyon, Adam [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Petravick, Don [Univ. of Illinois, Urbana-Champaign, IL (United States). National Center for Supercomputing Applications (NCSA)

    2015-10-29

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  20. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    Energy Technology Data Exchange (ETDEWEB)

    Habib, Salman [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Roser, Robert [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2015-10-28

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  1. SAPHIRE 8 Software Quality Assurance Plan

    Energy Technology Data Exchange (ETDEWEB)

    Curtis Smith

    2010-02-01

    This Quality Assurance (QA) Plan documents the QA activities that will be managed by the INL related to JCN N6423. The NRC developed the SAPHIRE computer code for performing probabilistic risk assessments (PRAs) using a personal computer (PC) at the Idaho National Laboratory (INL) under Job Code Number (JCN) L1429. SAPHIRE started out as a feasibility study for a PRA code to be run on a desktop personal PC and evolved through several phases into a state-of-the-art PRA code. The developmental activity of SAPHIRE was the result of two concurrent important events: The tremendous expansion of PC software and hardware capability of the 90s and the onset of a risk-informed regulation era.

  2. 48 CFR 227.7203-14 - Conformity, acceptance, and warranty of computer software and computer software documentation.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Conformity, acceptance... Software Documentation 227.7203-14 Conformity, acceptance, and warranty of computer software and computer...) Conformity and acceptance. Solicitations and contracts requiring the delivery of computer software shall...

  3. Computer games and software engineering

    CERN Document Server

    Cooper, Kendra M L

    2015-01-01

    Computer games represent a significant software application domain for innovative research in software engineering techniques and technologies. Game developers, whether focusing on entertainment-market opportunities or game-based applications in non-entertainment domains, thus share a common interest with software engineers and developers on how to best engineer game software.Featuring contributions from leading experts in software engineering, the book provides a comprehensive introduction to computer game software development that includes its history as well as emerging research on the inte

  4. Computing and software

    Directory of Open Access Journals (Sweden)

    White, G. C.

    2004-06-01

    Full Text Available The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methods available. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them. In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented. Rotella et al. (2004 compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004. Efford et al. (2004 present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years. Barker & White (2004 discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine

  5. SWiFT Software Quality Assurance Plan.

    Energy Technology Data Exchange (ETDEWEB)

    Berg, Jonathan Charles [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-01-01

    This document describes the software development practice areas and processes which contribute to the ability of SWiFT software developers to provide quality software. These processes are designed to satisfy the requirements set forth by the Sandia Software Quality Assurance Program (SSQAP). APPROVALS SWiFT Software Quality Assurance Plan (SAND2016-0765) approved by: Department Manager SWiFT Site Lead Dave Minster (6121) Date Jonathan White (6121) Date SWiFT Controls Engineer Jonathan Berg (6121) Date CHANGE HISTORY Issue Date Originator(s) Description A 2016/01/27 Jon Berg (06121) Initial release of the SWiFT Software Quality Assurance Plan

  6. 48 CFR 212.212 - Computer software.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Computer software. 212.212... Acquisition of Commercial Items 212.212 Computer software. (1) Departments and agencies shall identify and... technology development), opportunities for the use of commercial computer software and other non...

  7. Modular preoperative planning software for computer-aided oral implantology and the application of a novel stereolithographic template: a pilot study.

    Science.gov (United States)

    Chen, Xiaojun; Yuan, Jianbing; Wang, Chengtao; Huang, Yuanliang; Kang, Lu

    2010-09-01

    In the field of oral implantology, there is a trend toward computer-aided implant surgery, especially the application of computerized tomography (CT)-derived surgical templates. However, because of relatively unsatisfactory match between the templates and receptor sites, conventional surgical templates may not be accurate enough for the severely resorbed edentulous cases during the procedure of transferring the preoperative plan to the actual surgery. The purpose of this study is to introduce a novel bone-tooth-combined-supported surgical guide, which is designed by utilizing a special modular software and fabricated via stereolithography technique using both laser scanning and CT imaging, thus improving the fit accuracy and reliability. A modular preoperative planning software was developed for computer-aided oral implantology. With the introduction of dynamic link libraries and some well-known free, open-source software libraries such as Visualization Toolkit (Kitware, Inc., New York, USA) and Insight Toolkit (Kitware, Inc.) a plug-in evolutive software architecture was established, allowing for expandability, accessibility, and maintainability in our system. To provide a link between the preoperative plan and the actual surgery, a novel bone-tooth-combined-supported surgical template was fabricated, utilizing laser scanning, image registration, and rapid prototyping. Clinical studies were conducted on four partially edentulous cases to make a comparison with the conventional bone-supported templates. The fixation was more stable than tooth-supported templates because laser scanning technology obtained detailed dentition information, which brought about the unique topography between the match surface of the templates and the adjacent teeth. The average distance deviations at the coronal and apical point of the implant were 0.66 mm (range: 0.3-1.2) and 0.86 mm (range: 0.4-1.2), and the average angle deviation was 1.84 degrees (range: 0.6-2.8 degrees ). This pilot

  8. Idea Notebook: Wilderness Food Planning in the Computer Age.

    Science.gov (United States)

    Drury, Jack K.

    1986-01-01

    Explains the use of a computer as a planning and teaching tool in wilderness trip food planning. Details use of master food list and spreadsheet software such as VisiCalc to provide shopping lists for food purchasing, cost analysis, and diet analysis. (NEC)

  9. Bringing Legacy Visualization Software to Modern Computing Devices via Application Streaming

    Science.gov (United States)

    Fisher, Ward

    2014-05-01

    Planning software compatibility across forthcoming generations of computing platforms is a problem commonly encountered in software engineering and development. While this problem can affect any class of software, data analysis and visualization programs are particularly vulnerable. This is due in part to their inherent dependency on specialized hardware and computing environments. A number of strategies and tools have been designed to aid software engineers with this task. While generally embraced by developers at 'traditional' software companies, these methodologies are often dismissed by the scientific software community as unwieldy, inefficient and unnecessary. As a result, many important and storied scientific software packages can struggle to adapt to a new computing environment; for example, one in which much work is carried out on sub-laptop devices (such as tablets and smartphones). Rewriting these packages for a new platform often requires significant investment in terms of development time and developer expertise. In many cases, porting older software to modern devices is neither practical nor possible. As a result, replacement software must be developed from scratch, wasting resources better spent on other projects. Enabled largely by the rapid rise and adoption of cloud computing platforms, 'Application Streaming' technologies allow legacy visualization and analysis software to be operated wholly from a client device (be it laptop, tablet or smartphone) while retaining full functionality and interactivity. It mitigates much of the developer effort required by other more traditional methods while simultaneously reducing the time it takes to bring the software to a new platform. This work will provide an overview of Application Streaming and how it compares against other technologies which allow scientific visualization software to be executed from a remote computer. We will discuss the functionality and limitations of existing application streaming

  10. Computing Division two-year operational plan, FY 1981-1982

    International Nuclear Information System (INIS)

    Euald, R.H.; Worlton, W.J.; McCormick, M.

    1981-02-01

    This report is a comprehensive planning guide for the Computing Division of the Los Alamos National Laboratory for fiscal years 1981 and 1982. Subjects discussed include critical issues, programmatic requiements, hardware plans, software projects, direct user services, research projects, and projections of future developments

  11. Trends in computer hardware and software.

    Science.gov (United States)

    Frankenfeld, F M

    1993-04-01

    Previously identified and current trends in the development of computer systems and in the use of computers for health care applications are reviewed. Trends identified in a 1982 article were increasing miniaturization and archival ability, increasing software costs, increasing software independence, user empowerment through new software technologies, shorter computer-system life cycles, and more rapid development and support of pharmaceutical services. Most of these trends continue today. Current trends in hardware and software include the increasing use of reduced instruction-set computing, migration to the UNIX operating system, the development of large software libraries, microprocessor-based smart terminals that allow remote validation of data, speech synthesis and recognition, application generators, fourth-generation languages, computer-aided software engineering, object-oriented technologies, and artificial intelligence. Current trends specific to pharmacy and hospitals are the withdrawal of vendors of hospital information systems from the pharmacy market, improved linkage of information systems within hospitals, and increased regulation by government. The computer industry and its products continue to undergo dynamic change. Software development continues to lag behind hardware, and its high cost is offsetting the savings provided by hardware.

  12. Computer guided pre-operative planning and dental implant placement

    Directory of Open Access Journals (Sweden)

    Dušan Grošelj

    2007-05-01

    Full Text Available Background: Implants in dentistry are, besides fixed, removable and maxillofacial prosthodontics, one of the reliable possibility to make functional and aesthetic rehabilitation of the edentulism. Surgical and prosthodontic implant complications are often an inattentive consequence of wrong diagnosis, planning, and placement. In this article we present a technique using a highly advanced software program along with a rapid prototyping technology named stereolithography. A planning software for implant placement needs basically the high quality computed tomographic scan of one or both jaws for making accurate preoperative diagnostics and 3D preoperative plan. Later individual drill guide is designed and generated based on both the CT images and the preoperative planning. The patient specific drill guide transfers the virtual planning to the patient’s mouth at time of surgery.Conclusions: The advantages of computer guided implantology are the better prepared surgery with visualisation of critical anatomic structures, assessment of available bone and data about bone quality, increased confidence for the surgeon, deceased operative time, less frequent use of bone grafts, higher quality of collaboration between specialists and prosthetic lab and better communication with patients. Radiographic examination of the operation field for computer guided planning for implant placement is due to high costs justified as the most important information source on the areas to be implanted.

  13. 49 CFR 236.18 - Software management control plan.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Software management control plan. 236.18 Section... Instructions: All Systems General § 236.18 Software management control plan. (a) Within 6 months of June 6, 2005, each railroad shall develop and adopt a software management control plan for its signal and train...

  14. The ANS mathematics and computation software standards

    Energy Technology Data Exchange (ETDEWEB)

    Smetana, A. O. [Savannah River National Laboratory, Washington Savannah River Company, Aiken, SC 29808 (United States)

    2006-07-01

    The Mathematics and Computations Div. of the American Nuclear Society sponsors the ANS-10 Standards Subcommittee. This subcommittee, which is part of the ANS Standards Committee, currently maintains three ANSI/ANS software standards. These standards are: Portability of Scientific and Engineering Software, ANS-10.2; Guidelines for the Verification and Validation of Scientific and Engineering Computer Programs for the Nuclear Industry, ANS-10.4; and Accommodating User Needs in Scientific and Engineering Computer Software Development, ANS-10.5. A fourth Standard, Documentation of Computer Software, ANS-10.3, is available as a historical Standard. (authors)

  15. The ANS mathematics and computation software standards

    International Nuclear Information System (INIS)

    Smetana, A. O.

    2006-01-01

    The Mathematics and Computations Div. of the American Nuclear Society sponsors the ANS-10 Standards Subcommittee. This subcommittee, which is part of the ANS Standards Committee, currently maintains three ANSI/ANS software standards. These standards are: Portability of Scientific and Engineering Software, ANS-10.2; Guidelines for the Verification and Validation of Scientific and Engineering Computer Programs for the Nuclear Industry, ANS-10.4; and Accommodating User Needs in Scientific and Engineering Computer Software Development, ANS-10.5. A fourth Standard, Documentation of Computer Software, ANS-10.3, is available as a historical Standard. (authors)

  16. Music Learning Based on Computer Software

    Directory of Open Access Journals (Sweden)

    Baihui Yan

    2017-12-01

    Full Text Available In order to better develop and improve students’ music learning, the authors proposed the method of music learning based on computer software. It is still a new field to use computer music software to assist teaching. Hereby, we conducted an in-depth analysis on the computer-enabled music learning and the music learning status in secondary schools, obtaining the specific analytical data. Survey data shows that students have many cognitive problems in the current music classroom, and yet teachers have not found a reasonable countermeasure to them. Against this background, the introduction of computer music software to music learning is a new trial that can not only cultivate the students’ initiatives of music learning, but also enhance their abilities to learn music. Therefore, it is concluded that the computer software based music learning is of great significance to improving the current music learning modes and means.

  17. Computer software configuration management

    International Nuclear Information System (INIS)

    Pelletier, G.

    1987-08-01

    This report reviews the basic elements of software configuration management (SCM) as defined by military and industry standards. Several software configuration management standards are evaluated given the requirements of the nuclear industry. A survey is included of available automated tools for supporting SCM activities. Some information is given on the experience of establishing and using SCM plans of other organizations that manage critical software. The report concludes with recommendations of practices that would be most appropriate for the nuclear power industry in Canada

  18. 78 FR 47014 - Configuration Management Plans for Digital Computer Software Used in Safety Systems of Nuclear...

    Science.gov (United States)

    2013-08-02

    .... ML12354A524. 3. Revision 1 of RG 1.170, ``Test Documentation for Digital Computer Software used in Safety... is in ADAMS at Accession No. ML12354A531. 4. Revision 1 of RG 1.171, ``Software Unit Testing for... Software Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory Commission. ACTION...

  19. Software quality assurance plan for viscometer

    International Nuclear Information System (INIS)

    Gimera, M.

    1994-01-01

    The in situ viscometer is a portable instrument designed to raise and lower a sphere (rheometer ball) through layers of tank waste material while recording ball position, velocity, and cable tension. In the field, the viscometer attaches to a decontamination spool piece which in turn is designed to attach to any 4-inch, 150-pound flange (typical of many available tank risers). The motion of the ball and collection of data is controlled by instrumentation and control equipment housed in a separate remote control console. This document covers the product, Viscometer Data Acquisition Software. This document provides the software quality assurance plan, verification and validation plan, and configuration management plan for developing the software for the instrumentation that will be used to obtain rheology data from Tank SY-101

  20. Computational Dosimetry and Treatment Planning Considerations for Neutron Capture Therapy

    International Nuclear Information System (INIS)

    Nigg, David Waler

    2003-01-01

    Specialized treatment planning software systems are generally required for neutron capture therapy (NCT) research and clinical applications. The standard simplifying approximations that work well for treatment planning computations in the case of many other modalities are usually not appropriate for application to neutron transport. One generally must obtain an explicit three-dimensional numerical solution of the governing transport equation, with energy-dependent neutron scattering completely taken into account. Treatment planning systems that have been successfully introduced for NCT applications over the past 15 years rely on the Monte Carlo stochastic simulation method for the necessary computations, primarily because of the geometric complexity of human anatomy. However, historically, there has also been interest in the application of deterministic methods, and there have been some practical developments in this area. Most recently, interest has turned toward the creation of treatment planning software that is not limited to any specific therapy modality, with NCT as only one of several applications. A key issue with NCT treatment planning has to do with boron quantification, and whether improved information concerning the spatial biodistribution of boron can be effectively used to improve the treatment planning process. Validation and benchmarking of computations for NCT are also of current developmental interest. Various institutions have their own procedures, but standard validation models are not yet in wide use

  1. Knowledge-based computer systems for radiotherapy planning.

    Science.gov (United States)

    Kalet, I J; Paluszynski, W

    1990-08-01

    Radiation therapy is one of the first areas of clinical medicine to utilize computers in support of routine clinical decision making. The role of the computer has evolved from simple dose calculations to elaborate interactive graphic three-dimensional simulations. These simulations can combine external irradiation from megavoltage photons, electrons, and particle beams with interstitial and intracavitary sources. With the flexibility and power of modern radiotherapy equipment and the ability of computer programs that simulate anything the machinery can do, we now face a challenge to utilize this capability to design more effective radiation treatments. How can we manage the increased complexity of sophisticated treatment planning? A promising approach will be to use artificial intelligence techniques to systematize our present knowledge about design of treatment plans, and to provide a framework for developing new treatment strategies. Far from replacing the physician, physicist, or dosimetrist, artificial intelligence-based software tools can assist the treatment planning team in producing more powerful and effective treatment plans. Research in progress using knowledge-based (AI) programming in treatment planning already has indicated the usefulness of such concepts as rule-based reasoning, hierarchical organization of knowledge, and reasoning from prototypes. Problems to be solved include how to handle continuously varying parameters and how to evaluate plans in order to direct improvements.

  2. Software configuration management plan, 241-AY and 241-AZ tank farm MICON automation system

    International Nuclear Information System (INIS)

    Hill, L.F.

    1997-01-01

    This document establishes a Computer Software Configuration Management Plan (CSCM) for controlling software for the MICON Distributed Control System (DCS) located at the 241-AY and 241-AZ Aging Waste Tank Farm facilities in the 200 East Area. The MICON DCS software controls and monitors the instrumentation and equipment associated with plant systems and processes. A CSCM identifies and defines the configuration items in a system (section 3.1), controls the release and change of these items throughout the system life cycle (section 3.2), records and reports the status of configuration items and change requests (section 3.3), and verifies the completeness and correctness of the items (section 3.4). All software development before initial release, or before software is baselined, is considered developmental. This plan does not apply to developmental software. This plan applies to software that has been baselined and released. The MICON software will monitor and control the related instrumentation and equipment of the 241-AY and 241-AZ Tank Farm ventilation systems. Eventually, this software may also assume the monitoring and control of the tank sludge washing equipment and other systems as they are brought on line. This plan applies to the System Cognizant Manager and MICON Cognizant Engineer (who is also referred to herein as the system administrator) responsible for the software/hardware and administration of the MICON system. This document also applies to any other organizations within Tank Farms which are currently active on the system including system cognizant engineers, nuclear operators, technicians, and control room supervisors

  3. Protecting software agents from malicious hosts using quantum computing

    Science.gov (United States)

    Reisner, John; Donkor, Eric

    2000-07-01

    We evaluate how quantum computing can be applied to security problems for software agents. Agent-based computing, which merges technological advances in artificial intelligence and mobile computing, is a rapidly growing domain, especially in applications such as electronic commerce, network management, information retrieval, and mission planning. System security is one of the more eminent research areas in agent-based computing, and the specific problem of protecting a mobile agent from a potentially hostile host is one of the most difficult of these challenges. In this work, we describe our agent model, and discuss the capabilities and limitations of classical solutions to the malicious host problem. Quantum computing may be extremely helpful in addressing the limitations of classical solutions to this problem. This paper highlights some of the areas where quantum computing could be applied to agent security.

  4. Computer aided planning of orthopaedic surgeries: the definition of generic planning steps for bone removal procedures.

    Science.gov (United States)

    Putzer, David; Moctezuma, Jose Luis; Nogler, Michael

    2017-11-01

    An increasing number of orthopaedic surgeons are using computer aided planning tools for bone removal applications. The aim of the study was to consolidate a set of generic functions to be used for a 3D computer assisted planning or simulation. A limited subset of 30 surgical procedures was analyzed and verified in 243 surgical procedures of a surgical atlas. Fourteen generic functions to be used in 3D computer assisted planning and simulations were extracted. Our results showed that the average procedure comprises 14 ± 10 (SD) steps with ten different generic planning steps and four generic bone removal steps. In conclusion, the study shows that with a limited number of 14 planning functions it is possible to perform 243 surgical procedures out of Campbell's Operative Orthopedics atlas. The results may be used as a basis for versatile generic intraoperative planning software.

  5. Collection Of Software For Computer Graphics

    Science.gov (United States)

    Hibbard, Eric A.; Makatura, George

    1990-01-01

    Ames Research Graphics System (ARCGRAPH) collection of software libraries and software utilities assisting researchers in generating, manipulating, and visualizing graphical data. Defines metafile format containing device-independent graphical data. File format used with various computer-graphics-manipulation and -animation software packages at Ames, including SURF (COSMIC Program ARC-12381) and GAS (COSMIC Program ARC-12379). Consists of two-stage "pipeline" used to put out graphical primitives. ARCGRAPH libraries developed on VAX computer running VMS.

  6. Three-Dimensional Liver Surgery Simulation: Computer-Assisted Surgical Planning with Three-Dimensional Simulation Software and Three-Dimensional Printing.

    Science.gov (United States)

    Oshiro, Yukio; Ohkohchi, Nobuhiro

    2017-06-01

    To perform accurate hepatectomy without injury, it is necessary to understand the anatomical relationship among the branches of Glisson's sheath, hepatic veins, and tumor. In Japan, three-dimensional (3D) preoperative simulation for liver surgery is becoming increasingly common, and liver 3D modeling and 3D hepatectomy simulation by 3D analysis software for liver surgery have been covered by universal healthcare insurance since 2012. Herein, we review the history of virtual hepatectomy using computer-assisted surgery (CAS) and our research to date, and we discuss the future prospects of CAS. We have used the SYNAPSE VINCENT medical imaging system (Fujifilm Medical, Tokyo, Japan) for 3D visualization and virtual resection of the liver since 2010. We developed a novel fusion imaging technique combining 3D computed tomography (CT) with magnetic resonance imaging (MRI). The fusion image enables us to easily visualize anatomic relationships among the hepatic arteries, portal veins, bile duct, and tumor in the hepatic hilum. In 2013, we developed an original software, called Liversim, which enables real-time deformation of the liver using physical simulation, and a randomized control trial has recently been conducted to evaluate the use of Liversim and SYNAPSE VINCENT for preoperative simulation and planning. Furthermore, we developed a novel hollow 3D-printed liver model whose surface is covered with frames. This model is useful for safe liver resection, has better visibility, and the production cost is reduced to one-third of a previous model. Preoperative simulation and navigation with CAS in liver resection are expected to help planning and conducting a surgery and surgical education. Thus, a novel CAS system will contribute to not only the performance of reliable hepatectomy but also to surgical education.

  7. Computer systems and software engineering

    Science.gov (United States)

    Mckay, Charles W.

    1988-01-01

    The High Technologies Laboratory (HTL) was established in the fall of 1982 at the University of Houston Clear Lake. Research conducted at the High Tech Lab is focused upon computer systems and software engineering. There is a strong emphasis on the interrelationship of these areas of technology and the United States' space program. In Jan. of 1987, NASA Headquarters announced the formation of its first research center dedicated to software engineering. Operated by the High Tech Lab, the Software Engineering Research Center (SERC) was formed at the University of Houston Clear Lake. The High Tech Lab/Software Engineering Research Center promotes cooperative research among government, industry, and academia to advance the edge-of-knowledge and the state-of-the-practice in key topics of computer systems and software engineering which are critical to NASA. The center also recommends appropriate actions, guidelines, standards, and policies to NASA in matters pertinent to the center's research. Results of the research conducted at the High Tech Lab/Software Engineering Research Center have given direction to many decisions made by NASA concerning the Space Station Program.

  8. Software components for medical image visualization and surgical planning

    Science.gov (United States)

    Starreveld, Yves P.; Gobbi, David G.; Finnis, Kirk; Peters, Terence M.

    2001-05-01

    Purpose: The development of new applications in medical image visualization and surgical planning requires the completion of many common tasks such as image reading and re-sampling, segmentation, volume rendering, and surface display. Intra-operative use requires an interface to a tracking system and image registration, and the application requires basic, easy to understand user interface components. Rapid changes in computer and end-application hardware, as well as in operating systems and network environments make it desirable to have a hardware and operating system as an independent collection of reusable software components that can be assembled rapidly to prototype new applications. Methods: Using the OpenGL based Visualization Toolkit as a base, we have developed a set of components that implement the above mentioned tasks. The components are written in both C++ and Python, but all are accessible from Python, a byte compiled scripting language. The components have been used on the Red Hat Linux, Silicon Graphics Iris, Microsoft Windows, and Apple OS X platforms. Rigorous object-oriented software design methods have been applied to ensure hardware independence and a standard application programming interface (API). There are components to acquire, display, and register images from MRI, MRA, CT, Computed Rotational Angiography (CRA), Digital Subtraction Angiography (DSA), 2D and 3D ultrasound, video and physiological recordings. Interfaces to various tracking systems for intra-operative use have also been implemented. Results: The described components have been implemented and tested. To date they have been used to create image manipulation and viewing tools, a deep brain functional atlas, a 3D ultrasound acquisition and display platform, a prototype minimally invasive robotic coronary artery bypass graft planning system, a tracked neuro-endoscope guidance system and a frame-based stereotaxy neurosurgery planning tool. The frame-based stereotaxy module has been

  9. Challenges to Software/Computing for Experimentation at the LHC

    Science.gov (United States)

    Banerjee, Sunanda

    The demands of future high energy physics experiments towards software and computing have led the experiments to plan the related activities as a full-fledged project and to investigate new methodologies and languages to meet the challenges. The paths taken by the four LHC experiments ALICE, ATLAS, CMS and LHCb are coherently put together in an LHC-wide framework based on Grid technology. The current status and understandings have been broadly outlined.

  10. Software quality assurance plan for GCS

    Science.gov (United States)

    Duncan, Stephen E.; Bailey, Elizabeth K.

    1990-01-01

    The software quality assurance (SQA) function for the Guidance and Control Software (GCS) project which is part of a software error studies research program is described. The SQA plan outlines all of the procedures, controls, and audits to be carried out by the SQA organization to ensure adherence to the policies, procedures, and standards for the GCS project.

  11. Light Duty Utility Arm Software Test Plan

    International Nuclear Information System (INIS)

    Kiebel, G.R.

    1995-01-01

    This plan describes how validation testing of the software will be implemented for the integrated control and data acquisition system of the Light Duty Utility Arm System (LDUA). The purpose of LDUA software validation testing is to demonstrate and document that the LDUA software meets its software requirements specification

  12. Study on Top-Down Estimation Method of Software Project Planning

    Institute of Scientific and Technical Information of China (English)

    ZHANG Jun-guang; L(U) Ting-jie; ZHAO Yu-mei

    2006-01-01

    This paper studies a new software project planning method under some actual project data in order to make software project plans more effective. From the perspective of system theory, our new method regards a software project plan as an associative unit for study. During a top-down estimation of a software project, Program Evaluation and Review Technique (PERT) method and analogy method are combined to estimate its size, then effort estimation and specific schedules are obtained according to distributions of the phase effort. This allows a set of practical and feasible planning methods to be constructed. Actual data indicate that this set of methods can lead to effective software project planning.

  13. 14 CFR 415.123 - Computing systems and software.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Computing systems and software. 415.123... Launch Vehicle From a Non-Federal Launch Site § 415.123 Computing systems and software. (a) An applicant's safety review document must describe all computing systems and software that perform a safety...

  14. Music Learning Based on Computer Software

    OpenAIRE

    Baihui Yan; Qiao Zhou

    2017-01-01

    In order to better develop and improve students’ music learning, the authors proposed the method of music learning based on computer software. It is still a new field to use computer music software to assist teaching. Hereby, we conducted an in-depth analysis on the computer-enabled music learning and the music learning status in secondary schools, obtaining the specific analytical data. Survey data shows that students have many cognitive problems in the current music classroom, and yet teach...

  15. Essential Means for Urban Computing: Specification of Web-Based Computing Platforms for Urban Planning, a Hitchhiker’s Guide

    OpenAIRE

    Pirouz Nourian; Carlos Martinez-Ortiz; Ken Arroyo Ohori

    2018-01-01

    This article provides an overview of the specifications of web-based computing platforms for urban data analytics and computational urban planning practice. There are currently a variety of tools and platforms that can be used in urban computing practices, including scientific computing languages, interactive web languages, data sharing platforms and still many desktop computing environments, e.g., GIS software applications. We have reviewed a list of technologies considering their potential ...

  16. NIF Projects Controls and Information Systems Software Quality Assurance Plan

    Energy Technology Data Exchange (ETDEWEB)

    Fishler, B

    2011-03-18

    Quality achievement for the National Ignition Facility (NIF) and the National Ignition Campaign (NIC) is the responsibility of the NIF Projects line organization as described in the NIF and Photon Science Directorate Quality Assurance Plan (NIF QA Plan). This Software Quality Assurance Plan (SQAP) is subordinate to the NIF QA Plan and establishes quality assurance (QA) activities for the software subsystems within Controls and Information Systems (CIS). This SQAP implements an activity level software quality assurance plan for NIF Projects as required by the LLNL Institutional Software Quality Assurance Program (ISQAP). Planned QA activities help achieve, assess, and maintain appropriate quality of software developed and/or acquired for control systems, shot data systems, laser performance modeling systems, business applications, industrial control and safety systems, and information technology systems. The objective of this SQAP is to ensure that appropriate controls are developed and implemented for management planning, work execution, and quality assessment of the CIS organization's software activities. The CIS line organization places special QA emphasis on rigorous configuration control, change management, testing, and issue tracking to help achieve its quality goals.

  17. NIF Projects Controls and Information Systems Software Quality Assurance Plan

    International Nuclear Information System (INIS)

    Fishler, B.

    2011-01-01

    Quality achievement for the National Ignition Facility (NIF) and the National Ignition Campaign (NIC) is the responsibility of the NIF Projects line organization as described in the NIF and Photon Science Directorate Quality Assurance Plan (NIF QA Plan). This Software Quality Assurance Plan (SQAP) is subordinate to the NIF QA Plan and establishes quality assurance (QA) activities for the software subsystems within Controls and Information Systems (CIS). This SQAP implements an activity level software quality assurance plan for NIF Projects as required by the LLNL Institutional Software Quality Assurance Program (ISQAP). Planned QA activities help achieve, assess, and maintain appropriate quality of software developed and/or acquired for control systems, shot data systems, laser performance modeling systems, business applications, industrial control and safety systems, and information technology systems. The objective of this SQAP is to ensure that appropriate controls are developed and implemented for management planning, work execution, and quality assessment of the CIS organization's software activities. The CIS line organization places special QA emphasis on rigorous configuration control, change management, testing, and issue tracking to help achieve its quality goals.

  18. Computer software review procedures

    International Nuclear Information System (INIS)

    Mauck, J.L.

    1993-01-01

    This article reviews the procedures which are used to review software written for computer based instrumentation and control functions in nuclear facilities. The utilization of computer based control systems is becoming much more prevalent in such installations, in addition to being retrofit into existing systems. Currently, the Nuclear Regulatory System uses Regulatory Guide 1.152, open-quotes Criteria for Programmable Digital Computer System Software in Safety-Related Systems of Nuclear Power Plantsclose quotes and ANSI/IEEE-ANS-7-4.3.2-1982, open-quotes Application Criteria for Programmable Digital Computer Systems in Safety Systems of Nuclear Power Generating Stationsclose quotes for guidance when performing reviews of digital systems. There is great concern about the process of verification and validation of these codes, so when inspections are done of such systems, inspectors examine very closely the processes which were followed in developing the codes, the errors which were detected, how they were found, and the analysis which went into tracing down the causes behind the errors to insure such errors were not propagated again in the future

  19. Surgical planning of total hip arthroplasty: accuracy of computer-assisted EndoMap software in predicting component size

    International Nuclear Information System (INIS)

    Davila, Jesse A.; Kransdorf, Mark J.; Duffy, Gavan P.

    2006-01-01

    The purpose of our study was to assess the accuracy of a computer-assisted templating in the surgical planning of patients undergoing total hip arthroplasty utilizing EndoMap software (Siemans AG, Medical Solutions, Erlangen, Germany). Endomap Software is an electronic program that uses DICOM images to analyze standard anteroposterior radiographs for determination of optimal prosthesis component size. We retrospectively reviewed the preoperative radiographs of 36 patients undergoing uncomplicated primary total hip arthroplasty, utilizing EndoMap software, Version VA20. DICOM anteroposterior radiographs were analyzed using standard manufacturer supplied electronic templates to determine acetabular and femoral component sizes. No additional clinical information was reviewed. Acetabular and femoral component sizes were assessed by an orthopedic surgeon and two radiologists. Mean and estimated component size was compared with component size as documented in operative reports. The mean estimated acetabular component size was 53 mm (range 48-60 mm), 1 mm larger than the mean implanted size of 52 mm (range 48-62 mm). Thirty-one of 36 acetabular component sizes (86%) were accurate within one size. The mean calculated femoral component size was 4 (range 2-7), 1 size smaller than the actual mean component size of 5 (range 2-9). Twenty-six of 36 femoral component sizes (72%) were accurate within one size, and accurate within two sizes in all but four cases (94%). EndoMap Software predicted femoral component size well, with 72% within one component size of that used, and 94% within two sizes. Acetabular component size was predicted slightly better with 86% within one component size and 94% within two component sizes. (orig.)

  20. Software For Computing Selected Functions

    Science.gov (United States)

    Grant, David C.

    1992-01-01

    Technical memorandum presents collection of software packages in Ada implementing mathematical functions used in science and engineering. Provides programmer with function support in Pascal and FORTRAN, plus support for extended-precision arithmetic and complex arithmetic. Valuable for testing new computers, writing computer code, or developing new computer integrated circuits.

  1. 77 FR 50726 - Software Requirement Specifications for Digital Computer Software and Complex Electronics Used in...

    Science.gov (United States)

    2012-08-22

    ... Computer Software and Complex Electronics Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear...-1209, ``Software Requirement Specifications for Digital Computer Software and Complex Electronics used... Electronics Engineers (ANSI/IEEE) Standard 830-1998, ``IEEE Recommended Practice for Software Requirements...

  2. 77 FR 50727 - Configuration Management Plans for Digital Computer Software Used in Safety Systems of Nuclear...

    Science.gov (United States)

    2012-08-22

    ... enhanced consensus practices for planning software configuration management (SCM) as described in the... testing of structures, systems, and components important to safety throughout the life of the unit. This...

  3. 48 CFR 27.405-3 - Commercial computer software.

    Science.gov (United States)

    2010-10-01

    ... software. 27.405-3 Section 27.405-3 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION... Commercial computer software. (a) When contracting other than from GSA's Multiple Award Schedule contracts for the acquisition of commercial computer software, no specific contract clause prescribed in this...

  4. Computer Software Configuration Item-Specific Flight Software Image Transfer Script Generator

    Science.gov (United States)

    Bolen, Kenny; Greenlaw, Ronald

    2010-01-01

    A K-shell UNIX script enables the International Space Station (ISS) Flight Control Team (FCT) operators in NASA s Mission Control Center (MCC) in Houston to transfer an entire or partial computer software configuration item (CSCI) from a flight software compact disk (CD) to the onboard Portable Computer System (PCS). The tool is designed to read the content stored on a flight software CD and generate individual CSCI transfer scripts that are capable of transferring the flight software content in a given subdirectory on the CD to the scratch directory on the PCS. The flight control team can then transfer the flight software from the PCS scratch directory to the Electronically Erasable Programmable Read Only Memory (EEPROM) of an ISS Multiplexer/ Demultiplexer (MDM) via the Indirect File Transfer capability. The individual CSCI scripts and the CSCI Specific Flight Software Image Transfer Script Generator (CFITSG), when executed a second time, will remove all components from their original execution. The tool will identify errors in the transfer process and create logs of the transferred software for the purposes of configuration management.

  5. Active resources concept of computation for enterprise software

    Directory of Open Access Journals (Sweden)

    Koryl Maciej

    2017-06-01

    Full Text Available Traditional computational models for enterprise software are still to a great extent centralized. However, rapid growing of modern computation techniques and frameworks causes that contemporary software becomes more and more distributed. Towards development of new complete and coherent solution for distributed enterprise software construction, synthesis of three well-grounded concepts is proposed: Domain-Driven Design technique of software engineering, REST architectural style and actor model of computation. As a result new resources-based framework arises, which after first cases of use seems to be useful and worthy of further research.

  6. Computer, Network, Software, and Hardware Engineering with Applications

    CERN Document Server

    Schneidewind, Norman F

    2012-01-01

    There are many books on computers, networks, and software engineering but none that integrate the three with applications. Integration is important because, increasingly, software dominates the performance, reliability, maintainability, and availability of complex computer and systems. Books on software engineering typically portray software as if it exists in a vacuum with no relationship to the wider system. This is wrong because a system is more than software. It is comprised of people, organizations, processes, hardware, and software. All of these components must be considered in an integr

  7. Guidance and Control Software Project Data - Volume 1: Planning Documents

    Science.gov (United States)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the planning documents from the GCS project. Volume 1 contains five appendices: A. Plan for Software Aspects of Certification for the Guidance and Control Software Project; B. Software Development Standards for the Guidance and Control Software Project; C. Software Verification Plan for the Guidance and Control Software Project; D. Software Configuration Management Plan for the Guidance and Control Software Project; and E. Software Quality Assurance Activities.

  8. Computer-assisted qualitative data analysis software.

    Science.gov (United States)

    Cope, Diane G

    2014-05-01

    Advances in technology have provided new approaches for data collection methods and analysis for researchers. Data collection is no longer limited to paper-and-pencil format, and numerous methods are now available through Internet and electronic resources. With these techniques, researchers are not burdened with entering data manually and data analysis is facilitated by software programs. Quantitative research is supported by the use of computer software and provides ease in the management of large data sets and rapid analysis of numeric statistical methods. New technologies are emerging to support qualitative research with the availability of computer-assisted qualitative data analysis software (CAQDAS).CAQDAS will be presented with a discussion of advantages, limitations, controversial issues, and recommendations for this type of software use.

  9. Contracting for Computer Software in Standardized Computer Languages

    Science.gov (United States)

    Brannigan, Vincent M.; Dayhoff, Ruth E.

    1982-01-01

    The interaction between standardized computer languages and contracts for programs which use these languages is important to the buyer or seller of software. The rationale for standardization, the problems in standardizing computer languages, and the difficulties of determining whether the product conforms to the standard are issues which must be understood. The contract law processes of delivery, acceptance testing, acceptance, rejection, and revocation of acceptance are applicable to the contracting process for standard language software. Appropriate contract language is suggested for requiring strict compliance with a standard, and an overview of remedies is given for failure to comply.

  10. 48 CFR 212.7003 - Technical data and computer software.

    Science.gov (United States)

    2010-10-01

    ... computer software. 212.7003 Section 212.7003 Federal Acquisition Regulations System DEFENSE ACQUISITION... data and computer software. For purposes of establishing delivery requirements and license rights for technical data under 227.7102 and for computer software under 227.7202, there shall be a rebuttable...

  11. Software Accelerates Computing Time for Complex Math

    Science.gov (United States)

    2014-01-01

    Ames Research Center awarded Newark, Delaware-based EM Photonics Inc. SBIR funding to utilize graphic processing unit (GPU) technology- traditionally used for computer video games-to develop high-computing software called CULA. The software gives users the ability to run complex algorithms on personal computers with greater speed. As a result of the NASA collaboration, the number of employees at the company has increased 10 percent.

  12. Software engineering frameworks for the cloud computing paradigm

    CERN Document Server

    Mahmood, Zaigham

    2013-01-01

    This book presents the latest research on Software Engineering Frameworks for the Cloud Computing Paradigm, drawn from an international selection of researchers and practitioners. The book offers both a discussion of relevant software engineering approaches and practical guidance on enterprise-wide software deployment in the cloud environment, together with real-world case studies. Features: presents the state of the art in software engineering approaches for developing cloud-suitable applications; discusses the impact of the cloud computing paradigm on software engineering; offers guidance an

  13. More than 2 years' experience with computer-aided irradiation planning in clinical routine

    International Nuclear Information System (INIS)

    Heller, H.; Rathje, J.

    1976-01-01

    This is a report on an irradiation planning system which has been used for about 2 years in the department of radiotherapy in the general hospital in Altona. Hard- and software as well as the mathematical model for the description of the dose distribution are described. The compromise between the required accuray of the irradiation plan and the investment in computer-technical activities and computer time is discussed. (orig./LN) [de

  14. LDUA software custodian's notebook

    International Nuclear Information System (INIS)

    Aftanas, B.L.

    1998-01-01

    This plan describes the activities to be performed and controls to be applied to the process of specifying, obtaining, and qualifying the control and data acquisition software for the Light Duty Utility Arm (LDUA) System. It serves the purpose of a software quality assurance plan, a verification and validation plan, and a configuration management plan. This plan applies to all software that is an integral part of the LDUA control and data acquisition system, that is, software that is installed in the computers that are part of the LDUA system as it is deployed in the field. This plan applies to the entire development process, including: requirements; design; implementation; and operations and maintenance. This plan does not apply to any software that is not integral with the LDUA system. This plan has-been prepared in accordance with WHC-CM-6-1 Engineering Practices, EP-2.1; WHC-CM-3-10 Software Practices; and WHC-CM-4-2, QR 19.0, Software Quality Assurance Requirements

  15. Planning for land use and conservation: Assessing GIS-based conservation software for land use planning

    Science.gov (United States)

    Rob Baldwin; Ryan Scherzinger; Don Lipscomb; Miranda Mockrin; Susan Stein

    2014-01-01

    Recent advances in planning and ecological software make it possible to conduct highly technical analyses to prioritize conservation investments and inform local land use planning. We review these tools, termed conservation planning tools, and assess the knowledge of a key set of potential users: the land use planning community. We grouped several conservation software...

  16. Software Systems for High-performance Quantum Computing

    Energy Technology Data Exchange (ETDEWEB)

    Humble, Travis S [ORNL; Britt, Keith A [ORNL

    2016-01-01

    Quantum computing promises new opportunities for solving hard computational problems, but harnessing this novelty requires breakthrough concepts in the design, operation, and application of computing systems. We define some of the challenges facing the development of quantum computing systems as well as software-based approaches that can be used to overcome these challenges. Following a brief overview of the state of the art, we present models for the quantum programming and execution models, the development of architectures for hybrid high-performance computing systems, and the realization of software stacks for quantum networking. This leads to a discussion of the role that conventional computing plays in the quantum paradigm and how some of the current challenges for exascale computing overlap with those facing quantum computing.

  17. Computer Center: Software Review.

    Science.gov (United States)

    Duhrkopf, Richard, Ed.; Belshe, John F., Ed.

    1988-01-01

    Reviews a software package, "Mitosis-Meiosis," available for Apple II or IBM computers with colorgraphics capabilities. Describes the documentation, presentation and flexibility of the program. Rates the program based on graphics and usability in a biology classroom. (CW)

  18. Computer-Aided Software Engineering - An approach to real-time software development

    Science.gov (United States)

    Walker, Carrie K.; Turkovich, John J.

    1989-01-01

    A new software engineering discipline is Computer-Aided Software Engineering (CASE), a technology aimed at automating the software development process. This paper explores the development of CASE technology, particularly in the area of real-time/scientific/engineering software, and a history of CASE is given. The proposed software development environment for the Advanced Launch System (ALS CASE) is described as an example of an advanced software development system for real-time/scientific/engineering (RT/SE) software. The Automated Programming Subsystem of ALS CASE automatically generates executable code and corresponding documentation from a suitably formatted specification of the software requirements. Software requirements are interactively specified in the form of engineering block diagrams. Several demonstrations of the Automated Programming Subsystem are discussed.

  19. Software quality assurance plan for void fraction instrument

    International Nuclear Information System (INIS)

    Gimera, M.

    1994-01-01

    Waste Tank SY-101 has been the focus of extensive characterization work over the past few years. The waste continually generates gases, most notably hydrogen, which are periodically released from the waste. Gas can be trapped in tank waste in three forms: as void gas (bubbles), dissolved gas, or absorbed gas. Void fraction is the volume percentage of a given sample that is comprised of void gas. The void fraction instrument (VFI) acquires the data necessary to calculate void fraction. This document covers the product, Void Fraction Data Acquisition Software. The void fraction software being developed will have the ability to control the void fraction instrument hardware and acquire data necessary to calculate the void fraction in samples. This document provides the software quality assurance plan, verification and validation plan, and configuration management plan for developing the software for the instrumentation that will be used to obtain void fraction data from Tank SY-101

  20. Radioisotope thermoelectric generator transportation system subsystem 143 software development plan

    International Nuclear Information System (INIS)

    King, D.A.

    1994-01-01

    This plan describes the activities to be performed and the controls to be applied to the process of specifying, developing, and qualifying the data acquisition software for the Radioisotope Thermoelectric Generator (RTG) Transportation System Subsystem 143 Instrumentation and Data Acquisition System (IDAS). This plan will serve as a software quality assurance plan, a verification and validation (V and V) plan, and a configuration management plan

  1. Computational intelligence and quantitative software engineering

    CERN Document Server

    Succi, Giancarlo; Sillitti, Alberto

    2016-01-01

    In a down-to-the earth manner, the volume lucidly presents how the fundamental concepts, methodology, and algorithms of Computational Intelligence are efficiently exploited in Software Engineering and opens up a novel and promising avenue of a comprehensive analysis and advanced design of software artifacts. It shows how the paradigm and the best practices of Computational Intelligence can be creatively explored to carry out comprehensive software requirement analysis, support design, testing, and maintenance. Software Engineering is an intensive knowledge-based endeavor of inherent human-centric nature, which profoundly relies on acquiring semiformal knowledge and then processing it to produce a running system. The knowledge spans a wide variety of artifacts, from requirements, captured in the interaction with customers, to design practices, testing, and code management strategies, which rely on the knowledge of the running system. This volume consists of contributions written by widely acknowledged experts ...

  2. 48 CFR 52.227-19 - Commercial Computer Software License.

    Science.gov (United States)

    2010-10-01

    ... Software License. 52.227-19 Section 52.227-19 Federal Acquisition Regulations System FEDERAL ACQUISITION... Clauses 52.227-19 Commercial Computer Software License. As prescribed in 27.409(g), insert the following clause: Commercial Computer Software License (DEC 2007) (a) Notwithstanding any contrary provisions...

  3. CMS Software and Computing Ready for Run 2

    CERN Document Server

    Bloom, Kenneth

    2015-01-01

    In Run 1 of the Large Hadron Collider, software and computing was a strategic strength of the Compact Muon Solenoid experiment. The timely processing of data and simulation samples and the excellent performance of the reconstruction algorithms played an important role in the preparation of the full suite of searches used for the observation of the Higgs boson in 2012. In Run 2, the LHC will run at higher intensities and CMS will record data at a higher trigger rate. These new running conditions will provide new challenges for the software and computing systems. Over the two years of Long Shutdown 1, CMS has built upon the successes of Run 1 to improve the software and computing to meet these challenges. In this presentation we will describe the new features in software and computing that will once again put CMS in a position of physics leadership.

  4. Evaluation of high-performance computing software

    Energy Technology Data Exchange (ETDEWEB)

    Browne, S.; Dongarra, J. [Univ. of Tennessee, Knoxville, TN (United States); Rowan, T. [Oak Ridge National Lab., TN (United States)

    1996-12-31

    The absence of unbiased and up to date comparative evaluations of high-performance computing software complicates a user`s search for the appropriate software package. The National HPCC Software Exchange (NHSE) is attacking this problem using an approach that includes independent evaluations of software, incorporation of author and user feedback into the evaluations, and Web access to the evaluations. We are applying this approach to the Parallel Tools Library (PTLIB), a new software repository for parallel systems software and tools, and HPC-Netlib, a high performance branch of the Netlib mathematical software repository. Updating the evaluations with feed-back and making it available via the Web helps ensure accuracy and timeliness, and using independent reviewers produces unbiased comparative evaluations difficult to find elsewhere.

  5. KNGR core proection calculator, software, verification and validation plan

    International Nuclear Information System (INIS)

    Kim, Jang Yeol; Park, Jong Kyun; Lee, Ki Young; Lee, Jang Soo; Cheon, Se Woo

    2001-05-01

    This document describes the Software Verification and Validation Plan(SVVP) Guidance to be used in reviewing the Software Program Manual(SPM) in Korean Next Generation Reactor(KNGR) projects. This document is intended for a verifier or reviewer who is involved with performing of software verification and validation task activity in KNGR projects. This document includeds the basic philosophy, performing V and V effort, software testing techniques, criteria of review and audit on the safety software V and V activity. Major review topics on safety software addresses three kinds of characteristics based on Standard Review Plan(SRP) Chapter 7, Branch Technical Position(BTP)-14 : management characteristics, implementation characteristics and resources characteristics when reviewing on SVVP. Based on major topics of this document, we have produced the evaluation items list such as checklist in Appendix A

  6. Planning the Unplanned Experiment: Assessing the Efficacy of Standards for Safety Critical Software

    Science.gov (United States)

    Graydon, Patrick J.; Holloway, C. Michael

    2015-01-01

    We need well-founded means of determining whether software is t for use in safety-critical applications. While software in industries such as aviation has an excellent safety record, the fact that software aws have contributed to deaths illustrates the need for justi ably high con dence in software. It is often argued that software is t for safety-critical use because it conforms to a standard for software in safety-critical systems. But little is known about whether such standards `work.' Reliance upon a standard without knowing whether it works is an experiment; without collecting data to assess the standard, this experiment is unplanned. This paper reports on a workshop intended to explore how standards could practicably be assessed. Planning the Unplanned Experiment: Assessing the Ecacy of Standards for Safety Critical Software (AESSCS) was held on 13 May 2014 in conjunction with the European Dependable Computing Conference (EDCC). We summarize and elaborate on the workshop's discussion of the topic, including both the presented positions and the dialogue that ensued.

  7. Software Infrastructure for Computer-aided Drug Discovery and Development, a Practical Example with Guidelines.

    Science.gov (United States)

    Moretti, Loris; Sartori, Luca

    2016-09-01

    In the field of Computer-Aided Drug Discovery and Development (CADDD) the proper software infrastructure is essential for everyday investigations. The creation of such an environment should be carefully planned and implemented with certain features in order to be productive and efficient. Here we describe a solution to integrate standard computational services into a functional unit that empowers modelling applications for drug discovery. This system allows users with various level of expertise to run in silico experiments automatically and without the burden of file formatting for different software, managing the actual computation, keeping track of the activities and graphical rendering of the structural outcomes. To showcase the potential of this approach, performances of five different docking programs on an Hiv-1 protease test set are presented. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Computer organization and design the hardware/software interface

    CERN Document Server

    Hennessy, John L

    1994-01-01

    Computer Organization and Design: The Hardware/Software Interface presents the interaction between hardware and software at a variety of levels, which offers a framework for understanding the fundamentals of computing. This book focuses on the concepts that are the basis for computers.Organized into nine chapters, this book begins with an overview of the computer revolution. This text then explains the concepts and algorithms used in modern computer arithmetic. Other chapters consider the abstractions and concepts in memory hierarchies by starting with the simplest possible cache. This book di

  9. Essential Means for Urban Computing: Specification of Web-Based Computing Platforms for Urban Planning, a Hitchhiker’s Guide

    Directory of Open Access Journals (Sweden)

    Pirouz Nourian

    2018-03-01

    Full Text Available This article provides an overview of the specifications of web-based computing platforms for urban data analytics and computational urban planning practice. There are currently a variety of tools and platforms that can be used in urban computing practices, including scientific computing languages, interactive web languages, data sharing platforms and still many desktop computing environments, e.g., GIS software applications. We have reviewed a list of technologies considering their potential and applicability in urban planning and urban data analytics. This review is not only based on the technical factors such as capabilities of the programming languages but also the ease of developing and sharing complex data processing workflows. The arena of web-based computing platforms is currently under rapid development and is too volatile to be predictable; therefore, in this article we focus on the specification of the requirements and potentials from an urban planning point of view rather than speculating about the fate of computing platforms or programming languages. The article presents a list of promising computing technologies, a technical specification of the essential data models and operators for geo-spatial data processing, and mathematical models for an ideal urban computing platform.

  10. Open-Source Software in Computational Research: A Case Study

    Directory of Open Access Journals (Sweden)

    Sreekanth Pannala

    2008-04-01

    Full Text Available A case study of open-source (OS development of the computational research software MFIX, used for multiphase computational fluid dynamics simulations, is presented here. The verification and validation steps required for constructing modern computational software and the advantages of OS development in those steps are discussed. The infrastructure used for enabling the OS development of MFIX is described. The impact of OS development on computational research and education in gas-solids flow, as well as the dissemination of information to other areas such as geophysical and volcanology research, is demonstrated. This study shows that the advantages of OS development were realized in the case of MFIX: verification by many users, which enhances software quality; the use of software as a means for accumulating and exchanging information; the facilitation of peer review of the results of computational research.

  11. Ethics in computer software design and development

    Science.gov (United States)

    Alan J. Thomson; Daniel L. Schmoldt

    2001-01-01

    Over the past 20 years, computer software has become integral and commonplace for operational and management tasks throughout agricultural and natural resource disciplines. During this software infusion, however, little thought has been afforded human impacts, both good and bad. This paper examines current ethical issues of software system design and development in...

  12. Practical methods to improve the development of computational software

    International Nuclear Information System (INIS)

    Osborne, A. G.; Harding, D. W.; Deinert, M. R.

    2013-01-01

    The use of computation has become ubiquitous in science and engineering. As the complexity of computer codes has increased, so has the need for robust methods to minimize errors. Past work has show that the number of functional errors is related the number of commands that a code executes. Since the late 1960's, major participants in the field of computation have encouraged the development of best practices for programming to help reduce coder induced error, and this has lead to the emergence of 'software engineering' as a field of study. Best practices for coding and software production have now evolved and become common in the development of commercial software. These same techniques, however, are largely absent from the development of computational codes by research groups. Many of the best practice techniques from the professional software community would be easy for research groups in nuclear science and engineering to adopt. This paper outlines the history of software engineering, as well as issues in modern scientific computation, and recommends practices that should be adopted by individual scientific programmers and university research groups. (authors)

  13. Teaching cloud computing: a software engineering perspective

    OpenAIRE

    Sommerville, Ian

    2012-01-01

    This short papers discusses the issues of teaching cloud computing from a software engineering rather than a business perspective. It discusses what topics might be covered in a senior course on cloud software engineering.

  14. The HEP Software and Computing Knowledge Base

    Science.gov (United States)

    Wenaus, T.

    2017-10-01

    HEP software today is a rich and diverse domain in itself and exists within the mushrooming world of open source software. As HEP software developers and users we can be more productive and effective if our work and our choices are informed by a good knowledge of what others in our community have created or found useful. The HEP Software and Computing Knowledge Base, hepsoftware.org, was created to facilitate this by serving as a collection point and information exchange on software projects and products, services, training, computing facilities, and relating them to the projects, experiments, organizations and science domains that offer them or use them. It was created as a contribution to the HEP Software Foundation, for which a HEP S&C knowledge base was a much requested early deliverable. This contribution will motivate and describe the system, what it offers, its content and contributions both existing and needed, and its implementation (node.js based web service and javascript client app) which has emphasized ease of use for both users and contributors.

  15. Data systems and computer science: Software Engineering Program

    Science.gov (United States)

    Zygielbaum, Arthur I.

    1991-01-01

    An external review of the Integrated Technology Plan for the Civil Space Program is presented. This review is specifically concerned with the Software Engineering Program. The goals of the Software Engineering Program are as follows: (1) improve NASA's ability to manage development, operation, and maintenance of complex software systems; (2) decrease NASA's cost and risk in engineering complex software systems; and (3) provide technology to assure safety and reliability of software in mission critical applications.

  16. Archiving Software Systems: Approaches to Preserve Computational Capabilities

    Science.gov (United States)

    King, T. A.

    2014-12-01

    A great deal of effort is made to preserve scientific data. Not only because data is knowledge, but it is often costly to acquire and is sometimes collected under unique circumstances. Another part of the science enterprise is the development of software to process and analyze the data. Developed software is also a large investment and worthy of preservation. However, the long term preservation of software presents some challenges. Software often requires a specific technology stack to operate. This can include software, operating systems and hardware dependencies. One past approach to preserve computational capabilities is to maintain ancient hardware long past its typical viability. On an archive horizon of 100 years, this is not feasible. Another approach to preserve computational capabilities is to archive source code. While this can preserve details of the implementation and algorithms, it may not be possible to reproduce the technology stack needed to compile and run the resulting applications. This future forward dilemma has a solution. Technology used to create clouds and process big data can also be used to archive and preserve computational capabilities. We explore how basic hardware, virtual machines, containers and appropriate metadata can be used to preserve computational capabilities and to archive functional software systems. In conjunction with data archives, this provides scientist with both the data and capability to reproduce the processing and analysis used to generate past scientific results.

  17. Tools for Embedded Computing Systems Software

    Science.gov (United States)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  18. Software And Systems Engineering Risk Management

    Science.gov (United States)

    2010-04-01

    RSKM 2004 COSO Enterprise RSKM Framework 2006 ISO/IEC 16085 Risk Management Process 2008 ISO/IEC 12207 Software Lifecycle Processes 2009 ISO/IEC...1 Software And Systems Engineering Risk Management John Walz VP Technical and Conferences Activities, IEEE Computer Society Vice-Chair Planning...Software & Systems Engineering Standards Committee, IEEE Computer Society US TAG to ISO TMB Risk Management Working Group Systems and Software

  19. Web Application Software for Ground Operations Planning Database (GOPDb) Management

    Science.gov (United States)

    Lanham, Clifton; Kallner, Shawn; Gernand, Jeffrey

    2013-01-01

    A Web application facilitates collaborative development of the ground operations planning document. This will reduce costs and development time for new programs by incorporating the data governance, access control, and revision tracking of the ground operations planning data. Ground Operations Planning requires the creation and maintenance of detailed timelines and documentation. The GOPDb Web application was created using state-of-the-art Web 2.0 technologies, and was deployed as SaaS (Software as a Service), with an emphasis on data governance and security needs. Application access is managed using two-factor authentication, with data write permissions tied to user roles and responsibilities. Multiple instances of the application can be deployed on a Web server to meet the robust needs for multiple, future programs with minimal additional cost. This innovation features high availability and scalability, with no additional software that needs to be bought or installed. For data governance and security (data quality, management, business process management, and risk management for data handling), the software uses NAMS. No local copy/cloning of data is permitted. Data change log/tracking is addressed, as well as collaboration, work flow, and process standardization. The software provides on-line documentation and detailed Web-based help. There are multiple ways that this software can be deployed on a Web server to meet ground operations planning needs for future programs. The software could be used to support commercial crew ground operations planning, as well as commercial payload/satellite ground operations planning. The application source code and database schema are owned by NASA.

  20. Contracting for Computer Software in Standardized Computer Languages

    OpenAIRE

    Brannigan, Vincent M.; Dayhoff, Ruth E.

    1982-01-01

    The interaction between standardized computer languages and contracts for programs which use these languages is important to the buyer or seller of software. The rationale for standardization, the problems in standardizing computer languages, and the difficulties of determining whether the product conforms to the standard are issues which must be understood. The contract law processes of delivery, acceptance testing, acceptance, rejection, and revocation of acceptance are applicable to the co...

  1. Software for CATV Design and Frequency Plan Optimization

    OpenAIRE

    Hala, O.

    2007-01-01

    The paper deals with the structure of a software medium used for design and sub-optimization of frequency plan in CATV networks, their description and design method. The software performance is described and a simple design example of energy balance of a simplified CATV network is given. The software was created in programming environment called Delphi and local optimization was made in Matlab.

  2. "SABER": A new software tool for radiotherapy treatment plan evaluation.

    Science.gov (United States)

    Zhao, Bo; Joiner, Michael C; Orton, Colin G; Burmeister, Jay

    2010-11-01

    Both spatial and biological information are necessary in order to perform true optimization of a treatment plan and for predicting clinical outcome. The goal of this work is to develop an enhanced treatment plan evaluation tool which incorporates biological parameters and retains spatial dose information. A software system is developed which provides biological plan evaluation with a novel combination of features. It incorporates hyper-radiosensitivity using the induced-repair model and applies the new concept of dose convolution filter (DCF) to simulate dose wash-out effects due to cell migration, bystander effect, and/or tissue motion during treatment. Further, the concept of spatial DVH (sDVH) is introduced to evaluate and potentially optimize the spatial dose distribution in the target volume. Finally, generalized equivalent uniform dose is derived from both the physical dose distribution (gEUD) and the distribution of equivalent dose in 2 Gy fractions (gEUD2) and the software provides three separate models for calculation of tumor control probability (TCP), normal tissue complication probability (NTCP), and probability of uncomplicated tumor control (P+). TCP, NTCP, and P+ are provided as a function of prescribed dose and multivariable TCP, NTCP, and P+ plots are provided to illustrate the dependence on individual parameters used to calculate these quantities. Ten plans from two clinical treatment sites are selected to test the three calculation models provided by this software. By retaining both spatial and biological information about the dose distribution, the software is able to distinguish features of radiotherapy treatment plans not discernible using commercial systems. Plans that have similar DVHs may have different spatial and biological characteristics and the application of novel tools such as sDVH and DCF within the software may substantially change the apparent plan quality or predicted plan metrics such as TCP and NTCP. For the cases examined

  3. NEAMS Software Licensing, Release, and Distribution: Implications for FY2013 Work Package Planning

    International Nuclear Information System (INIS)

    Bernholdt, David E.

    2012-01-01

    The vision of the NEAMS program is to bring truly predictive modeling and simulation (M and S) capabilities to the nuclear engineering community in order to enable a new approach to the analysis of nuclear systems. NEAMS anticipates issuing in FY 2018 a full release of its computational 'Fermi Toolkit' aimed at advanced reactor and fuel cycles. The NEAMS toolkit involves extensive software development activities, some of which have already been underway for several years, however, the Advanced Modeling and Simulation Office (AMSO), which sponsors the NEAMS program, has not yet issued any official guidance regarding software licensing, release, and distribution policies. This motivated an FY12 task in the Capability Transfer work package to develop and recommend an appropriate set of policies. The current preliminary report is intended to provide awareness of issues with implications for work package planning for FY13. We anticipate a small amount of effort associated with putting into place formal licenses and contributor agreements for NEAMS software which doesn't already have them. We do not anticipate any additional effort or costs associated with software release procedures or schedules beyond those dictated by the quality expectations for the software. The largest potential costs we anticipate would be associated with the setup and maintenance of shared code repositories for development and early access to NEAMS software products. We also anticipate an opportunity, with modest associated costs, to work with the Radiation Safety Information Computational Center (RSICC) to clarify export control assessment policies for software under development.

  4. Teaching land-use planning in a flood prone area with an educational software

    Science.gov (United States)

    Metzger, R.; Jaboyedoff, M.

    2009-04-01

    Teaching of flood risk mapping and mitigation is a necessary task in geosciences studies. However, there is often a gap between the theoretical hydraulic notions broached during the courses and the possibility to make use of them in practice by the students during supervised computer lab exercises. This is mainly due because professional models/software have a steep learning curve and the lecturer spend most of his time to explain how to make such or such operation with the software. To overcome this shortcoming, an educational software was developed, which is made of three main modules: 1) A user-friendly graphical interface (GUI), allowing for handling geographical data and creating thematic maps (Geographical Information System (GIS) module); 2) A flood model (hydrological and inundation models) part allowing for freeing student as much as possible from the repetitive and tedious tasks related to modeling issues, while keeping reasonable computational time; 3) A land use planning module, which allow for specifying mitigation measures (dikes and levees building, flood retention, renaturation, …) and for evaluating their effects by re-running the flood model. The main goal of this educational software is to provide a smooth approach to the modeling issue, without loosing the focus on the main task which is flood risk reduction.

  5. 78 FR 47011 - Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants

    Science.gov (United States)

    2013-08-02

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Software Unit Testing for Digital Computer Software... revised regulatory guide (RG), revision 1 of RG 1.171, ``Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants.'' This RG endorses American National Standards...

  6. 77 FR 50722 - Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants

    Science.gov (United States)

    2012-08-22

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Software Unit Testing for Digital Computer Software...) is issuing for public comment draft regulatory guide (DG), DG-1208, ``Software Unit Testing for Digital Computer Software used in Safety Systems of Nuclear Power Plants.'' The DG-1208 is proposed...

  7. Software for CATV Design and Frequency Plan Optimization

    Directory of Open Access Journals (Sweden)

    O. Hala

    2007-09-01

    Full Text Available The paper deals with the structure of a software medium used for design and sub-optimization of frequency plan in CATV networks, their description and design method. The software performance is described and a simple design example of energy balance of a simplified CATV network is given. The software was created in programming environment called Delphi and local optimization was made in Matlab.

  8. Comparison of computed tomography dose reporting software

    International Nuclear Information System (INIS)

    Abdullah, A.; Sun, Z.; Pongnapang, N.; Ng, K. H.

    2008-01-01

    Computed tomography (CT) dose reporting software facilitates the estimation of doses to patients undergoing CT examinations. In this study, comparison of three software packages, i.e. CT-Expo (version 1.5, Medizinische Hochschule, Hannover (Germany)), ImPACT CT Patients Dosimetry Calculator (version 0.99x, Imaging Performance Assessment on Computed Tomography, www.impactscan.org) and WinDose (version 2.1a, Wellhofer Dosimetry, Schwarzenbruck (Germany)), has been made in terms of their calculation algorithm and the results of calculated doses. Estimations were performed for head, chest, abdominal and pelvic examinations based on the protocols recommended by European guidelines using single-slice CT (SSCT) (Siemens Somatom Plus 4, Erlangen (Germany)) and multi-slice CT (MSCT) (Siemens Sensation 16, Erlangen (Germany)) for software-based female and male phantoms. The results showed that there are some differences in final dose reporting provided by these software packages. There are deviations of effective doses produced by these software packages. Percentages of coefficient of variance range from 3.3 to 23.4 % in SSCT and from 10.6 to 43.8 % in MSCT. It is important that researchers state the name of the software that is used to estimate the various CT dose quantities. Users must also understand the equivalent terminologies between the information obtained from the CT console and the software packages in order to use the software correctly. (authors)

  9. INSPECT: a package of computer programs for planning safeguards inspections

    International Nuclear Information System (INIS)

    Wincek, M.A.; Mullen, M.F.

    1979-04-01

    As part of the U.S. program to provide technical assistance to the International Atomic Energy Agency, a package of computer programs was developed for use in planning safeguards inspections of various types of nuclear facilities. The INSPECT software package is a set of five interactive FORTRAN programs which can be used to calculate the variance components of the MUF (Material Unaccounted For) statistic, the variance components of the D (difference) statistic, attribute and variables sampling plans, a measure of the effectiveness of the inspection, and a measurement of the cost of implementing the inspection plan. This report describes the programs and explains how to use them

  10. Seeding the cloud: Financial bootstrapping in the computer software sector

    OpenAIRE

    Mac An Bhaird, Ciarán; Lynn, Theo

    2015-01-01

    This study investigates resourcing of computer software companies that have adopted cloud computing for the development and delivery of application software. Use of this innovative technology potentially impacts firm financing because the initial infrastructure investment requirement is much lower than for packaged software, lead time to market is shorter, and cloud computing supports instant scalability. We test these predictions by conducting in-depth interviews with founders of 18 independ...

  11. Automated transportation management system (ATMS) software project management plan (SPMP)

    Energy Technology Data Exchange (ETDEWEB)

    Weidert, R.S., Westinghouse Hanford

    1996-05-20

    The Automated Transportation Management System (ATMS) Software Project Management plan (SPMP) is the lead planning document governing the life cycle of the ATMS and its integration into the Transportation Information Network (TIN). This SPMP defines the project tasks, deliverables, and high level schedules involved in developing the client/server ATMS software.

  12. Architecture independent environment for developing engineering software on MIMD computers

    Science.gov (United States)

    Valimohamed, Karim A.; Lopez, L. A.

    1990-01-01

    Engineers are constantly faced with solving problems of increasing complexity and detail. Multiple Instruction stream Multiple Data stream (MIMD) computers have been developed to overcome the performance limitations of serial computers. The hardware architectures of MIMD computers vary considerably and are much more sophisticated than serial computers. Developing large scale software for a variety of MIMD computers is difficult and expensive. There is a need to provide tools that facilitate programming these machines. First, the issues that must be considered to develop those tools are examined. The two main areas of concern were architecture independence and data management. Architecture independent software facilitates software portability and improves the longevity and utility of the software product. It provides some form of insurance for the investment of time and effort that goes into developing the software. The management of data is a crucial aspect of solving large engineering problems. It must be considered in light of the new hardware organizations that are available. Second, the functional design and implementation of a software environment that facilitates developing architecture independent software for large engineering applications are described. The topics of discussion include: a description of the model that supports the development of architecture independent software; identifying and exploiting concurrency within the application program; data coherence; engineering data base and memory management.

  13. TMS communications software. Volume 1: Computer interfaces

    Science.gov (United States)

    Brown, J. S.; Lenker, M. D.

    1979-01-01

    A prototype bus communications system, which is being used to support the Trend Monitoring System (TMS) as well as for evaluation of the bus concept is considered. Hardware and software interfaces to the MODCOMP and NOVA minicomputers are included. The system software required to drive the interfaces in each TMS computer is described. Documentation of other software for bus statistics monitoring and for transferring files across the bus is also included.

  14. 14 CFR 417.123 - Computing systems and software.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Computing systems and software. 417.123... systems and software. (a) A launch operator must document a system safety process that identifies the... systems and software. (b) A launch operator must identify all safety-critical functions associated with...

  15. Automated planning target volume generation: an evaluation pitting a computer-based tool against human experts

    International Nuclear Information System (INIS)

    Ketting, Case H.; Austin-Seymour, Mary; Kalet, Ira; Jacky, Jon; Kromhout-Schiro, Sharon; Hummel, Sharon; Unger, Jonathan; Fagan, Lawrence M.; Griffin, Tom

    1997-01-01

    Purpose: Software tools are seeing increased use in three-dimensional treatment planning. However, the development of these tools frequently omits careful evaluation before placing them in clinical use. This study demonstrates the application of a rigorous evaluation methodology using blinded peer review to an automated software tool that produces ICRU-50 planning target volumes (PTVs). Methods and Materials: Seven physicians from three different institutions involved in three-dimensional treatment planning participated in the evaluation. Four physicians drew partial PTVs on nine test cases, consisting of four nasopharynx and five lung primaries. Using the same information provided to the human experts, the computer tool generated PTVs for comparison. The remaining three physicians, designated evaluators, individually reviewed the PTVs for acceptability. To exclude bias, the evaluators were blinded to the source (human or computer) of the PTVs they reviewed. Their scorings of the PTVs were statistically examined to determine if the computer tool performed as well as the human experts. Results: The computer tool was as successful as the human experts in generating PTVs. Failures were primarily attributable to insufficient margins around the clinical target volume and to encroachment upon critical structures. In a qualitative analysis, the human and computer experts displayed similar types and distributions of errors. Conclusions: Rigorous evaluation of computer-based radiotherapy tools requires comparison to current practice and can reveal areas for improvement before the tool enters clinical practice

  16. Computer-aided position planning of miniplates to treat facial bone defects.

    Directory of Open Access Journals (Sweden)

    Jan Egger

    Full Text Available In this contribution, a software system for computer-aided position planning of miniplates to treat facial bone defects is proposed. The intra-operatively used bone plates have to be passively adapted on the underlying bone contours for adequate bone fragment stabilization. However, this procedure can lead to frequent intra-operatively performed material readjustments especially in complex surgical cases. Our approach is able to fit a selection of common implant models on the surgeon's desired position in a 3D computer model. This happens with respect to the surrounding anatomical structures, always including the possibility of adjusting both the direction and the position of the used osteosynthesis material. By using the proposed software, surgeons are able to pre-plan the out coming implant in its form and morphology with the aid of a computer-visualized model within a few minutes. Further, the resulting model can be stored in STL file format, the commonly used format for 3D printing. Using this technology, surgeons are able to print the virtual generated implant, or create an individually designed bending tool. This method leads to adapted osteosynthesis materials according to the surrounding anatomy and requires further a minimum amount of money and time.

  17. Computer-aided position planning of miniplates to treat facial bone defects

    Science.gov (United States)

    Wallner, Jürgen; Gall, Markus; Chen, Xiaojun; Schwenzer-Zimmerer, Katja; Reinbacher, Knut; Schmalstieg, Dieter

    2017-01-01

    In this contribution, a software system for computer-aided position planning of miniplates to treat facial bone defects is proposed. The intra-operatively used bone plates have to be passively adapted on the underlying bone contours for adequate bone fragment stabilization. However, this procedure can lead to frequent intra-operatively performed material readjustments especially in complex surgical cases. Our approach is able to fit a selection of common implant models on the surgeon’s desired position in a 3D computer model. This happens with respect to the surrounding anatomical structures, always including the possibility of adjusting both the direction and the position of the used osteosynthesis material. By using the proposed software, surgeons are able to pre-plan the out coming implant in its form and morphology with the aid of a computer-visualized model within a few minutes. Further, the resulting model can be stored in STL file format, the commonly used format for 3D printing. Using this technology, surgeons are able to print the virtual generated implant, or create an individually designed bending tool. This method leads to adapted osteosynthesis materials according to the surrounding anatomy and requires further a minimum amount of money and time. PMID:28817607

  18. High Performance Computing Software Applications for Space Situational Awareness

    Science.gov (United States)

    Giuliano, C.; Schumacher, P.; Matson, C.; Chun, F.; Duncan, B.; Borelli, K.; Desonia, R.; Gusciora, G.; Roe, K.

    The High Performance Computing Software Applications Institute for Space Situational Awareness (HSAI-SSA) has completed its first full year of applications development. The emphasis of our work in this first year was in improving space surveillance sensor models and image enhancement software. These applications are the Space Surveillance Network Analysis Model (SSNAM), the Air Force Space Fence simulation (SimFence), and physically constrained iterative de-convolution (PCID) image enhancement software tool. Specifically, we have demonstrated order of magnitude speed-up in those codes running on the latest Cray XD-1 Linux supercomputer (Hoku) at the Maui High Performance Computing Center. The software applications improvements that HSAI-SSA has made, has had significant impact to the warfighter and has fundamentally changed the role of high performance computing in SSA.

  19. Application of PIMS Software in Monthly Planning of Refinery Production

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    This article describes the application of the PIMS software in formulating monthly refining production plan. Application of the PIMS software can help to solve a series of problems related with monthly plan of refining production such as optimized selection of crude and feedstocks, optimized selection of production scale and processing scheme, identification of bottlenecks and their mitigation,optimized selection of turnaround time and optimized selection of operating regime, which have increased the economic benefits of refining enterprises. With the further development and improvement of models the PIMS software will play an increasingly important role in formulating monthly plans of refining operations and production management at refineries. This article also explores the problems existing in refinery monthly planning, and has made recommendations on developing and improving models and reporting system, enhancement of basic data acquisition, model maintenance personnel and staff training.

  20. Advanced communications technology satellite high burst rate link evaluation terminal power control and rain fade software test plan, version 1.0

    Science.gov (United States)

    Reinhart, Richard C.

    1993-01-01

    The Power Control and Rain Fade Software was developed at the NASA Lewis Research Center to support the Advanced Communications Technology Satellite High Burst Rate Link Evaluation Terminal (ACTS HBR-LET). The HBR-LET is an experimenters terminal to communicate with the ACTS for various experiments by government, university, and industry agencies. The Power Control and Rain Fade Software is one segment of the Control and Performance Monitor (C&PM) Software system of the HBR-LET. The Power Control and Rain Fade Software automatically controls the LET uplink power to compensate for signal fades. Besides power augmentation, the C&PM Software system is also responsible for instrument control during HBR-LET experiments, control of the Intermediate Frequency Switch Matrix on board the ACTS to yield a desired path through the spacecraft payload, and data display. The Power Control and Rain Fade Software User's Guide, Version 1.0 outlines the commands and procedures to install and operate the Power Control and Rain Fade Software. The Power Control and Rain Fade Software Maintenance Manual, Version 1.0 is a programmer's guide to the Power Control and Rain Fade Software. This manual details the current implementation of the software from a technical perspective. Included is an overview of the Power Control and Rain Fade Software, computer algorithms, format representations, and computer hardware configuration. The Power Control and Rain Fade Test Plan provides a step-by-step procedure to verify the operation of the software using a predetermined signal fade event. The Test Plan also provides a means to demonstrate the capability of the software.

  1. 78 FR 47015 - Software Requirement Specifications for Digital Computer Software Used in Safety Systems of...

    Science.gov (United States)

    2013-08-02

    ... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Software Requirement Specifications for Digital Computer Software Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory Commission... issuing a revised regulatory guide (RG), revision 1 of RG 1.172, ``Software Requirement Specifications for...

  2. Assessment of Computer Software Usage for Estimating and Tender ...

    African Journals Online (AJOL)

    It has been discovered that there are limitations to the use of computer software packages in construction operations especially estimating and tender analysis. The objectives of this research is to evaluate the level of computer software usage for estimating and tender analysis while also assessing the challenges faced by ...

  3. 48 CFR 252.227-7027 - Deferred ordering of technical data or computer software.

    Science.gov (United States)

    2010-10-01

    ... technical data or computer software. 252.227-7027 Section 252.227-7027 Federal Acquisition Regulations... data or computer software. As prescribed at 227.7103-8(b), use the following clause: Deferred Ordering of Technical Data or Computer Software (APR 1988) In addition to technical data or computer software...

  4. Waste receiving and processing facility module 1 data management system software project management plan

    International Nuclear Information System (INIS)

    Clark, R.E.

    1994-01-01

    This document provides the software development plan for the Waste Receiving and Processing (WRAP) Module 1 Data Management System (DMS). The DMS is one of the plant computer systems for the new WRAP 1 facility (Project W-026). The DMS will collect, store, and report data required to certify the low level waste (LLW) and transuranic (TRU) waste items processed at WRAP 1 as acceptable for shipment, storage, or disposal

  5. Software Configuration Management Plan for the Sodium Removal System

    International Nuclear Information System (INIS)

    HILL, L.F.

    2000-01-01

    This document establishers the Software Configuration Management Plan (SCMP) for the software associated with the control system of the Sodium Removal System (SRS) located in the Interim Examination and Maintenance (IEM Cell) Facility of the FFTF Flux Test

  6. An integrated computer design environment for the development of micro-computer critical software

    International Nuclear Information System (INIS)

    De Agostino, E.; Massari, V.

    1986-01-01

    The paper deals with the development of micro-computer software for Nuclear Safety System. More specifically, it describes an experimental work in the field of software development methodologies to be used for the implementation of micro-computer based safety systems. An investigation of technological improvements that are provided by state-of-the-art integrated packages for micro-based systems development has been carried out. The work has aimed to assess a suitable automated tools environment for the whole software life-cycle. The main safety functions, as DNBR, KW/FT, of a nuclear power reactor have been implemented in a host-target approach. A prototype test-bed microsystem has been implemented to run the safety functions in order to derive a concrete evaluation on the feasibility of critical software according to new technological trends of ''Software Factories''. (author)

  7. Software for Probabilistic Risk Reduction

    Science.gov (United States)

    Hensley, Scott; Michel, Thierry; Madsen, Soren; Chapin, Elaine; Rodriguez, Ernesto

    2004-01-01

    A computer program implements a methodology, denoted probabilistic risk reduction, that is intended to aid in planning the development of complex software and/or hardware systems. This methodology integrates two complementary prior methodologies: (1) that of probabilistic risk assessment and (2) a risk-based planning methodology, implemented in a prior computer program known as Defect Detection and Prevention (DDP), in which multiple requirements and the beneficial effects of risk-mitigation actions are taken into account. The present methodology and the software are able to accommodate both process knowledge (notably of the efficacy of development practices) and product knowledge (notably of the logical structure of a system, the development of which one seeks to plan). Estimates of the costs and benefits of a planned development can be derived. Functional and non-functional aspects of software can be taken into account, and trades made among them. It becomes possible to optimize the planning process in the sense that it becomes possible to select the best suite of process steps and design choices to maximize the expectation of success while remaining within budget.

  8. Workshop on Software Development Tools for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Vetter, Jeffrey [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Georgia Inst. of Technology, Atlanta, GA (United States)

    2007-08-01

    Petascale computing systems will soon be available to the DOE science community. Recent studies in the productivity of HPC platforms point to better software environments as a key enabler to science on these systems. To prepare for the deployment and productive use of these petascale platforms, the DOE science and general HPC community must have the software development tools, such as performance analyzers and debuggers that meet application requirements for scalability, functionality, reliability, and ease of use. In this report, we identify and prioritize the research opportunities in the area of software development tools for high performance computing. To facilitate this effort, DOE hosted a group of 55 leading international experts in this area at the Software Development Tools for PetaScale Computing (SDTPC) Workshop, which was held in Washington, D.C. on August 1 and 2, 2007. Software development tools serve as an important interface between the application teams and the target HPC architectures. Broadly speaking, these roles can be decomposed into three categories: performance tools, correctness tools, and development environments. Accordingly, this SDTPC report has four technical thrusts: performance tools, correctness tools, development environment infrastructures, and scalable tool infrastructures. The last thrust primarily targets tool developers per se, rather than end users. Finally, this report identifies non-technical strategic challenges that impact most tool development. The organizing committee emphasizes that many critical areas are outside the scope of this charter; these important areas include system software, compilers, and I/O.

  9. Development of a software concept for computer-aided technical detail planning for machines in German hard coal mining; Entwicklung eines Softwarekonzeptes fuer rechnergestuetzte maschinentechnische Datailplanung im deutschen Steinkohlenbergbau

    Energy Technology Data Exchange (ETDEWEB)

    Borstell, D

    1994-12-31

    CAD systems have long been an aid in German hard coal mining for reducing costs in all technical planning tasks. The use of computers will offer as yet unused possibilities for further cost savings in the future. For this purpose, this book introduces a new software concept for the technical planning for machines in the mines of the Ruhr. By the continued and thorough use of the potential of modern computer techniques, by the application of the knowledge of planning science orientated towards practice and by transferring computer-aided planning applications from other branches of industry, a further contribution is to be made to reducing costs in technical planning. The heart of the future technical planning workplace will be a graphically orientated surface with the three-dimensional representation of the pit structure on the screen. On this surface, after choosing the work area in the pit structure, there is access to the available software tools. These support the planning engineer in information and design work (connection to databank, 2D/3D-CAD, libraries of operating means and standard parts) and give support to the method of procedure (through expert systems, sample specifications, checklists). They will also offer help in inspection and decision-making (by simulation and calculation routines, expert systems) and in supporting publicity activities (text processing, desktop publishing). The computer-aided planning system of the future will develop from the two-dimensional design environment usual today into a comprehensive integrated 3D engineering system. (orig.) [Deutsch] CAD-Systeme sind im deutschen Steinkohlenbergbau seit Jahren wichtige Hilfsmittel zur Kostenreduzierung bei allen technischen Planungsaufgaben. Auch in Zukunft wird der Einsatz von Rechnern noch ungenutzte Moeglichkeiten fuer weitere Kosteneinsparungen bieten. Zu diesem Zweck wird in der vorliegenden Arbeit ein neues Softwarekonzept fuer die maschinentechnische Planung in den Stabsstellen

  10. 48 CFR 27.404-2 - Limited rights data and restricted computer software.

    Science.gov (United States)

    2010-10-01

    ... restricted computer software. 27.404-2 Section 27.404-2 Federal Acquisition Regulations System FEDERAL... Copyrights 27.404-2 Limited rights data and restricted computer software. (a) General. The basic clause at 52... restricted computer software by withholding the data from the Government and instead delivering form, fit...

  11. 48 CFR 1852.227-86 - Commercial computer software-Licensing.

    Science.gov (United States)

    2010-10-01

    .../contractor proposes its standard commercial software license, those applicable portions thereof consistent... its standard commercial software license until after this purchase order/contract has been issued, or at or after the time the computer software is delivered, such license shall nevertheless be deemed...

  12. Software engineering

    CERN Document Server

    Sommerville, Ian

    2016-01-01

    For courses in computer science and software engineering The Fundamental Practice of Software Engineering Software Engineering introduces readers to the overwhelmingly important subject of software programming and development. In the past few years, computer systems have come to dominate not just our technological growth, but the foundations of our world's major industries. This text seeks to lay out the fundamental concepts of this huge and continually growing subject area in a clear and comprehensive manner. The Tenth Edition contains new information that highlights various technological updates of recent years, providing readers with highly relevant and current information. Sommerville's experience in system dependability and systems engineering guides the text through a traditional plan-based approach that incorporates some novel agile methods. The text strives to teach the innovators of tomorrow how to create software that will make our world a better, safer, and more advanced place to live.

  13. Bioconductor: open software development for computational biology and bioinformatics

    DEFF Research Database (Denmark)

    Gentleman, R.C.; Carey, V.J.; Bates, D.M.

    2004-01-01

    The Bioconductor project is an initiative for the collaborative creation of extensible software for computational biology and bioinformatics. The goals of the project include: fostering collaborative development and widespread use of innovative software, reducing barriers to entry into interdisci......The Bioconductor project is an initiative for the collaborative creation of extensible software for computational biology and bioinformatics. The goals of the project include: fostering collaborative development and widespread use of innovative software, reducing barriers to entry...... into interdisciplinary scientific research, and promoting the achievement of remote reproducibility of research results. We describe details of our aims and methods, identify current challenges, compare Bioconductor to other open bioinformatics projects, and provide working examples....

  14. 3D Computer aided treatment planning in endodontics.

    Science.gov (United States)

    van der Meer, Wicher J; Vissink, Arjan; Ng, Yuan Ling; Gulabivala, Kishor

    2016-02-01

    Obliteration of the root canal system due to accelerated dentinogenesis and dystrophic calcification can challenge the achievement of root canal treatment goals. This paper describes the application of 3D digital mapping technology for predictable navigation of obliterated canal systems during root canal treatment to avoid iatrogenic damage of the root. Digital endodontic treatment planning for anterior teeth with severely obliterated root canal systems was accomplished with the aid of computer software, based on cone beam computer tomography (CBCT) scans and intra-oral scans of the dentition. On the basis of these scans, endodontic guides were created for the planned treatment through digital designing and rapid prototyping fabrication. The custom-made guides allowed for an uncomplicated and predictable canal location and management. The method of digital designing and rapid prototyping of endodontic guides allows for reliable and predictable location of root canals of teeth with calcifically metamorphosed root canal systems. The endodontic directional guide facilitates difficult endodontic treatments at little additional cost. Copyright © 2016. Published by Elsevier Ltd.

  15. A briefing to verification and validation of computer software

    International Nuclear Information System (INIS)

    Zhang Aisen; Xie Yalian

    2012-01-01

    Nowadays, the computer equipment and information processing technology is coming into the engineering of instrument and process control. Owing to its convenient and other advantages, more and more utilities are more than happy to use it. After initial utilization in basic functional controlling, the computer equipment and information processing technology is widely used in safety critical control. Consequently, the people pay more attentions to the quality of computer software. How to assess and ensure its quality are the most concerned problems. The verification and validation technology of computer software are important steps to the quality assurance. (authors)

  16. Computers and Young Children. Storyboard Software: Flannel Boards in the Computer Age.

    Science.gov (United States)

    Shade, Daniel D.

    1995-01-01

    Describes storyboard software as computer programs with which children can build a story using visuals. Notes the importance of such programs from preliterate or nonreading children. Describes a new storyboard program, "Wiggins in Storyland," and its features. Lists recommended storyboard software programs, with publishers and compatible…

  17. Guidelines for evaluating software configuration management plans for digital instrumentation and control systems

    International Nuclear Information System (INIS)

    Cheon, Se Woo; Park, Jong Kyun; Lee, Ki Young; Lee, Jang Soo; Kim, Jang Yeon

    2001-08-01

    Software configuration management (SCM) is the process for identifying software configuration items (CIs), controlling the implementation and changes to software, recording and reporting the status of changes, and verifying the completeness and correctness of the released software. SCM consists of two major aspects: planning and implementation. Effective SCM involves planning for how activities are to be performed, and performing these activities in accordance with the Plan. This report first reviews the background of SCM that include key standards, SCM disciplines, SCM basic functions, baselines, software entity, SCM process, the implementation of SCM, and the tools of SCM. In turn, the report provides the guidelines for evaluating the SCM Plan for digital I and C systems of nuclear power plants. Most of the guidelines in the report are based on IEEE Std 828 and ANSI/IEEE Std 1042. According to BTP-14, NUREG-0800, the evaluation topics on the SCM Plan is classified into three categories: management, implementation, and resource characteristics

  18. Software development to support decommissioning and waste management strategic planning

    International Nuclear Information System (INIS)

    Williams, John; Warneford, Ian; Harrison, J.

    1997-01-01

    One of the components of the UKAEA's mission is to care for and, at the appropriate time, safely dismantle its radioactive facilities which are no longer in use. To assist in the development of an optimised strategy, AEA Technology was commissioned to produce decision support software. This paper describes the background to the development of the software, its key features and current status, and the lessons learnt during the development. The software, known as UKAEA SPS (Strategic Planning System), is a unique support software package that has been developed to assist in the planning of decommissioning and radioactive waste management. SPS models linked decommissioning and waste management strategies covering all of UKAEA's nuclear liabilities. It has been developed around the database package ACCESS, and runs on Pentium PCs; however, it has many of the features of project planning systems. Its principal outputs are costs, timings and utilisation data for the waste stores, processing facilities, transport and disposal operations displayed at any level of aggregation. This allows programme managers to see easily the effects of changing key parameters in a strategy under development. (author)

  19. Interactive reconstructions of cranial 3D implants under MeVisLab as an alternative to commercial planning software.

    Directory of Open Access Journals (Sweden)

    Jan Egger

    Full Text Available In this publication, the interactive planning and reconstruction of cranial 3D Implants under the medical prototyping platform MeVisLab as alternative to commercial planning software is introduced. In doing so, a MeVisLab prototype consisting of a customized data-flow network and an own C++ module was set up. As a result, the Computer-Aided Design (CAD software prototype guides a user through the whole workflow to generate an implant. Therefore, the workflow begins with loading and mirroring the patients head for an initial curvature of the implant. Then, the user can perform an additional Laplacian smoothing, followed by a Delaunay triangulation. The result is an aesthetic looking and well-fitting 3D implant, which can be stored in a CAD file format, e.g. STereoLithography (STL, for 3D printing. The 3D printed implant can finally be used for an in-depth pre-surgical evaluation or even as a real implant for the patient. In a nutshell, our research and development shows that a customized MeVisLab software prototype can be used as an alternative to complex commercial planning software, which may also not be available in every clinic. Finally, not to conform ourselves directly to available commercial software and look for other options that might improve the workflow.

  20. Interactive reconstructions of cranial 3D implants under MeVisLab as an alternative to commercial planning software

    Science.gov (United States)

    Egger, Jan; Gall, Markus; Tax, Alois; Ücal, Muammer; Zefferer, Ulrike; Li, Xing; von Campe, Gord; Schäfer, Ute; Schmalstieg, Dieter; Chen, Xiaojun

    2017-01-01

    In this publication, the interactive planning and reconstruction of cranial 3D Implants under the medical prototyping platform MeVisLab as alternative to commercial planning software is introduced. In doing so, a MeVisLab prototype consisting of a customized data-flow network and an own C++ module was set up. As a result, the Computer-Aided Design (CAD) software prototype guides a user through the whole workflow to generate an implant. Therefore, the workflow begins with loading and mirroring the patients head for an initial curvature of the implant. Then, the user can perform an additional Laplacian smoothing, followed by a Delaunay triangulation. The result is an aesthetic looking and well-fitting 3D implant, which can be stored in a CAD file format, e.g. STereoLithography (STL), for 3D printing. The 3D printed implant can finally be used for an in-depth pre-surgical evaluation or even as a real implant for the patient. In a nutshell, our research and development shows that a customized MeVisLab software prototype can be used as an alternative to complex commercial planning software, which may also not be available in every clinic. Finally, not to conform ourselves directly to available commercial software and look for other options that might improve the workflow. PMID:28264062

  1. A plan for administrative computing at ANL FY1991 through FY1993

    Energy Technology Data Exchange (ETDEWEB)

    Caruthers, L.E. (ed.); O' Brien, D.E.; Bretscher, M.E.; Hischier, R.C.; Moore, N.J.; Slade, R.G.

    1990-10-01

    In July of 1988, Argonne National Laboratory management approved the restructuring of Computing Services into the Computing and Telecommunications Division, part of the Physical Research area of the Laboratory. One major area of the Computing and Telecommunications Division is Management Information Systems (MIS). A significant aspect of Management Information Systems' work is the development of proposals for new and enhanced administrative computing systems based on an analysis of informational needs. This document represent the outcome of the planning process for FY1991 through FY1993. The introduction of the FY1991 through FY1993 Long-Range Plan assesses the state of administrative computing at ANL and the implications of FY1991 funding recommendations. It includes a history of MIS planning for administrative data processing. This document discusses the strategy and goals which are an important part of administrative data processing plans for the Laboratory. It also describes the management guidelines established by the Administrative Data Processing Oversight Committee for the proposal and implementation of administrative computing systems. Summaries of the proposals for new or enhanced administrative computing systems presented by individual divisions or departments with assistance of Management Information Systems, to the Administrative Data Processing Oversight Committee are given. The detailed tables in this paper give information on how much the resources to develop and implement a given systems will cost its users. The tables include development costs, computing/operations costs, software and hardware costs, and efforts costs. They include both systems funded by Laboratory General Expense and systems funded by the users themselves.

  2. IUE Data Analysis Software for Personal Computers

    Science.gov (United States)

    Thompson, R.; Caplinger, J.; Taylor, L.; Lawton , P.

    1996-01-01

    This report summarizes the work performed for the program titled, "IUE Data Analysis Software for Personal Computers" awarded under Astrophysics Data Program NRA 92-OSSA-15. The work performed was completed over a 2-year period starting in April 1994. As a result of the project, 450 IDL routines and eight database tables are now available for distribution for Power Macintosh computers and Personal Computers running Windows 3.1.

  3. The software analysis project for the Office of Human Resources

    Science.gov (United States)

    Tureman, Robert L., Jr.

    1994-01-01

    There were two major sections of the project for the Office of Human Resources (OHR). The first section was to conduct a planning study to analyze software use with the goal of recommending software purchases and determining whether the need exists for a file server. The second section was analysis and distribution planning for retirement planning computer program entitled VISION provided by NASA Headquarters. The software planning study was developed to help OHR analyze the current administrative desktop computing environment and make decisions regarding software acquisition and implementation. There were three major areas addressed by the study: current environment new software requirements, and strategies regarding the implementation of a server in the Office. To gather data on current environment, employees were surveyed and an inventory of computers were produced. The surveys were compiled and analyzed by the ASEE fellow with interpretation help by OHR staff. New software requirements represented a compilation and analysis of the surveyed requests of OHR personnel. Finally, the information on the use of a server represents research done by the ASEE fellow and analysis of survey data to determine software requirements for a server. This included selection of a methodology to estimate the number of copies of each software program required given current use and estimated growth. The report presents the results of the computing survey, a description of the current computing environment, recommenations for changes in the computing environment, current software needs, management advantages of using a server, and management considerations in the implementation of a server. In addition, detailed specifications were presented for the hardware and software recommendations to offer a complete picture to OHR management. The retirement planning computer program available to NASA employees will aid in long-range retirement planning. The intended audience is the NASA civil

  4. Computer software to assess weld thickness loss in offshore pipelines: PEDS

    Energy Technology Data Exchange (ETDEWEB)

    Germano, Andre Luiz Silva; Correa, Samanda Cristine Arruda [Centro Universitario Estadual da Zona Oeste (CCMAT/UEZO), Rio de Janeiro, RJ (Brazil)], e-mail: scorrea@nuclear.ufrj.br; Souza, Edmilson Monteiro de; Silva, Ademir Xavier da; Lopes, Ricardo Tadeu [Programa de Engenharia Nuclear, COPPE, Universidade Federal do Rio de Janeiro (UFRJ), Rio de Janeiro, RJ (Brazil)], e-mails: emonteiro@nuclear.ufrj.br, ademir@nuclear.ufrj.br, ricardo@lin.ufrj.br

    2010-07-01

    The purpose of this work is to present an initial vision about a computer software named PEDS to assess weld thickness loss in offshore pipelines through digital radiography. This software calculates the thickness loss through a data bank obtained using computational modeling based on Monte Carlo MCNPX code. In order to give users more flexibility, the computer software was written in Java, which allows it to run on Linux, Mac OSX and Windows. Furthermore, tools are provided to image display, select and analyze specific areas of the image (measure average, area of selection) and generate profile plots. Applications of this software in the offshore area are presented. (author)

  5. Component-based software for high-performance scientific computing

    Energy Technology Data Exchange (ETDEWEB)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly.

  6. Component-based software for high-performance scientific computing

    International Nuclear Information System (INIS)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly

  7. Quality assurance of nuclear medicine computer software

    International Nuclear Information System (INIS)

    Cradduck, T.D.

    1986-01-01

    Although quality assurance activities have become well established for the hardware found in nuclear medicine little attention has been paid to computer software. This paper outlines some of the problems that exist and indicates some of the solutions presently under development. The major thrust has been towards establishment of programming standards and comprehensive documentation. Some manufacturers have developed installation verification procedures which programmers are urged to use as models for their own programs. Items that tend to cause erroneous results are discussed with the emphasis for error detection and correction being placed on proper education and training of the computer operator. The concept of interchangeable data files or 'software phantoms' for purposes of quality assurance is discussed. (Author)

  8. Beyond Reactive Planning: Self Adaptive Software and Self Modeling Software in Predictive Deliberation Management

    National Research Council Canada - National Science Library

    Lenahan, Jack; Nash, Michael P; Charles, Phil

    2008-01-01

    .... We present the following hypothesis: predictive deliberation management using self-adapting and self-modeling software will be required to provide mission planning adjustments after the start of a mission...

  9. A Software Rejuvenation Framework for Distributed Computing

    Science.gov (United States)

    Chau, Savio

    2009-01-01

    A performability-oriented conceptual framework for software rejuvenation has been constructed as a means of increasing levels of reliability and performance in distributed stateful computing. As used here, performability-oriented signifies that the construction of the framework is guided by the concept of analyzing the ability of a given computing system to deliver services with gracefully degradable performance. The framework is especially intended to support applications that involve stateful replicas of server computers.

  10. Building fast, reliable, and adaptive software for computational science

    International Nuclear Information System (INIS)

    Rendell, A P; Antony, J; Armstrong, W; Janes, P; Yang, R

    2008-01-01

    Building fast, reliable, and adaptive software is a constant challenge for computational science, especially given recent developments in computer architecture. This paper outlines some of our efforts to address these three issues in the context of computational chemistry. First, a simple linear performance that can be used to model and predict the performance of Hartree-Fock calculations is discussed. Second, the use of interval arithmetic to assess the numerical reliability of the sort of integrals used in electronic structure methods is presented. Third, use of dynamic code modification as part of a framework to support adaptive software is outlined

  11. 48 CFR 252.227-7026 - Deferred delivery of technical data or computer software.

    Science.gov (United States)

    2010-10-01

    ... technical data or computer software. 252.227-7026 Section 252.227-7026 Federal Acquisition Regulations... data or computer software. As prescribed at 227.7103-8(a), use the following clause: Deferred Delivery of Technical Data or Computer Software (APR 1988) The Government shall have the right to require, at...

  12. SSCL computer planning

    International Nuclear Information System (INIS)

    Price, L.E.

    1990-01-01

    The SSC Laboratory is in the process of planning the acquisition of a substantial computing system to support the design of detectors. Advice has been sought from users and computer experts in several stages. This paper discuss this process

  13. Computer software summaries. Numbers 1 through 423

    International Nuclear Information System (INIS)

    1979-09-01

    The National Energy Software Center (NESC) serves as the software exchange and information center for the US Department of Energy and the Nuclear Regulatory Commission. A major activity of the Center is the preparation and publication of two reports issued periodically - the Center's compilation of program abstracts, ANL-7411, and this software summaries report, ANL-8040. The abstracts describe the softward packages available in the software exchange library maintained and distributed by the Center. The summaries describe agency-sponsored software that is at the specification stage, under development, being checked out, in use, or available at agency offices, laboratories, and contractor installations. Summaries describe software that is not included in the NESC library due to its preliminary status or because it is believed to be of limited interest. The purpose of the summaries report is to keep agency and contractor personnel informed as to the existence, status, and availability of computer programs within the agency, and thereby minimize duplication costs and maximize the value of agency software development efforts

  14. Computer software summaries. Numbers 325 through 423

    Energy Technology Data Exchange (ETDEWEB)

    1978-08-01

    The National Energy Software Center (NESC) serves as the software exchange and information center for the U.S. Department of Energy and the Nuclear Regulatory Commission. These summaries describe agency-sponsored software which is at the specification stage, under development, being checked out, in use, or available at agency offices, laboratories, and contractor installations. They describe software which is not included in the NESC library due to its preliminary status or because it is believed to be of limited interest. Codes dealing with the following subjects are included: cross section and resonance integral calculations; spectrum calculations, generation of group constants, lattice and cell problems; static design studies; depletion, fuel management, cost analysis, and power plant economics; space-independent kinetics; space--time kinetics, coupled neutronics--hydrodynamics--thermodynamics; and excursion simulations; radiological safety, hazard and accident analysis; heat transfer and fluid flow; deformation and stress distribution computations, structural analysis, and engineering design studies; gamma heating and shield design; reactor systems analysis; data preparation; data management; subsidiary calculations; experimental data processing; general mathematical and computing system routines; materials; environmental and earth sciences; space sciences; electronics and engineering equipment; chemistry; particle accelerators and high-voltage machines; physics; magnetic fusion research; biology and medicine; and data. (RWR)

  15. Computer software summaries. Numbers 325 through 423

    International Nuclear Information System (INIS)

    1978-08-01

    The National Energy Software Center (NESC) serves as the software exchange and information center for the U.S. Department of Energy and the Nuclear Regulatory Commission. These summaries describe agency-sponsored software which is at the specification stage, under development, being checked out, in use, or available at agency offices, laboratories, and contractor installations. They describe software which is not included in the NESC library due to its preliminary status or because it is believed to be of limited interest. Codes dealing with the following subjects are included: cross section and resonance integral calculations; spectrum calculations, generation of group constants, lattice and cell problems; static design studies; depletion, fuel management, cost analysis, and power plant economics; space-independent kinetics; space--time kinetics, coupled neutronics--hydrodynamics--thermodynamics; and excursion simulations; radiological safety, hazard and accident analysis; heat transfer and fluid flow; deformation and stress distribution computations, structural analysis, and engineering design studies; gamma heating and shield design; reactor systems analysis; data preparation; data management; subsidiary calculations; experimental data processing; general mathematical and computing system routines; materials; environmental and earth sciences; space sciences; electronics and engineering equipment; chemistry; particle accelerators and high-voltage machines; physics; magnetic fusion research; biology and medicine; and data

  16. 76 FR 60939 - Metal Fatigue Analysis Performed by Computer Software

    Science.gov (United States)

    2011-09-30

    ... Software AGENCY: Nuclear Regulatory Commission. ACTION: Regulatory issue summary; request for comment... computer software package, WESTEMS TM , to demonstrate compliance with Section III, ``Rules for... Software Addressees All holders of, and applicants for, a power reactor operating license or construction...

  17. 77 FR 50724 - Developing Software Life Cycle Processes for Digital Computer Software Used in Safety Systems of...

    Science.gov (United States)

    2012-08-22

    ... review of applications for permits and licenses. The DG entitled ``Developing Software Life Cycle... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Developing Software Life Cycle Processes for Digital Computer Software Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory Commission...

  18. Hardware replacements and software tools for digital control computers

    International Nuclear Information System (INIS)

    Walker, R.A.P.; Wang, B-C.; Fung, J.

    1996-01-01

    Technological obsolescence is an on-going challenge for all computer use. By design, and to some extent good fortune, AECL has had a good track record with respect to the march of obsolescence in CANDU digital control computer technology. Recognizing obsolescence as a fact of life, AECL has undertaken a program of supporting the digital control technology of existing CANDU plants. Other AECL groups are developing complete replacement systems for the digital control computers, and more advanced systems for the digital control computers of the future CANDU reactors. This paper presents the results of the efforts of AECL's DCC service support group to replace obsolete digital control computer and related components and to provide friendlier software technology related to the maintenance and use of digital control computers in CANDU. These efforts are expected to extend the current lifespan of existing digital control computers through their mandated life. This group applied two simple rules; the product, whether new or replacement should have a generic basis, and the products should be applicable to both existing CANDU plants and to 'repeat' plant designs built using current design guidelines. While some exceptions do apply, the rules have been met. The generic requirement dictates that the product should not be dependent on any brand technology, and should back-fit to and interface with any such technology which remains in the control design. The application requirement dictates that the product should have universal use and be user friendly to the greatest extent possible. Furthermore, both requirements were designed to anticipate user involvement, modifications and alternate user defined applications. The replacements for hardware components such as paper tape reader/punch, moving arm disk, contact scanner and Ramtek are discussed. The development of these hardware replacements coincide with the development of a gateway system for selected CANDU digital control

  19. Delivering LHC software to HPC compute elements

    CERN Document Server

    Blomer, Jakob; Hardi, Nikola; Popescu, Radu

    2017-01-01

    In recent years, there was a growing interest in improving the utilization of supercomputers by running applications of experiments at the Large Hadron Collider (LHC) at CERN when idle cores cannot be assigned to traditional HPC jobs. At the same time, the upcoming LHC machine and detector upgrades will produce some 60 times higher data rates and challenge LHC experiments to use so far untapped compute resources. LHC experiment applications are tailored to run on high-throughput computing resources and they have a different anatomy than HPC applications. LHC applications comprise a core framework that allows hundreds of researchers to plug in their specific algorithms. The software stacks easily accumulate to many gigabytes for a single release. New releases are often produced on a daily basis. To facilitate the distribution of these software stacks to world-wide distributed computing resources, LHC experiments use a purpose-built, global, POSIX file system, the CernVM File System. CernVM-FS pre-processes dat...

  20. Designing Scientific Software for Heterogeneous Computing

    DEFF Research Database (Denmark)

    Glimberg, Stefan Lemvig

    , algorithms and data structures must be designed to utilize the underlying parallel architecture. The architectural changes in hardware design within the last decade, from single to multi and many-core architectures, require software developers to identify and properly implement methods that both exploit...... makes parallel software design applicable, but also a challenge for scientific software developers at all levels. We have developed a generic C++ library for fast prototyping of large-scale PDEs solvers based on flexible-order finite difference approximations on structured regular grids. The library...... is designed with a high abstraction interface to improve developer productivity. The library is based on modern template-based design concepts as described in Glimberg, Engsig-Karup, Nielsen & Dammann (2013). The library utilizes heterogeneous CPU/GPU environments in order to maximize computational throughput...

  1. A directory of computer software applications: energy. Report for 1974--1976

    International Nuclear Information System (INIS)

    Grooms, D.W.

    1977-04-01

    The computer programs or the computer program documentation cited in this directory have been developed for a variety of applications in the field of energy. The cited computer software includes applications in solar energy, petroleum resources, batteries, electrohydrodynamic generators, magnetohydrodynamic generators, natural gas, nuclear fission, nuclear fusion, hydroelectric power production, and geothermal energy. The computer software cited has been used for simulation and modeling, calculations of future energy requirements, calculations of energy conservation measures, and computations of economic considerations of energy systems

  2. Computing platforms for software-defined radio

    CERN Document Server

    Nurmi, Jari; Isoaho, Jouni; Garzia, Fabio

    2017-01-01

    This book addresses Software-Defined Radio (SDR) baseband processing from the computer architecture point of view, providing a detailed exploration of different computing platforms by classifying different approaches, highlighting the common features related to SDR requirements and by showing pros and cons of the proposed solutions. Coverage includes architectures exploiting parallelism by extending single-processor environment (such as VLIW, SIMD, TTA approaches), multi-core platforms distributing the computation to either a homogeneous array or a set of specialized heterogeneous processors, and architectures exploiting fine-grained, coarse-grained, or hybrid reconfigurability. Describes a computer engineering approach to SDR baseband processing hardware; Discusses implementation of numerous compute-intensive signal processing algorithms on single and multicore platforms; Enables deep understanding of optimization techniques related to power and energy consumption of multicore platforms using several basic a...

  3. SEI Software Engineering Education Directory.

    Science.gov (United States)

    1987-02-01

    Planning, and Control, Kotler , P. Marketing Decision Making, Concepts and Strategy, Cravens Managerial Fnance: Essentials, Kroncke, C., Nammers, E., and...Textbooks: Applying Software Engineering Principles , Maria Systems: Cyber Turbo Dos Variety of Micros Courses: Introduction to Software Engineering...Assistant Professor of Computer Systems (513) 255-6913 Courses: Software Engineeing Managemrent EENG543 G N R A Textbooks: Principles of Productive

  4. The family of standard hydrogen monitoring system computer software design description: Revision 2

    International Nuclear Information System (INIS)

    Bender, R.M.

    1994-01-01

    In March 1990, 23 waste tanks at the Hanford Nuclear Reservation were identified as having the potential for the buildup of gas to a flammable or explosive level. As a result of the potential for hydrogen gas buildup, a project was initiated to design a standard hydrogen monitoring system (SHMS) for use at any waste tank to analyze gas samples for hydrogen content. Since it was originally deployed three years ago, two variations of the original system have been developed: the SHMS-B and SHMS-C. All three are currently in operation at the tank farms and will be discussed in this document. To avoid confusion in this document, when a feature is common to all three of the SHMS variants, it will be referred to as ''The family of SHMS.'' When it is specific to only one or two, they will be identified. The purpose of this computer software design document is to provide the following: the computer software requirements specification that documents the essential requirements of the computer software and its external interfaces; the computer software design description; the computer software user documentation for using and maintaining the computer software and any dedicated hardware; and the requirements for computer software design verification and validation

  5. A Novel Coupling Pattern in Computational Science and Engineering Software

    Science.gov (United States)

    Computational science and engineering (CSE) software is written by experts of certain area(s). Due to the specialization,existing CSE software may need to integrate other CSE software systems developed by different groups of experts. Thecoupling problem is one of the challenges f...

  6. A Novel Coupling Pattern in Computational Science and Engineering Software

    Science.gov (United States)

    Computational science and engineering (CSE) software is written by experts of certain area(s). Due to the specialization, existing CSE software may need to integrate other CSE software systems developed by different groups of experts. The coupling problem is one of the challenges...

  7. The December 2006 ATLAS Computing & Software Workshop

    CERN Multimedia

    Fred Luehring

    The 29th ATLAS Computing & Software Workshop was held on December 11-15 at CERN. With the rapidly approaching onset of data taking, the workshop participants had an air of urgency about them. There was considerable discussion on hot topics such as physics validation of the software, data analysis, actual software production on the GRID, and the schedule of work for 2007 including the Final Dress Rehearsal (FDR). However don't be fooled, the workshop was not all work - there were also two social events which were greatly enjoyed by the attendees. The workshop welcomed Wouter Verkerke as the new Physics Validation Coordinator (replacing Davide Costanzo). Most recent validation work has centered on the 12.0.X release series that will be used for the Computing System Commissioning (CSC) exercise. The validation is now a big job because it needs to be done over a variety of conditions (magnetic field on/off, aligned/misaligned geometry) for every candidate release. Luckily there have been a large number of pe...

  8. Computer Software: Copyright and Licensing Considerations for Schools and Libraries. ERIC Digest.

    Science.gov (United States)

    Reed, Mary Hutchings

    This digest notes that the terms and conditions of computer software package license agreements control the use of software in schools and libraries, and examines the implications of computer software license agreements for classroom use and for library lending policies. Guidelines are provided for interpreting the Copyright Act, and insuring the…

  9. AWARE-P: a system-based software for urban water IAM planning

    OpenAIRE

    Coelho, S.T.; Vitorino, D.; Alegre, H.

    2013-01-01

    The AWARE-P IAM planning software offers a non-intrusive, web-based, collaborative integration environment for a wide variety of data and processes that may be relevant to the IAM decision-making process, including maps, GIS shapefiles and geodatabases; inventory records; work orders, maintenance, inspections/CCTV records; network models, performance indicators, asset valuation records, among others. The software provides an organized framework for evaluating and comparing planning alternativ...

  10. A Study on the Software Quality Assurance Plan

    International Nuclear Information System (INIS)

    Kim, Hyun Tae

    2006-01-01

    On 25 August 2006, the CMMI V1.2 (Capability Maturity Model Integration Version 1.2) was released with the new title CMMI-DEV (CMMI for Development) which supersedes the CMMI-SE/SW (CMMI for systems engineering and software engineering) V1.1. This study discusses the application of IEEE Std 730-2002, IEEE Standard for Software Quality Assurance Plans, for the implementation of the Process and Product Quality Assurance (PPQA) process area (PA) of the CMMI-DEV

  11. Caesy: A software tool for computer-aided engineering

    Science.gov (United States)

    Wette, Matt

    1993-01-01

    A new software tool, Caesy, is described. This tool provides a strongly typed programming environment for research in the development of algorithms and software for computer-aided control system design. A description of the user language and its implementation as they currently stand are presented along with a description of work in progress and areas of future work.

  12. Software Engineering for Scientific Computer Simulations

    Science.gov (United States)

    Post, Douglass E.; Henderson, Dale B.; Kendall, Richard P.; Whitney, Earl M.

    2004-11-01

    Computer simulation is becoming a very powerful tool for analyzing and predicting the performance of fusion experiments. Simulation efforts are evolving from including only a few effects to many effects, from small teams with a few people to large teams, and from workstations and small processor count parallel computers to massively parallel platforms. Successfully making this transition requires attention to software engineering issues. We report on the conclusions drawn from a number of case studies of large scale scientific computing projects within DOE, academia and the DoD. The major lessons learned include attention to sound project management including setting reasonable and achievable requirements, building a good code team, enforcing customer focus, carrying out verification and validation and selecting the optimum computational mathematics approaches.

  13. Functional requirements for gas characterization system computer software

    International Nuclear Information System (INIS)

    Tate, D.D.

    1996-01-01

    This document provides the Functional Requirements for the Computer Software operating the Gas Characterization System (GCS), which monitors the combustible gasses in the vapor space of selected tanks. Necessary computer functions are defined to support design, testing, operation, and change control. The GCS requires several individual computers to address the control and data acquisition functions of instruments and sensors. These computers are networked for communication, and must multi-task to accommodate operation in parallel

  14. The first clinical application of planning software for laparoscopic microwave thermosphere ablation of malignant liver tumours.

    Science.gov (United States)

    Berber, Eren

    2015-07-01

    Liver tumour ablation is an operator-dependent procedure. The determination of the optimum needle trajectory and correct ablation parameters could be challenging. The aim of this study was to report the utility of a new, procedure planning software for microwave ablation (MWA) of liver tumours. This was a feasibility study in a pilot group of five patients with nine metastatic liver tumours who underwent laparoscopic MWA. Pre-operatively, parameters predicting the desired ablation zones were calculated for each tumour. Intra-operatively, this planning strategy was followed for both antenna placement and energy application. Post-operative 2-week computed tomography (CT) scans were performed to evaluate complete tumour destruction. The patients had an average of two tumours (range 1-4), measuring 1.9 ± 0.4 cm (range 0.9-4.4 cm). The ablation time was 7.1 ± 1.3 min (range 2.5-10 min) at 100W. There were no complications or mortality. The patients were discharged home on post-operative day (POD) 1. At 2-week CT scans, there were no residual tumours, with a complete ablation demonstrated in all lesions. This study describes and validates pre-treatment planning software for MWA of liver tumours. This software was found useful to determine precisely the ablation parameters and needle placement to create a predicted zone of ablation. © 2015 International Hepato-Pancreato-Biliary Association.

  15. Computer-guided implant placement: 3D planning software, fixed intraoral reference points, and CAD/CAM technology. A case report

    NARCIS (Netherlands)

    Tahmaseb, A.; de Clerck, R.; Wismeijer, D.

    2009-01-01

    The aim of this article is to explain the use of a computer-aided three-dimensional planning protocol in combination with previously placed mini-implants and computer-aided design/computer-assisted manufacture (CAD/CAM) technology to restore a completely edentulous patient. Mini-implants were used

  16. Copyright Protection for Computer Software: Is There a Need for More Protection?

    Science.gov (United States)

    Ku, Linlin

    Because the computer industry's expansion has been much faster than has the development of laws protecting computer software and since the practice of software piracy seems to be alive and well, the issue of whether existing laws can provide effective protection for software needs further discussion. Three bodies of law have been used to protect…

  17. Potential marketing plan for Sony Computer Entertainment, Inc. to China

    OpenAIRE

    Li, Weishen

    2013-01-01

    The purpose of this thesis was to create a marketing plan for Sony Computer Enter-tainment, Inc. (SCE) for its market entry in mainland China. SCE is a major Japanese video game company which develops and manufactures video game consoles and game software on a global scale. SCE belongs to Sony Cooperation. Sony operates almost its every single business in China except the video game business due to the internal factors of China. Along with the great increase of Chinese people’s purchas-ing po...

  18. Software for project-based learning of robot motion planning

    Science.gov (United States)

    Moll, Mark; Bordeaux, Janice; Kavraki, Lydia E.

    2013-12-01

    Motion planning is a core problem in robotics concerned with finding feasible paths for a given robot. Motion planning algorithms perform a search in the high-dimensional continuous space of robot configurations and exemplify many of the core algorithmic concepts of search algorithms and associated data structures. Motion planning algorithms can be explained in a simplified two-dimensional setting, but this masks many of the subtleties and complexities of the underlying problem. We have developed software for project-based learning of motion planning that enables deep learning. The projects that we have developed allow advanced undergraduate students and graduate students to reflect on the performance of existing textbook algorithms and their own variations on such algorithms. Formative assessment has been conducted at three institutions. The core of the software used for this teaching module is also used within the Robot Operating System, a widely adopted platform by the robotics research community. This allows for transfer of knowledge and skills to robotics research projects involving a large variety robot hardware platforms.

  19. Software for simulation of a computed tomography imaging spectrometer using optical design software

    Science.gov (United States)

    Spuhler, Peter T.; Willer, Mark R.; Volin, Curtis E.; Descour, Michael R.; Dereniak, Eustace L.

    2000-11-01

    Our Imaging Spectrometer Simulation Software known under the name Eikon should improve and speed up the design of a Computed Tomography Imaging Spectrometer (CTIS). Eikon uses existing raytracing software to simulate a virtual instrument. Eikon enables designers to virtually run through the design, calibration and data acquisition, saving significant cost and time when designing an instrument. We anticipate that Eikon simulations will improve future designs of CTIS by allowing engineers to explore more instrument options.

  20. What makes computational open source software libraries successful?

    International Nuclear Information System (INIS)

    Bangerth, Wolfgang; Heister, Timo

    2013-01-01

    Software is the backbone of scientific computing. Yet, while we regularly publish detailed accounts about the results of scientific software, and while there is a general sense of which numerical methods work well, our community is largely unaware of best practices in writing the large-scale, open source scientific software upon which our discipline rests. This is particularly apparent in the commonly held view that writing successful software packages is largely the result of simply ‘being a good programmer’ when in fact there are many other factors involved, for example the social skill of community building. In this paper, we consider what we have found to be the necessary ingredients for successful scientific software projects and, in particular, for software libraries upon which the vast majority of scientific codes are built today. In particular, we discuss the roles of code, documentation, communities, project management and licenses. We also briefly comment on the impact on academic careers of engaging in software projects. (paper)

  1. What makes computational open source software libraries successful?

    Science.gov (United States)

    Bangerth, Wolfgang; Heister, Timo

    2013-01-01

    Software is the backbone of scientific computing. Yet, while we regularly publish detailed accounts about the results of scientific software, and while there is a general sense of which numerical methods work well, our community is largely unaware of best practices in writing the large-scale, open source scientific software upon which our discipline rests. This is particularly apparent in the commonly held view that writing successful software packages is largely the result of simply ‘being a good programmer’ when in fact there are many other factors involved, for example the social skill of community building. In this paper, we consider what we have found to be the necessary ingredients for successful scientific software projects and, in particular, for software libraries upon which the vast majority of scientific codes are built today. In particular, we discuss the roles of code, documentation, communities, project management and licenses. We also briefly comment on the impact on academic careers of engaging in software projects.

  2. From On-Premise Software to Cloud Services: The Impact of Cloud Computing on Enterprise Software Vendors' Business Models

    OpenAIRE

    Boillat, Thomas; Legner, Christine

    2013-01-01

    Cloud computing is an emerging paradigm that allows users to conveniently access computing resources as pay-per-use services. Whereas cloud offerings such as Amazon's Elastic Compute Cloud and Google Apps are rapidly gaining a large user base, enterprise software's migration towards the cloud is still in its infancy. For software vendors the move towardscloud solutions implies profound changes in their value-creation logic. Not only are they forced to deliver fully web-enabled solutions and t...

  3. Honeywell modular automation system computer software documentation

    International Nuclear Information System (INIS)

    Cunningham, L.T.

    1997-01-01

    This document provides a Computer Software Documentation for a new Honeywell Modular Automation System (MAS) being installed in the Plutonium Finishing Plant (PFP). This system will be used to control new thermal stabilization furnaces in HA-21I

  4. Software and Computing News

    CERN Multimedia

    Barberis, D

    The last several months have been very busy ones for the ATLAS software developers. They've been trying to cope with the competing demands of multiple software stress tests and testbeds. These include Data Challenge Two (DC2), the Combined Testbeam (CTB), preparations for the Physics Workshop to be held in Rome in June 2005, and other testbeds, primarily one for the High-Level Trigger. Data Challenge 2 (DC2) The primary goal of this was to validate the computing model and to provide a test of simulating a day's worth of ATLAS data (10 million events) and of fully processing it and making it available to the physicists within 10 days (i.e. a 10% scale test). DC2 consists of three parts - the generation, simulation, and mixing of a representative sample of physics events with background events; the reconstruction of the mixed samples with initial classification into the different physics signatures; and the distribution of the data to multiple remote sites (Tier-1 centers) for analysis by physicists. Figu...

  5. Computer Software for Life Cycle Cost.

    Science.gov (United States)

    1987-04-01

    34 111. 1111I .25 IL4 jj 16 MICROCOPY RESOLUTION TEST CHART hut FILE C AIR CoMMNAMN STFF COLLG STUJDET PORTO i COMpUTER SOFTWARE FOR LIFE CYCLE CO879...obsolete), physical life (utility before physically wearing out), or application life (utility in a given function)." (7:5) The costs are usually

  6. A Generic Software Development Process Refined from Best Practices for Cloud Computing

    OpenAIRE

    Soojin Park; Mansoo Hwang; Sangeun Lee; Young B. Park

    2015-01-01

    Cloud computing has emerged as more than just a piece of technology, it is rather a new IT paradigm. The philosophy behind cloud computing shares its view with green computing where computing environments and resources are not as subjects to own but as subjects of sustained use. However, converting currently used IT services to Software as a Service (SaaS) cloud computing environments introduces several new risks. To mitigate such risks, existing software development processes must undergo si...

  7. RE-PLAN: An Extensible Software Architecture to Facilitate Disaster Response Planning

    Science.gov (United States)

    O’Neill, Martin; Mikler, Armin R.; Indrakanti, Saratchandra; Tiwari, Chetan; Jimenez, Tamara

    2014-01-01

    Computational tools are needed to make data-driven disaster mitigation planning accessible to planners and policymakers without the need for programming or GIS expertise. To address this problem, we have created modules to facilitate quantitative analyses pertinent to a variety of different disaster scenarios. These modules, which comprise the REsponse PLan ANalyzer (RE-PLAN) framework, may be used to create tools for specific disaster scenarios that allow planners to harness large amounts of disparate data and execute computational models through a point-and-click interface. Bio-E, a user-friendly tool built using this framework, was designed to develop and analyze the feasibility of ad hoc clinics for treating populations following a biological emergency event. In this article, the design and implementation of the RE-PLAN framework are described, and the functionality of the modules used in the Bio-E biological emergency mitigation tool are demonstrated. PMID:25419503

  8. A computer-aided software-tool for sustainable process synthesis-intensification

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan; Babi, Deenesh K.; Bottlaender, Jack

    2017-01-01

    and determine within the design space, the more sustainable processes. In this paper, an integrated computer-aided software-tool that searches the design space for hybrid/intensified more sustainable process options is presented. Embedded within the software architecture are process synthesis...... operations as well as reported hybrid/intensified unit operations is large and can be difficult to manually navigate in order to determine the best process flowsheet for the production of a desired chemical product. Therefore, it is beneficial to utilize computer-aided methods and tools to enumerate, analyze...... constraints while also matching the design targets, they are therefore more sustainable than the base case. The application of the software-tool to the production of biodiesel is presented, highlighting the main features of the computer-aided, multi-stage, multi-scale methods that are able to determine more...

  9. Fun and software exploring pleasure, paradox and pain in computing

    CERN Document Server

    Goriunova, Olga

    2014-01-01

    Fun and Software offers the untold story of fun as constitutive of the culture and aesthetics of computing. Fun in computing is a mode of thinking, making and experiencing. It invokes and convolutes the question of rationalism and logical reason, addresses the sensibilities and experience of computation and attests to its creative drives. By exploring topics as diverse as the pleasure and pain of the programmer, geek wit, affects of play and coding as a bodily pursuit of the unique in recursive structures, Fun and Software helps construct a different point of entry to the understanding of soft

  10. The Liquid Argon Software Toolkit (LArSoft): Goals, Status and Plan

    Energy Technology Data Exchange (ETDEWEB)

    Pordes, Rush [Fermilab; Snider, Erica [Fermilab

    2016-08-17

    LArSoft is a toolkit that provides a software infrastructure and algorithms for the simulation, reconstruction and analysis of events in Liquid Argon Time Projection Chambers (LArTPCs). It is used by the ArgoNeuT, LArIAT, MicroBooNE, DUNE (including 35ton prototype and ProtoDUNE) and SBND experiments. The LArSoft collaboration provides an environment for the development, use, and sharing of code across experiments. The ultimate goal is to develop fully automatic processes for reconstruction and analysis of LArTPC events. The toolkit is based on the art framework and has a well-defined architecture to interface to other packages, including to GEANT4 and GENIE simulation software and the Pandora software development kit for pattern recognition. It is designed to facilitate and support the evolution of algorithms including their transition to new computing platforms. The development of the toolkit is driven by the scientific stakeholders involved. The core infrastructure includes standard definitions of types and constants, means to input experiment geometries as well as meta and event- data in several formats, and relevant general utilities. Examples of algorithms experiments have contributed to date are: photon-propagation; particle identification; hit finding, track finding and fitting; electromagnetic shower identification and reconstruction. We report on the status of the toolkit and plans for future work.

  11. SOFTWARE FOR COMPUTER-AIDED DESIGN OF CROSS-WEDGE ROLLING

    OpenAIRE

    A. A. Abramov; S. V. Medvedev

    2013-01-01

    The issues of computer technology creation of 3D-design and engineering analysis of metal forming processes using cross wedge rolling methods (CWR) are considered. The developed software for computer-aided design and simulation of cross-wedge rolling is described.

  12. Tier2 Submit Software

    Science.gov (United States)

    Download this tool for Windows or Mac, which helps facilities prepare a Tier II electronic chemical inventory report. The data can also be exported into the CAMEOfm (Computer-Aided Management of Emergency Operations) emergency planning software.

  13. 75 FR 27341 - Increasing Market and Planning Efficiency Through Improved Software; Notice of Technical...

    Science.gov (United States)

    2010-05-14

    ..., ramp rates, and network topology), flexible dispatch, settlement calculations, transmission switching... Market and Planning Efficiency Through Improved Software; Notice of Technical Conference To Discuss Increasing Market and Planning Efficiency Through Improved Software May 7, 2010. Take notice that Commission...

  14. 34 CFR 464.42 - What limit applies to purchasing computer hardware and software?

    Science.gov (United States)

    2010-07-01

    ... software? 464.42 Section 464.42 Education Regulations of the Offices of the Department of Education... computer hardware and software? Not more than ten percent of funds received under any grant under this part may be used to purchase computer hardware or software. (Authority: 20 U.S.C. 1208aa(f)) ...

  15. Clinical trials radiotherapy treatment plan review software : is this the first quantified assessment

    International Nuclear Information System (INIS)

    Hatton, J.A.; Cornes, D.A.

    2011-01-01

    Full text: Clinical trials require robust quality assurance (QA) procedures to ensure commonality of all treatments, with independent reviews to assess compliance with trial protocols. All clinical trials tools, including QA software, require testing for validity and reliability. enabling inter- and intra-trial comparison. Unlike clinical radiotherapy treatment planning (RTP) systems, review software has no published guidelines. This study describes the design and development of a test suite to quantify the performance of review software in TROG clinical trials. Test areas are image handling and reconstruction; geometric accuracy; dosimetric accuracy; dose-volume histogram (DVH) calculation; display of plan parameters. TROG have developed tests for commissioning plan review software, assessed with SWAN 2.3, and CMS Elekta FocalPro. While image handling tests were based on published guidelines for RTP systems, dosimetric tests used the TROG QA case review requirements. Treatment plans represented systems of all manufacturers (Pinnacle, Eclipse, Xio and Oncentra) used in Australasian centres. The test suite identified areas for SW A software development, including the DVH algorithm, changed to reduce calculation time. Results, in Fig. I, for known volumes of varying shapes and sizes, demonstrate differences between SWAN 2.1 and 2.3 when compared with Eclipse. Liaison with SWAN programmers enabled re-instatement of 2.1 algorithm. The test suite has quantified the RTP review software, prioritised areas for development with the programmers, and improved the user experience.

  16. The classification and evaluation of Computer-Aided Software Engineering tools

    OpenAIRE

    Manley, Gary W.

    1990-01-01

    Approved for public release; distribution unlimited. The use of Computer-Aided Software Engineering (CASE) tools has been viewed as a remedy for the software development crisis by achieving improved productivity and system quality via the automation of all or part of the software engineering process. The proliferation and tremendous variety of tools available have stretched the understanding of experienced practitioners and has had a profound impact on the software engineering process itse...

  17. Verification of Software Components: Addressing Unbounded Paralelism

    Czech Academy of Sciences Publication Activity Database

    Adámek, Jiří

    2007-01-01

    Roč. 8, č. 2 (2007), s. 300-309 ISSN 1525-9293 R&D Projects: GA AV ČR 1ET400300504 Institutional research plan: CEZ:AV0Z10300504 Keywords : software components * formal verification * unbounded parallelism Subject RIV: JC - Computer Hardware ; Software

  18. Software to support planning for future waste treatment, storage, transport, and disposal requirements

    International Nuclear Information System (INIS)

    Holter, G.M.; Shay, M.R.; Stiles, D.L.

    1990-04-01

    Planning for adequate and appropriate treatment, storage, transport and disposal of wastes to be generated or received in the future is a complex but critical task that can be significantly enhanced by the development and use of appropriate software. This paper describes a software system that has been developed at Pacific Northwest Laboratory to aid in such planning. The basic needs for such a system are outlined, and the approach adopted in developing the software is described. The individual components of the system, and their integration into a unified system, are discussed. Typical analytical applications of this type of software are summarized. Conclusions concerning the development of such software systems and the necessary supporting data are then presented. 2 figs

  19. Re-engineering software systems in the Department of Defense using integrated computer aided software engineering tools

    OpenAIRE

    Jennings, Charles A.

    1992-01-01

    Approved for public release; distribution is unlimited The Department of Defense (DoD) is plagues with severe cost overruns and delays in developing software systems. Existing software within Dod, some developed 15-to 20 years ago, require continual maintenance and modification. Major difficulties arise with maintaining older systems due to cryptic source code and a lack of adequate documentation. To remedy this situation, the DoD, is pursuing the integrated computer aided software engi...

  20. A Generic Software Development Process Refined from Best Practices for Cloud Computing

    Directory of Open Access Journals (Sweden)

    Soojin Park

    2015-04-01

    Full Text Available Cloud computing has emerged as more than just a piece of technology, it is rather a new IT paradigm. The philosophy behind cloud computing shares its view with green computing where computing environments and resources are not as subjects to own but as subjects of sustained use. However, converting currently used IT services to Software as a Service (SaaS cloud computing environments introduces several new risks. To mitigate such risks, existing software development processes must undergo significant remodeling. This study analyzes actual cases of SaaS cloud computing environment adoption as a way to derive four new best practices for software development and incorporates the identified best practices for currently-in-use processes. Furthermore, this study presents a design for generic software development processes that implement the proposed best practices. The design for the generic process has been applied to reinforce the weak points found in SaaS cloud service development practices used by eight enterprises currently developing or operating actual SaaS cloud computing services. Lastly, this study evaluates the applicability of the proposed SaaS cloud oriented development process through analyzing the feedback data collected from actual application to the development of a SaaS cloud service Astation.

  1. Software Engineering Principles for Courseware Development.

    Science.gov (United States)

    Magel, Kenneth

    1980-01-01

    Courseware (computer based curriculum materials) development should follow the lessons learned by software engineers. The most important of 28 principles of software development presented here include a stress on human readability, the importance of early planning and analysis, the need for independent evaluation, and the need to be flexible.…

  2. A Methodological Framework for Software Safety in Safety Critical Computer Systems

    OpenAIRE

    P. V. Srinivas Acharyulu; P. Seetharamaiah

    2012-01-01

    Software safety must deal with the principles of safety management, safety engineering and software engineering for developing safety-critical computer systems, with the target of making the system safe, risk-free and fail-safe in addition to provide a clarified differentaition for assessing and evaluating the risk, with the principles of software risk management. Problem statement: Prevailing software quality models, standards were not subsisting in adequately addressing the software safety ...

  3. BISON Software V&V Plan

    Energy Technology Data Exchange (ETDEWEB)

    R. L. Williamson; J. D. Hales; D. M. Perez; S. R. Novascone; G. Pastore

    2014-07-01

    The primary vision for the BISON development team is to deliver a nuclear fuel performance simulation tool that is used to provide a researcher or fuel designer with best estimate calculations of the highly coupled and nonlinear phenomena that govern nuclear fuel behavior. Accurately simulating nuclear fuel behavior is a challenging computational undertaking and verification and validation (V&V) play an important role in realizing this vision. The purpose of this V&V plan is to express the BISON team’s definition of the terms verification and validation, document what we have done regarding V&V, and outline what we plan to do.

  4. Software For Computer-Aided Design Of Control Systems

    Science.gov (United States)

    Wette, Matthew

    1994-01-01

    Computer Aided Engineering System (CAESY) software developed to provide means to evaluate methods for dealing with users' needs in computer-aided design of control systems. Interpreter program for performing engineering calculations. Incorporates features of both Ada and MATLAB. Designed to be flexible and powerful. Includes internally defined functions, procedures and provides for definition of functions and procedures by user. Written in C language.

  5. Overview of the ANS [American Nuclear Society] mathematics and computation software standards

    International Nuclear Information System (INIS)

    Smetana, A.O.

    1991-01-01

    The Mathematics and Computations Division of the American Nuclear Society sponsors the ANS-10 Standards Subcommittee. This subcommittee, which is part of the ANS Standards Committee, currently maintains four ANSI/ANS software standards. These standards are: Recommended Programming Practices to Facilitate the Portability of Scientific Computer Programs, ANS-10.2; Guidelines for the Documentation of Computer Software, ANS-10.3; Guidelines for the Verification and Validation of Scientific and Engineering Computer Programs for the Nuclear Industry, ANS-10.4; and Guidelines for Accommodating User Needs in Computer Program Development, ANS-10.5. 5 refs

  6. Honeywell Modular Automation System Computer Software Documentation

    International Nuclear Information System (INIS)

    CUNNINGHAM, L.T.

    1999-01-01

    This document provides a Computer Software Documentation for a new Honeywell Modular Automation System (MAS) being installed in the Plutonium Finishing Plant (PFP). This system will be used to control new thermal stabilization furnaces in HA-211 and vertical denitration calciner in HC-230C-2

  7. The ''NAIRI-2'' computer plotter software

    International Nuclear Information System (INIS)

    Aksenova, E.K.; Kol'ga, V.V.; Trejbal, Z.

    1977-01-01

    The software is described for the grapher of the computer ''Nairi-2''. The system of subprograms ''Plot'' written in the machine language of ''Nairi-2'' allows to present graphically the information obtained with the computer ''Nairi-2'' and with basis computers (BESM-6, CDC-6500) through the information processing system. The graphic dependence can be represented on a pre-selected scale either as a continuous line with a program linear interpolation between the points with the plotting of coordinates of the x, y axes, or as separate points with the construction of the x, y coordinates axes, in any prescribed direction. The system of subprograms is operated in a language of autoprogramming with the application of a number of new operators introduced into the translator of ''Nairi-2''

  8. Prediction of Software Reliability using Bio Inspired Soft Computing Techniques.

    Science.gov (United States)

    Diwaker, Chander; Tomar, Pradeep; Poonia, Ramesh C; Singh, Vijander

    2018-04-10

    A lot of models have been made for predicting software reliability. The reliability models are restricted to using particular types of methodologies and restricted number of parameters. There are a number of techniques and methodologies that may be used for reliability prediction. There is need to focus on parameters consideration while estimating reliability. The reliability of a system may increase or decreases depending on the selection of different parameters used. Thus there is need to identify factors that heavily affecting the reliability of the system. In present days, reusability is mostly used in the various area of research. Reusability is the basis of Component-Based System (CBS). The cost, time and human skill can be saved using Component-Based Software Engineering (CBSE) concepts. CBSE metrics may be used to assess those techniques which are more suitable for estimating system reliability. Soft computing is used for small as well as large-scale problems where it is difficult to find accurate results due to uncertainty or randomness. Several possibilities are available to apply soft computing techniques in medicine related problems. Clinical science of medicine using fuzzy-logic, neural network methodology significantly while basic science of medicine using neural-networks-genetic algorithm most frequently and preferably. There is unavoidable interest shown by medical scientists to use the various soft computing methodologies in genetics, physiology, radiology, cardiology and neurology discipline. CBSE boost users to reuse the past and existing software for making new products to provide quality with a saving of time, memory space, and money. This paper focused on assessment of commonly used soft computing technique like Genetic Algorithm (GA), Neural-Network (NN), Fuzzy Logic, Support Vector Machine (SVM), Ant Colony Optimization (ACO), Particle Swarm Optimization (PSO), and Artificial Bee Colony (ABC). This paper presents working of soft computing

  9. The Implementation of Computer Data Processing Software for EAST NBI

    International Nuclear Information System (INIS)

    Zhang Xiaodan; Hu Chundong; Sheng Peng; Zhao Yuanzhe; Wu Deyun; Cui Qinglong

    2014-01-01

    One of the most important project missions of neutral beam injectors is the implementation of 100 s neutral beam injection (NBI) with high power energy to the plasma of the EAST superconducting tokamak. Correspondingly, it's necessary to construct a high-speed and reliable computer data processing system for processing experimental data, such as data acquisition, data compression and storage, data decompression and query, as well as data analysis. The implementation of computer data processing application software (CDPS) for EAST NBI is presented in this paper in terms of its functional structure and system realization. The set of software is programmed in C language and runs on Linux operating system based on TCP network protocol and multi-threading technology. The hardware mainly includes industrial control computer (IPC), data server, PXI DAQ cards and so on. Now this software has been applied to EAST NBI system, and experimental results show that the CDPS can serve EAST NBI very well. (fusion engineering)

  10. Software reliability and safety in nuclear reactor protection systems

    Energy Technology Data Exchange (ETDEWEB)

    Lawrence, J.D. [Lawrence Livermore National Lab., CA (United States)

    1993-11-01

    Planning the development, use and regulation of computer systems in nuclear reactor protection systems in such a way as to enhance reliability and safety is a complex issue. This report is one of a series of reports from the Computer Safety and Reliability Group, Lawrence Livermore that investigates different aspects of computer software in reactor National Laboratory, that investigates different aspects of computer software in reactor protection systems. There are two central themes in the report, First, software considerations cannot be fully understood in isolation from computer hardware and application considerations. Second, the process of engineering reliability and safety into a computer system requires activities to be carried out throughout the software life cycle. The report discusses the many activities that can be carried out during the software life cycle to improve the safety and reliability of the resulting product. The viewpoint is primarily that of the assessor, or auditor.

  11. Software reliability and safety in nuclear reactor protection systems

    International Nuclear Information System (INIS)

    Lawrence, J.D.

    1993-11-01

    Planning the development, use and regulation of computer systems in nuclear reactor protection systems in such a way as to enhance reliability and safety is a complex issue. This report is one of a series of reports from the Computer Safety and Reliability Group, Lawrence Livermore that investigates different aspects of computer software in reactor National Laboratory, that investigates different aspects of computer software in reactor protection systems. There are two central themes in the report, First, software considerations cannot be fully understood in isolation from computer hardware and application considerations. Second, the process of engineering reliability and safety into a computer system requires activities to be carried out throughout the software life cycle. The report discusses the many activities that can be carried out during the software life cycle to improve the safety and reliability of the resulting product. The viewpoint is primarily that of the assessor, or auditor

  12. Behavior Protocols for Software Components

    Czech Academy of Sciences Publication Activity Database

    Plášil, František; Višňovský, Stanislav

    2002-01-01

    Roč. 28, č. 11 (2002), s. 1056-1076 ISSN 0098-5589 R&D Projects: GA AV ČR IAA2030902; GA ČR GA201/99/0244 Grant - others:Eureka(XE) Pepita project no.2033 Institutional research plan: AV0Z1030915 Keywords : behavior protocols * component-based programming * software architecture Subject RIV: JC - Computer Hardware ; Software Impact factor: 1.170, year: 2002

  13. Benchmarking therapeutic drug monitoring software: a review of available computer tools.

    Science.gov (United States)

    Fuchs, Aline; Csajka, Chantal; Thoma, Yann; Buclin, Thierry; Widmer, Nicolas

    2013-01-01

    Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare

  14. Development of a computational program to planning and control of the IEA-R1 reactor maintenance

    International Nuclear Information System (INIS)

    Martins, Mauro Onofre; Madi Filho, Tufic

    2013-01-01

    Maintenance is an essential activity in nuclear reactors. The components of safety systems of an industrial plant should have a low probability of failure, especially if there is a high risk of accidents that may cause environmental damage. In nuclear facilities, the presence of security systems is a technical specification and a requirement for their license and operation. In order to manage the entire information flow from the maintenance of the IEA-R1, a computational program (software) was developed, which not only plans and control all the maintenance, but also updates the documents and records to safeguard the quality, ensuring the safe operation of the reactor. The software has access levels and provides detailed reports of all maintenance planned and implemented, together with an individual history of the equipment during its lifetime in the facility. This work presents all the stages of the software development, description, compatibility, application, advantages and results obtained experimentally. (author)

  15. Application of software technology to a future spacecraft computer design

    Science.gov (United States)

    Labaugh, R. J.

    1980-01-01

    A study was conducted to determine how major improvements in spacecraft computer systems can be obtained from recent advances in hardware and software technology. Investigations into integrated circuit technology indicated that the CMOS/SOS chip set being developed for the Air Force Avionics Laboratory at Wright Patterson had the best potential for improving the performance of spaceborne computer systems. An integral part of the chip set is the bit slice arithmetic and logic unit. The flexibility allowed by microprogramming, combined with the software investigations, led to the specification of a baseline architecture and instruction set.

  16. Three-Dimensional Path Planning Software-Assisted Transjugular Intrahepatic Portosystemic Shunt: A Technical Modification

    Energy Technology Data Exchange (ETDEWEB)

    Tsauo, Jiaywei, E-mail: 80732059@qq.com; Luo, Xuefeng, E-mail: luobo-913@126.com [West China Hospital of Sichuan University, Institute of Interventional Radiology (China); Ye, Linchao, E-mail: linchao.ye@siemens.com [Siemens Ltd, Healthcare Sector (China); Li, Xiao, E-mail: simonlixiao@gmail.com [West China Hospital of Sichuan University, Institute of Interventional Radiology (China)

    2015-06-15

    PurposeThis study was designed to report our results with a modified technique of three-dimensional (3D) path planning software assisted transjugular intrahepatic portosystemic shunt (TIPS).Methods3D path planning software was recently developed to facilitate TIPS creation by using two carbon dioxide portograms acquired at least 20° apart to generate a 3D path for overlay needle guidance. However, one shortcoming is that puncturing along the overlay would be technically impossible if the angle of the liver access set and the angle of the 3D path are not the same. To solve this problem, a prototype 3D path planning software was fitted with a utility to calculate the angle of the 3D path. Using this, we modified the angle of the liver access set accordingly during the procedure in ten patients.ResultsFailure for technical reasons occurred in three patients (unsuccessful wedged hepatic venography in two cases, software technical failure in one case). The procedure was successful in the remaining seven patients, and only one needle pass was required to obtain portal vein access in each case. The course of puncture was comparable to the 3D path in all patients. No procedure-related complication occurred following the procedures.ConclusionsAdjusting the angle of the liver access set to match the angle of the 3D path determined by the software appears to be a favorable modification to the technique of 3D path planning software assisted TIPS.

  17. Software For Computer-Security Audits

    Science.gov (United States)

    Arndt, Kate; Lonsford, Emily

    1994-01-01

    Information relevant to potential breaches of security gathered efficiently. Automated Auditing Tools for VAX/VMS program includes following automated software tools performing noted tasks: Privileged ID Identification, program identifies users and their privileges to circumvent existing computer security measures; Critical File Protection, critical files not properly protected identified; Inactive ID Identification, identifications of users no longer in use found; Password Lifetime Review, maximum lifetimes of passwords of all identifications determined; and Password Length Review, minimum allowed length of passwords of all identifications determined. Written in DEC VAX DCL language.

  18. Imprinting Community College Computer Science Education with Software Engineering Principles

    Science.gov (United States)

    Hundley, Jacqueline Holliday

    2012-01-01

    Although the two-year curriculum guide includes coverage of all eight software engineering core topics, the computer science courses taught in Alabama community colleges limit student exposure to the programming, or coding, phase of the software development lifecycle and offer little experience in requirements analysis, design, testing, and…

  19. SU-E-J-80: A Comparative Analysis of MIM and Pinnacle Software for Adaptive Planning

    Energy Technology Data Exchange (ETDEWEB)

    Stanford, J; Duggar, W; Morris, B; Yang, C [University of Mississippi Med. Center, Jackson, MS (United States)

    2015-06-15

    Purpose: IMRT treatment is often administered with image guidance and small PTV margins. Change in body habitus such as weight loss and tumor response during the course of a treatment could be significant, thus warranting re-simulation and re-planning. Adaptive planning is challenging and places significant burden on the staff, as such some commercial vendors are now offering adaptive planning software to stream line the process of re-planning and dose accumulation between different CT data set. The purpose of this abstract is to compare the adaptive planning tools between Pinnacle version 9.8 and MIM 6.4 software. Methods: Head and Neck cases of previously treated patients that experienced anatomical changes during the course of their treatment were chosen for evaluation. The new CT data set from the re-simulation was imported to Pinnacle and MIM software. The dynamic planning tool in pinnacle was used to calculate the old plan with fixed MU setting on the new CT data. In MIM, the old CT was registered to the new data set, followed by a dose transformation to the new CT. The dose distribution to the PTV and critical structures from each software were analyzed and compared. Results: 9% difference was observed between the Global maximum doses reported by both software. Mean doses to organs at risk and PTV’s were within 6 % however pinnacle showed greater difference in PTV coverage change. Conclusion: MIM software adaptive planning corrects for geometrical changes without consideration for the effect of radiological path length on dose distribution; however Pinnacle corrects for both geometric and radiological effect on the dose distribution. Pinnacle gives a better estimate of the dosimetric impact due to anatomical changes.

  20. Software development methodology for computer based I&C systems of prototype fast breeder reactor

    International Nuclear Information System (INIS)

    Manimaran, M.; Shanmugam, A.; Parimalam, P.; Murali, N.; Satya Murty, S.A.V.

    2015-01-01

    Highlights: • Software development methodology adopted for computer based I&C systems of PFBR is detailed. • Constraints imposed as part of software requirements and coding phase are elaborated. • Compliance to safety and security requirements are described. • Usage of CASE (Computer Aided Software Engineering) tools during software design, analysis and testing phase are explained. - Abstract: Prototype Fast Breeder Reactor (PFBR) is sodium cooled reactor which is in the advanced stage of construction in Kalpakkam, India. Versa Module Europa bus based Real Time Computer (RTC) systems are deployed for Instrumentation & Control of PFBR. RTC systems have to perform safety functions within the stipulated time which calls for highly dependable software. Hence, well defined software development methodology is adopted for RTC systems starting from the requirement capture phase till the final validation of the software product. V-model is used for software development. IEC 60880 standard and AERB SG D-25 guideline are followed at each phase of software development. Requirements documents and design documents are prepared as per IEEE standards. Defensive programming strategies are followed for software development using C language. Verification and validation (V&V) of documents and software are carried out at each phase by independent V&V committee. Computer aided software engineering tools are used for software modelling, checking for MISRA C compliance and to carry out static and dynamic analysis. Various software metrics such as cyclomatic complexity, nesting depth and comment to code are checked. Test cases are generated using equivalence class partitioning, boundary value analysis and cause and effect graphing techniques. System integration testing is carried out wherein functional and performance requirements of the system are monitored

  1. Software development methodology for computer based I&C systems of prototype fast breeder reactor

    Energy Technology Data Exchange (ETDEWEB)

    Manimaran, M., E-mail: maran@igcar.gov.in; Shanmugam, A.; Parimalam, P.; Murali, N.; Satya Murty, S.A.V.

    2015-10-15

    Highlights: • Software development methodology adopted for computer based I&C systems of PFBR is detailed. • Constraints imposed as part of software requirements and coding phase are elaborated. • Compliance to safety and security requirements are described. • Usage of CASE (Computer Aided Software Engineering) tools during software design, analysis and testing phase are explained. - Abstract: Prototype Fast Breeder Reactor (PFBR) is sodium cooled reactor which is in the advanced stage of construction in Kalpakkam, India. Versa Module Europa bus based Real Time Computer (RTC) systems are deployed for Instrumentation & Control of PFBR. RTC systems have to perform safety functions within the stipulated time which calls for highly dependable software. Hence, well defined software development methodology is adopted for RTC systems starting from the requirement capture phase till the final validation of the software product. V-model is used for software development. IEC 60880 standard and AERB SG D-25 guideline are followed at each phase of software development. Requirements documents and design documents are prepared as per IEEE standards. Defensive programming strategies are followed for software development using C language. Verification and validation (V&V) of documents and software are carried out at each phase by independent V&V committee. Computer aided software engineering tools are used for software modelling, checking for MISRA C compliance and to carry out static and dynamic analysis. Various software metrics such as cyclomatic complexity, nesting depth and comment to code are checked. Test cases are generated using equivalence class partitioning, boundary value analysis and cause and effect graphing techniques. System integration testing is carried out wherein functional and performance requirements of the system are monitored.

  2. Software designs of image processing tasks with incremental refinement of computation.

    Science.gov (United States)

    Anastasia, Davide; Andreopoulos, Yiannis

    2010-08-01

    Software realizations of computationally-demanding image processing tasks (e.g., image transforms and convolution) do not currently provide graceful degradation when their clock-cycles budgets are reduced, e.g., when delay deadlines are imposed in a multitasking environment to meet throughput requirements. This is an important obstacle in the quest for full utilization of modern programmable platforms' capabilities since worst-case considerations must be in place for reasonable quality of results. In this paper, we propose (and make available online) platform-independent software designs performing bitplane-based computation combined with an incremental packing framework in order to realize block transforms, 2-D convolution and frame-by-frame block matching. The proposed framework realizes incremental computation: progressive processing of input-source increments improves the output quality monotonically. Comparisons with the equivalent nonincremental software realization of each algorithm reveal that, for the same precision of the result, the proposed approach can lead to comparable or faster execution, while it can be arbitrarily terminated and provide the result up to the computed precision. Application examples with region-of-interest based incremental computation, task scheduling per frame, and energy-distortion scalability verify that our proposal provides significant performance scalability with graceful degradation.

  3. Strategic Planning for Computer-Based Educational Technology.

    Science.gov (United States)

    Bozeman, William C.

    1984-01-01

    Offers educational practitioners direction for the development of a master plan for the implementation and application of computer-based educational technology by briefly examining computers in education, discussing organizational change from a theoretical perspective, and presenting an overview of the planning strategy known as the planning and…

  4. SoftLab: A Soft-Computing Software for Experimental Research with Commercialization Aspects

    Science.gov (United States)

    Akbarzadeh-T, M.-R.; Shaikh, T. S.; Ren, J.; Hubbell, Rob; Kumbla, K. K.; Jamshidi, M

    1998-01-01

    SoftLab is a software environment for research and development in intelligent modeling/control using soft-computing paradigms such as fuzzy logic, neural networks, genetic algorithms, and genetic programs. SoftLab addresses the inadequacies of the existing soft-computing software by supporting comprehensive multidisciplinary functionalities from management tools to engineering systems. Furthermore, the built-in features help the user process/analyze information more efficiently by a friendly yet powerful interface, and will allow the user to specify user-specific processing modules, hence adding to the standard configuration of the software environment.

  5. Measuring the impact of computer resource quality on the software development process and product

    Science.gov (United States)

    Mcgarry, Frank; Valett, Jon; Hall, Dana

    1985-01-01

    The availability and quality of computer resources during the software development process was speculated to have measurable, significant impact on the efficiency of the development process and the quality of the resulting product. Environment components such as the types of tools, machine responsiveness, and quantity of direct access storage may play a major role in the effort to produce the product and in its subsequent quality as measured by factors such as reliability and ease of maintenance. During the past six years, the NASA Goddard Space Flight Center has conducted experiments with software projects in an attempt to better understand the impact of software development methodologies, environments, and general technologies on the software process and product. Data was extracted and examined from nearly 50 software development projects. All were related to support of satellite flight dynamics ground-based computations. The relationship between computer resources and the software development process and product as exemplified by the subject NASA data was examined. Based upon the results, a number of computer resource-related implications are provided.

  6. Software Engineering Improvement Plan

    Science.gov (United States)

    2006-01-01

    In performance of this task order, bd Systems personnel provided support to the Flight Software Branch and the Software Working Group through multiple tasks related to software engineering improvement and to activities of the independent Technical Authority (iTA) Discipline Technical Warrant Holder (DTWH) for software engineering. To ensure that the products, comments, and recommendations complied with customer requirements and the statement of work, bd Systems personnel maintained close coordination with the customer. These personnel performed work in areas such as update of agency requirements and directives database, software effort estimation, software problem reports, a web-based process asset library, miscellaneous documentation review, software system requirements, issue tracking software survey, systems engineering NPR, and project-related reviews. This report contains a summary of the work performed and the accomplishments in each of these areas.

  7. HEP Community White Paper on Software Trigger and Event Reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Albrecht, Johannes; et al.

    2018-02-23

    Realizing the physics programs of the planned and upgraded high-energy physics (HEP) experiments over the next 10 years will require the HEP community to address a number of challenges in the area of software and computing. For this reason, the HEP software community has engaged in a planning process over the past two years, with the objective of identifying and prioritizing the research and development required to enable the next generation of HEP detectors to fulfill their full physics potential. The aim is to produce a Community White Paper which will describe the community strategy and a roadmap for software and computing research and development in HEP for the 2020s. The topics of event reconstruction and software triggers were considered by a joint working group and are summarized together in this document.

  8. Analytical exploration of the thermodynamic potentials by using symbolic computation software

    International Nuclear Information System (INIS)

    Hantsaridou, Anastasia P; Polatoglou, Hariton M

    2005-01-01

    Thermodynamics is a very general theory, based on fundamental symmetries. It generalizes classical mechanics and incorporates theoretical concepts such as field and field equations. Although all these ingredients are of the highest importance for a scientist, they are not given the attention they perhaps deserve in most undergraduate courses. Nowadays, powerful computers in conjunction with equally powerful software can ease the exploration of the crucial ideas of thermodynamics. The purpose of the present work is to show how the utilization of symbolic computation software can lead to a complementary understanding of thermodynamics. The method was applied to first and second year physics students in the Aristotle University of Thessaloniki (Greece) during the 2002-2003 academic year. The results indicate that symbolic computation software is appropriate not only for enhancing the teaching of the fundamental principles in thermodynamics and their applications, but also for increasing students' motivation for learning

  9. CAMEO (Computer-Aided Management of Emergency Operations) Software Suite

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — CAMEO is the umbrella name for a system of software applications used widely to plan for and respond to chemical emergencies. All of the programs in the suite work...

  10. Transactions in Software Components: Container-Interposed Transactions

    Czech Academy of Sciences Publication Activity Database

    Procházka, M.; Plášil, František

    2002-01-01

    Roč. 3, č. 2 (2002), s. - ISSN 1525-9293 R&D Projects: GA ČR GA201/99/0244; GA AV ČR IAA2030902 Institutional research plan: AV0Z1030915 Keywords : transactions * component-based software architectures * transaction propagation policy * transaction attributes * container -interposed transactions Subject RIV: JC - Computer Hardware ; Software

  11. APPLICATION OF AHP METHOD FOR THE SELECTION OF BUSINESS PLAN SOFTWARE

    Directory of Open Access Journals (Sweden)

    Marko Hell

    2013-02-01

    Full Text Available This study seeks to determine the importance of tools that are offered within the software (simplicity, help, etc. and sort them according to your preferences. The results are intended for novice users in business planning. The study was carried out among students who are familiar with the possibilities of five programs for business planning. The students conducted the evaluation in accordance with each of the presented programs that were selected by experts from the problem areas. The practical importance of the results of this research is seen as a recommendation of faculties of economics in procurement of software for business planning for educational purposes. It is possible to apply the already appointed multi criteria (AHP model to the target group with extensive entrepreneurial experience, with a change to the set of criteria weights.

  12. Reviews of computing technology: Software overview

    Energy Technology Data Exchange (ETDEWEB)

    Hartshorn, W.R.; Johnson, A.L.

    1994-01-05

    The Savannah River Site Computing Architecture states that the site computing environment will be standards-based, data-driven, and workstation-oriented. Larger server systems deliver needed information to users in a client-server relationship. Goals of the Architecture include utilizing computing resources effectively, maintaining a high level of data integrity, developing a robust infrastructure, and storing data in such a way as to promote accessibility and usability. This document describes the current storage environment at Savannah River Site (SRS) and presents some of the problems that will be faced and strategies that are planned over the next few years.

  13. Design and evaluation of a software prototype for participatory planning of environmental adaptations.

    Science.gov (United States)

    Eriksson, J; Ek, A; Johansson, G

    2000-03-01

    A software prototype to support the planning process for adapting home and work environments for people with physical disabilities was designed and later evaluated. The prototype exploits low-cost three-dimensional (3-D) graphics products in the home computer market. The essential features of the prototype are: interactive rendering with optional hardware acceleration, interactive walk-throughs, direct manipulation tools for moving objects and measuring distances, and import of 3-D-objects from a library. A usability study was conducted, consisting of two test sessions (three weeks apart) and a final interview. The prototype was then tested and evaluated by representatives of future users: five occupational therapist students, and four persons with physical disability, with no previous experience of the prototype. Emphasis in the usability study was placed on the prototype's efficiency and learnability. We found that it is possible to realise a planning tool for environmental adaptations, both regarding usability and technical efficiency. The usability evaluation confirms our findings from previous case studies, regarding the relevance and positive attitude towards this kind of planning tool. Although the prototype was found to be satisfactorily efficient for the basic tasks, the paper presents several suggestions for improvement of future prototype versions.

  14. Do You Need ERP? In the Business World, Enterprise Resource Planning Software Keeps Costs down and Productivity up. Should Districts Follow Suit?

    Science.gov (United States)

    Careless, James

    2007-01-01

    Enterprise resource planning (ERP) software does what school leaders have always wanted their computer systems to do: It sees all. By integrating every IT application an organization has--from purchasing and inventory control to payroll--ERPs create a single unified system. Not only does this give IT managers a holistic view to what is happening…

  15. A NEW CONTROL CIRCUIT AND COMPUTER SOFTWARE FOR CONTROLING PHOTOVOLTAIC SYSTEMS

    Directory of Open Access Journals (Sweden)

    Mustafa Berkant SELEK

    2008-02-01

    Full Text Available In this study, a new microcontroller circuit was designed and new computer software was implemented to control power flow currents of renewable energy system, which is established in Solar Energy Institute, Ege University, Bornova, Izmir, Turkey. PIC18F452 microcontroller based electronic circuit was designed to control another electronic circuit that includes power electronic switching components. Readily available standard control circuits are designed for switching single level inverters. In contrary, implemented circuit allows to switch multilevel inverters. In addition, because the efficiency of solar energy panels is considerably low, solar panels should be operated under the maximum power point (MPP. Therefore, MPP algorithm is included in the designed control circuit. Next, the control circuit also includes a serial communication interface based on RS232 standard. Using this interface enables the user to choose all functions available in the control circuit and take status report via computer software. Last, a general purpose command set was designed to establish communication between the computer software and the microcontroller-based control circuit. As a result, it is aimed that this study supply a basis for the researchers who want to develop own control circuits or more visual software.

  16. Certainty in Stockpile Computing: Recommending a Verification and Validation Program for Scientific Software

    Energy Technology Data Exchange (ETDEWEB)

    Lee, J.R.

    1998-11-01

    As computing assumes a more central role in managing the nuclear stockpile, the consequences of an erroneous computer simulation could be severe. Computational failures are common in other endeavors and have caused project failures, significant economic loss, and loss of life. This report examines the causes of software failure and proposes steps to mitigate them. A formal verification and validation program for scientific software is recommended and described.

  17. Software configuration management plan for the Hanford site technical database

    International Nuclear Information System (INIS)

    GRAVES, N.J.

    1999-01-01

    The Hanford Site Technical Database (HSTD) is used as the repository/source for the technical requirements baseline and programmatic data input via the Hanford Site and major Hanford Project Systems Engineering (SE) activities. The Hanford Site SE effort has created an integrated technical baseline for the Hanford Site that supports SE processes at the Site and project levels which is captured in the HSTD. The HSTD has been implemented in Ascent Logic Corporation (ALC) Commercial Off-The-Shelf (COTS) package referred to as the Requirements Driven Design (RDD) software. This Software Configuration Management Plan (SCMP) provides a process and means to control and manage software upgrades to the HSTD system

  18. Windows Calorimeter Control (WinCal) program computer software test plan

    International Nuclear Information System (INIS)

    Pertzborn, N.F.

    1997-01-01

    This document provides the information and guidelines necessary to conduct all the required testing of the Windows Calorimeter Control (WinCal) system. The strategy and essential components for testing the WinCal System Project are described in this test plan. The purpose of this test plan is to provide the customer and performing organizations with specific procedures for testing the specified system's functions

  19. Approximator: Predicting Interruptibility in Software Development with Commodity Computers

    DEFF Research Database (Denmark)

    Tell, Paolo; Jalaliniya, Shahram; Andersen, Kristian S. M.

    2015-01-01

    Assessing the presence and availability of a remote colleague is key in coordination in global software development but is not easily done using existing computer-mediated channels. Previous research has shown that automated estimation of interruptibility is feasible and can achieve a precision....... These early but promising results represent a starting point for designing tools with support for interruptibility capable of improving distributed awareness and cooperation to be used in global software development....

  20. The NUKDOS software for treatment planning in molecular radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Kletting, Peter; Schimmel, Sebastian [Univ. Ulm (Germany). Klinik fuer Nuklearmedizin; Haenscheid, Heribert; Fernandez, Maria; Lassmann, Michael [Univ. Wuerzburg (Germany). Klinik fuer Nuklearmedizin; Luster, Markus [Univ. Marburg (Germany). Klinik fuer Nuklearmedizin; Nosske, Dietmar [Bundesamt fuer Strahlenschutz, Fachbereich Strahlenschutz und Gesundheit, Oberschleissheim (Germany); Glatting, Gerhard [Heidelberg Univ., Medical Radiation Physics/Radiation Protection, Mannheim (Germany)

    2015-07-01

    The aim of this work was the development of a software tool for treatment planning prior to molecular radiotherapy, which comprises all functionality to objectively determine the activity to administer and the pertaining absorbed doses (including the corresponding error) based on a series of gamma camera images and one SPECT/CT or probe data. NUKDOS was developed in MATLAB. The workflow is based on the MIRD formalism For determination of the tissue or organ pharmacokinetics, gamma camera images as well as probe, urine, serum and blood activity data can be processed. To estimate the time-integrated activity coefficients (TIAC), sums of exponentials are fitted to the time activity data and integrated analytically. To obtain the TIAC on the voxel level, the voxel activity distribution from the quantitative 3D SPECT/CT (or PET/CT) is used for scaling and weighting the TIAC derived from the 2D organ data. The voxel S-values are automatically calculated based on the voxel-size of the image and the therapeutic nuclide ({sup 90}Y, {sup 131}I or {sup 177}Lu). The absorbed dose coefficients are computed by convolution of the voxel TIAC and the voxel S-values. The activity to administer and the pertaining absorbed doses are determined by entering the absorbed dose for the organ at risk. The overall error of the calculated absorbed doses is determined by Gaussian error propagation. NUKDOS was tested for the operation systems Windows {sup registered} 7 (64 Bit) and 8 (64 Bit). The results of each working step were compared to commercially available (SAAMII, OLINDA/EXM) and in-house (UlmDOS) software. The application of the software is demonstrated using examples form peptide receptor radionuclide therapy (PRRT) and from radioiodine therapy of benign thyroid diseases. For the example from PRRT, the calculated activity to administer differed by 4% comparing NUKDOS and the final result using UlmDos, SAAMII and OLINDA/EXM sequentially. The absorbed dose for the spleen and tumour

  1. Do You Need ERP? In the Business World, Enterprise Resource Planning Software Keeps Costs down and Productivity up. Should Districts Follow Suit?

    Science.gov (United States)

    Careless, James

    2007-01-01

    Enterprise resource planning software does what school leaders have always wanted their computer systems to do: It sees all. By integrating every IT application an organization has--from purchasing and inventory control to payroll--ERPs create a single unified system. Not only does this give IT managers a holistic view to what is happening in the…

  2. Service-oriented Software Defined Optical Networks for Cloud Computing

    Science.gov (United States)

    Liu, Yuze; Li, Hui; Ji, Yuefeng

    2017-10-01

    With the development of big data and cloud computing technology, the traditional software-defined network is facing new challenges (e.g., ubiquitous accessibility, higher bandwidth, more flexible management and greater security). This paper proposes a new service-oriented software defined optical network architecture, including a resource layer, a service abstract layer, a control layer and an application layer. We then dwell on the corresponding service providing method. Different service ID is used to identify the service a device can offer. Finally, we experimentally evaluate that proposed service providing method can be applied to transmit different services based on the service ID in the service-oriented software defined optical network.

  3. Software for Distributed Computation on Medical Databases: A Demonstration Project

    Directory of Open Access Journals (Sweden)

    Balasubramanian Narasimhan

    2017-05-01

    Full Text Available Bringing together the information latent in distributed medical databases promises to personalize medical care by enabling reliable, stable modeling of outcomes with rich feature sets (including patient characteristics and treatments received. However, there are barriers to aggregation of medical data, due to lack of standardization of ontologies, privacy concerns, proprietary attitudes toward data, and a reluctance to give up control over end use. Aggregation of data is not always necessary for model fitting. In models based on maximizing a likelihood, the computations can be distributed, with aggregation limited to the intermediate results of calculations on local data, rather than raw data. Distributed fitting is also possible for singular value decomposition. There has been work on the technical aspects of shared computation for particular applications, but little has been published on the software needed to support the "social networking" aspect of shared computing, to reduce the barriers to collaboration. We describe a set of software tools that allow the rapid assembly of a collaborative computational project, based on the flexible and extensible R statistical software and other open source packages, that can work across a heterogeneous collection of database environments, with full transparency to allow local officials concerned with privacy protections to validate the safety of the method. We describe the principles, architecture, and successful test results for the site-stratified Cox model and rank-k singular value decomposition.

  4. Advances in Multimedia, Software Engineering and Computing Vol.1 : Proceedings of the 2011 MSEC International Conference on Multimedia, Software Engineering and Computing

    CERN Document Server

    Lin, Sally

    2012-01-01

    MSEC2011 is an integrated conference concentrating its focus upon Multimedia ,Software Engineering, Computing and Education. In the proceeding, you can learn much more knowledge about Multimedia, Software Engineering ,Computing and Education of researchers all around the world. The main role of the proceeding is to be used as an exchange pillar for researchers who are working in the mentioned field. In order to meet high standard of Springer, AISC series ,the organization committee has made their efforts to do the following things. Firstly, poor quality paper has been refused after reviewing course by anonymous referee experts. Secondly, periodically review meetings have been held around the reviewers about five times for exchanging reviewing suggestions. Finally, the conference organization had several preliminary sessions before the conference. Through efforts of different people and departments, the conference will be successful and fruitful.

  5. Advances in Multimedia, Software Engineering and Computing Vol.2 : Proceedings of the 2011 MSEC International Conference on Multimedia, Software Engineering and Computing

    CERN Document Server

    Lin, Sally

    2012-01-01

    MSEC2011 is an integrated conference concentrating its focus upon Multimedia ,Software Engineering, Computing and Education. In the proceeding, you can learn much more knowledge about Multimedia, Software Engineering, Computing and Education of researchers all around the world. The main role of the proceeding is to be used as an exchange pillar for researchers who are working in the mentioned field. In order to meet high standard of Springer, AISC series ,the organization committee has made their efforts to do the following things. Firstly, poor quality paper has been refused after reviewing course by anonymous referee experts. Secondly, periodically review meetings have been held around the reviewers about five times for exchanging reviewing suggestions. Finally, the conference organization had several preliminary sessions before the conference. Through efforts of different people and departments, the conference will be successful and fruitful.

  6. Software Defined Radio Datalink Implementation Using PC-Type Computers

    National Research Council Canada - National Science Library

    Zafeiropoulos, Georgios

    2003-01-01

    The objective of this thesis was to examine the feasibility of implementation and the performance of a Software Defined Radio datalink, using a common PC type host computer and a high level programming language...

  7. Do's and Don'ts of Computer Models for Planning

    Science.gov (United States)

    Hammond, John S., III

    1974-01-01

    Concentrates on the managerial issues involved in computer planning models. Describes what computer planning models are and the process by which managers can increase the likelihood of computer planning models being successful in their organizations. (Author/DN)

  8. The benefit of introducing audit software into curricula for computer ...

    African Journals Online (AJOL)

    The benefit of introducing audit software into curricula for computer auditing students: a student perspective from the University of Pretoria. ... willing to sacrifice more of their time for practical computer classes because they are aware of the beneficial impact on their understanding of the subject as well as their future careers.

  9. The experimental modification of a computer software package for ...

    African Journals Online (AJOL)

    The experimental modification of a computer software package for graphing algebraic functions. ... No Abstract Available South African Journal of Education Vol.25(2) 2005: 61-68. Full Text: EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  10. Automating software design system DESTA

    Science.gov (United States)

    Lovitsky, Vladimir A.; Pearce, Patricia D.

    1992-01-01

    'DESTA' is the acronym for the Dialogue Evolutionary Synthesizer of Turnkey Algorithms by means of a natural language (Russian or English) functional specification of algorithms or software being developed. DESTA represents the computer-aided and/or automatic artificial intelligence 'forgiving' system which provides users with software tools support for algorithm and/or structured program development. The DESTA system is intended to provide support for the higher levels and earlier stages of engineering design of software in contrast to conventional Computer Aided Design (CAD) systems which provide low level tools for use at a stage when the major planning and structuring decisions have already been taken. DESTA is a knowledge-intensive system. The main features of the knowledge are procedures, functions, modules, operating system commands, batch files, their natural language specifications, and their interlinks. The specific domain for the DESTA system is a high level programming language like Turbo Pascal 6.0. The DESTA system is operational and runs on an IBM PC computer.

  11. V-1 nuclear power plant standby RPP-16S computer software

    International Nuclear Information System (INIS)

    Suchy, R.

    1988-01-01

    The software structure of the function of program modules of the RPP-16S standby computer which is part of the information system of the V-1 Bohunice nuclear power plant are described. The multitasking AMOS operational system is used for the organization of programs in the computer. The program modules are classified in five groups by function, i.e., in modules for the periodical collection of values and for the measurement of process quantities for both nuclear power plant units; for the primary processing of the values; for the monitoring of exceedance of preset limits; for unit operators' communication with the computer. The fifth group consists of users program modules. The standby computer software was tested in the actual operating conditions of the V-1 power plant. The results showed it operated correctly; minor shortcomings were removed. (Z.M.). 1 fig

  12. Software for computer based systems important to safety in nuclear power plants. Safety guide

    International Nuclear Information System (INIS)

    2004-01-01

    Computer based systems are of increasing importance to safety in nuclear power plants as their use in both new and older plants is rapidly increasing. They are used both in safety related applications, such as some functions of the process control and monitoring systems, as well as in safety critical applications, such as reactor protection or actuation of safety features. The dependability of computer based systems important to safety is therefore of prime interest and should be ensured. With current technology, it is possible in principle to develop computer based instrumentation and control systems for systems important to safety that have the potential for improving the level of safety and reliability with sufficient dependability. However, their dependability can be predicted and demonstrated only if a systematic, fully documented and reviewable engineering process is followed. Although a number of national and international standards dealing with quality assurance for computer based systems important to safety have been or are being prepared, internationally agreed criteria for demonstrating the safety of such systems are not generally available. It is recognized that there may be other ways of providing the necessary safety demonstration than those recommended here. The basic requirements for the design of safety systems for nuclear power plants are provided in the Requirements for Design issued in the IAEA Safety Standards Series.The IAEA has issued a Technical Report to assist Member States in ensuring that computer based systems important to safety in nuclear power plants are safe and properly licensed. The report provides information on current software engineering practices and, together with relevant standards, forms a technical basis for this Safety Guide. The objective of this Safety Guide is to provide guidance on the collection of evidence and preparation of documentation to be used in the safety demonstration for the software for computer based

  13. Software for computer based systems important to safety in nuclear power plants. Safety guide

    International Nuclear Information System (INIS)

    2005-01-01

    Computer based systems are of increasing importance to safety in nuclear power plants as their use in both new and older plants is rapidly increasing. They are used both in safety related applications, such as some functions of the process control and monitoring systems, as well as in safety critical applications, such as reactor protection or actuation of safety features. The dependability of computer based systems important to safety is therefore of prime interest and should be ensured. With current technology, it is possible in principle to develop computer based instrumentation and control systems for systems important to safety that have the potential for improving the level of safety and reliability with sufficient dependability. However, their dependability can be predicted and demonstrated only if a systematic, fully documented and reviewable engineering process is followed. Although a number of national and international standards dealing with quality assurance for computer based systems important to safety have been or are being prepared, internationally agreed criteria for demonstrating the safety of such systems are not generally available. It is recognized that there may be other ways of providing the necessary safety demonstration than those recommended here. The basic requirements for the design of safety systems for nuclear power plants are provided in the Requirements for Design issued in the IAEA Safety Standards Series.The IAEA has issued a Technical Report to assist Member States in ensuring that computer based systems important to safety in nuclear power plants are safe and properly licensed. The report provides information on current software engineering practices and, together with relevant standards, forms a technical basis for this Safety Guide. The objective of this Safety Guide is to provide guidance on the collection of evidence and preparation of documentation to be used in the safety demonstration for the software for computer based

  14. Software for computer based systems important to safety in nuclear power plants. Safety guide

    International Nuclear Information System (INIS)

    2000-01-01

    Computer based systems are of increasing importance to safety in nuclear power plants as their use in both new and older plants is rapidly increasing. They are used both in safety related applications, such as some functions of the process control and monitoring systems, as well as in safety critical applications, such as reactor protection or actuation of safety features. The dependability of computer based systems important to safety is therefore of prime interest and should be ensured. With current technology, it is possible in principle to develop computer based instrumentation and control systems for systems important to safety that have the potential for improving the level of safety and reliability with sufficient dependability. However, their dependability can be predicted and demonstrated only if a systematic, fully documented and reviewable engineering process is followed. Although a number of national and international standards dealing with quality assurance for computer based systems important to safety have been or are being prepared, internationally agreed criteria for demonstrating the safety of such systems are not generally available. It is recognized that there may be other ways of providing the necessary safety demonstration than those recommended here. The basic requirements for the design of safety systems for nuclear power plants are provided in the Requirements for Design issued in the IAEA Safety Standards Series.The IAEA has issued a Technical Report to assist Member States in ensuring that computer based systems important to safety in nuclear power plants are safe and properly licensed. The report provides information on current software engineering practices and, together with relevant standards, forms a technical basis for this Safety Guide. The objective of this Safety Guide is to provide guidance on the collection of evidence and preparation of documentation to be used in the safety demonstration for the software for computer based

  15. Repository-Based Software Engineering Program: Working Program Management Plan

    Science.gov (United States)

    1993-01-01

    Repository-Based Software Engineering Program (RBSE) is a National Aeronautics and Space Administration (NASA) sponsored program dedicated to introducing and supporting common, effective approaches to software engineering practices. The process of conceiving, designing, building, and maintaining software systems by using existing software assets that are stored in a specialized operational reuse library or repository, accessible to system designers, is the foundation of the program. In addition to operating a software repository, RBSE promotes (1) software engineering technology transfer, (2) academic and instructional support of reuse programs, (3) the use of common software engineering standards and practices, (4) software reuse technology research, and (5) interoperability between reuse libraries. This Program Management Plan (PMP) is intended to communicate program goals and objectives, describe major work areas, and define a management report and control process. This process will assist the Program Manager, University of Houston at Clear Lake (UHCL) in tracking work progress and describing major program activities to NASA management. The goal of this PMP is to make managing the RBSE program a relatively easy process that improves the work of all team members. The PMP describes work areas addressed and work efforts being accomplished by the program; however, it is not intended as a complete description of the program. Its focus is on providing management tools and management processes for monitoring, evaluating, and administering the program; and it includes schedules for charting milestones and deliveries of program products. The PMP was developed by soliciting and obtaining guidance from appropriate program participants, analyzing program management guidance, and reviewing related program management documents.

  16. Report of Investigation Committee on Programs for Research and Development of Strategic Software for Advanced Computing; Kodo computing yo senryakuteki software no kenkyu kaihatsu program kento iinkai hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-12-26

    The committee met on December 26, 2000, with 32 people in attendance. Discussion was made on the results of surveys conducted for the development of strategic software for advanced computing and on candidate projects for strategic software development. Taken up at the meeting were eight subjects which were the interim report on the survey results, semiconductor TCAD (technology computer-aided design) system, nanodevice surface analysis system, network distribution parallel processing platform (tentative name), fatigue simulation system, chemical reaction simulator, protein structure analysis system, and a next-generation fluid analysis system. In this report, the author uses his own way in arranging the discussion results into the four categories of (1) a strategic software development system, (2) popularization method and maintenance system, (3) handling of the results, and (4) the evaluation of the program for research and development. In relation to category (1), it is stated that the software grows up with the passage of time, that the software is a commercial program, and that in the development of a commercial software program the process of basic study up to the preparation of a prototype should be completely separated from the process for its completion. (NEDO)

  17. Incorporating Computer-Aided Software in the Undergraduate Chemical Engineering Core Courses

    Science.gov (United States)

    Alnaizy, Raafat; Abdel-Jabbar, Nabil; Ibrahim, Taleb H.; Husseini, Ghaleb A.

    2014-01-01

    Introductions of computer-aided software and simulators are implemented during the sophomore-year of the chemical engineering (ChE) curriculum at the American University of Sharjah (AUS). Our faculty concurs that software integration within the curriculum is beneficial to our students, as evidenced by the positive feedback received from industry…

  18. A software to report and file by personal computer

    International Nuclear Information System (INIS)

    Di Giandomenico, E.; Filippone, A.; Esposito, A.; Bonomo, L.

    1989-01-01

    During the past four years the authors have been gaining experince in reporting radiological examinations by personal computer. Today they describe the project of a new software which allows the reporting and filing of roentgenograms. This program was realized by a radiologist, using a well known data base management system: dBASE III. The program was shaped to fit the radiologist's needs: it helps to report, and allows to file, radiological data, with the diagnosic codes used by the American College of Radiology. In this paper the authors describe the data base structure and indicate the software functions which make its use possible. Thus, this paper is not aimed at advertising a new reporting program, but at demonstrating how the radiologist can himself manage some aspects of his work with the help of a personal computer

  19. Poster — Thur Eve — 69: Computational Study of DVH-guided Cancer Treatment Planning Optimization Methods

    Energy Technology Data Exchange (ETDEWEB)

    Ghomi, Pooyan Shirvani; Zinchenko, Yuriy [University of Calgary, Department of Mathematics and Statistics (Canada)

    2014-08-15

    Purpose: To compare methods to incorporate the Dose Volume Histogram (DVH) curves into the treatment planning optimization. Method: The performance of three methods, namely, the conventional Mixed Integer Programming (MIP) model, a convex moment-based constrained optimization approach, and an unconstrained convex moment-based penalty approach, is compared using anonymized data of a prostate cancer patient. Three plans we generated using the corresponding optimization models. Four Organs at Risk (OARs) and one Tumor were involved in the treatment planning. The OARs and Tumor were discretized into total of 50,221 voxels. The number of beamlets was 943. We used commercially available optimization software Gurobi and Matlab to solve the models. Plan comparison was done by recording the model runtime followed by visual inspection of the resulting dose volume histograms. Conclusion: We demonstrate the effectiveness of the moment-based approaches to replicate the set of prescribed DVH curves. The unconstrained convex moment-based penalty approach is concluded to have the greatest potential to reduce the computational effort and holds a promise of substantial computational speed up.

  20. Human-Computer Interaction Software: Lessons Learned, Challenges Ahead

    Science.gov (United States)

    1989-01-01

    domain communi- Iatelligent s t s s Me cation. Users familiar with problem Inteligent support systes. High-func- anddomains but inxperienced with comput...8217i. April 1987, pp. 7.3-78. His research interests include artificial intel- Creating better HCI softw-are will have a 8. S.K Catrd. I.P. Moran. arid

  1. HEP Community White Paper on Software Trigger and Event Reconstruction: Executive Summary

    Energy Technology Data Exchange (ETDEWEB)

    Albrecht, Johannes; et al.

    2018-02-23

    Realizing the physics programs of the planned and upgraded high-energy physics (HEP) experiments over the next 10 years will require the HEP community to address a number of challenges in the area of software and computing. For this reason, the HEP software community has engaged in a planning process over the past two years, with the objective of identifying and prioritizing the research and development required to enable the next generation of HEP detectors to fulfill their full physics potential. The aim is to produce a Community White Paper which will describe the community strategy and a roadmap for software and computing research and development in HEP for the 2020s. The topics of event reconstruction and software triggers were considered by a joint working group and are summarized together in this document.

  2. Computer-supported planning on graphic terminals in the staff divisions of hard coal mines. Rechnergestuetzte Planung an grafischen Arbeitsplaetzen in den Stabsstellen von Steinkohlenbergwerken

    Energy Technology Data Exchange (ETDEWEB)

    Seeliger, A [Technische Hochschule Aachen (Germany)

    1990-01-01

    Analysis of the planning activity in the planning department of German hard coal mines have shown that in some branches of the planning process productivity and creativity of the involved experts can be increased, potentials for rationalization be opened up and the cooperation between different engineering disciplines be improved by using computer network systems in combination with graphic systems. This paper reports about the computer-supported planning system 'Grube', which has been developed at the RWTH (technical university) Aachen, and its applications in mine surveying, electro-technical and mechanical planning as well as in the planning of ventilation systems and detailed mine planning. The software module GRUBE-W, which will be in future the centre of the working place for the mine ventilation planning of the Ruhrkohle AG, is discussed in detail. (orig.).

  3. Comparison of two three-dimensional cephalometric analysis computer software.

    Science.gov (United States)

    Sawchuk, Dena; Alhadlaq, Adel; Alkhadra, Thamer; Carlyle, Terry D; Kusnoto, Budi; El-Bialy, Tarek

    2014-10-01

    Three-dimensional cephalometric analyses are getting more attraction in orthodontics. The aim of this study was to compare two softwares to evaluate three-dimensional cephalometric analyses of orthodontic treatment outcomes. Twenty cone beam computed tomography images were obtained using i-CAT(®) imaging system from patient's records as part of their regular orthodontic records. The images were analyzed using InVivoDental5.0 (Anatomage Inc.) and 3DCeph™ (University of Illinois at Chicago, Chicago, IL, USA) software. Before and after orthodontic treatments data were analyzed using t-test. Reliability test using interclass correlation coefficient was stronger for InVivoDental5.0 (0.83-0.98) compared with 3DCeph™ (0.51-0.90). Paired t-test comparison of the two softwares shows no statistical significant difference in the measurements made in the two softwares. InVivoDental5.0 measurements are more reproducible and user friendly when compared to 3DCeph™. No statistical difference between the two softwares in linear or angular measurements. 3DCeph™ is more time-consuming in performing three-dimensional analysis compared with InVivoDental5.0.

  4. Preliminary experience with SpineEOS, a new software for 3D planning in AIS surgery.

    Science.gov (United States)

    Ferrero, Emmanuelle; Mazda, Keyvan; Simon, Anne-Laure; Ilharreborde, Brice

    2018-04-24

    Preoperative planning of scoliosis surgery is essential in the effective treatment of spine pathology. Thus, precontoured rods have been recently developed to avoid iatrogenic sagittal misalignment and rod breakage. Some specific issues exist in adolescent idiopathic scoliosis (AIS), such as a less distal lower instrumented level, a great variability in the location of inflection point (transition from lumbar lordosis to thoracic kyphosis), and sagittal correction is limited by both bone-implant interface. Since 2007, stereoradiographic imaging system is used and allows for 3D reconstructions. Therefore, a software was developed to perform preoperative 3D surgical planning and to provide rod's shape and length. The goal of this preliminary study was to assess the feasibility, reliability, and the clinical relevance of this new software. Retrospective study on 47 AIS patients operated with the same surgical technique: posteromedial translation through posterior approach with lumbar screws and thoracic sublaminar bands. Pre- and postoperatively, 3D reconstructions were performed on stereoradiographic images (EOS system, Paris, France) and compared. Then, the software was used to plan the surgical correction and determine rod's shape and length. Simulated spine and rods were compared to postoperative real 3D reconstructions. 3D reconstructions and planning were performed by an independent observer. 3D simulations were performed on the 47 patients. No difference was found between the simulated model and the postoperative 3D reconstructions in terms of sagittal parameters. Postoperatively, 21% of LL were not within reference values. Postoperative SVA was 20 mm anterior in 2/3 of the cases. Postoperative rods were significantly longer than precontoured rods planned with the software (mean 10 mm). Inflection points were different on the rods used and the planned rods (2.3 levels on average). In this preliminary study, the software based on 3D stereoradiography low

  5. Provider software buyer's guide.

    Science.gov (United States)

    1994-03-01

    To help long term care providers find new ways to improve quality of care and efficiency, Provider magazine presents the fourth annual listing of software firms marketing computer programs for all areas of nursing facility operations. On the following five pages, more than 80 software firms display their wares, with programs such as minimum data set and care planning, dietary, accounting and financials, case mix, and medication administration records. The guide also charts compatible hardware, integration ability, telephone numbers, company contacts, and easy-to-use reader service numbers.

  6. Development of preoperative planning software for transforaminal endoscopic surgery and the guidance for clinical applications.

    Science.gov (United States)

    Chen, Xiaojun; Cheng, Jun; Gu, Xin; Sun, Yi; Politis, Constantinus

    2016-04-01

    Preoperative planning is of great importance for transforaminal endoscopic techniques applied in percutaneous endoscopic lumbar discectomy. In this study, a modular preoperative planning software for transforaminal endoscopic surgery was developed and demonstrated. The path searching method is based on collision detection, and the oriented bounding box was constructed for the anatomical models. Then, image reformatting algorithms were developed for multiplanar reconstruction which provides detailed anatomical information surrounding the virtual planned path. Finally, multithread technique was implemented to realize the steady-state condition of the software. A preoperative planning software for transforaminal endoscopic surgery (TE-Guider) was developed; seven cases of patients with symptomatic lumbar disc herniations were planned preoperatively using TE-Guider. The distances to the midlines and the direction of the optimal paths were exported, and each result was in line with the empirical value. TE-Guider provides an efficient and cost-effective way to search the ideal path and entry point for the puncture. However, more clinical cases will be conducted to demonstrate its feasibility and reliability.

  7. Computer Games as Virtual Environments for Safety-Critical Software Validation

    Directory of Open Access Journals (Sweden)

    Štefan Korečko

    2017-01-01

    Full Text Available Computer games became an inseparable part of everyday life in modern society and the time people spend playing them every day is increasing. This trend caused a noticeable research activity focused on utilizing the time spent playing in a meaningful way, for example to help solving scientific problems or tasks related to computer systems development. In this paper we present one contribution to this activity, a software system consisting of a modified version of the Open Rails train simulator and an application called TS2JavaConn, which allows to use separately developed software controllers with the simulator. The system is intended for validation of controllers developed by formal methods. The paper describes the overall architecture of the system and operation of its components. It also compares the system with other approaches to purposeful utilization of computer games, specifies suitable formal methods and illustrates its intended use on an example.

  8. The Virtual Cell: a software environment for computational cell biology.

    Science.gov (United States)

    Loew, L M; Schaff, J C

    2001-10-01

    The newly emerging field of computational cell biology requires software tools that address the needs of a broad community of scientists. Cell biological processes are controlled by an interacting set of biochemical and electrophysiological events that are distributed within complex cellular structures. Computational modeling is familiar to researchers in fields such as molecular structure, neurobiology and metabolic pathway engineering, and is rapidly emerging in the area of gene expression. Although some of these established modeling approaches can be adapted to address problems of interest to cell biologists, relatively few software development efforts have been directed at the field as a whole. The Virtual Cell is a computational environment designed for cell biologists as well as for mathematical biologists and bioengineers. It serves to aid the construction of cell biological models and the generation of simulations from them. The system enables the formulation of both compartmental and spatial models, the latter with either idealized or experimentally derived geometries of one, two or three dimensions.

  9. A Software Framework for Multimodal Human-Computer Interaction Systems

    NARCIS (Netherlands)

    Shen, Jie; Pantic, Maja

    2009-01-01

    This paper describes a software framework we designed and implemented for the development and research in the area of multimodal human-computer interface. The proposed framework is based on publish / subscribe architecture, which allows developers and researchers to conveniently configure, test and

  10. Modeling and performance analysis for composite network–compute service provisioning in software-defined cloud environments

    Directory of Open Access Journals (Sweden)

    Qiang Duan

    2015-08-01

    Full Text Available The crucial role of networking in Cloud computing calls for a holistic vision of both networking and computing systems that leads to composite network–compute service provisioning. Software-Defined Network (SDN is a fundamental advancement in networking that enables network programmability. SDN and software-defined compute/storage systems form a Software-Defined Cloud Environment (SDCE that may greatly facilitate composite network–compute service provisioning to Cloud users. Therefore, networking and computing systems need to be modeled and analyzed as composite service provisioning systems in order to obtain thorough understanding about service performance in SDCEs. In this paper, a novel approach for modeling composite network–compute service capabilities and a technique for evaluating composite network–compute service performance are developed. The analytic method proposed in this paper is general and agnostic to service implementation technologies; thus is applicable to a wide variety of network–compute services in SDCEs. The results obtained in this paper provide useful guidelines for federated control and management of networking and computing resources to achieve Cloud service performance guarantees.

  11. HCI^2 Framework: A software framework for multimodal human-computer interaction systems

    NARCIS (Netherlands)

    Shen, Jie; Pantic, Maja

    2013-01-01

    This paper presents a novel software framework for the development and research in the area of multimodal human-computer interface (MHCI) systems. The proposed software framework, which is called the HCI∧2 Framework, is built upon publish/subscribe (P/S) architecture. It implements a

  12. The cognitive dynamics of computer science cost-effective large scale software development

    CERN Document Server

    De Gyurky, Szabolcs Michael; John Wiley & Sons

    2006-01-01

    This book has three major objectives: To propose an ontology for computer software; To provide a methodology for development of large software systems to cost and schedule that is based on the ontology; To offer an alternative vision regarding the development of truly autonomous systems.

  13. The GeoSteiner software package for computing Steiner trees in the plane

    DEFF Research Database (Denmark)

    Juhl, Daniel; Warme, David M.; Winter, Pawel

    The GeoSteiner software package has for more than 10 years been the fastest (publicly available) program for computing exact solutions to Steiner tree problems in the plane. The computational study by Warme, Winter and Zachariasen, published in 2000, documented the performance of the GeoSteiner...... approach --- allowing the exact solution of Steiner tree problems with more than a thousand terminals. Since then, a number of algorithmic enhancements have improved the performance of the software package significantly. In this computational study we run the current code on the largest problem instances...... from the 2000-study, and on a number of larger problem instances. The computational study is performed using both the publicly available GeoSteiner 3.1 code base, and the commercial GeoSteiner 4.0 code base....

  14. Current practice in software development for computational neuroscience and how to improve it.

    Science.gov (United States)

    Gewaltig, Marc-Oliver; Cannon, Robert

    2014-01-01

    Almost all research work in computational neuroscience involves software. As researchers try to understand ever more complex systems, there is a continual need for software with new capabilities. Because of the wide range of questions being investigated, new software is often developed rapidly by individuals or small groups. In these cases, it can be hard to demonstrate that the software gives the right results. Software developers are often open about the code they produce and willing to share it, but there is little appreciation among potential users of the great diversity of software development practices and end results, and how this affects the suitability of software tools for use in research projects. To help clarify these issues, we have reviewed a range of software tools and asked how the culture and practice of software development affects their validity and trustworthiness. We identified four key questions that can be used to categorize software projects and correlate them with the type of product that results. The first question addresses what is being produced. The other three concern why, how, and by whom the work is done. The answers to these questions show strong correlations with the nature of the software being produced, and its suitability for particular purposes. Based on our findings, we suggest ways in which current software development practice in computational neuroscience can be improved and propose checklists to help developers, reviewers, and scientists to assess the quality of software and whether particular pieces of software are ready for use in research.

  15. Current practice in software development for computational neuroscience and how to improve it.

    Directory of Open Access Journals (Sweden)

    Marc-Oliver Gewaltig

    2014-01-01

    Full Text Available Almost all research work in computational neuroscience involves software. As researchers try to understand ever more complex systems, there is a continual need for software with new capabilities. Because of the wide range of questions being investigated, new software is often developed rapidly by individuals or small groups. In these cases, it can be hard to demonstrate that the software gives the right results. Software developers are often open about the code they produce and willing to share it, but there is little appreciation among potential users of the great diversity of software development practices and end results, and how this affects the suitability of software tools for use in research projects. To help clarify these issues, we have reviewed a range of software tools and asked how the culture and practice of software development affects their validity and trustworthiness. We identified four key questions that can be used to categorize software projects and correlate them with the type of product that results. The first question addresses what is being produced. The other three concern why, how, and by whom the work is done. The answers to these questions show strong correlations with the nature of the software being produced, and its suitability for particular purposes. Based on our findings, we suggest ways in which current software development practice in computational neuroscience can be improved and propose checklists to help developers, reviewers, and scientists to assess the quality of software and whether particular pieces of software are ready for use in research.

  16. A software for computer automated radioactive particle tracking

    International Nuclear Information System (INIS)

    Vieira, Wilson S.; Brandao, Luis E.; Braz, Delson

    2008-01-01

    TRACO-1 is the first software developed in Brazil for optimization and diagnosis of multiphase chemical reactors employing the technique known as 'Computer Automated Radioactive Particle Tracking' whose main idea is to follow the movement of a punctual radioactive particle inside a vessel. Considering that this particle has a behavior similar of the phase under investigation, important conclusions can be achieved. As a preliminary TRACO-1 evaluation, a simulation was carried out with the aid of a commercial software called MICROSHIELD, version 5.05, to obtain values of photon counting rates at four detector surfaces. These counting were related to the emission of gamma radiation from a radioactive source because they are the main TRACO-1 input variables. Although the results that has been found are incipient, the analysis of them suggest that the tracking of a radioactive source using TRACO- 1 can be well succeed, but a better evaluation of the capabilities of this software will only be achieved after its application in real experiments. (author)

  17. A community Q&A for HEP Software and Computing ?

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    How often do you use StackOverflow or ServerFault to find information in your daily work? Would you be interested in a community Q&A site for HEP Software and Computing, for instance a dedicated StackExchange site? I looked into this question...

  18. Blind trials of computer-assisted structure elucidation software

    Directory of Open Access Journals (Sweden)

    Moser Arvin

    2012-02-01

    Full Text Available Abstract Background One of the largest challenges in chemistry today remains that of efficiently mining through vast amounts of data in order to elucidate the chemical structure for an unknown compound. The elucidated candidate compound must be fully consistent with the data and any other competing candidates efficiently eliminated without doubt by using additional data if necessary. It has become increasingly necessary to incorporate an in silico structure generation and verification tool to facilitate this elucidation process. An effective structure elucidation software technology aims to mimic the skills of a human in interpreting the complex nature of spectral data while producing a solution within a reasonable amount of time. This type of software is known as computer-assisted structure elucidation or CASE software. A systematic trial of the ACD/Structure Elucidator CASE software was conducted over an extended period of time by analysing a set of single and double-blind trials submitted by a global audience of scientists. The purpose of the blind trials was to reduce subjective bias. Double-blind trials comprised of data where the candidate compound was unknown to both the submitting scientist and the analyst. The level of expertise of the submitting scientist ranged from novice to expert structure elucidation specialists with experience in pharmaceutical, industrial, government and academic environments. Results Beginning in 2003, and for the following nine years, the algorithms and software technology contained within ACD/Structure Elucidator have been tested against 112 data sets; many of these were unique challenges. Of these challenges 9% were double-blind trials. The results of eighteen of the single-blind trials were investigated in detail and included problems of a diverse nature with many of the specific challenges associated with algorithmic structure elucidation such as deficiency in protons, structure symmetry, a large number of

  19. DVS-SOFTWARE: An Effective Tool for Applying Highly Parallelized Hardware To Computational Geophysics

    Science.gov (United States)

    Herrera, I.; Herrera, G. S.

    2015-12-01

    Most geophysical systems are macroscopic physical systems. The behavior prediction of such systems is carried out by means of computational models whose basic models are partial differential equations (PDEs) [1]. Due to the enormous size of the discretized version of such PDEs it is necessary to apply highly parallelized super-computers. For them, at present, the most efficient software is based on non-overlapping domain decomposition methods (DDM). However, a limiting feature of the present state-of-the-art techniques is due to the kind of discretizations used in them. Recently, I. Herrera and co-workers using 'non-overlapping discretizations' have produced the DVS-Software which overcomes this limitation [2]. The DVS-software can be applied to a great variety of geophysical problems and achieves very high parallel efficiencies (90%, or so [3]). It is therefore very suitable for effectively applying the most advanced parallel supercomputers available at present. In a parallel talk, in this AGU Fall Meeting, Graciela Herrera Z. will present how this software is being applied to advance MOD-FLOW. Key Words: Parallel Software for Geophysics, High Performance Computing, HPC, Parallel Computing, Domain Decomposition Methods (DDM)REFERENCES [1]. Herrera Ismael and George F. Pinder, Mathematical Modelling in Science and Engineering: An axiomatic approach", John Wiley, 243p., 2012. [2]. Herrera, I., de la Cruz L.M. and Rosas-Medina A. "Non Overlapping Discretization Methods for Partial, Differential Equations". NUMER METH PART D E, 30: 1427-1454, 2014, DOI 10.1002/num 21852. (Open source) [3]. Herrera, I., & Contreras Iván "An Innovative Tool for Effectively Applying Highly Parallelized Software To Problems of Elasticity". Geofísica Internacional, 2015 (In press)

  20. SWEPP gamma-ray spectrometer system software test plan and report

    International Nuclear Information System (INIS)

    Femec, D.A.

    1994-09-01

    The SWEPP Gamma-Ray Spectrometer (SGRS) System has been developed by the Radiation Measurements and Development Unit of the Idaho National Engineering Laboratory to assist in the characterization of the radiological contents of contact-handled waste containers at the Stored Waste Examination Pilot Plant (SWEPP). In addition to determining the concentrations of gamma-ray-emitting radionuclides, the software also calculates attenuation-corrected isotopic mass ratios of specific interest, and provides controls for SGRS hardware as required. This document presents the test plan and report for the data acquisition and analysis software associated with the SGRS system

  1. Software and man-machine interface considerations for a nuclear plant computer replacement and upgrade project

    International Nuclear Information System (INIS)

    Diamond, G.; Robinson, E.

    1984-01-01

    Some of the key software functions and Man-Machine Interface considerations in a computer replacement and upgrade project for a nuclear power plant are described. The project involves the installation of two separate computer systems: an Emergency Response Facilities Computer System (ERFCS) and a Plant Process Computer System (PPCS). These systems employ state-of-the-art computer hardware and software. The ERFCS is a new system intended to provide enhanced functions to meet NRC post-TMI guidelines. The PPCS is intended to replace and upgrade an existing obsolete plant computer system. A general overview of the hardware and software aspects of the replacement and upgrade is presented. The work done to develop the upgraded Man-Machine Interface is described. For the ERFCS, a detailed discussion is presented of the work done to develop logic to evaluate the readiness and performance of safety systems and their supporting functions. The Man-Machine Interface considerations of reporting readiness and performance to the operator are discussed. Finally, the considerations involved in the implementation of this logic in real-time software are discussed.. For the PPCS, a detailed discussion is presented of some new features

  2. Runtime Concepts of Hierarchical Software Components

    Czech Academy of Sciences Publication Activity Database

    Bureš, Tomáš; Hnětynka, P.; Plášil, František

    2007-01-01

    Roč. 8, special (2007), s. 454-463 ISSN 1525-9293 R&D Projects: GA AV ČR 1ET400300504 Institutional research plan: CEZ:AV0Z10300504 Keywords : component-based development * hierarchical components * connectors * controlers * runtime environment Subject RIV: JC - Computer Hardware ; Software

  3. Software Safety Risk in Legacy Safety-Critical Computer Systems

    Science.gov (United States)

    Hill, Janice L.; Baggs, Rhoda

    2007-01-01

    Safety Standards contain technical and process-oriented safety requirements. Technical requirements are those such as "must work" and "must not work" functions in the system. Process-Oriented requirements are software engineering and safety management process requirements. Address the system perspective and some cover just software in the system > NASA-STD-8719.13B Software Safety Standard is the current standard of interest. NASA programs/projects will have their own set of safety requirements derived from the standard. Safety Cases: a) Documented demonstration that a system complies with the specified safety requirements. b) Evidence is gathered on the integrity of the system and put forward as an argued case. [Gardener (ed.)] c) Problems occur when trying to meet safety standards, and thus make retrospective safety cases, in legacy safety-critical computer systems.

  4. NNSA?s Computing Strategy, Acquisition Plan, and Basis for Computing Time Allocation

    Energy Technology Data Exchange (ETDEWEB)

    Nikkel, D J

    2009-07-21

    This report is in response to the Omnibus Appropriations Act, 2009 (H.R. 1105; Public Law 111-8) in its funding of the National Nuclear Security Administration's (NNSA) Advanced Simulation and Computing (ASC) Program. This bill called for a report on ASC's plans for computing and platform acquisition strategy in support of stockpile stewardship. Computer simulation is essential to the stewardship of the nation's nuclear stockpile. Annual certification of the country's stockpile systems, Significant Finding Investigations (SFIs), and execution of Life Extension Programs (LEPs) are dependent on simulations employing the advanced ASC tools developed over the past decade plus; indeed, without these tools, certification would not be possible without a return to nuclear testing. ASC is an integrated program involving investments in computer hardware (platforms and computing centers), software environments, integrated design codes and physical models for these codes, and validation methodologies. The significant progress ASC has made in the past derives from its focus on mission and from its strategy of balancing support across the key investment areas necessary for success. All these investment areas must be sustained for ASC to adequately support current stockpile stewardship mission needs and to meet ever more difficult challenges as the weapons continue to age or undergo refurbishment. The appropriations bill called for this report to address three specific issues, which are responded to briefly here but are expanded upon in the subsequent document: (1) Identify how computing capability at each of the labs will specifically contribute to stockpile stewardship goals, and on what basis computing time will be allocated to achieve the goal of a balanced program among the labs. (2) Explain the NNSA's acquisition strategy for capacity and capability of machines at each of the labs and how it will fit within the existing budget constraints. (3

  5. [Computer-assisted operational planning for pediatric abdominal surgery. 3D-visualized MRI with volume rendering].

    Science.gov (United States)

    Günther, P; Tröger, J; Holland-Cunz, S; Waag, K L; Schenk, J P

    2006-08-01

    Exact surgical planning is necessary for complex operations of pathological changes in anatomical structures of the pediatric abdomen. 3D visualization and computer-assisted operational planning based on CT data are being increasingly used for difficult operations in adults. To minimize radiation exposure and for better soft tissue contrast, sonography and MRI are the preferred diagnostic methods in pediatric patients. Because of manifold difficulties 3D visualization of these MRI data has not been realized so far, even though the field of embryonal malformations and tumors could benefit from this.A newly developed and modified raycasting-based powerful 3D volume rendering software (VG Studio Max 1.2) for the planning of pediatric abdominal surgery is presented. With the help of specifically developed algorithms, a useful surgical planning system is demonstrated. Thanks to the easy handling and high-quality visualization with enormous gain of information, the presented system is now an established part of routine surgical planning.

  6. Computer-assisted operational planning for pediatric abdominal surgery. 3D-visualized MRI with volume rendering

    International Nuclear Information System (INIS)

    Guenther, P.; Holland-Cunz, S.; Waag, K.L.

    2006-01-01

    Exact surgical planning is necessary for complex operations of pathological changes in anatomical structures of the pediatric abdomen. 3D visualization and computer-assisted operational planning based on CT data are being increasingly used for difficult operations in adults. To minimize radiation exposure and for better soft tissue contrast, sonography and MRI are the preferred diagnostic methods in pediatric patients. Because of manifold difficulties 3D visualization of these MRI data has not been realized so far, even though the field of embryonal malformations and tumors could benefit from this. A newly developed and modified raycasting-based powerful 3D volume rendering software (VG Studio Max 1.2) for the planning of pediatric abdominal surgery is presented. With the help of specifically developed algorithms, a useful surgical planning system is demonstrated. Thanks to the easy handling and high-quality visualization with enormous gain of information, the presented system is now an established part of routine surgical planning. (orig.) [de

  7. Learning Vocabulary in a Foreign Language: A Computer Software Based Model Attempt

    Science.gov (United States)

    Yelbay Yilmaz, Yasemin

    2015-01-01

    This study aimed at devising a vocabulary learning software that would help learners learn and retain vocabulary items effectively. Foundation linguistics and learning theories have been adapted to the foreign language vocabulary learning context using a computer software named Parole that was designed exclusively for this study. Experimental…

  8. Computer Game Theories for Designing Motivating Educational Software: A Survey Study

    Science.gov (United States)

    Ang, Chee Siang; Rao, G. S. V. Radha Krishna

    2008-01-01

    The purpose of this study is to evaluate computer game theories for educational software. We propose a framework for designing engaging educational games based on contemporary game studies which includes ludology and narratology. Ludology focuses on the study of computer games as play and game activities, while narratology revolves around the…

  9. Specification and Generation of Environment for Model Checking of Software Components

    Czech Academy of Sciences Publication Activity Database

    Pařízek, P.; Plášil, František

    2007-01-01

    Roč. 176, - (2007), s. 143-154 ISSN 1571-0661 R&D Projects: GA AV ČR 1ET400300504 Institutional research plan: CEZ:AV0Z10300504 Keywords : software components * behavior protocols * model checking * automated generation of environment Subject RIV: JC - Computer Hardware ; Software

  10. Software for the ACP [Advanced Computer Program] multiprocessor system

    International Nuclear Information System (INIS)

    Biel, J.; Areti, H.; Atac, R.

    1987-01-01

    Software has been developed for use with the Fermilab Advanced Computer Program (ACP) multiprocessor system. The software was designed to make a system of a hundred independent node processors as easy to use as a single, powerful CPU. Subroutines have been developed by which a user's host program can send data to and get results from the program running in each of his ACP node processors. Utility programs make it easy to compile and link host and node programs, to debug a node program on an ACP development system, and to submit a debugged program to an ACP production system

  11. The Influence of Personal Characteristics, Interaction: (Computer/Individual), Computer Self-efficacy, Personal Innovativeness in Information Technology to Computer Anxiety in use of Mind your Own Business Accounting Software

    OpenAIRE

    Mayasari, Mega; ., Gudono

    2015-01-01

    The purpose of this study was to identify the factors that cause computer anxiety in the use of Mind Your Own Business (MYOB) accounting software, i.e., to assess if there are any influence of age, gender, amount of training, ownership (usage of accounting software on a regular basis), computer self-efficacy, personal innovativeness in Information Technology (IT) to computer anxiety. The study also examined whether there is a relationship trait anxiety and negative affect to computer self-eff...

  12. Self-service for software development projects and HPC activities

    International Nuclear Information System (INIS)

    Husejko, M; Høimyr, N; Gonzalez, A; Koloventzos, G; Asbury, D; Trzcinska, A; Agtzidis, I; Botrel, G; Otto, J

    2014-01-01

    This contribution describes how CERN has implemented several essential tools for agile software development processes, ranging from version control (Git) to issue tracking (Jira) and documentation (Wikis). Running such services in a large organisation like CERN requires many administrative actions both by users and service providers, such as creating software projects, managing access rights, users and groups, and performing tool-specific customisation. Dealing with these requests manually would be a time-consuming task. Another area of our CERN computing services that has required dedicated manual support has been clusters for specific user communities with special needs. Our aim is to move all our services to a layered approach, with server infrastructure running on the internal cloud computing infrastructure at CERN. This contribution illustrates how we plan to optimise the management of our of services by means of an end-user facing platform acting as a portal into all the related services for software projects, inspired by popular portals for open-source developments such as Sourceforge, GitHub and others. Furthermore, the contribution will discuss recent activities with tests and evaluations of High Performance Computing (HPC) applications on different hardware and software stacks, and plans to offer a dynamically scalable HPC service at CERN, based on affordable hardware.

  13. A computationally efficient software application for calculating vibration from underground railways

    International Nuclear Information System (INIS)

    Hussein, M F M; Hunt, H E M

    2009-01-01

    The PiP model is a software application with a user-friendly interface for calculating vibration from underground railways. This paper reports about the software with a focus on its latest version and the plans for future developments. The software calculates the Power Spectral Density of vibration due to a moving train on floating-slab track with track irregularity described by typical values of spectra for tracks with good, average and bad conditions. The latest version accounts for a tunnel embedded in a half space by employing a toolbox developed at K.U. Leuven which calculates Green's functions for a multi-layered half-space.

  14. Integrating Free Computer Software in Chemistry and Biochemistry Instruction: An International Collaboration

    Science.gov (United States)

    Cedeno, David L.; Jones, Marjorie A.; Friesen, Jon A.; Wirtz, Mark W.; Rios, Luz Amalia; Ocampo, Gonzalo Taborda

    2010-01-01

    At the Universidad de Caldas, Manizales, Colombia, we used their new computer facilities to introduce chemistry graduate students to biochemical database mining and quantum chemistry calculations using freeware. These hands-on workshops allowed the students a strong introduction to easily accessible software and how to use this software to begin…

  15. A software tool for advanced MRgFUS prostate therapy planning and follow up

    Science.gov (United States)

    van Straaten, Dörte; Hoogenboom, Martijn; van Amerongen, Martinus J.; Weiler, Florian; Issawi, Jumana Al; Günther, Matthias; Fütterer, Jurgen; Jenne, Jürgen W.

    2017-03-01

    US guided HIFU/FUS ablation for the therapy of prostate cancer is a clinical established method, while MR guided HIFU/FUS applications for prostate recently started clinical evaluation. Even if MRI examination is an excellent diagnostic tool for prostate cancer, it is a time consuming procedure and not practicable within an MRgFUS therapy session. The aim of our ongoing work is to develop software to support therapy planning and post-therapy follow-up for MRgFUS on localized prostate cancer, based on multi-parametric MR protocols. The clinical workflow of diagnosis, therapy and follow-up of MR guided FUS on prostate cancer was deeply analyzed. Based on this, the image processing workflow was designed and all necessary components, e.g. GUI, viewer, registration tools etc. were defined and implemented. The software bases on MeVisLab with several implemented C++ modules for the image processing tasks. The developed software, called LTC (Local Therapy Control) will register and visualize automatically all images (T1w, T2w, DWI etc.) and ADC or perfusion maps gained from the diagnostic MRI session. This maximum of diagnostic information helps to segment all necessary ROIs, e.g. the tumor, for therapy planning. Final therapy planning will be performed based on these segmentation data in the following MRgFUS therapy session. In addition, the developed software should help to evaluate the therapy success, by synchronization and display of pre-therapeutic, therapy and follow-up image data including the therapy plan and thermal dose information. In this ongoing project, the first stand-alone prototype was completed and will be clinically evaluated.

  16. Planning Systems for Distributed Operations

    Science.gov (United States)

    Maxwell, Theresa G.

    2002-01-01

    This viewgraph representation presents an overview of the mission planning process involving distributed operations (such as the International Space Station (ISS)) and the computer hardware and software systems needed to support such an effort. Topics considered include: evolution of distributed planning systems, ISS distributed planning, the Payload Planning System (PPS), future developments in distributed planning systems, Request Oriented Scheduling Engine (ROSE) and Next Generation distributed planning systems.

  17. Planning and management of cloud computing networks

    Science.gov (United States)

    Larumbe, Federico

    The evolution of the Internet has a great impact on a big part of the population. People use it to communicate, query information, receive news, work, and as entertainment. Its extraordinary usefulness as a communication media made the number of applications and technological resources explode. However, that network expansion comes at the cost of an important power consumption. If the power consumption of telecommunication networks and data centers is considered as the power consumption of a country, it would rank at the 5 th place in the world. Furthermore, the number of servers in the world is expected to grow by a factor of 10 between 2013 and 2020. This context motivates us to study techniques and methods to allocate cloud computing resources in an optimal way with respect to cost, quality of service (QoS), power consumption, and environmental impact. The results we obtained from our test cases show that besides minimizing capital expenditures (CAPEX) and operational expenditures (OPEX), the response time can be reduced up to 6 times, power consumption by 30%, and CO2 emissions by a factor of 60. Cloud computing provides dynamic access to IT resources as a service. In this paradigm, programs are executed in servers connected to the Internet that users access from their computers and mobile devices. The first advantage of this architecture is to reduce the time of application deployment and interoperability, because a new user only needs a web browser and does not need to install software on local computers with specific operating systems. Second, applications and information are available from everywhere and with any device with an Internet access. Also, servers and IT resources can be dynamically allocated depending on the number of users and workload, a feature called elasticity. This thesis studies the resource management of cloud computing networks and is divided in three main stages. We start by analyzing the planning of cloud computing networks to get a

  18. Computer software quality assurance

    International Nuclear Information System (INIS)

    Ives, K.A.

    1986-06-01

    The author defines some criteria for the evaluation of software quality assurance elements for applicability to the regulation of the nuclear industry. The author then analyses a number of software quality assurance (SQA) standards. The major extracted SQA elements are then discussed, and finally specific software quality assurance recommendations are made for the nuclear industry

  19. 77 FR 19280 - Increasing Market and Planning Efficiency Through Improved Software; Notice of Technical...

    Science.gov (United States)

    2012-03-30

    ... concerns that current system data quality might not allow for an AC optimal power flow model to be properly... Market and Planning Efficiency Through Improved Software; Notice of Technical Conference: Increasing Real-Time and Day- Ahead Market Efficiency Through Improved Software Take notice that Commission staff will...

  20. EQ3/6 software test and verification report 9/94

    International Nuclear Information System (INIS)

    Kishi, T.

    1996-02-01

    This document is the Software Test and Verification Report (STVR) for the EQ3/6 suite of codes as stipulated in the Individual Software Plan for Initial Qualification of EQ3/6 (ISP-NF-07, Revision 1, 11/25/92). The software codes, EQPT, EQ3NR, EQ6, and the software library EQLIB constitute the EQ3/6 software package. This software test and verification project for EQ3/6 was started under the requirements of the LLNL Yucca Mountain Project Software Quality Assurance Plan (SQAP), Revision 0, December 14, 1989, but QP 3.2, Revision 2, June 21, 1994 is now the operative controlling procedure. This is a ''V and V'' report in the language of QP 3.2, Revision 2. Because the author of this report does not have a background in geochemistry, other technical sources were consulted in order to acquire some familiarity with geochemisty, the terminology minology involved, and to review comparable computational methods especially, geochemical aqueous speciation-solubility calculations. The software for the EQ3/6 package consists of approximately 47,000 lines of FORTRAN77 source code and nine on platforms ranging from workstations to supercomputers. The physical control of EQ3/6 software package and documentation is on a SUN SPARC station. Walkthroughs of each principal software packages, EQPT, EQ3NR, and EQ6 were conducted in order to understand the computational procedures involved, to determine any commonality in procedures, and then to establish a plan for the test and verification of EQ3/6. It became evident that all three phases depended upon solving an n x n matrix by the Newton-Raphson Method. Thus, a great deal of emphasis on the test and verification of this procedure was carried out on the first code in the software package EQPT

  1. EQ3/6 software test and verification report 9/94

    Energy Technology Data Exchange (ETDEWEB)

    Kishi, T.

    1996-02-01

    This document is the Software Test and Verification Report (STVR) for the EQ3/6 suite of codes as stipulated in the Individual Software Plan for Initial Qualification of EQ3/6 (ISP-NF-07, Revision 1, 11/25/92). The software codes, EQPT, EQ3NR, EQ6, and the software library EQLIB constitute the EQ3/6 software package. This software test and verification project for EQ3/6 was started under the requirements of the LLNL Yucca Mountain Project Software Quality Assurance Plan (SQAP), Revision 0, December 14, 1989, but QP 3.2, Revision 2, June 21, 1994 is now the operative controlling procedure. This is a ``V and V`` report in the language of QP 3.2, Revision 2. Because the author of this report does not have a background in geochemistry, other technical sources were consulted in order to acquire some familiarity with geochemisty, the terminology minology involved, and to review comparable computational methods especially, geochemical aqueous speciation-solubility calculations. The software for the EQ3/6 package consists of approximately 47,000 lines of FORTRAN77 source code and nine on platforms ranging from workstations to supercomputers. The physical control of EQ3/6 software package and documentation is on a SUN SPARC station. Walkthroughs of each principal software packages, EQPT, EQ3NR, and EQ6 were conducted in order to understand the computational procedures involved, to determine any commonality in procedures, and then to establish a plan for the test and verification of EQ3/6. It became evident that all three phases depended upon solving an n x n matrix by the Newton-Raphson Method. Thus, a great deal of emphasis on the test and verification of this procedure was carried out on the first code in the software package EQPT.

  2. Computer software design description for the integrated control and data acquisition system LDUA system

    International Nuclear Information System (INIS)

    Aftanas, B.L.

    1998-01-01

    This Computer Software Design Description (CSDD) document provides the overview of the software design for all the software that is part of the integrated control and data acquisition system of the Light Duty Utility Arm System (LDUA). It describes the major software components and how they interface. It also references the documents that contain the detailed design description of the components

  3. Software Verification and Validation Test Report for the HEPA filter Differential Pressure Fan Interlock System

    International Nuclear Information System (INIS)

    ERMI, A.M.

    2000-01-01

    The HEPA Filter Differential Pressure Fan Interlock System PLC ladder logic software was tested using a Software Verification and Validation (VandV) Test Plan as required by the ''Computer Software Quality Assurance Requirements''. The purpose of his document is to report on the results of the software qualification

  4. Strategic Planning and Decision Analysis: Presentation of the COSIMA Software System

    DEFF Research Database (Denmark)

    This paper presents a composite decision support system, COSIMA, programmed in MS Excel. COSIMA provides assistance to the decision maker as concerns complex decisions and strategic planning. The COSIMA software is designed as interconnected modules which make it possible to conduct Cost-Benefit...

  5. Implementing Software Safety in the NASA Environment

    Science.gov (United States)

    Wetherholt, Martha S.; Radley, Charles F.

    1994-01-01

    Until recently, NASA did not consider allowing computers total control of flight systems. Human operators, via hardware, have constituted the ultimate safety control. In an attempt to reduce costs, NASA has come to rely more and more heavily on computers and software to control space missions. (For example. software is now planned to control most of the operational functions of the International Space Station.) Thus the need for systematic software safety programs has become crucial for mission success. Concurrent engineering principles dictate that safety should be designed into software up front, not tested into the software after the fact. 'Cost of Quality' studies have statistics and metrics to prove the value of building quality and safety into the development cycle. Unfortunately, most software engineers are not familiar with designing for safety, and most safety engineers are not software experts. Software written to specifications which have not been safety analyzed is a major source of computer related accidents. Safer software is achieved step by step throughout the system and software life cycle. It is a process that includes requirements definition, hazard analyses, formal software inspections, safety analyses, testing, and maintenance. The greatest emphasis is placed on clearly and completely defining system and software requirements, including safety and reliability requirements. Unfortunately, development and review of requirements are the weakest link in the process. While some of the more academic methods, e.g. mathematical models, may help bring about safer software, this paper proposes the use of currently approved software methodologies, and sound software and assurance practices to show how, to a large degree, safety can be designed into software from the start. NASA's approach today is to first conduct a preliminary system hazard analysis (PHA) during the concept and planning phase of a project. This determines the overall hazard potential of

  6. A directory of computer software applications: astronomy and astrophysics, 1970-May, 1979

    International Nuclear Information System (INIS)

    1979-05-01

    Astronomy and astrophysics reports that list computer programs and/or their documentation are cited. These software applications pertain to topics such as solar activity, atmospheric radiative transfer, stellar and galactic structure, lunar and planetary studies, and astrophysical data reduction. The directory contains complete bibliographic data for each report as well as a subject and a corporate author index. The computer software offered by NTIS was created by a variety of Federal agencies to meet their diverse but quite specific objectives. It is provided without installation, support, or maintenance services and sometimes requires customer modifications to run effectively in customer environments

  7. Cluster computing software for GATE simulations

    International Nuclear Information System (INIS)

    Beenhouwer, Jan de; Staelens, Steven; Kruecker, Dirk; Ferrer, Ludovic; D'Asseler, Yves; Lemahieu, Ignace; Rannou, Fernando R.

    2007-01-01

    Geometry and tracking (GEANT4) is a Monte Carlo package designed for high energy physics experiments. It is used as the basis layer for Monte Carlo simulations of nuclear medicine acquisition systems in GEANT4 Application for Tomographic Emission (GATE). GATE allows the user to realistically model experiments using accurate physics models and time synchronization for detector movement through a script language contained in a macro file. The downside of this high accuracy is long computation time. This paper describes a platform independent computing approach for running GATE simulations on a cluster of computers in order to reduce the overall simulation time. Our software automatically creates fully resolved, nonparametrized macros accompanied with an on-the-fly generated cluster specific submit file used to launch the simulations. The scalability of GATE simulations on a cluster is investigated for two imaging modalities, positron emission tomography (PET) and single photon emission computed tomography (SPECT). Due to a higher sensitivity, PET simulations are characterized by relatively high data output rates that create rather large output files. SPECT simulations, on the other hand, have lower data output rates but require a long collimator setup time. Both of these characteristics hamper scalability as a function of the number of CPUs. The scalability of PET simulations is improved here by the development of a fast output merger. The scalability of SPECT simulations is improved by greatly reducing the collimator setup time. Accordingly, these two new developments result in higher scalability for both PET and SPECT simulations and reduce the computation time to more practical values

  8. Recommendations for a Software Quality Assurance Plan for the CMR Facility at LANL

    International Nuclear Information System (INIS)

    Adams, K.; Matthews, S. D.; McQueen, M. A.

    1998-01-01

    The Nuclear Materials Technology (NMT) organizations 1 and 3 within the Chemical and Metallurgical Research (CMR) facility at the Los Alamos National Laboratory are working to achieve Waste Isolation Pilot Plant (WIPP) certification to enable them to transport their TRU waste to WIPP. This document is intended to provide not only recommendations to address the necessary software quality assurance activities to enable the NMT-1 and NMT-3 organizations to be WIPP compliant but is also meant to provide a template for the final Software Quality Assurance Plan (SQAP). This document specifically addresses software quality assurance for all software used in support of waste characterization and analysis. Since NMT-1 and NMT-3 currently have several operational software products that are used for waste characterization and analysis, these software quality assurance recommendations apply to the operations, maintenance and retirement of the software and the creation and development of any new software required for waste characterization and analyses

  9. Free and open-source software application for the evaluation of coronary computed tomography angiography images.

    Science.gov (United States)

    Hadlich, Marcelo Souza; Oliveira, Gláucia Maria Moraes; Feijóo, Raúl A; Azevedo, Clerio F; Tura, Bernardo Rangel; Ziemer, Paulo Gustavo Portela; Blanco, Pablo Javier; Pina, Gustavo; Meira, Márcio; Souza e Silva, Nelson Albuquerque de

    2012-10-01

    The standardization of images used in Medicine in 1993 was performed using the DICOM (Digital Imaging and Communications in Medicine) standard. Several tests use this standard and it is increasingly necessary to design software applications capable of handling this type of image; however, these software applications are not usually free and open-source, and this fact hinders their adjustment to most diverse interests. To develop and validate a free and open-source software application capable of handling DICOM coronary computed tomography angiography images. We developed and tested the ImageLab software in the evaluation of 100 tests randomly selected from a database. We carried out 600 tests divided between two observers using ImageLab and another software sold with Philips Brilliance computed tomography appliances in the evaluation of coronary lesions and plaques around the left main coronary artery (LMCA) and the anterior descending artery (ADA). To evaluate intraobserver, interobserver and intersoftware agreements, we used simple and kappa statistics agreements. The agreements observed between software applications were generally classified as substantial or almost perfect in most comparisons. The ImageLab software agreed with the Philips software in the evaluation of coronary computed tomography angiography tests, especially in patients without lesions, with lesions 70% in the ADA was lower, but this is also observed when the anatomical reference standard is used.

  10. Surveillance Analysis Computer System (SACS): Software requirements specification (SRS). Revision 2

    International Nuclear Information System (INIS)

    Glasscock, J.A.

    1995-01-01

    This document is the primary document establishing requirements for the Surveillance Analysis Computer System (SACS) database, an Impact Level 3Q system. SACS stores information on tank temperatures, surface levels, and interstitial liquid levels. This information is retrieved by the customer through a PC-based interface and is then available to a number of other software tools. The software requirements specification (SRS) describes the system requirements for the SACS Project, and follows the Standard Engineering Practices (WHC-CM-6-1), Software Practices (WHC-CM-3-10) and Quality Assurance (WHC-CM-4-2, QR 19.0) policies

  11. A software for computer automated radioactive particle tracking

    Energy Technology Data Exchange (ETDEWEB)

    Vieira, Wilson S.; Brandao, Luis E. [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)]. E-mails: wilson@ien.gov.br; brandao@ien.gov.br; Braz, Delson [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Coordenacao dos Programas de Pos-graduacao de Engenharia (COPPE)]. E-mail: delson@smb.lin.ufrj.br

    2008-07-01

    TRACO-1 is the first software developed in Brazil for optimization and diagnosis of multiphase chemical reactors employing the technique known as 'Computer Automated Radioactive Particle Tracking' whose main idea is to follow the movement of a punctual radioactive particle inside a vessel. Considering that this particle has a behavior similar of the phase under investigation, important conclusions can be achieved. As a preliminary TRACO-1 evaluation, a simulation was carried out with the aid of a commercial software called MICROSHIELD, version 5.05, to obtain values of photon counting rates at four detector surfaces. These counting were related to the emission of gamma radiation from a radioactive source because they are the main TRACO-1 input variables. Although the results that has been found are incipient, the analysis of them suggest that the tracking of a radioactive source using TRACO- 1 can be well succeed, but a better evaluation of the capabilities of this software will only be achieved after its application in real experiments. (author)

  12. High-integrity software, computation and the scientific method

    International Nuclear Information System (INIS)

    Hatton, L.

    2012-01-01

    Computation rightly occupies a central role in modern science. Datasets are enormous and the processing implications of some algorithms are equally staggering. With the continuing difficulties in quantifying the results of complex computations, it is of increasing importance to understand its role in the essentially Popperian scientific method. In this paper, some of the problems with computation, for example the long-term unquantifiable presence of undiscovered defect, problems with programming languages and process issues will be explored with numerous examples. One of the aims of the paper is to understand the implications of trying to produce high-integrity software and the limitations which still exist. Unfortunately Computer Science itself suffers from an inability to be suitably critical of its practices and has operated in a largely measurement-free vacuum since its earliest days. Within computer science itself, this has not been so damaging in that it simply leads to unconstrained creativity and a rapid turnover of new technologies. In the applied sciences however which have to depend on computational results, such unquantifiability significantly undermines trust. It is time this particular demon was put to rest. (author)

  13. Software Tools: A One-Semester Secondary School Computer Course.

    Science.gov (United States)

    Bromley, John; Lakatos, John

    1985-01-01

    Provides a course outline, describes equipment and teacher requirements, discusses student evaluation and course outcomes, and details the computer programs used in a high school course. The course is designed to teach students use of the microcomputer as a tool through hands-on experience with a variety of commercial software programs. (MBR)

  14. Mahotas: Open source software for scriptable computer vision

    Directory of Open Access Journals (Sweden)

    Luis Pedro Coelho

    2013-07-01

    Full Text Available Mahotas is a computer vision library for Python. It contains traditional image processing functionality such as filtering and morphological operations as well as more modern computer vision functions for feature computation, including interest point detection and local descriptors. The interface is in Python, a dynamic programming language, which is appropriate for fast development, but the algorithms are implemented in C++ and are tuned for speed. The library is designed to fit in with the scientific software ecosystem in this language and can leverage the existing infrastructure developed in that language. Mahotas is released under a liberal open source license (MIT License and is available from http://github.com/luispedro/mahotas and from the Python Package Index (http://pypi.python.org/pypi/mahotas. Tutorials and full API documentation are available online at http://mahotas.readthedocs.org/.

  15. Software Engineering Techniques for Computer-Aided Learning.

    Science.gov (United States)

    Ibrahim, Bertrand

    1989-01-01

    Describes the process for developing tutorials for computer-aided learning (CAL) using a programing language rather than an authoring system. The workstation used is described, the use of graphics is discussed, the role of a local area network (LAN) is explained, and future plans are discussed. (five references) (LRW)

  16. RELAP-7 Software Verification and Validation Plan: Requirements Traceability Matrix (RTM) Part 1 – Physics and numerical methods

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Yong Joon [Idaho National Lab. (INL), Idaho Falls, ID (United States); Yoo, Jun Soo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis Lee [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    This INL plan comprehensively describes the Requirements Traceability Matrix (RTM) on main physics and numerical method of the RELAP-7. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7.

  17. Software development on the DIII-D control and data acquisition computers

    International Nuclear Information System (INIS)

    Penaflor, B.G.; McHarg, B.B. Jr.; Piglowski, D.

    1997-11-01

    The various software systems developed for the DIII-D tokamak have played a highly visible and important role in tokamak operations and fusion research. Because of the heavy reliance on in-house developed software encompassing all aspects of operating the tokamak, much attention has been given to the careful design, development and maintenance of these software systems. Software systems responsible for tokamak control and monitoring, neutral beam injection, and data acquisition demand the highest level of reliability during plasma operations. These systems made up of hundreds of programs totaling thousands of lines of code have presented a wide variety of software design and development issues ranging from low level hardware communications, database management, and distributed process control, to man machine interfaces. The focus of this paper will be to describe how software is developed and managed for the DIII-D control and data acquisition computers. It will include an overview and status of software systems implemented for tokamak control, neutral beam control, and data acquisition. The issues and challenges faced developing and managing the large amounts of software in support of the dynamic and everchanging needs of the DIII-D experimental program will be addressed

  18. Optimal integration and test plans for software releases of lithographic systems

    NARCIS (Netherlands)

    Boumen, R.; Jong, de I.S.M.; Mortel - Fronczak, van de J.M.; Rooda, J.E.

    2007-01-01

    This paper describes a method to determine the optimal integration and test plan for embedded systems software releases. The method consists of four steps: 1)describe the integration and test problem in an integration and test model which is introduced in this paper, 2) determine possible test

  19. Computer-assisted planning and dosimetry for radiation treatment of head and neck cancer in Cameroon

    International Nuclear Information System (INIS)

    Yomi, J.; Ngniah, A.; Kingue, S.; Muna, W.F.T.; Durosinmi-Etti, F.A.

    1995-01-01

    This evaluation was part of a multicenter, multinational study sponsored by the International Agency for Atomic Energy (Vienna) to investigate a simple, reliable computer-assisted planning and dosimetry system for radiation treatment of head and neck cancers in developing countries. Over a 13-month period (April 1992-April 1993), 120 patients with histologically-proven head or neck cancer were included in the evaluation. In each patient, planning and dosimetry were done both manually and using the computer-assisted system. The manual and computerized systems were compared on the basis of accuracy of determination of the outer contour, target volume, and critical organs; volume inequality resolution; structure heterogeneity correction; selection of the number, angle, and size of beams; treatment time calculation; availability of dosimetry predictions; and duration and cost of the procedure. Results demonstrated that the computer-assisted procedure was superior over the manual procedure, despite less than optimal software. The accuracy provided by the completely computerized procedure is indispensable for Level II radiation therapy, which is particularly useful in tumors of the sensitive, complex structures in the head and neck. (authors). 7 refs., 3 tabs

  20. Development of innovative computer software to facilitate the setup and computation of water quality index.

    Science.gov (United States)

    Nabizadeh, Ramin; Valadi Amin, Maryam; Alimohammadi, Mahmood; Naddafi, Kazem; Mahvi, Amir Hossein; Yousefzadeh, Samira

    2013-04-26

    Developing a water quality index which is used to convert the water quality dataset into a single number is the most important task of most water quality monitoring programmes. As the water quality index setup is based on different local obstacles, it is not feasible to introduce a definite water quality index to reveal the water quality level. In this study, an innovative software application, the Iranian Water Quality Index Software (IWQIS), is presented in order to facilitate calculation of a water quality index based on dynamic weight factors, which will help users to compute the water quality index in cases where some parameters are missing from the datasets. A dataset containing 735 water samples of drinking water quality in different parts of the country was used to show the performance of this software using different criteria parameters. The software proved to be an efficient tool to facilitate the setup of water quality indices based on flexible use of variables and water quality databases.

  1. Computational Ecology and Software (http://www.iaees.org/publications/journals/ces/online-version.asp

    Directory of Open Access Journals (Sweden)

    ces@iaees.org

    Full Text Available Computational Ecology and Software ISSN 2220-721X URL: http://www.iaees.org/publications/journals/ces/online-version.asp RSS: http://www.iaees.org/publications/journals/ces/rss.xml E-mail: ces@iaees.org Editor-in-Chief: WenJun Zhang Aims and Scope COMPUTATIONAL ECOLOGY AND SOFTWARE (ISSN 2220-721X is an open access, peer-reviewed online journal that considers scientific articles in all different areas of computational ecology. It is the transactions of the International Society of Computational Ecology. The journal is concerned with the ecological researches, constructions and applications of theories and methods of computational sciences including computational mathematics, computational statistics and computer science. It features the simulation, approximation, prediction, recognition, and classification of ecological issues. Intensive computation is one of the major stresses of the journal. The journal welcomes research articles, short communications, review articles, perspectives, and book reviews. The journal also supports the activities of the International Society of Computational Ecology. The topics to be covered by CES include, but are not limited to: •Computation intensive methods, numerical and optimization methods, differential and difference equation modeling and simulation, prediction, recognition, classification, statistical computation (Bayesian computing, randomization, bootstrapping, Monte Carlo techniques, stochastic process, etc., agent-based modeling, individual-based modeling, artificial neural networks, knowledge based systems, machine learning, genetic algorithms, data exploration, network analysis and computation, databases, ecological modeling and computation using Geographical Information Systems, satellite imagery, and other computation intensive theories and methods. •Artificial ecosystems, artificial life, complexity of ecosystems and virtual reality. •The development, evaluation and validation of software and

  2. MEDBASE: Strategic Planning and Implementation of an Army Medical Department Software Application

    Science.gov (United States)

    2003-07-17

    downtime procedures) Software/ hardware performance testing Contingency plans Celebration / people management Risk Zone 4 Planning for upgrades...goal to “Crush Adidas !” and Honda’s goal in 1970 to “destroy Yamaha!”. MEDBASE: Strategy and Implementation 57 (3) Role Model BHAGs suit up-and...Crush Adidas ! (Nike, 1960s) Yamaha so tsubusu! We will destroy Yamaha! (Honda, 1970s) Role-Model BHAGs suit up-and-coming organizations Become the

  3. CERR: A computational environment for radiotherapy research

    International Nuclear Information System (INIS)

    Deasy, Joseph O.; Blanco, Angel I.; Clark, Vanessa H.

    2003-01-01

    A software environment is described, called the computational environment for radiotherapy research (CERR, pronounced 'sir'). CERR partially addresses four broad needs in treatment planning research: (a) it provides a convenient and powerful software environment to develop and prototype treatment planning concepts, (b) it serves as a software integration environment to combine treatment planning software written in multiple languages (MATLAB, FORTRAN, C/C++, JAVA, etc.), together with treatment plan information (computed tomography scans, outlined structures, dose distributions, digital films, etc.), (c) it provides the ability to extract treatment plans from disparate planning systems using the widely available AAPM/RTOG archiving mechanism, and (d) it provides a convenient and powerful tool for sharing and reproducing treatment planning research results. The functional components currently being distributed, including source code, include: (1) an import program which converts the widely available AAPM/RTOG treatment planning format into a MATLAB cell-array data object, facilitating manipulation; (2) viewers which display axial, coronal, and sagittal computed tomography images, structure contours, digital films, and isodose lines or dose colorwash, (3) a suite of contouring tools to edit and/or create anatomical structures, (4) dose-volume and dose-surface histogram calculation and display tools, and (5) various predefined commands. CERR allows the user to retrieve any AAPM/RTOG key word information about the treatment plan archive. The code is relatively self-describing, because it relies on MATLAB structure field name definitions based on the AAPM/RTOG standard. New structure field names can be added dynamically or permanently. New components of arbitrary data type can be stored and accessed without disturbing system operation. CERR has been applied to aid research in dose-volume-outcome modeling, Monte Carlo dose calculation, and treatment planning optimization

  4. Newly Developed Software Application for Multiple Access Process Planning

    Directory of Open Access Journals (Sweden)

    Katarina Monkova

    2014-11-01

    Full Text Available The purchase of a complex system for computer aided process planning (CAPP can be expensive for little and some middle sized plants, sometimes an inaccessible investment, with a long recoupment period. According to this fact and the author's experience with Eastern European plants, they decided to design a new database application which is suitable for production, stock, and economic data holding as well as processing and exploitation within the manufacturing process. The application can also be used to process a plan according to the selected criteria, for technological documentation and NC program creation. It was based on the theory of a multivariant approach to computer aided plan generation. Its fundamental features, the internal mathematical structure and new code system of processed objects, were prepared by the authors. The verification of the designed information system in real practice has shown that it enables about 30% cost and production time reduction and decreases input material assortment variability.

  5. Primary Health Care Software-A Computer Based Data Management System

    Directory of Open Access Journals (Sweden)

    Tuli K

    1990-01-01

    Full Text Available Realising the duplication and time consumption in the usual manual system of data collection necessitated experimentation with computer based management system for primary health care in the primary health centers. The details of the population as available in the existing manual system were used for computerizing the data. Software was designed for data entry and analysis. It was written in Dbase III plus language. It was so designed that a person with no knowledge about computer could use it, A cost analysis was done and the computer system was found more cost effective than the usual manual system.

  6. Development of the JFT-2M data analysis software system on the mainframe computer

    International Nuclear Information System (INIS)

    Matsuda, Toshiaki; Amagai, Akira; Suda, Shuji; Maemura, Katsumi; Hata, Ken-ichiro.

    1990-11-01

    We developed software system on the FACOM mainframe computer to analyze JFT-2M experimental data archived by JFT-2M data acquisition system. Then we can reduce and distribute the CPU load of the data acquisition system. And we can analyze JFT-2M experimental data by using complicated computational code with raw data, such as equilibrium calculation and transport analysis, and useful software package like SAS statistic package on the mainframe. (author)

  7. Assume-Guarantee Verification of Software Components in SOFA 2 Framework

    Czech Academy of Sciences Publication Activity Database

    Parízek, P.; Plášil, František

    2010-01-01

    Roč. 4, č. 3 (2010), s. 210-221 ISSN 1751-8806 R&D Projects: GA AV ČR 1ET400300504 Grant - others:GA MŠk(CZ) 7E08004 Institutional research plan: CEZ:AV0Z10300504 Keywords : components * software verification * model checking Subject RIV: JC - Computer Hardware ; Software Impact factor: 0.671, year: 2010

  8. Computer mapping software and geographic data base development: Oak Ridge National Laboratory user experience

    International Nuclear Information System (INIS)

    Honea, B.; Johnson, P.

    1978-01-01

    As users of computer display tools, our opinion is that the researcher's needs should guide and direct the computer scientist's development of mapping software and data bases. Computer graphic techniques developed for the sake of the computer graphics community tend to be esoteric and rarely suitable for user problems. Two types of users exist for computer graphic tools: the researcher who is generally satisfied with abstract but accurate displays for analysis purposes and the decision maker who requires synoptic and easily comprehended displays relevant to the issues he or she must address. Computer mapping software and data bases should be developed for the user in a generalized and standardized format for ease in transferring and to facilitate the linking or merging with larger analysis systems. Maximum utility of computer mapping tools is accomplished when linked to geographic information and analysis systems. Computer graphic techniques have varying degrees of utility depending upon whether they are used for data validation, analysis procedures or presenting research results

  9. A pioneering application of NQA-1 quality assurance standards in the development of software

    International Nuclear Information System (INIS)

    Weisbin, A.N.

    1988-01-01

    The application of NQA-1 Quality Assurance Standards to computer software programs has been recent at the Oak Ridge National Laboratory. One reason for systematically applying quality assurance to computer software is the extensive use of results from computer programs. to characterize potential sites for nuclear waste repositories leading ultimately to important policy making decisions. Because data from these programs characterize the likely radioactivity profile for many hundreds of years, experimental validation is not feasible. The Sensitivity and Uncertainty Analysis Methods Development Project (SUAMDP) was developed to formulate and utilize efficient and comprehensive methods for determining sensitivities of calculated results with respect to changes in all input parameters. The computerized methodology was embodied in the Gradient Enhanced Software System (GRESS). Due to the fact that GRESS was to be used in the site characterization for waste storage, stringent NQA-1 requirements were imposed by the sponsor. A working relationship between the Oak Ridge National Laboratory (ORNL) Quality Department and the research scientists developing GRESS was essential in achieving understanding and acceptance of the quality assurance requirements as applied to the SUAMDP. The relationship resulted in the SUAMDP becoming the first software project at ORNL to develop a comprehensive NQA-1 Quality Assurance Plan; this plan now serves as a model for software quality assurance at ORNL. This paper describes the evolution of this plan and its impact on the application of quality assurance procedures to software

  10. The CMS software performance at the start of data taking

    CERN Document Server

    Benelli, Gabriele

    2009-01-01

    The CMS software framework (CMSSW) is a complex project evolving very rapidly as the first LHC colliding beams approach. The computing requirements constrain performance in terms of CPU time, memory footprint and event size on disk to allow for planning and managing the computing infrastructure necessary to handle the needs of the experiment. A performance suite of tools has been developed to track all aspects of code performance, through the software release cycles, allowing for regression and guiding code development for optimization. In this talk, we describe the CMSSW performance suite tools used and present some sample performance results from the release integration process for the CMS software.

  11. Computer- Aided Design in Power Engineering Application of Software Tools

    CERN Document Server

    Stojkovic, Zlatan

    2012-01-01

    This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents  application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel & Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems ...

  12. Estimating pressurized water reactor decommissioning costs: A user's manual for the PWR Cost Estimating Computer Program (CECP) software

    International Nuclear Information System (INIS)

    Bierschbach, M.C.; Mencinsky, G.J.

    1993-10-01

    With the issuance of the Decommissioning Rule (July 27, 1988), nuclear power plant licensees are required to submit to the US Regulatory Commission (NRC) for review, decommissioning plans and cost estimates. This user's manual and the accompanying Cost Estimating Computer Program (CECP) software provide a cost-calculating methodology to the NRC staff that will assist them in assessing the adequacy of the licensee submittals. The CECP, designed to be used on a personnel computer, provides estimates for the cost of decommissioning PWR plant stations to the point of license termination. Such cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial costs; and manpower costs. In addition to costs, the CECP also calculates burial volumes, person-hours, crew-hours, and exposure person-hours associated with decommissioning

  13. State-of-the-Art: Evolution of Software Life Cycle Process for NPPs

    International Nuclear Information System (INIS)

    Suh, Yong Suk; Park, Heui Youn; Son, Ki Sung; Lee, Ki Hyun; Kim, Hyeon Soo

    2007-01-01

    This paper is to investigate the evolution of software life cycle process (SLCP) for nuclear power plants (NPPs) based on IEEE Std 7-4.3.2 which has been updated twice (namely 1993 and 2003 ) since it was published in 1982 and relevant software certifications. IEEE Std 7-4.3.2 specifies additional computer specific requirements to supplement the criteria and requirements of IEEE Std 603. It also specifies the software quality requirements as follows: computer software shall be developed, modified, or accepted in accordance with an approved software quality assurance (QA) plan. IEEE Std 7-4.3.2-1982 specifies a minimum software development process as follows: plan, design and implementation. ANSI/ASME NQA-1-1979 is not directly related to software development process but to overall quality assurance criteria. IEEE Std 7-4.3.2-1993 addresses ASME NQA-2a-1990 Part 2.7 for software development requirements. ASME NQA-2a-1990 Part 2.7 which was interpreted into KEPIC QAP-2 II.7, specifies software development process in more detail as follows: requirements, design, implementation, test, installation and checkout, operation and maintenance, and retirement. Along with this, software QA plan is emphasized in IEEE Std 730-1989. In IEEE Std 7-4.3.2-2003, IEEE/EIA Std 12207.0-1996 replaces the ASME NQA as a requirement for software development. The evolution of SLCP from ASME NQA to IEEE/EIA Std 12207.0 is discussed in Section 2 of this paper. The publication of IEEE/EIA Std 12207.0 is motivated from industrial experiences and practices to promote the quality of software. In Section 3, three international software certifications relating to the IEEE/EIA Std 12207.0 are introduced

  14. Computer-aided design in power engineering. Application of software tools

    International Nuclear Information System (INIS)

    Stojkovic, Zlatan

    2012-01-01

    Demonstrates the use software tools in the practice of design in the field of power systems. Presents many applications in the design in the field of power systems. Useful for educative purposes and practical work. This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel and Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems is discussed. Here, the emphasis is put on the standard software MS Excel and MS Project.

  15. Computer-aided design in power engineering. Application of software tools

    Energy Technology Data Exchange (ETDEWEB)

    Stojkovic, Zlatan

    2012-07-01

    Demonstrates the use software tools in the practice of design in the field of power systems. Presents many applications in the design in the field of power systems. Useful for educative purposes and practical work. This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel and Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems is discussed. Here, the emphasis is put on the standard software MS Excel and MS Project.

  16. ATLAS experience with HEP software at the Argonne leadership computing facility

    International Nuclear Information System (INIS)

    Uram, Thomas D; LeCompte, Thomas J; Benjamin, D

    2014-01-01

    A number of HEP software packages used by the ATLAS experiment, including GEANT4, ROOT and ALPGEN, have been adapted to run on the IBM Blue Gene supercomputers at the Argonne Leadership Computing Facility. These computers use a non-x86 architecture and have a considerably less rich operating environment than in common use in HEP, but also represent a computing capacity an order of magnitude beyond what ATLAS is presently using via the LCG. The status and potential for making use of leadership-class computing, including the status of integration with the ATLAS production system, is discussed.

  17. ATLAS Experience with HEP Software at the Argonne Leadership Computing Facility

    CERN Document Server

    LeCompte, T; The ATLAS collaboration; Benjamin, D

    2014-01-01

    A number of HEP software packages used by the ATLAS experiment, including GEANT4, ROOT and ALPGEN, have been adapted to run on the IBM Blue Gene supercomputers at the Argonne Leadership Computing Facility. These computers use a non-x86 architecture and have a considerably less rich operating environment than in common use in HEP, but also represent a computing capacity an order of magnitude beyond what ATLAS is presently using via the LCG. The status and potential for making use of leadership-class computing, including the status of integration with the ATLAS production system, is discussed.

  18. Ultrasonic and computed tomography in radiotherapy planning - a comparison

    International Nuclear Information System (INIS)

    Schertel, L.

    1980-01-01

    The precondition of any radiotherapy is radiation planning. This must be done individually for every patient and must be applicable for any region of the body. Modern irradiation planning requires pictures of the body parts concerned; these can be made by means of the ultrasonic method and computed tomography. This comparative investigation leads to the result (see fig. 4 and 5) that computed tomographic body part pictures should be preferred to those made sonographically. The opinion of Huenig and Co. [8] that ultrasonic tomography will soon lose some of its importance within irradiation planning once computed tomography is introduced could be confirmed by the latest developments. The authors can confirm this also out of their own experience and agree with Winkel and Hermann [23] that computed tomography cannot be done without any more irradiation planning. (orig.) [de

  19. Designing of a Computer Software for Detection of Approximal Caries in Posterior Teeth

    International Nuclear Information System (INIS)

    Valizadeh, Solmaz; Goodini, Mostafa; Ehsani, Sara; Mohseni, Hadis; Azimi, Fateme; Bakhshandeh, Hooman

    2015-01-01

    Radiographs, adjunct to clinical examination are always valuable complementary methods for dental caries detection. Recently, progressing in digital imaging system provides possibility of software designing for automatically dental caries detection. The aim of this study was to develop and assess the function of diagnostic computer software designed for evaluation of approximal caries in posterior teeth. This software should be able to indicate the depth and location of caries on digital radiographic images. Digital radiographs were obtained of 93 teeth including 183 proximal surfaces. These images were used as a database for designing the software and training the software designer. In the design phase, considering the summed density of pixels in rows and columns of the images, the teeth were separated from each other and the unnecessary regions; for example, the root area in the alveolar bone was eliminated. Therefore, based on summed intensities, each image was segmented such that each segment contained only one tooth. Subsequently, based on the fuzzy logic, a well-known data-clustering algorithm named fuzzy c-means (FCM) was applied to the images to cluster or segment each tooth. This algorithm is referred to as a soft clustering method, which assigns data elements to one or more clusters with a specific membership function. Using the extracted clusters, the tooth border was determined and assessed for cavity. The results of histological analysis were used as the gold standard for comparison with the results obtained from the software. Depth of caries was measured, and finally Intraclass Correlation Coefficient (ICC) and Bland-Altman plot were used to show the agreement between the methods. The software diagnosed 60% of enamel caries. The ICC (for detection of enamel caries) between the computer software and histological analysis results was determined as 0.609 (95% confidence interval [CI] = 0.159-0.849) (P = 0.006). Also, the computer program diagnosed 97% of

  20. Visualization of biomedical image data and irradiation planning using a parallel computing system

    International Nuclear Information System (INIS)

    Lehrig, R.

    1991-01-01

    The contribution explains the development of a novel, low-cost workstation for the processing of biomedical tomographic data sequences. The workstation was to allow both graphical display of the data and implementation of modelling software for irradiation planning, especially for calculation of dose distributions on the basis of the measured tomogram data. The system developed according to these criteria is a parallel computing system which performs secondary, two-dimensional image reconstructions irrespective of the imaging direction of the original tomographic scans. Three-dimensional image reconstructions can be generated from any direction of view, with random selection of sections of the scanned object. (orig./MM) With 69 figs., 2 tabs [de

  1. Special software for computing the special functions of wave catastrophes

    Directory of Open Access Journals (Sweden)

    Andrey S. Kryukovsky

    2015-01-01

    Full Text Available The method of ordinary differential equations in the context of calculating the special functions of wave catastrophes is considered. Complementary numerical methods and algorithms are described. The paper shows approaches to accelerate such calculations using capabilities of modern computing systems. Methods for calculating the special functions of wave catastrophes are considered in the framework of parallel computing and distributed systems. The paper covers the development process of special software for calculating of special functions, questions of portability, extensibility and interoperability.

  2. Three-dimensional computer reconstruction of large tissue volumes based on composing series of high-resolution confocal images by GlueMRC and LinkMRC software

    Czech Academy of Sciences Publication Activity Database

    Karen, Petr; Jirkovská, M.; Tomori, Z.; Demjénová, E.; Janáček, Jiří; Kubínová, Lucie

    2003-01-01

    Roč. 62, č. 5 (2003), s. 415-422 ISSN 1059-910X R&D Projects: GA ČR GA304/01/0257 Grant - others:VEGA(SK) 2/1146/21; CZ-SK GA MŠk(CZ) KONTAKT 126/184 Institutional research plan: CEZ:AV0Z5011922 Keywords : 3D reconstruction * confocal microscopy * image processing Subject RIV: JC - Computer Hardware ; Software Impact factor: 2.307, year: 2003

  3. Requirements on software lifecycle process (RSLP) for KALIMER digital computer-based MMIS design

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jang Soo; Kwon, Kee Choon; Kim, Jang Yeol [Korea Atomic Energy Research Institute, Taejon (Korea)

    1998-04-01

    Digital Man Machine Interface System (MMIS) systems of Korea Advanced Liquid MEtal Reactor (KALIMER) may share code, data transmission, data, and process equipment to a greater degree than analog systems. Although this sharing is the basis for many of the advantages of digital systems, it also raises a key concern: a design using shared data or code has the potential to propagate a common-cause or common-mode failure via software errors, thus defeating the redundancy achieved by the hardware architectural structure. Greater sharing of process equipment among functions within a channel increases the consequences of the failure of a single hardware module and reduces the amount of diversity available within a single safety channel. The software safety plan describes the safety analysis implementation tasks that are to be carried out during the software life cycle. Documentation should exist that shows that the safety analysis activities have been successfully accomplished for each life cycle activity group. In particular, the documentation should show that the system safety requirement have been adequately addressed for each life cycle activity group, that no new hazards have been introduced, and that the software requirements, design elements, and code elements that can affect safety have been identified. Because the safety of software can be assured through both the process Verification and Validation (V and V) itself and the V and V of all the intermediate and final products during the software development lifecycle, the development of KALIMER Software Safety Framework (KSSF) must be established. As the first activity for establishing KSSF, we have developed this report, Requirement on Software Life-cycle Process (RSLP) for designing KALIMER digital MMIS. This report is organized as follows. Section I describes the background, definitions, and references of RSLP. Section II describes KALIMER safety software categorization. In Section III, we define the

  4. Integrated State Estimation and Contingency Analysis Software Implementation using High Performance Computing Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yousu; Glaesemann, Kurt R.; Rice, Mark J.; Huang, Zhenyu

    2015-12-31

    Power system simulation tools are traditionally developed in sequential mode and codes are optimized for single core computing only. However, the increasing complexity in the power grid models requires more intensive computation. The traditional simulation tools will soon not be able to meet the grid operation requirements. Therefore, power system simulation tools need to evolve accordingly to provide faster and better results for grid operations. This paper presents an integrated state estimation and contingency analysis software implementation using high performance computing techniques. The software is able to solve large size state estimation problems within one second and achieve a near-linear speedup of 9,800 with 10,000 cores for contingency analysis application. The performance evaluation is presented to show its effectiveness.

  5. Computer-aided software development

    International Nuclear Information System (INIS)

    Teichroew, D.; Hershey, E.A. III; Yamamoto, Y.

    1978-01-01

    In recent years, as the hardware cost/capability ratio has continued to decrease and as much of the routine data processing has been computerized, the emphasis in software development has shifted from just getting systems operational to the maintenance of existing systems, reduction of duplication by integration, selective addition of new applications, systems that are more usable, maintainable, portable and reliable and to improving the productivity of software developers. This paper examines a number of trends that are changing the methods by which software is being produced and used. (Auth.)

  6. Efficient multi-objective calibration of a computationally intensive hydrologic model with parallel computing software in Python

    Science.gov (United States)

    With enhanced data availability, distributed watershed models for large areas with high spatial and temporal resolution are increasingly used to understand water budgets and examine effects of human activities and climate change/variability on water resources. Developing parallel computing software...

  7. SU-F-I-43: A Software-Based Statistical Method to Compute Low Contrast Detectability in Computed Tomography Images

    Energy Technology Data Exchange (ETDEWEB)

    Chacko, M; Aldoohan, S [University of Oklahoma Health Sciences Center, Oklahoma City, OK (United States)

    2016-06-15

    Purpose: The low contrast detectability (LCD) of a CT scanner is its ability to detect and display faint lesions. The current approach to quantify LCD is achieved using vendor-specific methods and phantoms, typically by subjectively observing the smallest size object at a contrast level above phantom background. However, this approach does not yield clinically applicable values for LCD. The current study proposes a statistical LCD metric using software tools to not only to assess scanner performance, but also to quantify the key factors affecting LCD. This approach was developed using uniform QC phantoms, and its applicability was then extended under simulated clinical conditions. Methods: MATLAB software was developed to compute LCD using a uniform image of a QC phantom. For a given virtual object size, the software randomly samples the image within a selected area, and uses statistical analysis based on Student’s t-distribution to compute the LCD as the minimal Hounsfield Unit’s that can be distinguished from the background at the 95% confidence level. Its validity was assessed by comparison with the behavior of a known QC phantom under various scan protocols and a tissue-mimicking phantom. The contributions of beam quality and scattered radiation upon the computed LCD were quantified by using various external beam-hardening filters and phantom lengths. Results: As expected, the LCD was inversely related to object size under all scan conditions. The type of image reconstruction kernel filter and tissue/organ type strongly influenced the background noise characteristics and therefore, the computed LCD for the associated image. Conclusion: The proposed metric and its associated software tools are vendor-independent and can be used to analyze any LCD scanner performance. Furthermore, the method employed can be used in conjunction with the relationships established in this study between LCD and tissue type to extend these concepts to patients’ clinical CT

  8. CT-Based Brachytherapy Treatment Planning using Monte Carlo Simulation Aided by an Interface Software

    Directory of Open Access Journals (Sweden)

    Vahid Moslemi

    2011-03-01

    Full Text Available Introduction: In brachytherapy, radioactive sources are placed close to the tumor, therefore, small changes in their positions can cause large changes in the dose distribution. This emphasizes the need for computerized treatment planning. The usual method for treatment planning of cervix brachytherapy uses conventional radiographs in the Manchester system. Nowadays, because of their advantages in locating the source positions and the surrounding tissues, CT and MRI images are replacing conventional radiographs. In this study, we used CT images in Monte Carlo based dose calculation for brachytherapy treatment planning, using an interface software to create the geometry file required in the MCNP code. The aim of using the interface software is to facilitate and speed up the geometry set-up for simulations based on the patient’s anatomy. This paper examines the feasibility of this method in cervix brachytherapy and assesses its accuracy and speed. Material and Methods: For dosimetric measurements regarding the treatment plan, a pelvic phantom was made from polyethylene in which the treatment applicators could be placed. For simulations using CT images, the phantom was scanned at 120 kVp. Using an interface software written in MATLAB, the CT images were converted into MCNP input file and the simulation was then performed. Results: Using the interface software, preparation time for the simulations of the applicator and surrounding structures was approximately 3 minutes; the corresponding time needed in the conventional MCNP geometry entry being approximately 1 hour. The discrepancy in the simulated and measured doses to point A was 1.7% of the prescribed dose.  The corresponding dose differences between the two methods in rectum and bladder were 3.0% and 3.7% of the prescribed dose, respectively. Comparing the results of simulation using the interface software with those of simulation using the standard MCNP geometry entry showed a less than 1

  9. Computer-Assisted School Facility Planning with ONPASS.

    Science.gov (United States)

    Urban Decision Systems, Inc., Los Angeles, CA.

    The analytical capabilities of ONPASS, an on-line computer-aided school facility planning system, are described by its developers. This report describes how, using the Canoga Park-Winnetka-Woodland Hills Planning Area as a test case, the Department of City Planning of the city of Los Angeles employed ONPASS to demonstrate how an on-line system can…

  10. Analysis of chromium-51 release assay data using personal computer spreadsheet software

    International Nuclear Information System (INIS)

    Lefor, A.T.; Steinberg, S.M.; Wiebke, E.A.

    1988-01-01

    The Chromium-51 release assay is a widely used technique to assess the lysis of labeled target cells in vitro. We have developed a simple technique to analyze data from Chromium-51 release assays using the widely available LOTUS 1-2-3 spreadsheet software. This package calculates percentage specific cytotoxicity and lytic units by linear regression. It uses all data points to compute the linear regression and can determine if there is a statistically significant difference between two lysis curves. The system is simple to use and easily modified, since its implementation requires neither knowledge of computer programming nor custom designed software. This package can help save considerable time when analyzing data from Chromium-51 release assays

  11. Physics detector simulation facility system software description

    International Nuclear Information System (INIS)

    Allen, J.; Chang, C.; Estep, P.; Huang, J.; Liu, J.; Marquez, M.; Mestad, S.; Pan, J.; Traversat, B.

    1991-12-01

    Large and costly detectors will be constructed during the next few years to study the interactions produced by the SSC. Efficient, cost-effective designs for these detectors will require careful thought and planning. Because it is not possible to test fully a proposed design in a scaled-down version, the adequacy of a proposed design will be determined by a detailed computer model of the detectors. Physics and detector simulations will be performed on the computer model using high-powered computing system at the Physics Detector Simulation Facility (PDSF). The SSCL has particular computing requirements for high-energy physics (HEP) Monte Carlo calculations for the simulation of SSCL physics and detectors. The numerical calculations to be performed in each simulation are lengthy and detailed; they could require many more months per run on a VAX 11/780 computer and may produce several gigabytes of data per run. Consequently, a distributed computing environment of several networked high-speed computing engines is envisioned to meet these needs. These networked computers will form the basis of a centralized facility for SSCL physics and detector simulation work. Our computer planning groups have determined that the most efficient, cost-effective way to provide these high-performance computing resources at this time is with RISC-based UNIX workstations. The modeling and simulation application software that will run on the computing system is usually written by physicists in FORTRAN language and may need thousands of hours of supercomputing time. The system software is the ''glue'' which integrates the distributed workstations and allows them to be managed as a single entity. This report will address the computing strategy for the SSC

  12. Cluster implementation for parallel computation within MATLAB software environment

    International Nuclear Information System (INIS)

    Santana, Antonio O. de; Dantas, Carlos C.; Charamba, Luiz G. da R.; Souza Neto, Wilson F. de; Melo, Silvio B. Melo; Lima, Emerson A. de O.

    2013-01-01

    A cluster for parallel computation with MATLAB software the COCGT - Cluster for Optimizing Computing in Gamma ray Transmission methods, is implemented. The implementation correspond to creation of a local net of computers, facilities and configurations of software, as well as the accomplishment of cluster tests for determine and optimizing of performance in the data processing. The COCGT implementation was required by data computation from gamma transmission measurements applied to fluid dynamic and tomography reconstruction in a FCC-Fluid Catalytic Cracking cold pilot unity, and simulation data as well. As an initial test the determination of SVD - Singular Values Decomposition - of random matrix with dimension (n , n), n=1000, using the Girco's law modified, revealed that COCGT was faster in comparison to the literature [1] cluster, which is similar and operates at the same conditions. Solution of a system of linear equations provided a new test for the COCGT performance by processing a square matrix with n=10000, computing time was 27 s and for square matrix with n=12000, computation time was 45 s. For determination of the cluster behavior in relation to 'parfor' (parallel for-loop) and 'spmd' (single program multiple data), two codes were used containing those two commands and the same problem: determination of SVD of a square matrix with n= 1000. The execution of codes by means of COCGT proved: 1) for the code with 'parfor', the performance improved with the labs number from 1 to 8 labs; 2) for the code 'spmd', just 1 lab (core) was enough to process and give results in less than 1 s. In similar situation, with the difference that now the SVD will be determined from square matrix with n1500, for code with 'parfor', and n=7000, for code with 'spmd'. That results take to conclusions: 1) for the code with 'parfor', the behavior was the same already described above; 2) for code with 'spmd', the same besides having produced a larger performance, it supports a

  13. Architecture of the software for LAMOST fiber positioning subsystem

    Science.gov (United States)

    Peng, Xiaobo; Xing, Xiaozheng; Hu, Hongzhuan; Zhai, Chao; Li, Weimin

    2004-09-01

    The architecture of the software which controls the LAMOST fiber positioning sub-system is described. The software is composed of two parts as follows: a main control program in a computer and a unit controller program in a MCS51 single chip microcomputer ROM. And the function of the software includes: Client/Server model establishment, observation planning, collision handling, data transmission, pulse generation, CCD control, image capture and processing, and data analysis etc. Particular attention is paid to the ways in which different parts of the software can communicate. Also software techniques for multi threads, SOCKET programming, Microsoft Windows message response, and serial communications are discussed.

  14. CRYSNET manual. Informal report. [Hardware and software of crystallographic computing network

    Energy Technology Data Exchange (ETDEWEB)

    None,

    1976-07-01

    This manual describes the hardware and software which together make up the crystallographic computing network (CRYSNET). The manual is intended as a users' guide and also provides general information for persons without any experience with the system. CRYSNET is a network of intelligent remote graphics terminals that are used to communicate with the CDC Cyber 70/76 computing system at the Brookhaven National Laboratory (BNL) Central Scientific Computing Facility. Terminals are in active use by four research groups in the field of crystallography. A protein data bank has been established at BNL to store in machine-readable form atomic coordinates and other crystallographic data for macromolecules. The bank currently includes data for more than 20 proteins. This structural information can be accessed at BNL directly by the CRYSNET graphics terminals. More than two years of experience has been accumulated with CRYSNET. During this period, it has been demonstrated that the terminals, which provide access to a large, fast third-generation computer, plus stand-alone interactive graphics capability, are useful for computations in crystallography, and in a variety of other applications as well. The terminal hardware, the actual operations of the terminals, and the operations of the BNL Central Facility are described in some detail, and documentation of the terminal and central-site software is given. (RWR)

  15. Software functions for safe operation - learning from Sizewell-B

    International Nuclear Information System (INIS)

    Welbourne, D.

    1996-01-01

    Future nuclear plants will use computer-based systems extensively. Regulatory acceptance must be planned and not underestimated. Commercial software packages will simplify it, but costly analysis and demonstration may be needed. Multiplexed control needs preparation of extensive configuration data and careful checking. On-screen soft control will need consideration of the integrity of the control path. Display design should follow human factors analysis of the operators' needs, and display layout needs great care for clarity. Computer-based system with planned quality will then bring great benefits in safe operation. (author) 1 fig., 3 refs

  16. COMPUTATIONAL MODELLING OF BUFFETING EFFECTS USING OPENFOAM SOFTWARE PACKAGE

    Directory of Open Access Journals (Sweden)

    V. T. Kalugin

    2015-01-01

    Full Text Available In this paper, the preliminary results of computational modeling of an aircraft with the airbrake deployed are presented. The calculations were performed with OpenFOAM software package. The results outlined are a part of a research project to optimise aircraft performance using a perforated airbrake. Within this stage of the project OpenFOAM software package with hybrid RANS-LES approach was tested in respect to a given configuration of the aircraft, airbrake and then has been compared with the test data. For the worst case the amplitude of the peak force acting on the tail fin can be up to 6 times higher than the average value without airbrake deployed. To reduce unsteady loads acting on the tailfin, perforation of the airbrake was proposed.

  17. 77 FR 50720 - Test Documentation for Digital Computer Software Used in Safety Systems of Nuclear Power Plants

    Science.gov (United States)

    2012-08-22

    ... Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory Commission. ACTION: Draft... Computer Software used in Safety Systems of Nuclear Power Plants.'' The DG-1207 is proposed Revision 1 of... for Digital Computer Software Used in Safety Systems of Nuclear Power Plants'' is temporarily...

  18. An Educational Software for Simulating the Sample Size of Molecular Marker Experiments

    Science.gov (United States)

    Helms, T. C.; Doetkott, C.

    2007-01-01

    We developed educational software to show graduate students how to plan molecular marker experiments. These computer simulations give the students feedback on the precision of their experiments. The objective of the software was to show students using a hands-on approach how: (1) environmental variation influences the range of the estimates of the…

  19. A specialized plug-in software module for computer-aided quantitative measurement of medical images.

    Science.gov (United States)

    Wang, Q; Zeng, Y J; Huo, P; Hu, J L; Zhang, J H

    2003-12-01

    This paper presents a specialized system for quantitative measurement of medical images. Using Visual C++, we developed a computer-aided software based on Image-Pro Plus (IPP), a software development platform. When transferred to the hard disk of a computer by an MVPCI-V3A frame grabber, medical images can be automatically processed by our own IPP plug-in for immunohistochemical analysis, cytomorphological measurement and blood vessel segmentation. In 34 clinical studies, the system has shown its high stability, reliability and ease of utility.

  20. Radiation Planning Assistant - A Streamlined, Fully Automated Radiotherapy Treatment Planning System

    Science.gov (United States)

    Court, Laurence E.; Kisling, Kelly; McCarroll, Rachel; Zhang, Lifei; Yang, Jinzhong; Simonds, Hannah; du Toit, Monique; Trauernicht, Chris; Burger, Hester; Parkes, Jeannette; Mejia, Mike; Bojador, Maureen; Balter, Peter; Branco, Daniela; Steinmann, Angela; Baltz, Garrett; Gay, Skylar; Anderson, Brian; Cardenas, Carlos; Jhingran, Anuja; Shaitelman, Simona; Bogler, Oliver; Schmeller, Kathleen; Followill, David; Howell, Rebecca; Nelson, Christopher; Peterson, Christine; Beadle, Beth

    2018-01-01

    The Radiation Planning Assistant (RPA) is a system developed for the fully automated creation of radiotherapy treatment plans, including volume-modulated arc therapy (VMAT) plans for patients with head/neck cancer and 4-field box plans for patients with cervical cancer. It is a combination of specially developed in-house software that uses an application programming interface to communicate with a commercial radiotherapy treatment planning system. It also interfaces with a commercial secondary dose verification software. The necessary inputs to the system are a Treatment Plan Order, approved by the radiation oncologist, and a simulation computed tomography (CT) image, approved by the radiographer. The RPA then generates a complete radiotherapy treatment plan. For the cervical cancer treatment plans, no additional user intervention is necessary until the plan is complete. For head/neck treatment plans, after the normal tissue and some of the target structures are automatically delineated on the CT image, the radiation oncologist must review the contours, making edits if necessary. They also delineate the gross tumor volume. The RPA then completes the treatment planning process, creating a VMAT plan. Finally, the completed plan must be reviewed by qualified clinical staff. PMID:29708544

  1. Features of commercial computer software systems for medical examiners and coroners.

    Science.gov (United States)

    Hanzlick, R L; Parrish, R G; Ing, R

    1993-12-01

    There are many ways of automating medical examiner and coroner offices, one of which is to purchase commercial software products specifically designed for death investigation. We surveyed four companies that offer such products and requested information regarding each company and its hardware, software, operating systems, peripheral devices, applications, networking options, programming language, querying capability, coding systems, prices, customer support, and number and size of offices using the product. Although the four products (CME2, ForenCIS, InQuest, and Medical Examiner's Software System) are similar in many respects and each can be installed on personal computers, there are differences among the products with regard to cost, applications, and the other features. Death investigators interested in office automation should explore these products to determine the usefulness of each in comparison with the others and in comparison with general-purpose, off-the-shelf databases and software adaptable to death investigation needs.

  2. Consolidated Cab Display (CCD) System, Project Planning Document (PPD),

    Science.gov (United States)

    1981-02-01

    1980 1981 I 2 3 4 5 6 7 8 9 10 1112 1 2 3 4 5 6 7 8 9 1011112 1 2 31 12. Software Documentation a. Overall Computer Program Description ( OCPD ) b...Approve OCPD c. Computer Program Functional Specifications (CPFS) d. Data Base Table Design Specification (DBTDS) e. Software Interface Control Document...Parts List Master Pattern and Plan View Reproducible Drawings Instruction Book Training Aids/Materials b. Software: OCPD CPFS SI CD PDS DBTDS SDD

  3. Architecture-driven Migration of Legacy Systems to Cloud-enabled Software

    DEFF Research Database (Denmark)

    Ahmad, Aakash; Babar, Muhammad Ali

    2014-01-01

    of legacy systems to cloud computing. The framework leverages the software reengineering concepts that aim to recover the architecture from legacy source code. Then the framework exploits the software evolution concepts to support architecture-driven migration of legacy systems to cloud-based architectures....... The Legacy-to-Cloud Migration Horseshoe comprises of four processes: (i) architecture migration planning, (ii) architecture recovery and consistency, (iii) architecture transformation and (iv) architecture-based development of cloud-enabled software. We aim to discover, document and apply the migration...

  4. Improving Software Performance in the Compute Unified Device Architecture

    Directory of Open Access Journals (Sweden)

    Alexandru PIRJAN

    2010-01-01

    Full Text Available This paper analyzes several aspects regarding the improvement of software performance for applications written in the Compute Unified Device Architecture CUDA. We address an issue of great importance when programming a CUDA application: the Graphics Processing Unit’s (GPU’s memory management through ranspose ernels. We also benchmark and evaluate the performance for progressively optimizing a transposing matrix application in CUDA. One particular interest was to research how well the optimization techniques, applied to software application written in CUDA, scale to the latest generation of general-purpose graphic processors units (GPGPU, like the Fermi architecture implemented in the GTX480 and the previous architecture implemented in GTX280. Lately, there has been a lot of interest in the literature for this type of optimization analysis, but none of the works so far (to our best knowledge tried to validate if the optimizations can apply to a GPU from the latest Fermi architecture and how well does the Fermi architecture scale to these software performance improving techniques.

  5. A state-of-the-art report on software operation structure of the digital control computer system

    International Nuclear Information System (INIS)

    Kim, Bong Kee; Lee, Kyung Hoh; Joo, Jae Yoon; Jang, Yung Woo; Shin, Hyun Kook

    1994-06-01

    CANDU Nuclear Power Plants including Wolsong 1 and 2/3/4 are controlled by a real-time plant control computer system. This report was written to provide an overview on the station control computer software which belongs to one of the most advanced real-time computing application area, along with the Fuel Handling Machine design concepts. The combination of well designed control computer and Fuel Handling Machine allow changing fuel bundles while the plant is in operation. Design methodologies and software structure are discussed along with the interface between the two systems. 29 figs., 2 tabs., 20 refs. (Author)

  6. A state-of-the-art report on software operation structure of the digital control computer system

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Bong Kee; Lee, Kyung Hoh; Joo, Jae Yoon; Jang, Yung Woo; Shin, Hyun Kook [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1994-06-01

    CANDU Nuclear Power Plants including Wolsong 1 and 2/3/4 are controlled by a real-time plant control computer system. This report was written to provide an overview on the station control computer software which belongs to one of the most advanced real-time computing application area, along with the Fuel Handling Machine design concepts. The combination of well designed control computer and Fuel Handling Machine allow changing fuel bundles while the plant is in operation. Design methodologies and software structure are discussed along with the interface between the two systems. 29 figs., 2 tabs., 20 refs. (Author).

  7. Company's Unusual Plan to Package Commercial Software with Business Textbooks Produces a Measure of Success.

    Science.gov (United States)

    Watkins, Beverly T.

    1992-01-01

    Course Technology Inc. has developed 10 products combining textbooks with commercial software for college accounting, business, computer science, and statistics courses. Five of the products use Lotus 1-2-3 spreadsheet software. The products have been positively received by teachers and students. (DB)

  8. Sustainable embedded software lifecycle planning

    OpenAIRE

    Lee, Dong-Hyun; In, Hoh Peter; Lee, Keun; Park, Sooyong; Hinchey, Mike

    2012-01-01

    peer-reviewed Time-to-market is a crucial factor in increasing market share in the consumer electronics (CE) market. Furthermore, fierce competition in the market tends to sharply lower the prices of brand-new CE products as soon as they are released. Software-intensive embedded system design methods such as hardware/software co-design have been studied with the goal of reducing development lead-time by designing hardware and software simultaneously. Many researchers, however, concentra...

  9. An integrated approach for requirement selection and scheduling in software release planning

    NARCIS (Netherlands)

    Li, C.; van den Akker, Marjan; Brinkkemper, Sjaak; Diepen, Guido

    2010-01-01

    It is essential for product software companies to decide which requirements should be included in the next release and to make an appropriate time plan of the development project. Compared to the extensive research done on requirement selection, very little research has been performed on time

  10. FREE SOFTWARE IN ELECTRONIC LEARNING FUTURE TEACHERS OF MATHEMATICS, PHYSICS AND COMPUTER SCIENCE

    Directory of Open Access Journals (Sweden)

    Vladyslav Ye. Velychko

    2016-05-01

    Full Text Available Popularity of the use of free software in the IT industry is much higher than its popular use in educational activities. Disadvantages of free software and problems of its implementation in the educational process is a limiting factor for its use in the education system, however, openness, accessibility and functionality are the main factors for the introduction of free software in the educational process. Nevertheless, for future teachers of mathematics, physics and informatics free software is designed as well as possible because of the specificity of its creation, and therefore, there is a question of the system analysis of the possibilities of using open source software in e-learning for future teachers of mathematics, physics and computer science.

  11. Honeywell Modular Automation System Computer Software Documentation for the Magnesium Hydroxide Precipitation Process

    International Nuclear Information System (INIS)

    STUBBS, A.M.

    2001-01-01

    The purpose of this Computer Software Document (CSWD) is to provide configuration control of the Honeywell Modular Automation System (MAS) in use at the Plutonium Finishing Plant (PFP) for the Magnesium Hydroxide Precipitation Process in Rm 230C/234-5Z. The magnesium hydroxide process control software Rev 0 is being updated to include control programming for a second hot plate. The process control programming was performed by the system administrator. Software testing for the additional hot plate was performed per PFP Job Control Work Package 2Z-00-1703. The software testing was verified by Quality Control to comply with OSD-Z-184-00044, Magnesium Hydroxide Precipitation Process

  12. A Multi-Time Scale Morphable Software Milieu for Polymorphous Computing Architectures (PCA) - Composable, Scalable Systems

    National Research Council Canada - National Science Library

    Skjellum, Anthony

    2004-01-01

    Polymorphous Computing Architectures (PCA) rapidly "morph" (reorganize) software and hardware configurations in order to achieve high performance on computation styles ranging from specialized streaming to general threaded applications...

  13. Treatment planning in radiosurgery: parallel Monte Carlo simulation software

    Energy Technology Data Exchange (ETDEWEB)

    Scielzo, G [Galliera Hospitals, Genova (Italy). Dept. of Hospital Physics; Grillo Ruggieri, F [Galliera Hospitals, Genova (Italy) Dept. for Radiation Therapy; Modesti, M; Felici, R [Electronic Data System, Rome (Italy); Surridge, M [University of South Hampton (United Kingdom). Parallel Apllication Centre

    1995-12-01

    The main objective of this research was to evaluate the possibility of direct Monte Carlo simulation for accurate dosimetry with short computation time. We made us of: graphics workstation, linear accelerator, water, PMMA and anthropomorphic phantoms, for validation purposes; ionometric, film and thermo-luminescent techniques, for dosimetry; treatment planning system for comparison. Benchmarking results suggest that short computing times can be obtained with use of the parallel version of EGS4 that was developed. Parallelism was obtained assigning simulation incident photons to separate processors, and the development of a parallel random number generator was necessary. Validation consisted in: phantom irradiation, comparison of predicted and measured values good agreement in PDD and dose profiles. Experiments on anthropomorphic phantoms (with inhomogeneities) were carried out, and these values are being compared with results obtained with the conventional treatment planning system.

  14. Computer-Aided Experiment Planning toward Causal Discovery in Neuroscience.

    Science.gov (United States)

    Matiasz, Nicholas J; Wood, Justin; Wang, Wei; Silva, Alcino J; Hsu, William

    2017-01-01

    Computers help neuroscientists to analyze experimental results by automating the application of statistics; however, computer-aided experiment planning is far less common, due to a lack of similar quantitative formalisms for systematically assessing evidence and uncertainty. While ontologies and other Semantic Web resources help neuroscientists to assimilate required domain knowledge, experiment planning requires not only ontological but also epistemological (e.g., methodological) information regarding how knowledge was obtained. Here, we outline how epistemological principles and graphical representations of causality can be used to formalize experiment planning toward causal discovery. We outline two complementary approaches to experiment planning: one that quantifies evidence per the principles of convergence and consistency, and another that quantifies uncertainty using logical representations of constraints on causal structure. These approaches operationalize experiment planning as the search for an experiment that either maximizes evidence or minimizes uncertainty. Despite work in laboratory automation, humans must still plan experiments and will likely continue to do so for some time. There is thus a great need for experiment-planning frameworks that are not only amenable to machine computation but also useful as aids in human reasoning.

  15. Software Attribution for Geoscience Applications in the Computational Infrastructure for Geodynamics

    Science.gov (United States)

    Hwang, L.; Dumit, J.; Fish, A.; Soito, L.; Kellogg, L. H.; Smith, M.

    2015-12-01

    Scientific software is largely developed by individual scientists and represents a significant intellectual contribution to the field. As the scientific culture and funding agencies move towards an expectation that software be open-source, there is a corresponding need for mechanisms to cite software, both to provide credit and recognition to developers, and to aid in discoverability of software and scientific reproducibility. We assess the geodynamic modeling community's current citation practices by examining more than 300 predominantly self-reported publications utilizing scientific software in the past 5 years that is available through the Computational Infrastructure for Geodynamics (CIG). Preliminary results indicate that authors cite and attribute software either through citing (in rank order) peer-reviewed scientific publications, a user's manual, and/or a paper describing the software code. Attributions maybe found directly in the text, in acknowledgements, in figure captions, or in footnotes. What is considered citable varies widely. Citations predominantly lack software version numbers or persistent identifiers to find the software package. Versioning may be implied through reference to a versioned user manual. Authors sometimes report code features used and whether they have modified the code. As an open-source community, CIG requests that researchers contribute their modifications to the repository. However, such modifications may not be contributed back to a repository code branch, decreasing the chances of discoverability and reproducibility. Survey results through CIG's Software Attribution for Geoscience Applications (SAGA) project suggest that lack of knowledge, tools, and workflows to cite codes are barriers to effectively implement the emerging citation norms. Generated on-demand attributions on software landing pages and a prototype extensible plug-in to automatically generate attributions in codes are the first steps towards reproducibility.

  16. Preliminary Validation and Verification Plan for CAREM Reactor Protection System

    International Nuclear Information System (INIS)

    Fittipaldi, Ana; Maciel Felix

    2000-01-01

    The purpose of this paper, is to present a preliminary validation and verification plan for a particular architecture proposed for the CAREM reactor protection system with software modules (computer based system).These software modules can be either own design systems or systems based in commercial modules such as programmable logic controllers (PLC) redundant of last generation.During this study, it was seen that this plan can also be used as a validation and verification plan of commercial products (COTS, commercial off the shelf) and/or smart transmitters.The software life cycle proposed and its features are presented, and also the advantages of the preliminary validation and verification plan

  17. Software Defects, Scientific Computation and the Scientific Method

    CERN Multimedia

    CERN. Geneva

    2011-01-01

    Computation has rapidly grown in the last 50 years so that in many scientific areas it is the dominant partner in the practice of science. Unfortunately, unlike the experimental sciences, it does not adhere well to the principles of the scientific method as espoused by, for example, the philosopher Karl Popper. Such principles are built around the notions of deniability and reproducibility. Although much research effort has been spent on measuring the density of software defects, much less has been spent on the more difficult problem of measuring their effect on the output of a program. This talk explores these issues with numerous examples suggesting how this situation might be improved to match the demands of modern science. Finally it develops a theoretical model based on an amalgam of statistical mechanics and Hartley/Shannon information theory which suggests that software systems have strong implementation independent behaviour and supports the widely observed phenomenon that defects clust...

  18. NASA software documentation standard software engineering program

    Science.gov (United States)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as Standard) can be applied to the documentation of all NASA software. This Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. This basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  19. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  20. Optimal reliability allocation for large software projects through soft computing techniques

    DEFF Research Database (Denmark)

    Madsen, Henrik; Albeanu, Grigore; Popentiu-Vladicescu, Florin

    2012-01-01

    or maximizing the system reliability subject to budget constraints. These kinds of optimization problems were considered both in deterministic and stochastic frameworks in literature. Recently, the intuitionistic-fuzzy optimization approach was considered as a soft computing successful modelling approach....... Firstly, a review on existing soft computing approaches to optimization is given. The main section extends the results considering self-organizing migrating algorithms for solving intuitionistic-fuzzy optimization problems attached to complex fault-tolerant software architectures which proved...

  1. Future plans for HEP computing in the US

    International Nuclear Information System (INIS)

    Ballam, J.

    1985-06-01

    The computing requirements of the US HEP Community are set forth. These will be dominated in the next five years by the pantip (TEV I) and e + e - (SLC and CESR) experiments. The ensuing period will be almost completely driven by the data generated by the superconducting super collider (SSC). Plans for near term computing are presented along with speculations for the SSC. Brief descriptions of accelerator and theoretical physics plans are also presented

  2. Planning is not sufficient - Reliable computers need good requirements specifications

    International Nuclear Information System (INIS)

    Matras, J.R.

    1992-01-01

    Computer system reliability is the assurance that a computer system will perform its functions when required to do so. To ensure such reliability, it is important to plan the activities needed for computer system development. These development activities, in turn, require a Computer Quality Assurance Plan (CQAP) that provides the following: a Configuration Management Plan, a Verification and Validation (V and V) Plan, documentation requirements, a defined life cycle, review requirements, and organizational responsibilities. These items are necessary for system reliability; ultimately, however, they are not enough. Development of a reliable system is dependent on the requirements specification. This paper discusses how to use existing industry standards to develop a CQAP. In particular, the paper emphasizes the importance of the requirements specification and of methods for establishing reliability goals. The paper also describes how the revision of ANSI/IEE-ANS-7-4.3.2, Application Criteria for Digital Computer Systems of Nuclear Power Generating Stations, has addressed these issues

  3. A Middleware Platform for Providing Mobile and Embedded Computing Instruction to Software Engineering Students

    Science.gov (United States)

    Mattmann, C. A.; Medvidovic, N.; Malek, S.; Edwards, G.; Banerjee, S.

    2012-01-01

    As embedded software systems have grown in number, complexity, and importance in the modern world, a corresponding need to teach computer science students how to effectively engineer such systems has arisen. Embedded software systems, such as those that control cell phones, aircraft, and medical equipment, are subject to requirements and…

  4. High performance computing and communications: FY 1996 implementation plan

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-05-16

    The High Performance Computing and Communications (HPCC) Program was formally authorized by passage of the High Performance Computing Act of 1991, signed on December 9, 1991. Twelve federal agencies, in collaboration with scientists and managers from US industry, universities, and research laboratories, have developed the Program to meet the challenges of advancing computing and associated communications technologies and practices. This plan provides a detailed description of the agencies` HPCC implementation plans for FY 1995 and FY 1996. This Implementation Plan contains three additional sections. Section 3 provides an overview of the HPCC Program definition and organization. Section 4 contains a breakdown of the five major components of the HPCC Program, with an emphasis on the overall directions and milestones planned for each one. Section 5 provides a detailed look at HPCC Program activities within each agency.

  5. Computational Science And Engineering Software Sustainability And Productivity (CSESSP) Challenges Workshop Report

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — This report details the challenges and opportunities discussed at the NITRD sponsored multi-agency workshop on Computational Science and Engineering Software...

  6. Decision support software technology demonstration plan

    Energy Technology Data Exchange (ETDEWEB)

    SULLIVAN,T.; ARMSTRONG,A.

    1998-09-01

    The performance evaluation of innovative and alternative environmental technologies is an integral part of the US Environmental Protection Agency's (EPA) mission. Early efforts focused on evaluating technologies that supported the implementation of the Clean Air and Clean Water Acts. In 1986 the Agency began to demonstrate and evaluate the cost and performance of remediation and monitoring technologies under the Superfund Innovative Technology Evaluation (SITE) program (in response to the mandate in the Superfund Amendments and Reauthorization Act of 1986 (SARA)). In 1990, the US Technology Policy was announced. This policy placed a renewed emphasis on making the best use of technology in achieving the national goals of improved quality of life for all Americans, continued economic growth, and national security. In the spirit of the technology policy, the Agency began to direct a portion of its resources toward the promotion, recognition, acceptance, and use of US-developed innovative environmental technologies both domestically and abroad. Decision Support Software (DSS) packages integrate environmental data and simulation models into a framework for making site characterization, monitoring, and cleanup decisions. To limit the scope which will be addressed in this demonstration, three endpoints have been selected for evaluation: Visualization; Sample Optimization; and Cost/Benefit Analysis. Five topics are covered in this report: the objectives of the demonstration; the elements of the demonstration plan; an overview of the Site Characterization and Monitoring Technology Pilot; an overview of the technology verification process; and the purpose of this demonstration plan.

  7. A Roadmap for HEP Software and Computing R&D for the 2020s

    Energy Technology Data Exchange (ETDEWEB)

    Alves, Antonio Augusto, Jr; et al.

    2017-12-18

    Particle physics has an ambitious and broad experimental programme for the coming decades. This programme requires large investments in detector hardware, either to build new facilities and experiments, or to upgrade existing ones. Similarly, it requires commensurate investment in the R&D of software to acquire, manage, process, and analyse the shear amounts of data to be recorded. In planning for the HL-LHC in particular, it is critical that all of the collaborating stakeholders agree on the software goals and priorities, and that the efforts complement each other. In this spirit, this white paper describes the R&D activities required to prepare for this software upgrade.

  8. Computer-Aided Prototyping Systems (CAPS) within the software acquisition process: a case study

    OpenAIRE

    Ellis, Mary Kay

    1993-01-01

    Approved for public release; distribution is unlimited This thesis provides a case study which examines the benefits derived from the practice of computer-aided prototyping within the software acquisition process. An experimental prototyping systems currently in research is the Computer Aided Prototyping System (CAPS) managed under the Computer Science department of the Naval Postgraduate School, Monterey, California. This thesis determines the qualitative value which may be realized by ...

  9. High Performance Electrical Modeling and Simulation Software Normal Environment Verification and Validation Plan, Version 1.0; TOPICAL

    International Nuclear Information System (INIS)

    WIX, STEVEN D.; BOGDAN, CAROLYN W.; MARCHIONDO JR., JULIO P.; DEVENEY, MICHAEL F.; NUNEZ, ALBERT V.

    2002-01-01

    The requirements in modeling and simulation are driven by two fundamental changes in the nuclear weapons landscape: (1) The Comprehensive Test Ban Treaty and (2) The Stockpile Life Extension Program which extends weapon lifetimes well beyond their originally anticipated field lifetimes. The move from confidence based on nuclear testing to confidence based on predictive simulation forces a profound change in the performance asked of codes. The scope of this document is to improve the confidence in the computational results by demonstration and documentation of the predictive capability of electrical circuit codes and the underlying conceptual, mathematical and numerical models as applied to a specific stockpile driver. This document describes the High Performance Electrical Modeling and Simulation software normal environment Verification and Validation Plan

  10. USERDA computer software summaries: numbers 240 through 324

    International Nuclear Information System (INIS)

    1976-12-01

    Since 1960 the Argonne Code Center has served as a U.S. Atomic Energy Commission information center for computer programs developed and used primarily for the solution of problems in nuclear physics, reactor design, reactor engineering and operation. The Center, through a network of registered installations, collects, validates, maintains, and distributes a library of these computer programs and publishes a compilation of abstracts describing them. In 1972 the scope of the Center's activities was officially expanded to include computer programs developed in all of the U.S. Atomic Energy Commission program areas and the compilation and publicatuon of this report. The Computer Software Summary report contains summaries of computer programs at the specification stage, under development, being checked out, in use, or available at ERDA offices, laboratories, and contractor installations. Programs are divided into the following categories : cross section and resonance integral calculations; spectrum calculations, generation of group constants, lattice and cell problems; static design studies; depletion, fuel management, cost analysis, and reactor economics; space-independent k;inetics; pace--time kinetics, coupled neutronics--hydrodynamics--thermodynmics and excursion simulations; radiological safety, hazard and accident analysis; heat transfer and fluid flow; deformation and stress distribution computations, structural analysis and engineering design studies; gamma heating and shielddesign programs; reactor systems analysis; data preparation; data management; subsidiary calculations; experimental data processing; general mathematical and computing system routines; materials; environmental and earth sciences; space sciences; electronics and engineering equipment; chemistry; particle accelerators and high-voltage machines; physics; controlled thermonuclear research; biology and medicine; and data

  11. Technical Note: SCUDA: A software platform for cumulative dose assessment

    Energy Technology Data Exchange (ETDEWEB)

    Park, Seyoun; McNutt, Todd; Quon, Harry; Wong, John; Lee, Junghoon, E-mail: rshekhar@childrensnational.org, E-mail: junghoon@jhu.edu [Department of Radiation Oncology and Molecular Radiation Sciences, Johns Hopkins University, Baltimore, Maryland 21231 (United States); Plishker, William [IGI Technologies, Inc., College Park, Maryland 20742 (United States); Shekhar, Raj, E-mail: rshekhar@childrensnational.org, E-mail: junghoon@jhu.edu [IGI Technologies, Inc., College Park, Maryland 20742 and Sheikh Zayed Institute for Pediatric Surgical Innovation, Children’s National Health System, Washington, DC 20010 (United States)

    2016-10-15

    Purpose: Accurate tracking of anatomical changes and computation of actually delivered dose to the patient are critical for successful adaptive radiation therapy (ART). Additionally, efficient data management and fast processing are practically important for the adoption in clinic as ART involves a large amount of image and treatment data. The purpose of this study was to develop an accurate and efficient Software platform for CUmulative Dose Assessment (SCUDA) that can be seamlessly integrated into the clinical workflow. Methods: SCUDA consists of deformable image registration (DIR), segmentation, dose computation modules, and a graphical user interface. It is connected to our image PACS and radiotherapy informatics databases from which it automatically queries/retrieves patient images, radiotherapy plan, beam data, and daily treatment information, thus providing an efficient and unified workflow. For accurate registration of the planning CT and daily CBCTs, the authors iteratively correct CBCT intensities by matching local intensity histograms during the DIR process. Contours of the target tumor and critical structures are then propagated from the planning CT to daily CBCTs using the computed deformations. The actual delivered daily dose is computed using the registered CT and patient setup information by a superposition/convolution algorithm, and accumulated using the computed deformation fields. Both DIR and dose computation modules are accelerated by a graphics processing unit. Results: The cumulative dose computation process has been validated on 30 head and neck (HN) cancer cases, showing 3.5 ± 5.0 Gy (mean±STD) absolute mean dose differences between the planned and the actually delivered doses in the parotid glands. On average, DIR, dose computation, and segmentation take 20 s/fraction and 17 min for a 35-fraction treatment including additional computation for dose accumulation. Conclusions: The authors developed a unified software platform that provides

  12. 5th Annual Provider Software Buyer's Guide.

    Science.gov (United States)

    1995-03-01

    To help long term care providers find new ways to improve quality of care and efficiency, PROVIDER presents the fifth annual listing of software firms marketing computer programs for all areas of long term care operations. On the following five pages, more than 70 software firms display their wares, with programs such as minimum data set and care planning, dietary, accounting and financials, case mix, and medication administration records. The guide also charts compatible hardware, integration ability, telephone numbers, company contacts, and easy-to-use reader service numbers.

  13. Using networking and communications software in business

    CERN Document Server

    McBride, PK

    2014-01-01

    Using Networking and Communications Software in Business covers the importance of networks in a business firm, the benefits of computer communications within a firm, and the cost-benefit in putting up networks in businesses. The book is divided into six parts. Part I looks into the nature and varieties of networks, networking standards, and network software. Part II discusses the planning of a networked system, which includes analyzing the requirements for the network system, the hardware for the network, and network management. The installation of the network system and the network managemen

  14. Overview of Hazard Assessment and Emergency Planning Software of Use to RN First Responders

    Energy Technology Data Exchange (ETDEWEB)

    Waller, E; Millage, K; Blakely, W F; Ross, J A; Mercier, J R; Sandgren, D J; Levine, I H; Dickerson, W E; Nemhauser, J B; Nasstrom, J S; Sugiyama, G; Homann, S; Buddemeier, B R; Curling, C A; Disraelly, D S

    2008-08-26

    There are numerous software tools available for field deployment, reach-back, training and planning use in the event of a radiological or nuclear (RN) terrorist event. Specialized software tools used by CBRNe responders can increase information available and the speed and accuracy of the response, thereby ensuring that radiation doses to responders, receivers, and the general public are kept as low as reasonably achievable. Software designed to provide health care providers with assistance in selecting appropriate countermeasures or therapeutic interventions in a timely fashion can improve the potential for positive patient outcome. This paper reviews various software applications of relevance to radiological and nuclear (RN) events that are currently in use by first responders, emergency planners, medical receivers, and criminal investigators.

  15. Preliminary Validation and Verification Plan for CAREM Reactor Protection System; Modelo de Plan Preliminar de Validacion y Verificacion para el Sistema de Proteccion del Reactor CAREM

    Energy Technology Data Exchange (ETDEWEB)

    Fittipaldi, Ana; Felix, Maciel [Comision Nacional de Energia Atomica, Centro Atomico Bariloche (Argentina)

    2000-07-01

    The purpose of this paper, is to present a preliminary validation and verification plan for a particular architecture proposed for the CAREM reactor protection system with software modules (computer based system).These software modules can be either own design systems or systems based in commercial modules such as programmable logic controllers (PLC) redundant of last generation.During this study, it was seen that this plan can also be used as a validation and verification plan of commercial products (COTS, commercial off the shelf) and/or smart transmitters.The software life cycle proposed and its features are presented, and also the advantages of the preliminary validation and verification plan.

  16. Comparison of two three-dimensional cephalometric analysis computer software

    OpenAIRE

    Sawchuk, Dena; Alhadlaq, Adel; Alkhadra, Thamer; Carlyle, Terry D; Kusnoto, Budi; El-Bialy, Tarek

    2014-01-01

    Background: Three-dimensional cephalometric analyses are getting more attraction in orthodontics. The aim of this study was to compare two softwares to evaluate three-dimensional cephalometric analyses of orthodontic treatment outcomes. Materials and Methods: Twenty cone beam computed tomography images were obtained using i-CAT® imaging system from patient's records as part of their regular orthodontic records. The images were analyzed using InVivoDental5.0 (Anatomage Inc.) and 3DCeph™ (Unive...

  17. Significance of preoperative planning software for puncture and channel establishment in percutaneous endoscopic lumbar DISCECTOMY: A study of 40 cases.

    Science.gov (United States)

    Hu, Zhouyang; Li, Xinhua; Cui, Jian; He, Xiaobo; Li, Cong; Han, Yingchao; Pan, Jie; Yang, Mingjie; Tan, Jun; Li, Lijun

    2017-05-01

    Preoperative planning software has been widely used in many other minimally invasive surgeries, but there is a lack of information describing the clinical benefits of existing software applied in percutaneous endoscopic lumbar discectomy (PELD). This study aimed to compare the clinical efficacy of preoperative planning software in puncture and channel establishment of PELD with routine methods in treating lumbar disc herniation (LDH). From June 2016 to October 2016, 40 patients who had single L4/5 or L5/S1 disc herniation were divided into two groups. Group A adopted planning software for preoperative puncture simulation while Group B took routine cases discussion for making puncture plans. The channel establishment time, operative time, fluoroscopic times and complications were compared between the two groups. The surgical efficacy was evaluated according to the Visual Analogue Scale (VAS), Oswestry Disability Index (ODI) and modified Macnab's criteria. The mean channel establishment time was 25.1 ± 4.2 min and 34.6 ± 5.4 min in Group A and B, respectively (P  0.05). The findings of modified Macnab's criteria at each follow-up also showed no significant differences (P > 0.05). The application of preoperative planning software in puncture and cannula insertion planning in PELD was easy and reliable, and could reduce the channel establishment time, operative time and fluoroscopic times of PELD significantly. Copyright © 2017 IJS Publishing Group Ltd. Published by Elsevier Ltd. All rights reserved.

  18. Z-Plant material information tracking system (ZMITS) software development and integration project management plan

    International Nuclear Information System (INIS)

    IBSEN, T.G.

    1999-01-01

    This document plans for software and interface development governing the implementation of ZMITS and other supporting systems necessary to manage information for material stabilization needs of the Project Hanford Management Contract (PHMC)

  19. A Study of the Use of Ontologies for Building Computer-Aided Control Engineering Self-Learning Educational Software

    Science.gov (United States)

    García, Isaías; Benavides, Carmen; Alaiz, Héctor; Alonso, Angel

    2013-01-01

    This paper describes research on the use of knowledge models (ontologies) for building computer-aided educational software in the field of control engineering. Ontologies are able to represent in the computer a very rich conceptual model of a given domain. This model can be used later for a number of purposes in different software applications. In…

  20. Madagascar: open-source software project for multidimensional data analysis and reproducible computational experiments

    Directory of Open Access Journals (Sweden)

    Sergey Fomel

    2013-11-01

    Full Text Available The Madagascar software package is designed for analysis of large-scale multidimensional data, such as those occurring in exploration geophysics. Madagascar provides a framework for reproducible research. By “reproducible research” we refer to the discipline of attaching software codes and data to computational results reported in publications. The package contains a collection of (a computational modules, (b data-processing scripts, and (c research papers. Madagascar is distributed on SourceForge under a GPL v2 license https://sourceforge.net/projects/rsf/. By October 2013, more than 70 people from different organizations around the world have contributed to the project, with increasing year-to-year activity. The Madagascar website is http://www.ahay.org/.

  1. Upgrade plan for HANARO control computer system

    International Nuclear Information System (INIS)

    Kim, Min Jin; Kim, Young Ki; Jung, Hwan Sung; Choi, Young San; Woo, Jong Sub; Jun, Byung Jin

    2001-01-01

    A microprocessor based digital control system, the Multi-Loop Controller (MLC), which was chosen to control HANARO, was introduced to the market in early '80s and it had been used to control petrochemical plant, paper mill and Slowpoke reactor in Canada. Due to the development in computer technology, it has become so outdated model and the production of this model was discontinued a few years ago. Hence difficulty in acquiring the spare parts is expected. To achieve stable reactor control during its lifetime and to avoid possible technical dependency to the manufacturer, a long-term replacement plan for HANARO control computer system is on its way. The plan will include a few steps in its process. This paper briefly introduces the methods of implementation of the process and discusses the engineering activities of the plan

  2. High performance computing and communications: FY 1997 implementation plan

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-01

    The High Performance Computing and Communications (HPCC) Program was formally authorized by passage, with bipartisan support, of the High-Performance Computing Act of 1991, signed on December 9, 1991. The original Program, in which eight Federal agencies participated, has now grown to twelve agencies. This Plan provides a detailed description of the agencies` FY 1996 HPCC accomplishments and FY 1997 HPCC plans. Section 3 of this Plan provides an overview of the HPCC Program. Section 4 contains more detailed definitions of the Program Component Areas, with an emphasis on the overall directions and milestones planned for each PCA. Appendix A provides a detailed look at HPCC Program activities within each agency.

  3. Path Planning Software and Graphics Interface for an Autonomous Vehicle, Accounting for Terrain Features

    National Research Council Canada - National Science Library

    Hurezeanu, Vlad

    2000-01-01

    .... This vehicle performs tasks to include surveying fields, laying mines, and teleoperation. The capability of the vehicle will be increased if its supporting software plans paths that take into account the terrain features...

  4. Application of software quality assurance methods in validation and maintenance of reactor analysis computer codes

    International Nuclear Information System (INIS)

    Reznik, L.

    1994-01-01

    Various computer codes employed at Israel Electricity Company for preliminary reactor design analysis and fuel cycle scoping calculations have been often subject to program source modifications. Although most changes were due to computer or operating system compatibility problems, a number of significant modifications were due to model improvement and enhancements of algorithm efficiency and accuracy. With growing acceptance of software quality assurance requirements and methods, a program of implementing extensive testing of modified software has been adopted within the regular maintenance activities. In this work survey has been performed of various software quality assurance methods of software testing which belong mainly to the two major categories of implementation ('white box') and specification-based ('black box') testing. The results of this survey exhibits a clear preference of specification-based testing. In particular the equivalence class partitioning method and the boundary value method have been selected as especially suitable functional methods for testing reactor analysis codes.A separate study of software quality assurance methods and techniques has been performed in this work objective to establish appropriate pre-test software specification methods. Two methods of software analysis and specification have been selected as the most suitable for this purpose: The method of data flow diagrams has been shown to be particularly valuable for performing the functional/procedural software specification while the entities - relationship diagrams has been approved to be efficient for specifying software data/information domain. Feasibility of these two methods has been analyzed in particular for software uncertainty analysis and overall code accuracy estimation. (author). 14 refs

  5. High performance computing and communications: FY 1995 implementation plan

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-04-01

    The High Performance Computing and Communications (HPCC) Program was formally established following passage of the High Performance Computing Act of 1991 signed on December 9, 1991. Ten federal agencies in collaboration with scientists and managers from US industry, universities, and laboratories have developed the HPCC Program to meet the challenges of advancing computing and associated communications technologies and practices. This plan provides a detailed description of the agencies` HPCC implementation plans for FY 1994 and FY 1995. This Implementation Plan contains three additional sections. Section 3 provides an overview of the HPCC Program definition and organization. Section 4 contains a breakdown of the five major components of the HPCC Program, with an emphasis on the overall directions and milestones planned for each one. Section 5 provides a detailed look at HPCC Program activities within each agency. Although the Department of Education is an official HPCC agency, its current funding and reporting of crosscut activities goes through the Committee on Education and Health Resources, not the HPCC Program. For this reason the Implementation Plan covers nine HPCC agencies.

  6. Process for planning and control of software projects using XedroGESPRO

    OpenAIRE

    Jacqueline Marín-Sánchez; José Alejandro Lugo-García; Pedro Yobanis Piñero-Pérez; Alena María Santiesteban-García; Félix Noel Abelardo-Santana; Javier Menéndez-Rizo

    2014-01-01

    The software project management in Cuba has become a key area for improving production processes and decisionmaking in organizations. Several models and standards for process improvement, related with project management, proposed best practices on issues of planning and control of projects. However, they are generic guidelines that describe only those activities to execute, leaving the responsibility for implementing to organizations, using sometimes , expensive proprietary infor...

  7. Comparison of 3D reconstruction of mandible for pre-operative planning using commercial and open-source software

    Science.gov (United States)

    Abdullah, Johari Yap; Omar, Marzuki; Pritam, Helmi Mohd Hadi; Husein, Adam; Rajion, Zainul Ahmad

    2016-12-01

    3D printing of mandible is important for pre-operative planning, diagnostic purposes, as well as for education and training. Currently, the processing of CT data is routinely performed with commercial software which increases the cost of operation and patient management for a small clinical setting. Usage of open-source software as an alternative to commercial software for 3D reconstruction of the mandible from CT data is scarce. The aim of this study is to compare two methods of 3D reconstruction of the mandible using commercial Materialise Mimics software and open-source Medical Imaging Interaction Toolkit (MITK) software. Head CT images with a slice thickness of 1 mm and a matrix of 512x512 pixels each were retrieved from the server located at the Radiology Department of Hospital Universiti Sains Malaysia. The CT data were analysed and the 3D models of mandible were reconstructed using both commercial Materialise Mimics and open-source MITK software. Both virtual 3D models were saved in STL format and exported to 3matic and MeshLab software for morphometric and image analyses. Both models were compared using Wilcoxon Signed Rank Test and Hausdorff Distance. No significant differences were obtained between the 3D models of the mandible produced using Mimics and MITK software. The 3D model of the mandible produced using MITK open-source software is comparable to the commercial MIMICS software. Therefore, open-source software could be used in clinical setting for pre-operative planning to minimise the operational cost.

  8. Application of a B ampersand W developed computer aided pictorial process planning system to CQMS for manufacturing process control

    International Nuclear Information System (INIS)

    Johanson, D.C.; VandeBogart, J.E.

    1992-01-01

    Babcock ampersand Wilcox (B ampersand W) will utilize its internally developed Computer Aided Pictorial Process Planning or CAPPP (pronounced open-quotes cap cubedclose quotes) system to create a paperless manufacturing environment for the Collider Quadruple Magnets (CQM). The CAPPP system consists of networked personal computer hardware and software used to: (1) generate and maintain the documents necessary for product fabrication, (2) communicate the information contained in these documents to the production floor, and (3) obtain quality assurance and manufacturing feedback information from the production floor. The purpose of this paper is to describe the various components of the CAPPP system and explain their applicability to product fabrication, specifically quality assurance functions

  9. 77 FR 50723 - Verification, Validation, Reviews, and Audits for Digital Computer Software Used in Safety...

    Science.gov (United States)

    2012-08-22

    ... regulations with respect to software verification and auditing of digital computer software used in the safety... Standards and Records,'' which requires, in part, that a quality assurance program be established and implemented to provide adequate assurance that systems and components important to safety will satisfactorily...

  10. UFMulti: A new parallel processing software system for HEP

    Science.gov (United States)

    Avery, Paul; White, Andrew

    1989-12-01

    UFMulti is a multiprocessing software package designed for general purpose high energy physics applications, including physics and detector simulation, data reduction and DST physics analysis. The system is particularly well suited for installations where several workstation or computers are connected through a local area network (LAN). The initial configuration of the software is currently running on VAX/VMS machines with a planned extension to ULTRIX, using the new RISC CPUs from Digital, in the near future.

  11. UFMULTI: A new parallel processing software system for HEP

    International Nuclear Information System (INIS)

    Avery, P.; White, A.

    1989-01-01

    UFMulti is a multiprocessing software package designed for general purpose high energy physics applications, including physics and detector simulation, data reduction and DST physics analysis. The system is particularly well suited for installations where several workstations or computers are connected through a local area network (LAN). The initial configuration of the software is currently running on VAX/VMS machines with a planned extension to ULTRIX, using the new RISC CPUs from Digital, in the near future. (orig.)

  12. Technology survey of computer software as applicable to the MIUS project

    Science.gov (United States)

    Fulbright, B. E.

    1975-01-01

    Existing computer software, available from either governmental or private sources, applicable to modular integrated utility system program simulation is surveyed. Several programs and subprograms are described to provide a consolidated reference, and a bibliography is included. The report covers the two broad areas of design simulation and system simulation.

  13. Software of image processing system on the JINR basic computers and problems of its further development

    International Nuclear Information System (INIS)

    Ivanov, V.G.

    1978-01-01

    To process picture information on the basis of BESM-6 and CDC-6500 computers, Joint Institute for Nuclear Research has developed a set of programs which enables the user to restore a spatial picture of measured events and calculate track parameters, as well as kinematically identify the events and to select most probable hypotheses for each event. A wide-scale use of programs which process picture data obtained via various track chambers requires quite a number of different options of each program. For this purpose, a special program, PATCHY editor, has been developed to update, edit and assemble large programs. Therefore, a partitioned structure of the programs has been chosen which considerably reduces programming time. Basic problems of picture processing software are discussed and the fact that availability of terminal equipment for BESM-6 and CDC-6500 computers will help to increase the processing speed and to implement interactive mode is pointed out. It is also planned to develop a training system to help the user learn how to use the programs of the system

  14. What's New in Software? Computers and the Writing Process: Strategies That Work.

    Science.gov (United States)

    Ellsworth, Nancy J.

    1990-01-01

    The computer can be a powerful tool to help students who are having difficulty learning the skills of prewriting, composition, revision, and editing. Specific software is suggested for each phase, as well as for classroom publishing. (Author/JDD)

  15. Agile Development of Various Computational Power Adaptive Web-Based Mobile-Learning Software Using Mobile Cloud Computing

    Science.gov (United States)

    Zadahmad, Manouchehr; Yousefzadehfard, Parisa

    2016-01-01

    Mobile Cloud Computing (MCC) aims to improve all mobile applications such as m-learning systems. This study presents an innovative method to use web technology and software engineering's best practices to provide m-learning functionalities hosted in a MCC-learning system as service. Components hosted by MCC are used to empower developers to create…

  16. Software Quality Assurance activities of ITER CODAC

    Energy Technology Data Exchange (ETDEWEB)

    Pande, Sopan, E-mail: sopan.pande@iter.org [ITER Organization, Route de Vinon sur Verdon, 13115 St Paul Lez Durance (France); DiMaio, Franck; Kim, Changseung; Kim, Joohan; Klotz, Wolf-Dieter; Makijarvi, Petri; Stepanov, Denis; Wallander, Anders [ITER Organization, Route de Vinon sur Verdon, 13115 St Paul Lez Durance (France)

    2013-10-15

    Highlights: ► Comprehensive and consistent software engineering and quality assurance of CODAC. ► Applicable to all CODAC software projects executed by ITER DAs and contractors. ► Configurable plans for cost effective application of SQA processes. ► CODAC software plans SQAP, SVVP, SDP, and SCMP. ► CODAC software processes based on IEEE 12207-2008. -- Abstract: Software as an integral part of the plant system I and C is crucial in the manufacturing and integrated operation of ITER plant systems. Software Quality Assurance is necessary to ensure the development and maintenance of consistently high quality I and C software throughout the lifetime of ITER. CODAC decided to follow IEEE 12207-2008 software lifecycle processes for Software Engineering and Software Quality Assurance. Software Development Plan, Software Configuration Management Plan and Software Verification and Validation Plan are the mainstay of Software Quality Assurance which is documented in the Software Quality Assurance Plan. This paper describes the Software Quality Assurance (SQA) activities performed by CODAC. The SQA includes development and maintenance of above plans, processes and resources. With the help of Verification and Validation Teams they gather evidence of process conformance and product conformance, and record process data for quality audits and perform process improvements.

  17. Software Quality Assurance activities of ITER CODAC

    International Nuclear Information System (INIS)

    Pande, Sopan; DiMaio, Franck; Kim, Changseung; Kim, Joohan; Klotz, Wolf-Dieter; Makijarvi, Petri; Stepanov, Denis; Wallander, Anders

    2013-01-01

    Highlights: ► Comprehensive and consistent software engineering and quality assurance of CODAC. ► Applicable to all CODAC software projects executed by ITER DAs and contractors. ► Configurable plans for cost effective application of SQA processes. ► CODAC software plans SQAP, SVVP, SDP, and SCMP. ► CODAC software processes based on IEEE 12207-2008. -- Abstract: Software as an integral part of the plant system I and C is crucial in the manufacturing and integrated operation of ITER plant systems. Software Quality Assurance is necessary to ensure the development and maintenance of consistently high quality I and C software throughout the lifetime of ITER. CODAC decided to follow IEEE 12207-2008 software lifecycle processes for Software Engineering and Software Quality Assurance. Software Development Plan, Software Configuration Management Plan and Software Verification and Validation Plan are the mainstay of Software Quality Assurance which is documented in the Software Quality Assurance Plan. This paper describes the Software Quality Assurance (SQA) activities performed by CODAC. The SQA includes development and maintenance of above plans, processes and resources. With the help of Verification and Validation Teams they gather evidence of process conformance and product conformance, and record process data for quality audits and perform process improvements

  18. A Reusable Software Architecture for Small Satellite AOCS Systems

    DEFF Research Database (Denmark)

    Alminde, Lars; Bendtsen, Jan Dimon; Laursen, Karl Kaas

    2006-01-01

    This paper concerns the software architecture called Sophy, which is an abbreviation for Simulation, Observation, and Planning in HYbrid systems. We present a framework that allows execution of hybrid dynamical systems in an on-line distributed computing environment, which includes interaction...... with both hardware and on-board software. Some of the key issues addressed by the framework are automatic translation of mathematical specifications of hybrid systems into executable software entities, management of execution of coupled models in a parallel distributed environment, as well as interaction...... with external components, hardware and/or software, through generic interfaces. Sophy is primarily intended as a tool for development of model based reusable software for the control and autonomous functions of satellites and/or satellite clusters....

  19. Computational Infrastructure for Nuclear Astrophysics

    International Nuclear Information System (INIS)

    Smith, Michael S.; Hix, W. Raphael; Bardayan, Daniel W.; Blackmon, Jeffery C.; Lingerfelt, Eric J.; Scott, Jason P.; Nesaraja, Caroline D.; Chae, Kyungyuk; Guidry, Michael W.; Koura, Hiroyuki; Meyer, Richard A.

    2006-01-01

    A Computational Infrastructure for Nuclear Astrophysics has been developed to streamline the inclusion of the latest nuclear physics data in astrophysics simulations. The infrastructure consists of a platform-independent suite of computer codes that is freely available online at nucastrodata.org. Features of, and future plans for, this software suite are given

  20. A Graphical User Interface for the Computational Fluid Dynamics Software OpenFOAM

    OpenAIRE

    Melbø, Henrik Kaald

    2014-01-01

    A graphical user interface for the computational fluid dynamics software OpenFOAM has been constructed. OpenFOAM is a open source and powerful numerical software, but has much to be wanted in the field of user friendliness. In this thesis the basic operation of OpenFOAM will be introduced and the thesis will emerge in a graphical user interface written in PyQt. The graphical user interface will make the use of OpenFOAM simpler, and hopefully make this powerful tool more available for the gene...

  1. A pioneering application of NQA-1 quality assurance standards in the development of software

    International Nuclear Information System (INIS)

    Weisbin, A.N.

    1988-01-01

    One reason for systematically applying quality assurance to computer software is the extensive use of results from computer programs to characterize potential sited for nuclear waste repositories leading ultimately to important policy making decisions. Because data from these programs characterize the likely radioactivity profile for many hundreds of years, experimental validation is not feasible. The Sensitivity and Uncertainty Analysis Methods Development Project (SUAMDP) was developed to formulate and utilize efficient and comprehensive methods for determining sensitivities of calculated results with respect to changes in all input parameters. The computerized methodology was embodied in the Gradient Enhanced Software System (GRESS). Due to the fact that GRESS was to be used in the site characterization for waste storage, stringent NQA-1 requirements were imposed by the sponsor. A working relationship between the Oak Ridge National Laboratory (ORNL) Quality Department and the research scientists developing GRESS was essential in achieving understanding and acceptance of the quality assurance requirements as applied to the SUAMDP. The relationship resulted in the SUAMDP becoming the first software project at ORNL to develop a comprehensive NQA-1 Quality Assurance Plan; this plan now serves as a model for software quality assurance at ORNL. This paper describes the evolution of this plan and its impact on the application of quality assurance procedures to software. 2 refs

  2. The Effects of Computer-Aided Design Software on Engineering Students' Spatial Visualisation Skills

    Science.gov (United States)

    Kösa, Temel; Karakus, Fatih

    2018-01-01

    The purpose of this study was to determine the influence of computer-aided design (CAD) software-based instruction on the spatial visualisation skills of freshman engineering students in a computer-aided engineering drawing course. A quasi-experimental design was applied, using the Purdue Spatial Visualization Test-Visualization of Rotations…

  3. Computer-aided software understanding systems to enhance confidence of scientific codes

    International Nuclear Information System (INIS)

    Sheng, G.; Oeren, T.I.

    1991-01-01

    A unique characteristic of nuclear waste disposal is the very long time span over which the combined engineered and natural containment system must remain effective: hundreds of thousands of years. Since there is no precedent in human history for such an endeavour, simulation with the use of computers is the only means we have of forecasting possible future outcomes quantitatively. The need for reliable models and software to make such forecasts so far into the future is obvious. One of the critical elements necessary to ensure reliability is the degree of reviewability of the computer program. Among others, there are two very important reasons for this. Firstly, if there is to be any chance at all of validating the conceptual models as implemented by the computer code, peer reviewers must be able to see and understand what the program is doing. It is all but impossible to achieve this understanding by just looking at the code due to possible unfamiliarity with the language and often due as well to the length and complexity of the code. Secondly, a thorough understanding of the code is also necessary to carry out code maintenance activities which include among others, error detection, error correction and code modification for purposes of enhancing its performance, functionality or to adapt it to a changed environment. The emerging concepts of computer-aided software understanding and reverse engineering can answer precisely these needs. This paper will discuss the role they can play in enhancing the confidence one has on computer codes and several examples will be provided. Finally a brief discussion of combining state-of-art forward engineering systems with reverse engineering systems will show how powerfully they can contribute to the overall quality assurance of a computer program. (13 refs., 7 figs.)

  4. Epistemic Opacity, Confirmation Holism and Technical Debt: Computer Simulation in the Light of Empirical Software Engineering

    OpenAIRE

    Newman , Julian

    2015-01-01

    Epistemic opacity vis a vis human agents has been presented as an essential, ineliminable characteristic of computer simulation models resulting from the characteristics of the human cognitive agent. This paper argues, on the contrary, that such epistemic opacity as does occur in computer simulations is not a consequence of human limitations but of a failure on the part of model developers to adopt good software engineering practice for managing human error and ensuring the software artefact ...

  5. Multimodality image registration with software: state-of-the-art

    International Nuclear Information System (INIS)

    Slomka, Piotr J.; Baum, Richard P.

    2009-01-01

    Multimodality image integration of functional and anatomical data can be performed by means of dedicated hybrid imaging systems or by software image co-registration techniques. Hybrid positron emission tomography (PET)/computed tomography (CT) systems have found wide acceptance in oncological imaging, while software registration techniques have a significant role in patient-specific, cost-effective, and radiation dose-effective application of integrated imaging. Software techniques allow accurate (2-3 mm) rigid image registration of brain PET with CT and MRI. Nonlinear techniques are used in whole-body image registration, and recent developments allow for significantly accelerated computing times. Nonlinear software registration of PET with CT or MRI is required for multimodality radiation planning. Difficulties remain in the validation of nonlinear registration of soft tissue organs. The utilization of software-based multimodality image integration in a clinical environment is sometimes hindered by the lack of appropriate picture archiving and communication systems (PACS) infrastructure needed to efficiently and automatically integrate all available images into one common database. In cardiology applications, multimodality PET/single photon emission computed tomography and coronary CT angiography imaging is typically not required unless the results of one of the tests are equivocal. Software image registration is likely to be used in a complementary fashion with hybrid PET/CT or PET/magnetic resonance imaging systems. Software registration of stand-alone scans ''paved the way'' for the clinical application of hybrid scanners, demonstrating practical benefits of image integration before the hybrid dual-modality devices were available. (orig.)

  6. Planning and Analysis of the Company’s Financial Performances by Using a Software Simulation

    Directory of Open Access Journals (Sweden)

    Meri BOSHKOSKA

    2017-06-01

    Full Text Available Information Technology includes a wide range of software solution that helps managers in decision making processes in order to increase the company's business performance. Using software solution in financial analysis is a valuable tool for managers in the financial decision making process. The objective of the study was accomplished by developing Software that easily determines the financial performances of the company through integration of the analysis of financial indicators and DuPont profitability analysis model. Through this software, managers will be able to calculate the current financial state and visually analyze how their actions will affect the financial performance of the company. This will enable them to identify the best ways to improve the financial performance of the company. The software can perform a financial analysis and give a clear, useful overview of the current business performance and can also help in planning the growth of the company. The Software can also be implemented in educational purposes for students and managers in the field of financial management.

  7. Testing of the assisting software for radiologists analysing head CT images: lessons learned.

    Science.gov (United States)

    Martynov, Petr; Mitropolskii, Nikolai; Kukkola, Katri; Gretsch, Monika; Koivisto, Vesa-Matti; Lindgren, Ilkka; Saunavaara, Jani; Reponen, Jarmo; Mäkynen, Anssi

    2017-12-11

    Assessing a plan for user testing and evaluation of the assisting software developed for radiologists. Test plan was assessed in experimental testing, where users performed reporting on head computed tomography studies with the aid of the software developed. The user testing included usability tests, questionnaires, and interviews. In addition, search relevance was assessed on the basis of user opinions. The testing demonstrated weaknesses in the initial plan and enabled improvements. Results showed that the software has acceptable usability level but some minor fixes are needed before larger-scale pilot testing. The research also proved that it is possible even for radiologists with under a year's experience to perform reporting of non-obvious cases when assisted by the software developed. Due to the small number of test users, it was impossible to assess effects on diagnosis quality. The results of the tests performed showed that the test plan designed is useful, and answers to the key research questions should be forthcoming after testing with more radiologists. The preliminary testing revealed opportunities to improve test plan and flow, thereby illustrating that arranging preliminary test sessions prior to any complex scenarios is beneficial.

  8. Current trends in hardware and software for brain-computer interfaces (BCIs).

    Science.gov (United States)

    Brunner, P; Bianchi, L; Guger, C; Cincotti, F; Schalk, G

    2011-04-01

    A brain-computer interface (BCI) provides a non-muscular communication channel to people with and without disabilities. BCI devices consist of hardware and software. BCI hardware records signals from the brain, either invasively or non-invasively, using a series of device components. BCI software then translates these signals into device output commands and provides feedback. One may categorize different types of BCI applications into the following four categories: basic research, clinical/translational research, consumer products, and emerging applications. These four categories use BCI hardware and software, but have different sets of requirements. For example, while basic research needs to explore a wide range of system configurations, and thus requires a wide range of hardware and software capabilities, applications in the other three categories may be designed for relatively narrow purposes and thus may only need a very limited subset of capabilities. This paper summarizes technical aspects for each of these four categories of BCI applications. The results indicate that BCI technology is in transition from isolated demonstrations to systematic research and commercial development. This process requires several multidisciplinary efforts, including the development of better integrated and more robust BCI hardware and software, the definition of standardized interfaces, and the development of certification, dissemination and reimbursement procedures.

  9. Current trends in hardware and software for brain-computer interfaces (BCIs)

    Science.gov (United States)

    Brunner, P.; Bianchi, L.; Guger, C.; Cincotti, F.; Schalk, G.

    2011-04-01

    A brain-computer interface (BCI) provides a non-muscular communication channel to people with and without disabilities. BCI devices consist of hardware and software. BCI hardware records signals from the brain, either invasively or non-invasively, using a series of device components. BCI software then translates these signals into device output commands and provides feedback. One may categorize different types of BCI applications into the following four categories: basic research, clinical/translational research, consumer products, and emerging applications. These four categories use BCI hardware and software, but have different sets of requirements. For example, while basic research needs to explore a wide range of system configurations, and thus requires a wide range of hardware and software capabilities, applications in the other three categories may be designed for relatively narrow purposes and thus may only need a very limited subset of capabilities. This paper summarizes technical aspects for each of these four categories of BCI applications. The results indicate that BCI technology is in transition from isolated demonstrations to systematic research and commercial development. This process requires several multidisciplinary efforts, including the development of better integrated and more robust BCI hardware and software, the definition of standardized interfaces, and the development of certification, dissemination and reimbursement procedures.

  10. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  11. Software engineering and Ada (Trademark) training: An implementation model for NASA

    Science.gov (United States)

    Legrand, Sue; Freedman, Glenn

    1988-01-01

    The choice of Ada for software engineering for projects such as the Space Station has resulted in government and industrial groups considering training programs that help workers become familiar with both a software culture and the intricacies of a new computer language. The questions of how much time it takes to learn software engineering with Ada, how much an organization should invest in such training, and how the training should be structured are considered. Software engineering is an emerging, dynamic discipline. It is defined by the author as the establishment and application of sound engineering environments, tools, methods, models, principles, and concepts combined with appropriate standards, guidelines, and practices to support computing which is correct, modifiable, reliable and safe, efficient, and understandable throughout the life cycle of the application. Neither the training programs needed, nor the content of such programs, have been well established. This study addresses the requirements for training for NASA personnel and recommends an implementation plan. A curriculum and a means of delivery are recommended. It is further suggested that a knowledgeable programmer may be able to learn Ada in 5 days, but that it takes 6 to 9 months to evolve into a software engineer who uses the language correctly and effectively. The curriculum and implementation plan can be adapted for each NASA Center according to the needs dictated by each project.

  12. COMPUTER TOOLS OF DYNAMIC MATHEMATIC SOFTWARE AND METHODICAL PROBLEMS OF THEIR USE

    OpenAIRE

    Olena V. Semenikhina; Maryna H. Drushliak

    2014-01-01

    The article presents results of analyses of standard computer tools of dynamic mathematic software which are used in solving tasks, and tools on which the teacher can support in the teaching of mathematics. Possibility of the organization of experimental investigating of mathematical objects on the basis of these tools and the wording of new tasks on the basis of the limited number of tools, fast automated check are specified. Some methodological comments on application of computer tools and ...

  13. Computational mathematics and mathematical computer software. Vychislitel'naia matematika i matematicheskoe obespechenie EVM

    Energy Technology Data Exchange (ETDEWEB)

    Tikhonov, A.N.; Samarskii, A.A.

    1985-01-01

    Various aspects of mathematical modeling and problem-oriented computer software are examined with reference to numerical methods in mathematical physics, methods for solving inverse problems, development of automatic systems for experimental data processing, and mathematical modeling in plasma physics. Papers are presented on some properties of difference schemes in one-dimensional gas dynamics, an algorithm for processing signals reflected from multipoint targets, and the application of simplified Navier-Stokes equations for calculating flow of a viscous gas past long bodies.

  14. Software for muscle fibre type classification and analysis

    Czech Academy of Sciences Publication Activity Database

    Karen, Petr; Števanec, M.; Smerdu, V.; Cvetko, E.; Kubínová, Lucie; Eržen, I.

    2009-01-01

    Roč. 53, č. 2 (2009), s. 87-95 ISSN 1121-760X R&D Projects: GA MŠk(CZ) LC06063; GA MŠk(CZ) MEB090910 Institutional research plan: CEZ:AV0Z50110509 Keywords : muscle fiber types * myosin heavy chain isoforms * image processing Subject RIV: JC - Computer Hardware ; Software Impact factor: 0.886, year: 2009

  15. Strengthening Software Authentication with the ROSE Software Suite

    International Nuclear Information System (INIS)

    White, G

    2006-01-01

    Many recent nonproliferation and arms control software projects include a software authentication regime. These include U.S. Government-sponsored projects both in the United States and in the Russian Federation (RF). This trend toward requiring software authentication is only accelerating. Demonstrating assurance that software performs as expected without hidden ''backdoors'' is crucial to a project's success. In this context, ''authentication'' is defined as determining that a software package performs only its intended purpose and performs said purpose correctly and reliably over the planned duration of an agreement. In addition to visual inspections by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs, both to aid visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary and not extensible. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool has to be based on a complete language compiler. ROSE is precisely such a compiler infrastructure developed within the Department of Energy (DOE) and targeted at the optimization of scientific applications and user-defined libraries within large-scale applications (typically applications of a million lines of code). ROSE is a robust, source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C and C++ (handling the full C, C99, C++ languages and with current collaborations to support Fortran90). We propose to extend ROSE to address a number of security-specific requirements, and apply it to software authentication for nonproliferation and arms control projects

  16. ProjectQ: An Open Source Software Framework for Quantum Computing

    OpenAIRE

    Steiger, Damian S.; Häner, Thomas; Troyer, Matthias

    2016-01-01

    We introduce ProjectQ, an open source software effort for quantum computing. The first release features a compiler framework capable of targeting various types of hardware, a high-performance simulator with emulation capabilities, and compiler plug-ins for circuit drawing and resource estimation. We introduce our Python-embedded domain-specific language, present the features, and provide example implementations for quantum algorithms. The framework allows testing of quantum algorithms through...

  17. Program Helps Design Tests Of Developmental Software

    Science.gov (United States)

    Hops, Jonathan

    1994-01-01

    Computer program called "A Formal Test Representation Language and Tool for Functional Test Designs" (TRL) provides automatic software tool and formal language used to implement category-partition method and produce specification of test cases in testing phase of development of software. Category-partition method useful in defining input, outputs, and purpose of test-design phase of development and combines benefits of choosing normal cases having error-exposing properties. Traceability maintained quite easily by creating test design for each objective in test plan. Effort to transform test cases into procedures simplified by use of automatic software tool to create cases based on test design. Method enables rapid elimination of undesired test cases from consideration and facilitates review of test designs by peer groups. Written in C language.

  18. Informatics in Radiology (infoRAD): personal computer security: part 2. Software Configuration and file protection.

    Science.gov (United States)

    Caruso, Ronald D

    2004-01-01

    Proper configuration of software security settings and proper file management are necessary and important elements of safe computer use. Unfortunately, the configuration of software security options is often not user friendly. Safe file management requires the use of several utilities, most of which are already installed on the computer or available as freeware. Among these file operations are setting passwords, defragmentation, deletion, wiping, removal of personal information, and encryption. For example, Digital Imaging and Communications in Medicine medical images need to be anonymized, or "scrubbed," to remove patient identifying information in the header section prior to their use in a public educational or research environment. The choices made with respect to computer security may affect the convenience of the computing process. Ultimately, the degree of inconvenience accepted will depend on the sensitivity of the files and communications to be protected and the tolerance of the user. Copyright RSNA, 2004

  19. SU-E-T-465: Implementation of An Automated Collision Detection Program Using Open Source Software for the Pinnacle Treatment Planning System

    Energy Technology Data Exchange (ETDEWEB)

    Tanny, S; Bogue, J; Parsai, E; Sperling, N [University of Toledo Medical Center, Toledo, OH (United States)

    2015-06-15

    Purpose: Potential collisions between the gantry head and the patient or table assembly are difficult to detect in most treatment planning systems. We have developed and implemented a novel software package for the representation of potential gantry collisions with the couch assembly at the time of treatment planning. Methods: Physical dimensions of the Varian Edge linear accelerator treatment head were measured and reproduced using the Visual Python display package. A script was developed for the Pinnacle treatment planning system to generate a file with the relevant couch, gantry, and isocenter positions for each beam in a planning trial. A python program was developed to parse the information from the TPS and produce a representative model of the couch/gantry system. Using the model and the Visual Python libraries, a rendering window is generated for each beam that allows the planner to evaluate the possibility of a collision. Results: Comparison against heuristic methods and direct verification on the machine validated the collision model generated by the software. Encounters of <1 cm between the gantry treatment head and table were visualized as collisions in our virtual model. Visual windows were created depicting the angle of collision for each beam, including the anticipated table coordinates. Visual rendering of a 6 arc trial with multiple couch positions was completed in under 1 minute, with network bandwidth being the primary bottleneck. Conclusion: The developed software allows for quick examination of possible collisions during the treatment planning process and helps to prevent major collisions prior to plan approval. The software can easily be implemented on future planning systems due to the versatility and platform independence of the Python programming language. Further integration of the software with the treatment planning system will allow the possibility of patient-gantry collision detection for a range of treatment machines.

  20. SU-E-T-465: Implementation of An Automated Collision Detection Program Using Open Source Software for the Pinnacle Treatment Planning System

    International Nuclear Information System (INIS)

    Tanny, S; Bogue, J; Parsai, E; Sperling, N

    2015-01-01

    Purpose: Potential collisions between the gantry head and the patient or table assembly are difficult to detect in most treatment planning systems. We have developed and implemented a novel software package for the representation of potential gantry collisions with the couch assembly at the time of treatment planning. Methods: Physical dimensions of the Varian Edge linear accelerator treatment head were measured and reproduced using the Visual Python display package. A script was developed for the Pinnacle treatment planning system to generate a file with the relevant couch, gantry, and isocenter positions for each beam in a planning trial. A python program was developed to parse the information from the TPS and produce a representative model of the couch/gantry system. Using the model and the Visual Python libraries, a rendering window is generated for each beam that allows the planner to evaluate the possibility of a collision. Results: Comparison against heuristic methods and direct verification on the machine validated the collision model generated by the software. Encounters of <1 cm between the gantry treatment head and table were visualized as collisions in our virtual model. Visual windows were created depicting the angle of collision for each beam, including the anticipated table coordinates. Visual rendering of a 6 arc trial with multiple couch positions was completed in under 1 minute, with network bandwidth being the primary bottleneck. Conclusion: The developed software allows for quick examination of possible collisions during the treatment planning process and helps to prevent major collisions prior to plan approval. The software can easily be implemented on future planning systems due to the versatility and platform independence of the Python programming language. Further integration of the software with the treatment planning system will allow the possibility of patient-gantry collision detection for a range of treatment machines

  1. Towards early software reliability prediction for computer forensic tools (case study).

    Science.gov (United States)

    Abu Talib, Manar

    2016-01-01

    Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.

  2. Effect of Software Designed by Computer Conceptual Map Method in Mobile Environment on Learning Level of Nursing Students

    Directory of Open Access Journals (Sweden)

    Salmani N

    2015-12-01

    Full Text Available Aims: In order to preserve its own progress, nursing training has to be utilized new training methods, in such a case that the teaching methods used by the nursing instructors enhance significant learning via preventing superficial learning in the students. Conceptual Map Method is one of the new training strategies playing important roles in the field. The aim of this study was to investigate the effectiveness of the designed software based on the mobile phone computer conceptual map on the learning level of the nursing students. Materials & Methods: In the semi-experimental study with pretest-posttest plan, 60 students, who were studying at the 5th semester, were studied at the 1st semester of 2015-16. Experimental group (n=30 from Meibod Nursing Faculty and control group (n=30 from Yazd Shahid Sadoughi Nursing Faculty were trained during the first 4 weeks of the semester, using computer conceptual map method and computer conceptual map method in mobile phone environment. Data was collected, using a researcher-made academic progress test including “knowledge” and “significant learning”. Data was analyzed in SPSS 21 software using Independent T, Paired T, and Fisher tests. Findings: There were significant increases in the mean scores of knowledge and significant learning in both groups before and after the intervention (p0.05. Nevertheless, the process of change of the scores of significant learning level between the groups was statistically significant (p<0.05.   Conclusion: Presenting the course content as conceptual map in mobile phone environment positively affects the significant learning of the nursing students.

  3. An expert system based software sizing tool, phase 2

    Science.gov (United States)

    Friedlander, David

    1990-01-01

    A software tool was developed for predicting the size of a future computer program at an early stage in its development. The system is intended to enable a user who is not expert in Software Engineering to estimate software size in lines of source code with an accuracy similar to that of an expert, based on the program's functional specifications. The project was planned as a knowledge based system with a field prototype as the goal of Phase 2 and a commercial system planned for Phase 3. The researchers used techniques from Artificial Intelligence and knowledge from human experts and existing software from NASA's COSMIC database. They devised a classification scheme for the software specifications, and a small set of generic software components that represent complexity and apply to large classes of programs. The specifications are converted to generic components by a set of rules and the generic components are input to a nonlinear sizing function which makes the final prediction. The system developed for this project predicted code sizes from the database with a bias factor of 1.06 and a fluctuation factor of 1.77, an accuracy similar to that of human experts but without their significant optimistic bias.

  4. Development of software for computing forming information using a component based approach

    Directory of Open Access Journals (Sweden)

    Kwang Hee Ko

    2009-12-01

    Full Text Available In shipbuilding industry, the manufacturing technology has advanced at an unprecedented pace for the last decade. As a result, many automatic systems for cutting, welding, etc. have been developed and employed in the manufacturing process and accordingly the productivity has been increased drastically. Despite such improvement in the manufacturing technology, however, development of an automatic system for fabricating a curved hull plate remains at the beginning stage since hardware and software for the automation of the curved hull fabrication process should be developed differently depending on the dimensions of plates, forming methods and manufacturing processes of each shipyard. To deal with this problem, it is necessary to create a “plug-in” framework, which can adopt various kinds of hardware and software to construct a full automatic fabrication system. In this paper, a framework for automatic fabrication of curved hull plates is proposed, which consists of four components and related software. In particular the software module for computing fabrication information is developed by using the ooCBD development methodology, which can interface with other hardware and software with minimum effort. Examples of the proposed framework applied to medium and large shipyards are presented.

  5. Dosimetric verification of a software for planning of radio therapeutical treatments

    International Nuclear Information System (INIS)

    Alfonso, R.; Huerta, U.; Alfonso, J.L.; Torres, M.

    1995-01-01

    A software for radiation treatment planning was recently developed by medical physicists at the Hermanos Ameijeiras Hospital in Havana. Selected locations in head and neck region were used to evaluate the reliability of calculated dose distributions in patients, taking as a reference the results of dosimetric measurements with TLD-700 powder in a RANDO type phantom. The different options is shown. Causes of discrepancies are analyzed and recommendations are made for the use of data acquisitions options

  6. Multimodality image registration with software: state-of-the-art

    Energy Technology Data Exchange (ETDEWEB)

    Slomka, Piotr J. [Cedars-Sinai Medical Center, AIM Program/Department of Imaging, Los Angeles, CA (United States); University of California, David Geffen School of Medicine, Los Angeles, CA (United States); Baum, Richard P. [Center for PET, Department of Nuclear Medicine, Bad Berka (Germany)

    2009-03-15

    Multimodality image integration of functional and anatomical data can be performed by means of dedicated hybrid imaging systems or by software image co-registration techniques. Hybrid positron emission tomography (PET)/computed tomography (CT) systems have found wide acceptance in oncological imaging, while software registration techniques have a significant role in patient-specific, cost-effective, and radiation dose-effective application of integrated imaging. Software techniques allow accurate (2-3 mm) rigid image registration of brain PET with CT and MRI. Nonlinear techniques are used in whole-body image registration, and recent developments allow for significantly accelerated computing times. Nonlinear software registration of PET with CT or MRI is required for multimodality radiation planning. Difficulties remain in the validation of nonlinear registration of soft tissue organs. The utilization of software-based multimodality image integration in a clinical environment is sometimes hindered by the lack of appropriate picture archiving and communication systems (PACS) infrastructure needed to efficiently and automatically integrate all available images into one common database. In cardiology applications, multimodality PET/single photon emission computed tomography and coronary CT angiography imaging is typically not required unless the results of one of the tests are equivocal. Software image registration is likely to be used in a complementary fashion with hybrid PET/CT or PET/magnetic resonance imaging systems. Software registration of stand-alone scans ''paved the way'' for the clinical application of hybrid scanners, demonstrating practical benefits of image integration before the hybrid dual-modality devices were available. (orig.)

  7. An IMRT dose distribution study using commercial verification software

    International Nuclear Information System (INIS)

    Grace, M.; Liu, G.; Fernando, W.; Rykers, K.

    2004-01-01

    Full text: The introduction of IMRT requires users to confirm that the isodose distributions and relative doses calculated by their planning system match the doses delivered by their linear accelerators. To this end the commercially available software, VeriSoft TM (PTW-Freiburg, Germany) was trialled to determine if the tools and functions it offered would be of benefit to this process. The CMS Xio (Computer Medical System) treatment planning system was used to generate IMRT plans that were delivered with an upgraded Elekta SL15 linac. Kodak EDR2 film sandwiched in RW3 solid water (PTW-Freiburg, Germany) was used to measure the IMRT fields delivered with 6 MV photons. The isodose and profiles measured with the film generally agreed to within ± 3% or ± 3 mm with the planned doses, in some regions (outside the IMRT field) the match fell to within ± 5%. The isodose distributions of the planning system and the film could be compared on screen and allows for electronic records of the comparison to be kept if so desired. The features and versatility of this software has been of benefit to our IMRT QA program. Furthermore, the VeriSoft TM software allows for quick and accurate, automated planar film analysis.Copyright (2004) Australasian College of Physical Scientists and Engineers in Medicine

  8. Computer Security Incident Response Planning at Nuclear Facilities

    International Nuclear Information System (INIS)

    2016-06-01

    The purpose of this publication is to assist Member States in developing comprehensive contingency plans for computer security incidents with the potential to impact nuclear security and/or nuclear safety. It provides an outline and recommendations for establishing a computer security incident response capability as part of a computer security programme, and considers the roles and responsibilities of the system owner, operator, competent authority, and national technical authority in responding to a computer security incident with possible nuclear security repercussions

  9. Programming Languages or Generic Software Tools, for Beginners' Courses in Computer Literacy?

    Science.gov (United States)

    Neuwirth, Erich

    1987-01-01

    Discussion of methods that can be used to teach beginner courses in computer literacy focuses on students aged 10-12. The value of using a programing language versus using a generic software package is highlighted; Logo and Prolog are reviewed; and the use of databases is discussed. (LRW)

  10. CMS Computing Software and Analysis Challenge 2006

    Energy Technology Data Exchange (ETDEWEB)

    De Filippis, N. [Dipartimento interateneo di Fisica M. Merlin and INFN Bari, Via Amendola 173, 70126 Bari (Italy)

    2007-10-15

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work.

  11. CMS Computing Software and Analysis Challenge 2006

    International Nuclear Information System (INIS)

    De Filippis, N.

    2007-01-01

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work

  12. The future of commodity computing and many-core versus the interests of HEP software

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    As the mainstream computing world has shifted from multi-core to many-core platforms, the situation for software developers has changed as well. With the numerous hardware and software options available, choices balancing programmability and performance are becoming a significant challenge. The expanding multiplicative dimensions of performance offer a growing number of possibilities that need to be assessed and addressed on several levels of abstraction. This paper reviews the major tradeoffs forced upon the software domain by the changing landscape of parallel technologies – hardware and software alike. Recent developments, paradigms and techniques are considered with respect to their impact on the rather traditional HEP programming models. Other considerations addressed include aspects of efficiency and reasonably achievable targets for the parallelization of large scale HEP workloads.

  13. Estimating boiling water reactor decommissioning costs. A user's manual for the BWR Cost Estimating Computer Program (CECP) software: Draft report for comment

    International Nuclear Information System (INIS)

    Bierschbach, M.C.

    1994-12-01

    With the issuance of the Decommissioning Rule (July 27, 1988), nuclear power plant licensees are required to submit to the U.S. Regulatory Commission (NRC) for review, decommissioning plans and cost estimates. This user's manual and the accompanying Cost Estimating Computer Program (CECP) software provide a cost-calculating methodology to the NRC staff that will assist them in assessing the adequacy of the licensee submittals. The CECP, designed to be used on a personal computer, provides estimates for the cost of decommissioning BWR power stations to the point of license termination. Such cost estimates include component, piping, and equipment removal costs; packaging costs; decontamination costs; transportation costs; burial costs; and manpower costs. In addition to costs, the CECP also calculates burial volumes, person-hours, crew-hours, and exposure person-hours associated with decommissioning

  14. Overdose problem associated with treatment planning software for high energy photons in response of Panama's accident.

    Science.gov (United States)

    Attalla, Ehab M; Lotayef, Mohamed M; Khalil, Ehab M; El-Hosiny, Hesham A; Nazmy, Mohamed S

    2007-06-01

    The purpose of this study was to quantify dose distribution errors by comparing actual dose measurements with the calculated values done by the software. To evaluate the outcome of radiation overexposure related to Panama's accident and in response to ensure that the treatment planning systems (T.P.S.) are being operated in accordance with the appropriate quality assurance programme, we studied the central axis and pripheral depth dose data using complex field shaped with blocks to quantify dose distribution errors. Multidata T.P.S. software versions 2.35 and 2.40 and Helax T.P.S. software version 5.1 B were assesed. The calculated data of the software treatment planning systems were verified by comparing these data with the actual dose measurements for open and blocked high energy photon fields (Co-60, 6MV & 18MV photons). Close calculated and measured results were obtained for the 2-D (Multidata) and 3-D treatment planning (TMS Helax). These results were correct within 1 to 2% for open fields and 0.5 to 2.5% for peripheral blocked fields. Discrepancies between calculated and measured data ranged between 13. to 36% along the central axis of complex blocked fields when normalisation point was selected at the Dmax, when the normalisation point was selected near or under the blocks, the variation between the calculated and the measured data was up to 500% difference. The present results emphasize the importance of the proper selection of the normalization point in the radiation field, as this facilitates detection of aberrant dose distribution (over exposure or under exposure).

  15. The Model of the Software Running on a Computer Equipment Hardware Included in the Grid network

    Directory of Open Access Journals (Sweden)

    T. A. Mityushkina

    2012-12-01

    Full Text Available A new approach to building a cloud computing environment using Grid networks is proposed in this paper. The authors describe the functional capabilities, algorithm, model of software running on a computer equipment hardware included in the Grid network, that will allow to implement cloud computing environment using Grid technologies.

  16. Predicting Forearm Physical Exposures During Computer Work Using Self-Reports, Software-Recorded Computer Usage Patterns, and Anthropometric and Workstation Measurements.

    Science.gov (United States)

    Huysmans, Maaike A; Eijckelhof, Belinda H W; Garza, Jennifer L Bruno; Coenen, Pieter; Blatter, Birgitte M; Johnson, Peter W; van Dieën, Jaap H; van der Beek, Allard J; Dennerlein, Jack T

    2017-12-15

    Alternative techniques to assess physical exposures, such as prediction models, could facilitate more efficient epidemiological assessments in future large cohort studies examining physical exposures in relation to work-related musculoskeletal symptoms. The aim of this study was to evaluate two types of models that predict arm-wrist-hand physical exposures (i.e. muscle activity, wrist postures and kinematics, and keyboard and mouse forces) during computer use, which only differed with respect to the candidate predicting variables; (i) a full set of predicting variables, including self-reported factors, software-recorded computer usage patterns, and worksite measurements of anthropometrics and workstation set-up (full models); and (ii) a practical set of predicting variables, only including the self-reported factors and software-recorded computer usage patterns, that are relatively easy to assess (practical models). Prediction models were build using data from a field study among 117 office workers who were symptom-free at the time of measurement. Arm-wrist-hand physical exposures were measured for approximately two hours while workers performed their own computer work. Each worker's anthropometry and workstation set-up were measured by an experimenter, computer usage patterns were recorded using software and self-reported factors (including individual factors, job characteristics, computer work behaviours, psychosocial factors, workstation set-up characteristics, and leisure-time activities) were collected by an online questionnaire. We determined the predictive quality of the models in terms of R2 and root mean squared (RMS) values and exposure classification agreement to low-, medium-, and high-exposure categories (in the practical model only). The full models had R2 values that ranged from 0.16 to 0.80, whereas for the practical models values ranged from 0.05 to 0.43. Interquartile ranges were not that different for the two models, indicating that only for some

  17. Dosimetric validation for an automatic brain metastases planning software using single-isocenter dynamic conformal arcsDosimetric validation for an automatic brain metastases planning software using single-isocenter dynamic conformal arcs.

    Science.gov (United States)

    Liu, Haisong; Li, Jun; Pappas, Evangelos; Andrews, David; Evans, James; Werner-Wasik, Maria; Yu, Yan; Dicker, Adam; Shi, Wenyin

    2016-09-08

    An automatic brain-metastases planning (ABMP) software has been installed in our institution. It is dedicated for treating multiple brain metastases with radiosurgery on linear accelerators (linacs) using a single-setup isocenter with noncoplanar dynamic conformal arcs. This study is to validate the calculated absolute dose and dose distribution of ABMP. Three types of measurements were performed to validate the planning software: 1, dual micro ion chambers were used with an acrylic phantom to measure the absolute dose; 2, a 3D cylindrical phantom with dual diode array was used to evaluate 2D dose distribution and point dose for smaller targets; and 3, a 3D pseudo-in vivo patient-specific phantom filled with polymer gels was used to evaluate the accuracy of 3D dose distribution and radia-tion delivery. Micro chamber measurement of two targets (volumes of 1.2 cc and 0.9 cc, respectively) showed that the percentage differences of the absolute dose at both targets were less than 1%. Averaged GI passing rate of five different plans measured with the diode array phantom was above 98%, using criteria of 3% dose difference, 1 mm distance to agreement (DTA), and 10% low-dose threshold. 3D gel phantom measurement results demonstrated a 3D displacement of nine targets of 0.7 ± 0.4 mm (range 0.2 ~ 1.1 mm). The averaged two-dimensional (2D) GI passing rate for several region of interests (ROI) on axial slices that encompass each one of the nine targets was above 98% (5% dose difference, 2 mm DTA, and 10% low-dose threshold). Measured D95, the minimum dose that covers 95% of the target volume, of the nine targets was 0.7% less than the calculated D95. Three different types of dosimetric verification methods were used and proved the dose calculation of the new automatic brain metastases planning (ABMP) software was clinical acceptable. The 3D pseudo-in vivo patient-specific gel phantom test also served as an end-to-end test for validating not only the dose calculation, but the

  18. Goethe Gossips with Grass: Using Computer Chatting Software in an Introductory Literature Course.

    Science.gov (United States)

    Fraser, Catherine C.

    1999-01-01

    Students in a third-year introduction to German literature course chatted over networked computers, using "FirstClass" software. A brief description of the course design is provided with detailed information on how the three chat sessions were organized. (Author/VWL)

  19. Improvement of Computer Software Quality through Software Automated Tools.

    Science.gov (United States)

    1986-08-30

    information that are returned from the tools to the human user, and the forms in which these outputs are presented. Page 2 of 4 STAGE OF DEVELOPMENT: What... AUTOMIATED SOFTWARE TOOL MONITORING SYSTEM APPENDIX 2 2-1 INTRODUCTION This document and Automated Software Tool Monitoring Program (Appendix 1) are...t Output Output features provide links from the tool to both the human user and the target machine (where applicable). They describe the types

  20. X-ray image processing software for computing object size and object location coordinates from acquired optical and x-ray images

    International Nuclear Information System (INIS)

    Tiwari, Akash; Tiwari, Shyam Sunder; Tiwari, Railesha; Panday, Lokesh; Panday, Jeet; Suri, Nitin

    2004-01-01

    X-ray and Visible image data processing software has been developed in Visual Basic for real time online and offline image information processing for NDT and Medical Applications. Software computes two dimension image size parameters from its sharp boundary lines by raster scanning the image contrast data. Code accepts bit map image data and hunts for multiple tumors of different sizes that may be present in the image definition and then computes size of each tumor and locates its approximate center for registering its location coordinates. Presence of foreign metal and glass balls industrial product such as chocolate and other food items imaged out using x-ray imaging technique are detected by the software and their size and position co-ordinates are computed by the software. Paper discusses ways and means to compute size and coordinated of air bubble like objects present in the x-ray and optical images and their multiple existences in image of interest. (author)