WorldWideScience

Sample records for general purpose software

  1. General-purpose software for science technology calculation

    International Nuclear Information System (INIS)

    Aikawa, Hiroshi

    1999-01-01

    We have developed many general-purpose softwares for parallel processing of science technology calculation. This paper reported six softwares such as STA (Seamless Thinking Aid) basic soft, parallel numerical computation library, grid formation software for parallel computer, real-time visualizing system, parallel benchmark test system and object-oriented parallel programing method. STA is a user interface software to perform a total environment for parallel programing, a network computing environment for various parallel computers and a desktop computing environment via Web. Some examples using the above softwares are explained. One of them is a simultaneous parallel calculation of both analysis of flow and structure of supersonic transport to design of them. The other is various kinds of computer parallel calculations for nuclear fusion reaction such as a molecular dynamic calculation and a calculation of reactor structure and fluid. These softs are opened to the public by the home page {http://guide.tokai.jaeri.go.jp/ccse/}. (S.Y.)

  2. Software design of a general purpose data acquisition and control executive

    International Nuclear Information System (INIS)

    Labiak, W.G.; Minor, E.G.

    1981-01-01

    The software design of an executive which performs general purpose data acquisition, monitoring, and control is presented. The executive runs on a memory-based mini or micro-computer and communicates with a disk-based computer where data analysis and display are done. The executive design stresses reliability and versatility, and has yielded software which can provide control and monitoring for widely different hardware systems. Applications of this software on two major fusion energy experiments at Lawrence Livermore National Laboratory will be described

  3. Survey of advanced general-purpose software for robot manipulators

    International Nuclear Information System (INIS)

    Latombe, J.C.

    1983-01-01

    Computer-controlled sensor-based robots will more and more common in industry. This paper attempts to survey the main trends of the development of advanced general-purpose software for robot manipulators. It is intended to make clear that robots are not only mechanical devices. They are truly programmable machines, and their programming, which occurs in an imperfectly modelled world,is somewhat different from conventional computer programming. (orig.)

  4. CLOUDCLOUD : general-purpose instrument monitoring and data managing software

    Science.gov (United States)

    Dias, António; Amorim, António; Tomé, António

    2016-04-01

    An effective experiment is dependent on the ability to store and deliver data and information to all participant parties regardless of their degree of involvement in the specific parts that make the experiment a whole. Having fast, efficient and ubiquitous access to data will increase visibility and discussion, such that the outcome will have already been reviewed several times, strengthening the conclusions. The CLOUD project aims at providing users with a general purpose data acquisition, management and instrument monitoring platform that is fast, easy to use, lightweight and accessible to all participants of an experiment. This work is now implemented in the CLOUD experiment at CERN and will be fully integrated with the experiment as of 2016. Despite being used in an experiment of the scale of CLOUD, this software can also be used in any size of experiment or monitoring station, from single computers to large networks of computers to monitor any sort of instrument output without influencing the individual instrument's DAQ. Instrument data and meta data is stored and accessed via a specially designed database architecture and any type of instrument output is accepted using our continuously growing parsing application. Multiple databases can be used to separate different data taking periods or a single database can be used if for instance an experiment is continuous. A simple web-based application gives the user total control over the monitored instruments and their data, allowing data visualization and download, upload of processed data and the ability to edit existing instruments or add new instruments to the experiment. When in a network, new computers are immediately recognized and added to the system and are able to monitor instruments connected to them. Automatic computer integration is achieved by a locally running python-based parsing agent that communicates with a main server application guaranteeing that all instruments assigned to that computer are

  5. Development of general-purpose software to analyze the static thermal characteristic of nuclear power plant

    International Nuclear Information System (INIS)

    Nakao, Yoshinobu; Koda, Eiichi; Takahashi, Toru

    2009-01-01

    We have developed the general-purpose software by which static thermal characteristic of the power generation system is analyzed easily. This software has the notable features as follows. It has the new algorithm to solve non-linear simultaneous equations to analyze the static thermal characteristics such as heat and mass balance, efficiencies, etc. of various power generation systems. It has the flexibility for setting calculation conditions. It is able to be executed on the personal computer easily and quickly. We ensured that it is able to construct heat and mass balance diagrams of main steam system of nuclear power plant and calculate the power output and efficiencies of the system. Furthermore, we evaluated various heat recovery measures of steam generator blowdown water and found that this software could be a useful operation aid for planning effective changes in support of power stretch. (author)

  6. General-Purpose Software For Computer Graphics

    Science.gov (United States)

    Rogers, Joseph E.

    1992-01-01

    NASA Device Independent Graphics Library (NASADIG) is general-purpose computer-graphics package for computer-based engineering and management applications which gives opportunity to translate data into effective graphical displays for presentation. Features include two- and three-dimensional plotting, spline and polynomial interpolation, control of blanking of areas, multiple log and/or linear axes, control of legends and text, control of thicknesses of curves, and multiple text fonts. Included are subroutines for definition of areas and axes of plots; setup and display of text; blanking of areas; setup of style, interpolation, and plotting of lines; control of patterns and of shading of colors; control of legends, blocks of text, and characters; initialization of devices; and setting of mixed alphabets. Written in FORTRAN 77.

  7. DIDACTIC PRINCIPLES AND PSYCHOLOGICAL CHARACTERISTICS IN DEFINITION OF QUALITY OF SOFTWARE TOOLS FOR EDUCATIONAL PURPOSE IN THE GENERAL EDUCATIONAL ENVIRONMENT OF UKRAINE

    Directory of Open Access Journals (Sweden)

    Maryna V. Pirko

    2011-02-01

    Full Text Available The fundamental feature of economy of postindustrial society is the knowledge that represents the basic source of competitive advantage. In the article the circle of didactic, psychological indicators in researches of problems of achievement of a high degree of quality of education and educational services is considered and described. The attention is paid to pedagogical requirements of the given period which are a standard substantiation in orientations for quality estimation of software tools for educational purpose of the general educational environment in Ukraine. The scheme of internal model of maintenance of quality of software tools for educational purpose is considered, the aspects integrated by internal model of quality of software for educational purpose are listed. The article describes the directions of researches in the conditions of formation of the global international educational environment and uniform information space of  education system taking into account the growth of availability of educational services. It is specified the main principles in the organization of pedagogical software tools.

  8. Open-source implementation of an ad-hoc IEEE802.11a/g/p software-defined radio on low-power and low-cost general purpose processors

    Directory of Open Access Journals (Sweden)

    S. Ciccia

    2017-12-01

    Full Text Available This work proposes a low-cost and low-power software-defined radio open-source platform with IEEE 802.11 a/g/p wireless communication capability. A state-of-the-art version of the IEEE 802.11 a/g/p software for GNU Radio (a free and open-source software development framework is available online, but we show here that its computational complexity prevents operations in low-power general purpose processors, even at throughputs below the standard. We therefore propose an evolution of this software that achieves a faster and lighter IEEE 802.11 a/g/p transmitter and receiver, suitable for low-power general purpose processors, for which GNU Radio provides very limited support; we discuss and describe the software radio processing structuring that is necessary to achieve the goal, providing a review of signal processing techniques. In particular, we emphasize the advanced reduced-instruction set (RISC machine (ARM study case, for which we also optimize some of the processing libraries. The presented software will remain open-source.

  9. Generalized Fluid System Simulation Program (GFSSP) Version 6 - General Purpose Thermo-Fluid Network Analysis Software

    Science.gov (United States)

    Majumdar, Alok; Leclair, Andre; Moore, Ric; Schallhorn, Paul

    2011-01-01

    GFSSP stands for Generalized Fluid System Simulation Program. It is a general-purpose computer program to compute pressure, temperature and flow distribution in a flow network. GFSSP calculates pressure, temperature, and concentrations at nodes and calculates flow rates through branches. It was primarily developed to analyze Internal Flow Analysis of a Turbopump Transient Flow Analysis of a Propulsion System. GFSSP development started in 1994 with an objective to provide a generalized and easy to use flow analysis tool for thermo-fluid systems.

  10. Design of a general purpose (RS-232C) analog-to-digital data converter

    International Nuclear Information System (INIS)

    Ali, Q.

    1995-01-01

    The purpose of this project is to design a general purpose hardware that interfaces analog devices with any desirable computer supporting the RS-232 interface. The hardware incorporates bidirectional data transmission at 1,200 bps, 2,400 bps, 4800 bps, 9,600 bps, 19,200 pbs and 38400 bps. The communication / processing software has been written in C language that incorporates the idea of the potability of the software from one environment to the other. (author)

  11. General Purpose Crate (GPC) for control applications

    International Nuclear Information System (INIS)

    Singh, Kundan; Munda, Deepak K.; Jain, Mamta; Archunan, M.; Barua, P.; Ajith Kumar, B.P.

    2011-01-01

    A General Purpose Crate (GPC) capable of handling digital and analog Inputs/Outputs signals has been developed at Inter University Accelerator Centre (IUAC), New Delhi, for accelerator control system applications. The system includes back-plane bus with on board plugged-in single board computer with PC104 and Ethernet interface, running Linux operating system. The bus control logic is designed on the back-plane pcb itself, making the system more rugged. The various types of digital and analog input/output modules can be plugged into the back plane bus randomly with standard euro connectors, which provides highly reliable and dust free contacts. Maximum eight modules can be inserted into the crate. The total power consumption for various types of modules and back-plane controller is approximately 50 watts. The multi-output DC power supply from COSEL has been used in the crate. The general purpose crate is software compatible with the CAMAC crates used in the accelerator control system. (author)

  12. Design method of general-purpose driving circuit for CCD based on CPLD

    International Nuclear Information System (INIS)

    Zhang Yong; Tang Benqi; Xiao Zhigang; Wang Zujun; Huang Shaoyan

    2005-01-01

    It is very important for studying the radiation damage effects and mechanism systematically about CCD to develop a general-purpose test platform. The paper discusses the design method of general-purpose driving circuit for CCD based on CPLD and the realization approach. A main controller has being designed to read the data file from the outer memory, setup the correlative parameter registers and produce the driving pulses according with parameter request strictly, which is based on MAX7000S by using MAX-PLUS II software. The basic driving circuit module has being finished based on this method. The output waveform of the module is the same figure as the simulation waveform. The result indicates that the design method is feasible. (authors)

  13. The ATLAS Trigger Algorithms for General Purpose Graphics Processor Units

    CERN Document Server

    Tavares Delgado, Ademar; The ATLAS collaboration

    2016-01-01

    The ATLAS Trigger Algorithms for General Purpose Graphics Processor Units Type: Talk Abstract: We present the ATLAS Trigger algorithms developed to exploit General­ Purpose Graphics Processor Units. ATLAS is a particle physics experiment located on the LHC collider at CERN. The ATLAS Trigger system has two levels, hardware-­based Level 1 and the High Level Trigger implemented in software running on a farm of commodity CPU. Performing the trigger event selection within the available farm resources presents a significant challenge that will increase future LHC upgrades. are being evaluated as a potential solution for trigger algorithms acceleration. Key factors determining the potential benefit of this new technology are the relative execution speedup, the number of GPUs required and the relative financial cost of the selected GPU. We have developed a trigger demonstrator which includes algorithms for reconstructing tracks in the Inner Detector and Muon Spectrometer and clusters of energy deposited in the Cal...

  14. A Real-Time Programmer's Tour of General-Purpose L4 Microkernels

    OpenAIRE

    Ruocco Sergio

    2008-01-01

    Abstract L4-embedded is a microkernel successfully deployed in mobile devices with soft real-time requirements. It now faces the challenges of tightly integrated systems, in which user interface, multimedia, OS, wireless protocols, and even software-defined radios must run on a single CPU. In this paper we discuss the pros and cons of L4-embedded for real-time systems design, focusing on the issues caused by the extreme speed optimisations it inherited from its general-purpose ancestors. Sinc...

  15. A Real-Time Programmer's Tour of General-Purpose L4 Microkernels

    OpenAIRE

    Sergio Ruocco

    2008-01-01

    L4-embedded is a microkernel successfully deployed in mobile devices with soft real-time requirements. It now faces the challenges of tightly integrated systems, in which user interface, multimedia, OS, wireless protocols, and even software-defined radios must run on a single CPU. In this paper we discuss the pros and cons of L4-embedded for real-time systems design, focusing on the issues caused by the extreme speed optimisations it inherited from its general-purpose ancestors. Since these i...

  16. 10 CFR 205.350 - General purpose.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false General purpose. 205.350 Section 205.350 Energy DEPARTMENT OF ENERGY OIL ADMINISTRATIVE PROCEDURES AND SANCTIONS Electric Power System Permits and Reports....350 General purpose. The purpose of this rule is to establish a procedure for the Office of...

  17. 7 CFR 254.1 - General purpose.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 4 2010-01-01 2010-01-01 false General purpose. 254.1 Section 254.1 Agriculture... GENERAL REGULATIONS AND POLICIES-FOOD DISTRIBUTION ADMINISTRATION OF THE FOOD DISTRIBUTION PROGRAM FOR INDIAN HOUSEHOLDS IN OKLAHOMA § 254.1 General purpose. This part sets the requirement under which...

  18. 12 CFR 1703.31 - General purposes.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false General purposes. 1703.31 Section 1703.31 Banks and Banking OFFICE OF FEDERAL HOUSING ENTERPRISE OVERSIGHT, DEPARTMENT OF HOUSING AND URBAN... Legal Proceedings in Which OFHEO Is Not a Named Party § 1703.31 General purposes. The purposes of this...

  19. 22 CFR 309.1 - General purpose.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 2 2010-04-01 2010-04-01 true General purpose. 309.1 Section 309.1 Foreign Relations PEACE CORPS DEBT COLLECTION General Provisions § 309.1 General purpose. This part prescribes the procedures to be used by the United States Peace Corps (Peace Corps) in the collection and/or disposal of non...

  20. Developing wearable bio-feedback systems: a general-purpose platform.

    Science.gov (United States)

    Bianchi, Luigi; Babiloni, Fabio; Cincotti, Febo; Arrivas, Marco; Bollero, Patrizio; Marciani, Maria Grazia

    2003-06-01

    Microprocessors, even those in PocketPCs, have adequate power for many real-time biofeedback applications for disabled people. This power allows design of portable or wearable devices that are smaller and lighter, and that have longer battery life compared to notebook-based systems. In this paper, we discuss a general-purpose hardware/software solution based on industrial or consumer devices and a C++ framework. Its flexibility and modularity make it adaptable to a wide range of situations. Moreover, its design minimizes system requirements and programming effort, thus allowing efficient systems to be built quickly and easily. Our design has been used to build two brain computer interface systems that were easily ported from the Win32 platform.

  1. A General Water Resources Regulation Software System in China

    Science.gov (United States)

    LEI, X.

    2017-12-01

    To avoid iterative development of core modules in water resource normal regulation and emergency regulation and improve the capability of maintenance and optimization upgrading of regulation models and business logics, a general water resources regulation software framework was developed based on the collection and analysis of common demands for water resources regulation and emergency management. It can provide a customizable, secondary developed and extensible software framework for the three-level platform "MWR-Basin-Province". Meanwhile, this general software system can realize business collaboration and information sharing of water resources regulation schemes among the three-level platforms, so as to improve the decision-making ability of national water resources regulation. There are four main modules involved in the general software system: 1) A complete set of general water resources regulation modules allows secondary developer to custom-develop water resources regulation decision-making systems; 2) A complete set of model base and model computing software released in the form of Cloud services; 3) A complete set of tools to build the concept map and model system of basin water resources regulation, as well as a model management system to calibrate and configure model parameters; 4) A database which satisfies business functions and functional requirements of general water resources regulation software can finally provide technical support for building basin or regional water resources regulation models.

  2. Selecting a general-purpose data compression algorithm

    Science.gov (United States)

    Mathews, Gary Jason

    1995-01-01

    The National Space Science Data Center's Common Data Formate (CDF) is capable of storing many types of data such as scalar data items, vectors, and multidimensional arrays of bytes, integers, or floating point values. However, regardless of the dimensionality and data type, the data break down into a sequence of bytes that can be fed into a data compression function to reduce the amount of data without losing data integrity and thus remaining fully reconstructible. Because of the diversity of data types and high performance speed requirements, a general-purpose, fast, simple data compression algorithm is required to incorporate data compression into CDF. The questions to ask are how to evaluate and compare compression algorithms, and what compression algorithm meets all requirements. The object of this paper is to address these questions and determine the most appropriate compression algorithm to use within the CDF data management package that would be applicable to other software packages with similar data compression needs.

  3. Multi­-Threaded Algorithms for General purpose Graphics Processor Units in the ATLAS High Level Trigger

    CERN Document Server

    Conde Mui\\~no, Patricia; The ATLAS collaboration

    2016-01-01

    General purpose Graphics Processor Units (GPGPU) are being evaluated for possible future inclusion in an upgraded ATLAS High Level Trigger farm. We have developed a demonstrator including GPGPU implementations of Inner Detector and Muon tracking and Calorimeter clustering within the ATLAS software framework. ATLAS is a general purpose particle physics experiment located on the LHC collider at CERN. The ATLAS Trigger system consists of two levels, with level 1 implemented in hardware and the High Level Trigger implemented in software running on a farm of commodity CPU. The High Level Trigger reduces the trigger rate from the 100 kHz level 1 acceptance rate to 1 kHz for recording, requiring an average per­-event processing time of ~250 ms for this task. The selection in the high level trigger is based on reconstructing tracks in the Inner Detector and Muon Spectrometer and clusters of energy deposited in the Calorimeter. Performing this reconstruction within the available farm resources presents a significant ...

  4. General Purpose (office) Network reorganisation

    CERN Multimedia

    IT Department

    2016-01-01

    On Saturday 27 August, the IT Department’s Communication Systems group will perform a major reorganisation of CERN’s General Purpose Network.   This reorganisation will cause network interruptions on Saturday 27 August (and possibly Sunday 28 August) and will be followed by a change to the IP addresses of connected systems that will come into effect on Monday 3 October. For further details and information about the actions you may need to take, please see: https://information-technology.web.cern.ch/news/general-purpose-office-network-reorganisation.

  5. 7 CFR 2902.48 - General purpose household cleaners.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 15 2010-01-01 2010-01-01 false General purpose household cleaners. 2902.48 Section... PROCUREMENT Designated Items § 2902.48 General purpose household cleaners. (a) Definition. Products designed... procurement preference for qualifying biobased general purpose household cleaners. By that date, Federal...

  6. 47 CFR 32.6124 - General purpose computers expense.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 2 2010-10-01 2010-10-01 false General purpose computers expense. 32.6124... General purpose computers expense. This account shall include the costs of personnel whose principal job is the physical operation of general purpose computers and the maintenance of operating systems. This...

  7. General software design for multisensor data fusion

    Science.gov (United States)

    Zhang, Junliang; Zhao, Yuming

    1999-03-01

    In this paper a general method of software design for multisensor data fusion is discussed in detail, which adopts object-oriented technology under UNIX operation system. The software for multisensor data fusion is divided into six functional modules: data collection, database management, GIS, target display and alarming data simulation etc. Furthermore, the primary function, the components and some realization methods of each modular is given. The interfaces among these functional modular relations are discussed. The data exchange among each functional modular is performed by interprocess communication IPC, including message queue, semaphore and shared memory. Thus, each functional modular is executed independently, which reduces the dependence among functional modules and helps software programing and testing. This software for multisensor data fusion is designed as hierarchical structure by the inheritance character of classes. Each functional modular is abstracted and encapsulated through class structure, which avoids software redundancy and enhances readability.

  8. 7 CFR 225.1 - General purpose and scope.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 4 2010-01-01 2010-01-01 false General purpose and scope. 225.1 Section 225.1... AGRICULTURE CHILD NUTRITION PROGRAMS SUMMER FOOD SERVICE PROGRAM General § 225.1 General purpose and scope... primary purpose of the Program is to provide food service to children from needy areas during periods when...

  9. 7 CFR 2902.37 - General purpose de-icers.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 15 2010-01-01 2010-01-01 false General purpose de-icers. 2902.37 Section 2902.37... Items § 2902.37 General purpose de-icers. (a) Definition. Chemical products (e.g., salt, fluids) that... preference for qualifying biobased general purpose de-icers. By that date, Federal agencies that have the...

  10. 7 CFR 285.1 - General purpose and scope.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 4 2010-01-01 2010-01-01 false General purpose and scope. 285.1 Section 285.1... COMMONWEALTH OF PUERTO RICO § 285.1 General purpose and scope. This part describes the general terms and... government of the Commonwealth of Puerto Rico for the purpose of designing and conducting a nutrition...

  11. 7 CFR 246.1 - General purpose and scope.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 4 2010-01-01 2010-01-01 false General purpose and scope. 246.1 Section 246.1... General § 246.1 General purpose and scope. This part announces regulations under which the Secretary of... health by reason of inadequate nutrition or health care, or both. The purpose of the Program is to...

  12. Development of a General Purpose Gamification Framework

    OpenAIRE

    Vea, Eivind

    2016-01-01

    This report describes the design and implementation of a general purpose gamification framework developed in JavaScript on the Metor platform. Gamification is described as the use of game elements in none-game contexts. The purpose is to encourage and change user behaviour. Examples of existing gamification use cases and frameworks are described. A demo game shows how a general purpose framework can be used.

  13. General-purpose radiographic and fluoroscopic table

    International Nuclear Information System (INIS)

    Ishizaki, Noritaka

    1982-01-01

    A new series of diagnostic tables, Model DT-KEL, was developed for general-purpose radiographic and fluoroscopic systems. Through several investigations, the table was so constructed that the basic techniques be general radiography and GI examination, and other techniques be optionally added. The diagnostic tables involve the full series of the type for various purposes and are systematized with the surrounding equipment. A retractable mechanism of grids was adopted first for general use. The fine grids with a density of 57 lines per cm, which was adopted in KEL-2, reduced the X-ray doses by 16 percent. (author)

  14. 47 CFR 32.2124 - General purpose computers.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 2 2010-10-01 2010-10-01 false General purpose computers. 32.2124 Section 32... General purpose computers. (a) This account shall include the original cost of computers and peripheral... financial, statistical, or other business analytical reports; preparation of payroll, customer bills, and...

  15. Applications for a general purpose optical beam propagation code

    International Nuclear Information System (INIS)

    Munroe, J.L.; Wallace, N.W.

    1987-01-01

    Real world beam propagation and diffraction problems can rarely be solved by the analytical expressions commonly found in optics and lasers textbooks. These equations are typically valid only for paraxial geometries, for specific boundary conditions (e.g., infinite apertures), or for special assumptions (e.g., at focus). Numerical techniques must be used to solve the equations for the general case. LOTS, a public domain numerical beam propagation software package developed for this purpose, is a widely used and proven tool. The graphical presentation of results combined with a well designed command language make LOTS particularly user-friendly, and the recent implementation of LOTS on the IBM PC/XT family of desktop computes will make this capability available to a much larger group of users. This paper surveys several applications demonstrating the need for such a capability

  16. 21 CFR 864.4010 - General purpose reagent.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false General purpose reagent. 864.4010 Section 864.4010 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Specimen Preparation Reagents § 864.4010 General purpose...

  17. 7 CFR 250.1 - General purpose and scope.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 4 2010-01-01 2010-01-01 false General purpose and scope. 250.1 Section 250.1... AGRICULTURE GENERAL REGULATIONS AND POLICIES-FOOD DISTRIBUTION DONATION OF FOODS FOR USE IN THE UNITED STATES, ITS TERRITORIES AND POSSESSIONS AND AREAS UNDER ITS JURISDICTION General § 250.1 General purpose and...

  18. General Method of Using Bayesian Nets for a Software Reliability Assessment in Varying SW Development Life cycle

    International Nuclear Information System (INIS)

    Eom, Heung Seop; Chang, Seung Cheol

    2008-01-01

    Bayesian Net (BN) has been used in many researches to predict software defects, because it allows all the evidence to be taken into account. However one of the serious difficulties in the earlier works was that the user had to build a different BN for each software development life cycle. This limits the practical use of BN in the field. One way to solve this problem is the use of general BN templates which are not restricted to a particular software life cycle. This paper describes a method for this purpose on the strength of Object- Oriented BN (OOBN) and Dynamic BN (DBN) technique

  19. 7 CFR 277.1 - General purpose and scope.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 4 2010-01-01 2010-01-01 false General purpose and scope. 277.1 Section 277.1 Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF... AGENCIES § 277.1 General purpose and scope. (a) Purpose. This part establishes uniform requirements for the...

  20. General purpose computers in real time

    International Nuclear Information System (INIS)

    Biel, J.R.

    1989-01-01

    I see three main trends in the use of general purpose computers in real time. The first is more processing power. The second is the use of higher speed interconnects between computers (allowing more data to be delivered to the processors). The third is the use of larger programs running in the computers. Although there is still work that needs to be done, I believe that all indications are that the online need for general purpose computers should be available for the SCC and LHC machines. 2 figs

  1. 7 CFR 253.1 - General purpose and scope.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 4 2010-01-01 2010-01-01 false General purpose and scope. 253.1 Section 253.1... AGRICULTURE GENERAL REGULATIONS AND POLICIES-FOOD DISTRIBUTION ADMINISTRATION OF THE FOOD DISTRIBUTION PROGRAM FOR HOUSEHOLDS ON INDIAN RESERVATIONS § 253.1 General purpose and scope. This part describes the terms...

  2. 7 CFR 248.1 - General purpose and scope.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 4 2010-01-01 2010-01-01 false General purpose and scope. 248.1 Section 248.1... AGRICULTURE CHILD NUTRITION PROGRAMS WIC FARMERS' MARKET NUTRITION PROGRAM (FMNP) General § 248.1 General purpose and scope. This part announces regulations under which the Secretary of Agriculture shall carry...

  3. 7 CFR 251.1 - General purpose and scope.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 4 2010-01-01 2010-01-01 false General purpose and scope. 251.1 Section 251.1... AGRICULTURE GENERAL REGULATIONS AND POLICIES-FOOD DISTRIBUTION THE EMERGENCY FOOD ASSISTANCE PROGRAM § 251.1 General purpose and scope. This part announces the policies and prescribes the regulations necessary to...

  4. Software Engineering Program: Software Process Improvement Guidebook

    Science.gov (United States)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  5. 7 CFR 1485.10 - General purpose and scope.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 10 2010-01-01 2010-01-01 false General purpose and scope. 1485.10 Section 1485.10... FOREIGN MARKETS FOR AGRICULTURAL COMMODITIES Market Access Program § 1485.10 General purpose and scope. (a.../Market Access Program (EIP/MAP). It also establishes the general terms and conditions applicable to MAP...

  6. 7 CFR 215.1 - General purpose and scope.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 4 2010-01-01 2010-01-01 false General purpose and scope. 215.1 Section 215.1... AGRICULTURE CHILD NUTRITION PROGRAMS SPECIAL MILK PROGRAM FOR CHILDREN § 215.1 General purpose and scope. This part announces the policies and prescribes the general regulations with respect to the Special Milk...

  7. 7 CFR 226.1 - General purpose and scope.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 4 2010-01-01 2010-01-01 false General purpose and scope. 226.1 Section 226.1 Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE CHILD NUTRITION PROGRAMS CHILD AND ADULT CARE FOOD PROGRAM General § 226.1 General purpose and...

  8. Parallelized computation for computer simulation of electrocardiograms using personal computers with multi-core CPU and general-purpose GPU.

    Science.gov (United States)

    Shen, Wenfeng; Wei, Daming; Xu, Weimin; Zhu, Xin; Yuan, Shizhong

    2010-10-01

    Biological computations like electrocardiological modelling and simulation usually require high-performance computing environments. This paper introduces an implementation of parallel computation for computer simulation of electrocardiograms (ECGs) in a personal computer environment with an Intel CPU of Core (TM) 2 Quad Q6600 and a GPU of Geforce 8800GT, with software support by OpenMP and CUDA. It was tested in three parallelization device setups: (a) a four-core CPU without a general-purpose GPU, (b) a general-purpose GPU plus 1 core of CPU, and (c) a four-core CPU plus a general-purpose GPU. To effectively take advantage of a multi-core CPU and a general-purpose GPU, an algorithm based on load-prediction dynamic scheduling was developed and applied to setting (c). In the simulation with 1600 time steps, the speedup of the parallel computation as compared to the serial computation was 3.9 in setting (a), 16.8 in setting (b), and 20.0 in setting (c). This study demonstrates that a current PC with a multi-core CPU and a general-purpose GPU provides a good environment for parallel computations in biological modelling and simulation studies. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  9. Software architecture for a multi-purpose real-time control unit for research purposes

    Science.gov (United States)

    Epple, S.; Jung, R.; Jalba, K.; Nasui, V.

    2017-05-01

    A new, freely programmable, scalable control system for academic research purposes was developed. The intention was, to have a control unit capable of handling multiple PT1000 temperature sensors at reasonable accuracy and temperature range, as well as digital input signals and providing powerful output signals. To take full advantage of the system, control-loops are run in real time. The whole eight bit system with very limited memory runs independently of a personal computer. The two on board RS232 connectors allow to connect further units or to connect other equipment, as required in real time. This paper describes the software architecture for the third prototype that now provides stable measurements and an improvement in accuracy compared to the previous designs. As test case a thermal solar system to produce hot tap water and assist heating in a single-family house was implemented. The solar fluid pump was power-controlled and several temperatures at different points in the hydraulic system were measured and used in the control algorithms. The software architecture proved suitable to test several different control strategies and their corresponding algorithms for the thermal solar system.

  10. A Real-Time Programmer's Tour of General-Purpose L4 Microkernels

    Directory of Open Access Journals (Sweden)

    Ruocco Sergio

    2008-01-01

    Full Text Available Abstract L4-embedded is a microkernel successfully deployed in mobile devices with soft real-time requirements. It now faces the challenges of tightly integrated systems, in which user interface, multimedia, OS, wireless protocols, and even software-defined radios must run on a single CPU. In this paper we discuss the pros and cons of L4-embedded for real-time systems design, focusing on the issues caused by the extreme speed optimisations it inherited from its general-purpose ancestors. Since these issues can be addressed with a minimal performance loss, we conclude that, overall, the design of real-time systems based on L4-embedded is possible, and facilitated by a number of design features unique to microkernels and the L4 family.

  11. A Real-Time Programmer's Tour of General-Purpose L4 Microkernels

    Directory of Open Access Journals (Sweden)

    Sergio Ruocco

    2008-02-01

    Full Text Available L4-embedded is a microkernel successfully deployed in mobile devices with soft real-time requirements. It now faces the challenges of tightly integrated systems, in which user interface, multimedia, OS, wireless protocols, and even software-defined radios must run on a single CPU. In this paper we discuss the pros and cons of L4-embedded for real-time systems design, focusing on the issues caused by the extreme speed optimisations it inherited from its general-purpose ancestors. Since these issues can be addressed with a minimal performance loss, we conclude that, overall, the design of real-time systems based on L4-embedded is possible, and facilitated by a number of design features unique to microkernels and the L4 family.

  12. Radiobiology software for educational purpose

    International Nuclear Information System (INIS)

    Pandey, A.K.; Sharma, S.K.; Kumar, R.; Bal, C.S.; Nair, O.; Haresh, K.P.; Julka, P.K.

    2014-01-01

    To understand radio-nuclide therapy and the basis of radiation protection, it is essential to understand radiobiology. With limited time for classroom teaching and limited time and resources for radiobiology experiments students do not acquire firm grasp of theoretical mathematical models and experimental knowledge of target theory and Linear quadratic models that explain nature of cell survival curves. We believe that this issue might be addressed with numerical simulation of cell survival curves using mathematical models. Existing classroom teaching can be reoriented to understand the subject using the concept of modeling, simulation and virtual experiments. After completion of the lecture, students can practice with simulation tool at their convenient time. In this study we have developed software that can help the students to acquire firm grasp of theoretical and experimental radiobiology. The software was developed using FreeMat ver 4.0, open source software. Target theory, linear quadratic model, cell killing based on Poisson model have been included. The implementation of the program structure was to display the menu for the user choice to be made and then program flows depending on the users choice. The program executes by typing 'Radiobiology' on the command line interface. Students can investigate the effect of radiation dose on cell, interactively. They can practice to draw the cell survival curve based on the input and output data and they can also compare their handmade graphs with automatically generated graphs by the program. This software is in the early stage of development and will evolve on user feedback. We feel this simulation software will be quite useful for students entering in the nuclear medicine, radiology and radiotherapy disciplines. (author)

  13. Non-intrusive Instance Level Software Composition

    NARCIS (Netherlands)

    Hatun, Kardelen

    2014-01-01

    A software system is comprised of parts, which interact through shared interfaces. Certain qualities of integration, such as loose-coupling, requiring minimal changes to the software and fine-grained localisation of dependencies, have impact on the overall software quality. Current general-purpose

  14. 7 CFR 1728.10 - General purpose and scope.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 11 2010-01-01 2010-01-01 false General purpose and scope. 1728.10 Section 1728.10 Agriculture Regulations of the Department of Agriculture (Continued) RURAL UTILITIES SERVICE, DEPARTMENT OF AGRICULTURE ELECTRIC STANDARDS AND SPECIFICATIONS FOR MATERIALS AND CONSTRUCTION § 1728.10 General purpose and...

  15. Prospect on general software of Monte Carlo method

    International Nuclear Information System (INIS)

    Pei Lucheng

    1992-01-01

    This is a short paper on the prospect of Monte Carlo general software. The content consists of cluster sampling method, zero variance technique, self-improved method, and vectorized Monte Carlo method

  16. General purpose code for Monte Carlo simulations

    International Nuclear Information System (INIS)

    Wilcke, W.W.

    1983-01-01

    A general-purpose computer called MONTHY has been written to perform Monte Carlo simulations of physical systems. To achieve a high degree of flexibility the code is organized like a general purpose computer, operating on a vector describing the time dependent state of the system under simulation. The instruction set of the computer is defined by the user and is therefore adaptable to the particular problem studied. The organization of MONTHY allows iterative and conditional execution of operations

  17. 7 CFR 245.1 - General purpose and scope.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 4 2010-01-01 2010-01-01 false General purpose and scope. 245.1 Section 245.1 Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF... SCHOOLS § 245.1 General purpose and scope. (a) This part established the responsibilities of State...

  18. 7 CFR 220.1 - General purpose and scope.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 4 2010-01-01 2010-01-01 false General purpose and scope. 220.1 Section 220.1 Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE CHILD NUTRITION PROGRAMS SCHOOL BREAKFAST PROGRAM § 220.1 General purpose and scope. This part...

  19. 7 CFR 235.1 - General purpose and scope.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 4 2010-01-01 2010-01-01 false General purpose and scope. 235.1 Section 235.1 Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE CHILD NUTRITION PROGRAMS STATE ADMINISTRATIVE EXPENSE FUNDS § 235.1 General purpose and scope...

  20. 7 CFR 281.1 - General purpose and scope.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 4 2010-01-01 2010-01-01 false General purpose and scope. 281.1 Section 281.1 Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF... RESERVATIONS § 281.1 General purpose and scope. (a) These regulations govern the operation of the Food Stamp...

  1. Standalone General Purpose Data Logger Design and Implementation

    African Journals Online (AJOL)

    This paper describes the design of a general purpose data logger that is compatible with a variety of transducers, potentially permitting the measurement and recording of a wide range of phenomena. The recorded data can be retrieved to a PC via an RS-232 serial port. The standalone general purpose data logger ...

  2. Lenstronomy: Multi-purpose gravitational lens modeling software package

    Science.gov (United States)

    Birrer, Simon; Amara, Adam

    2018-04-01

    Lenstronomy is a multi-purpose open-source gravitational lens modeling python package. Lenstronomy reconstructs the lens mass and surface brightness distributions of strong lensing systems using forward modelling and supports a wide range of analytic lens and light models in arbitrary combination. The software is also able to reconstruct complex extended sources as well as point sources. Lenstronomy is flexible and numerically accurate, with a clear user interface that could be deployed across different platforms. Lenstronomy has been used to derive constraints on dark matter properties in strong lenses, measure the expansion history of the universe with time-delay cosmography, measure cosmic shear with Einstein rings, and decompose quasar and host galaxy light.

  3. Simrank: Rapid and sensitive general-purpose k-mer search tool

    Energy Technology Data Exchange (ETDEWEB)

    DeSantis, T.Z.; Keller, K.; Karaoz, U.; Alekseyenko, A.V; Singh, N.N.S.; Brodie, E.L; Pei, Z.; Andersen, G.L; Larsen, N.

    2011-04-01

    Terabyte-scale collections of string-encoded data are expected from consortia efforts such as the Human Microbiome Project (http://nihroadmap.nih.gov/hmp). Intra- and inter-project data similarity searches are enabled by rapid k-mer matching strategies. Software applications for sequence database partitioning, guide tree estimation, molecular classification and alignment acceleration have benefited from embedded k-mer searches as sub-routines. However, a rapid, general-purpose, open-source, flexible, stand-alone k-mer tool has not been available. Here we present a stand-alone utility, Simrank, which allows users to rapidly identify database strings the most similar to query strings. Performance testing of Simrank and related tools against DNA, RNA, protein and human-languages found Simrank 10X to 928X faster depending on the dataset. Simrank provides molecular ecologists with a high-throughput, open source choice for comparing large sequence sets to find similarity.

  4. General Purpose Data-Driven Online System Health Monitoring with Applications to Space Operations

    Science.gov (United States)

    Iverson, David L.; Spirkovska, Lilly; Schwabacher, Mark

    2010-01-01

    Modern space transportation and ground support system designs are becoming increasingly sophisticated and complex. Determining the health state of these systems using traditional parameter limit checking, or model-based or rule-based methods is becoming more difficult as the number of sensors and component interactions grows. Data-driven monitoring techniques have been developed to address these issues by analyzing system operations data to automatically characterize normal system behavior. System health can be monitored by comparing real-time operating data with these nominal characterizations, providing detection of anomalous data signatures indicative of system faults, failures, or precursors of significant failures. The Inductive Monitoring System (IMS) is a general purpose, data-driven system health monitoring software tool that has been successfully applied to several aerospace applications and is under evaluation for anomaly detection in vehicle and ground equipment for next generation launch systems. After an introduction to IMS application development, we discuss these NASA online monitoring applications, including the integration of IMS with complementary model-based and rule-based methods. Although the examples presented in this paper are from space operations applications, IMS is a general-purpose health-monitoring tool that is also applicable to power generation and transmission system monitoring.

  5. 46 CFR 7.1 - General purpose of boundary lines.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 1 2010-10-01 2010-10-01 false General purpose of boundary lines. 7.1 Section 7.1 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY PROCEDURES APPLICABLE TO THE PUBLIC BOUNDARY LINES General § 7.1 General purpose of boundary lines. The lines in this part delineate the application of the...

  6. General-Purpose Data Containers for Science and Engineering

    International Nuclear Information System (INIS)

    2015-01-01

    In 2012 the SG38 international committee was formed to develop a modern structure to replace the ENDF-6 format for storing evaluated nuclear reaction data on a computer system. This committee divided the project into seven tasks. One of these tasks, the design of General-Purpose Data Containers (GPDCs), is described in this article. What type of data does SG38 need to store and why is the task called General-Purpose Data Containers? The most common types of data in an evaluated nuclear reaction database are representations of physical functions in tabulated forms. There is also a need to store 1-dimensional functions using truncated Legendre or polynomial (or others) expansions. The phrase General-Purpose implies that the containers are to be designed to store generic forms of tabulated data rather than one for each physical function. Also, where possible, it would be beneficial to design containers that can store data forms not currently used in evaluated nuclear database or at least be easily extended. In addition to containers for storing physical functions as tabulated data, other types of containers are needed. There exists a desire within SG38 to support the storage of documentation at various levels within an evaluated file. Containers for storing non-functional data (e.g., a list of numbers) as well as units and labels for axes are also needed. Herein, containers for storing physical functions are called functional containers. One of the goals for the general-purpose data containers task is to design containers that will be useful to other scientific and engineering applications. To meet this goal, task members should think outside of the immediate needs of evaluated nuclear data to ensure that the containers are general- purpose rather than simply repackaged versions of existing containers. While the examples in this article may be specific to nuclear reaction data, it is hoped that the end product will be useful for other applications. To this end, some

  7. EFOM 12C software: general overview. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Jadot, P; Fuchsova, J; Vankelecom, E; Van der Voort, E; Thonet, C

    1981-01-01

    A logic manual defining the general philosophy of the EC-12C software is presented. The guidelines used to develop the Energy Data Base and the programs of the energy flow models are: within the frame of some basic conventions, the data base structure and software should be as independent as possible from the energy system representation; utilization of the models should be user-friendly; as data has to be collected and manipulated by various national expert teams, extensive data consistency checks and appropriate error messages should be included in the software to support the data validation process; the various energy flow models should be integrated; the outputs of the programs should be user-controlled; and the scope of the study is under user's control. As a result of those guidelines, an integrated set of software composed of DAMOCLES (Data Base Management System); SIML (simulation program suitable for data analysis and description study); and ORESTE EDISON (LP matrix generator and report writer) was developed. These are described.

  8. Inventory of safeguards software

    International Nuclear Information System (INIS)

    Suzuki, Mitsutoshi; Horino, Koichi

    2009-03-01

    The purpose of this survey activity will serve as a basis for determining what needs may exist in this arena for development of next-generation safeguards systems and approaches. 23 software tools are surveyed by JAEA and NMCC. Exchanging information regarding existing software tools for safeguards and discussing about a next R and D program of developing a general-purpose safeguards tool should be beneficial to a safeguards system design and indispensable to evaluate a safeguards system for future nuclear fuel facilities. (author)

  9. General purpose programmable accelerator board

    Science.gov (United States)

    Robertson, Perry J.; Witzke, Edward L.

    2001-01-01

    A general purpose accelerator board and acceleration method comprising use of: one or more programmable logic devices; a plurality of memory blocks; bus interface for communicating data between the memory blocks and devices external to the board; and dynamic programming capabilities for providing logic to the programmable logic device to be executed on data in the memory blocks.

  10. The influence of human factor on security of software intended for educational purposes

    Directory of Open Access Journals (Sweden)

    Valeriy Valentinovich Gurov

    2016-06-01

    Full Text Available The report considers the construction and analysis of attack tree on the software tools intended for educational purposes. This takes into account different groups of attackers. The criterion of security for such tools is introduced.

  11. Accuracy of Surface Plate Measurements - General Purpose Software for Flatness Measurement

    NARCIS (Netherlands)

    Meijer, J.; Heuvelman, C.J.

    1990-01-01

    Flatness departures of surface plates are generally obtained from straightness measurements of lines on the surface. A computer program has been developed for on-line measurement and evaluation, based on the simultaneous coupling of measurements in all grid points. Statistical methods are used to

  12. Free software and free culture in Higher Education: pipelines and workflows for creative purposes

    OpenAIRE

    Gonçalves, Nelson; Figueiredo, Maria Pacheco

    2015-01-01

    OpenLab ESEV is a project of the School of Education of the Polytechnic Institute of Viseu (ESEV), Portugal, that aims to promote, foster and support the use of Free/Libre Software and Open Source Software, Open Educational Resources, Free Culture, Free file formats and more flexible copyright licenses for creative and educational purposes in the ESEV's domains of activity (education, arts, media). Most of the OpenLab ESEV activities are related to the teacher education and arts a...

  13. A general purpose code for Monte Carlo simulations

    International Nuclear Information System (INIS)

    Wilcke, W.W.; Rochester Univ., NY

    1984-01-01

    A general-purpose computer code MONTHY has been written to perform Monte Carlo simulations of physical systems. To achieve a high degree of flexibility the code is organized like a general purpose computer, operating on a vector describing the time dependent state of the system under simulation. The instruction set of the 'computer' is defined by the user and is therefore adaptable to the particular problem studied. The organization of MONTHY allows iterative and conditional execution of operations. (orig.)

  14. Summary of JENDL-2 general purpose file

    Energy Technology Data Exchange (ETDEWEB)

    Nakagawa, Tsuneo [ed.

    1984-06-15

    The general purpose file of the second version of Japanese Evaluated Nuclear Data Library (JENDL-2) was released in December 1982. Recently, descriptive data were added to JENDL-2 and at the same time the first revision of numerical data was performed. JENDL-2 (Rev.1) consists of the data for 89 nuclides and about 211,000 records in the ENDF/B-IV format. In this report, full listings of presently added descriptive data are given to summarize the JENDL-2 general purpose file. The 2200-m/sec and 14-MeV cross sections, resonance integrals, Maxwellian and fission spectrum averaged cross sections are given in a table. Average cross sections were also calculated in suitable energy intervals.

  15. Summary of JENDL-2 general purpose file

    International Nuclear Information System (INIS)

    Nakagawa, Tsuneo

    1984-06-01

    The general purpose file of the second version of Japanese Evaluated Nuclear Data Library (JENDL-2) was released in December 1982. Recently, descriptive data were added to JENDL-2 and at the same time the first revision of numerical data was performed. JENDL-2 (Rev1) consists of the data for 89 nuclides and about 211,000 records in the ENDF/B-IV format. In this report, full listings of presently added descriptive data are given to summarize the JENDL-2 general purpose file. The 2200-m/sec and 14-MeV cross sections, resonance integrals, Maxwellian and fission spectrum averaged cross sections are given in a table. Average cross sections were also calculated in suitable energy intervals. (author)

  16. Analysis and optimization techniques for real-time streaming image processing software on general purpose systems

    NARCIS (Netherlands)

    Westmijze, Mark

    2018-01-01

    Commercial Off The Shelf (COTS) Chip Multi-Processor (CMP) systems are for cost reasons often used in industry for soft real-time stream processing. COTS CMP systems typically have a low timing predictability, which makes it difficult to develop software applications for these systems with tight

  17. A Design and Development of Multi-Purpose CCD Camera System with Thermoelectric Cooling: Software

    Directory of Open Access Journals (Sweden)

    S. H. Oh

    2007-12-01

    Full Text Available We present a software which we developed for the multi-purpose CCD camera. This software can be used on the all 3 types of CCD - KAF-0401E (768×512, KAF-1602E (15367times;1024, KAF-3200E (2184×1472 made in KODAK Co.. For the efficient CCD camera control, the software is operated with two independent processes of the CCD control program and the temperature/shutter operation program. This software is designed to fully automatic operation as well as manually operation under LINUX system, and is controled by LINUX user signal procedure. We plan to use this software for all sky survey system and also night sky monitoring and sky observation. As our results, the read-out time of each CCD are about 15sec, 64sec, 134sec for KAF-0401E, KAF-1602E, KAF-3200E., because these time are limited by the data transmission speed of parallel port. For larger format CCD, the data transmission is required more high speed. we are considering this control software to one using USB port for high speed data transmission.

  18. A General Purpose Feature Extractor for Light Detection and Ranging Data

    Directory of Open Access Journals (Sweden)

    Edwin B. Olson

    2010-11-01

    Full Text Available Feature extraction is a central step of processing Light Detection and Ranging (LIDAR data. Existing detectors tend to exploit characteristics of specific environments: corners and lines from indoor (rectilinear environments, and trees from outdoor environments. While these detectors work well in their intended environments, their performance in different environments can be poor. We describe a general purpose feature detector for both 2D and 3D LIDAR data that is applicable to virtually any environment. Our method adapts classic feature detection methods from the image processing literature, specifically the multi-scale Kanade-Tomasi corner detector. The resulting method is capable of identifying highly stable and repeatable features at a variety of spatial scales without knowledge of environment, and produces principled uncertainty estimates and corner descriptors at same time. We present results on both software simulation and standard datasets, including the 2D Victoria Park and Intel Research Center datasets, and the 3D MIT DARPA Urban Challenge dataset.

  19. A general purpose feature extractor for light detection and ranging data.

    Science.gov (United States)

    Li, Yangming; Olson, Edwin B

    2010-01-01

    Feature extraction is a central step of processing Light Detection and Ranging (LIDAR) data. Existing detectors tend to exploit characteristics of specific environments: corners and lines from indoor (rectilinear) environments, and trees from outdoor environments. While these detectors work well in their intended environments, their performance in different environments can be poor. We describe a general purpose feature detector for both 2D and 3D LIDAR data that is applicable to virtually any environment. Our method adapts classic feature detection methods from the image processing literature, specifically the multi-scale Kanade-Tomasi corner detector. The resulting method is capable of identifying highly stable and repeatable features at a variety of spatial scales without knowledge of environment, and produces principled uncertainty estimates and corner descriptors at same time. We present results on both software simulation and standard datasets, including the 2D Victoria Park and Intel Research Center datasets, and the 3D MIT DARPA Urban Challenge dataset.

  20. 7 CFR 227.1 - General purpose and scope.

    Science.gov (United States)

    2010-01-01

    ... management training of school foodservice personnel, and (c) the conduct of nutrition education activities in... Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE CHILD NUTRITION PROGRAMS NUTRITION EDUCATION AND TRAINING PROGRAM General § 227.1 General purpose...

  1. 24 CFR 902.1 - Purpose and general description.

    Science.gov (United States)

    2010-04-01

    ... assessments. The Real Estate Assessment Center (REAC) is responsible for assessing and scoring the performance... uniform and objective protocols for the physical inspection of properties and the financial assessment of... URBAN DEVELOPMENT PUBLIC HOUSING ASSESSMENT SYSTEM General Provisions § 902.1 Purpose and general...

  2. DNA Processing and Reassembly on General Purpose FPGA-based Development Boards

    Directory of Open Access Journals (Sweden)

    SZÁSZ Csaba

    2017-05-01

    Full Text Available The great majority of researchers involved in microelectronics generally agree that many scientific challenges in life sciences have associated with them a powerful computational requirement that must be solved before scientific progress can be made. The current trend in Deoxyribonucleic Acid (DNA computing technologies is to develop special hardware platforms capable to provide the needed processing performance at lower cost. In this endeavor the FPGA-based (Field Programmable Gate Arrays configurations aimed to accelerate genome sequencing and reassembly plays a leading role. This paper emphasizes benefits and advantages using general purpose FPGA-based development boards in DNA reassembly applications beside the special hardware architecture solutions. An original approach is unfolded which outlines the versatility of high performance ready-to-use manufacturer development platforms endowed with powerful hardware resources fully optimized for high speed processing applications. The theoretical arguments are supported via an intuitive implementation example where the designer it is discharged from any hardware development effort and completely assisted in exclusive concentration only on software design issues providing greatly reduced application development cycles. The experiments prove that such boards available on the market are suitable to fulfill in all a wide range of DNA sequencing and reassembly applications.

  3. ''Sheiva'' : a general purpose multi-parameter data acquisition and processing system at VECC

    International Nuclear Information System (INIS)

    Viyogi, Y.P.; Ganguly, N.K.

    1982-01-01

    A general purpose interactive software to be used with the PDP-15/76 on-line computer at VEC Centre for the acquisition and processing of data in nuclear physics experiments is described. The program can accommodate a maximum of thirty two inputs although the present hardware limits the number of inputs to eight. Particular emphasis is given to the problems of flexibility and ease of operation, memory optimisation and techniques dealing with experimenter-computer interaction. Various graphical methods for one- and two-dimensional data presentation are discussed. Specific problems of particle identification using detector telescopes have been dealt with carefully to handle experiments using several detector telescopes and those involving light particle-heavy particle coincidence studies. Steps needed to tailor this program towards utilisation for special experiments are also described. (author)

  4. Re-purposing commercial entertainment software for military use

    OpenAIRE

    DeBrine, Jeffrey D.; Morrow, Donald E.

    2000-01-01

    Approved for public release; distribution is unlimited Virtual environments have achieved widespread use in the military in applications such as theater planning, training, and architectural walkthroughs. These applications are generally expensive and inflexible in design and implementation. Re-purposing these applications to meet the dynamic modeling and simulation needs of the military can be awkward or impossible. Video games are designed to be both technologically advanced and flexible...

  5. General-purpose RFQ design program

    International Nuclear Information System (INIS)

    Wadlinger, E.A.

    1984-01-01

    We have written a general-purpose, radio-frequency quadrupole (RFQ) design program that allows maximum flexibility in picking design algorithms. This program optimizes the RFQ on any combination of design parameters while simultaneously satisfying mutually compatible, physically required constraint equations. It can be very useful for deriving various scaling laws for RFQs. This program has a friendly user interface in addition to checking the consistency of the user-defined requirements and is written to minimize the effort needed to incorporate additional constraint equations. We describe the program and present some examples

  6. A Small Acoustic Goniometer for General Purpose Research.

    Science.gov (United States)

    Pook, Michael L; Loo, Sin Ming

    2016-04-29

    Understanding acoustic events and monitoring their occurrence is a useful aspect of many research projects. In particular, acoustic goniometry allows researchers to determine the source of an event based solely on the sound it produces. The vast majority of acoustic goniometry research projects used custom hardware targeted to the specific application under test. Unfortunately, due to the wide range of sensing applications, a flexible general purpose hardware/firmware system does not exist for this purpose. This article focuses on the development of such a system which encourages the continued exploration of general purpose hardware/firmware and lowers barriers to research in projects requiring the use of acoustic goniometry. Simulations have been employed to verify system feasibility, and a complete hardware implementation of the acoustic goniometer has been designed and field tested. The results are reported, and suggested areas for improvement and further exploration are discussed.

  7. Component-based development of software language engineering tools

    NARCIS (Netherlands)

    Ssanyu, J.; Hemerik, C.

    2011-01-01

    In this paper we outline how Software Language Engineering (SLE) could benefit from Component-based Software Development (CBSD) techniques and present an architecture aimed at developing a coherent set of lightweight SLE components, fitting into a general-purpose component framework. In order to

  8. Integrating and Managing Bim in GIS, Software Review

    Science.gov (United States)

    El Meouche, R.; Rezoug, M.; Hijazi, I.

    2013-08-01

    Since the advent of Computer-Aided Design (CAD) and Geographical Information System (GIS) tools, project participants have been increasingly leveraging these tools throughout the different phases of a civil infrastructure project. In recent years the number of GIS software that provides tools to enable the integration of Building information in geo context has risen sharply. More and more GIS software are added tools for this purposes and other software projects are regularly extending these tools. However, each software has its different strength and weakness and its purpose of use. This paper provides a thorough review to investigate the software capabilities and clarify its purpose. For this study, Autodesk Revit 2012 i.e. BIM editor software was used to create BIMs. In the first step, three building models were created, the resulted models were converted to BIM format and then the software was used to integrate it. For the evaluation of the software, general characteristics was studied such as the user interface, what formats are supported (import/export), and the way building information are imported.

  9. VoxelMages: a general-purpose graphical interface for designing geometries and processing DICOM images for PENELOPE.

    Science.gov (United States)

    Giménez-Alventosa, V; Ballester, F; Vijande, J

    2016-12-01

    The design and construction of geometries for Monte Carlo calculations is an error-prone, time-consuming, and complex step in simulations describing particle interactions and transport in the field of medical physics. The software VoxelMages has been developed to help the user in this task. It allows to design complex geometries and to process DICOM image files for simulations with the general-purpose Monte Carlo code PENELOPE in an easy and straightforward way. VoxelMages also allows to import DICOM-RT structure contour information as delivered by a treatment planning system. Its main characteristics, usage and performance benchmarking are described in detail. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Efficient probabilistic model checking on general purpose graphic processors

    NARCIS (Netherlands)

    Bosnacki, D.; Edelkamp, S.; Sulewski, D.; Pasareanu, C.S.

    2009-01-01

    We present algorithms for parallel probabilistic model checking on general purpose graphic processing units (GPGPUs). For this purpose we exploit the fact that some of the basic algorithms for probabilistic model checking rely on matrix vector multiplication. Since this kind of linear algebraic

  11. A general software reliability process simulation technique

    Science.gov (United States)

    Tausworthe, Robert C.

    1991-01-01

    The structure and rationale of the generalized software reliability process, together with the design and implementation of a computer program that simulates this process are described. Given assumed parameters of a particular project, the users of this program are able to generate simulated status timelines of work products, numbers of injected anomalies, and the progress of testing, fault isolation, repair, validation, and retest. Such timelines are useful in comparison with actual timeline data, for validating the project input parameters, and for providing data for researchers in reliability prediction modeling.

  12. TACO: a general-purpose tool for predicting cell-type-specific transcription factor dimers.

    Science.gov (United States)

    Jankowski, Aleksander; Prabhakar, Shyam; Tiuryn, Jerzy

    2014-03-19

    Cooperative binding of transcription factor (TF) dimers to DNA is increasingly recognized as a major contributor to binding specificity. However, it is likely that the set of known TF dimers is highly incomplete, given that they were discovered using ad hoc approaches, or through computational analyses of limited datasets. Here, we present TACO (Transcription factor Association from Complex Overrepresentation), a general-purpose standalone software tool that takes as input any genome-wide set of regulatory elements and predicts cell-type-specific TF dimers based on enrichment of motif complexes. TACO is the first tool that can accommodate motif complexes composed of overlapping motifs, a characteristic feature of many known TF dimers. Our method comprehensively outperforms existing tools when benchmarked on a reference set of 29 known dimers. We demonstrate the utility and consistency of TACO by applying it to 152 DNase-seq datasets and 94 ChIP-seq datasets. Based on these results, we uncover a general principle governing the structure of TF-TF-DNA ternary complexes, namely that the flexibility of the complex is correlated with, and most likely a consequence of, inter-motif spacing.

  13. Development of General Purpose Data Acquisition Shell (GPDAS)

    International Nuclear Information System (INIS)

    Chung, Y.; Kim, K.

    1995-01-01

    This note is intended as an abbreviated introduction to the concept and the structure of General Purpose Data Acquisitions Shell (GPDAS) and assumes the reader has a certain level of familiarity with programming in general. The structure of the following sections consists of brief explanations of the concepts and commands of GPDAS, followed by several examples. Some of these are tabulated in the appendices at the end of this note

  14. Programming Language Software For Graphics Applications

    Science.gov (United States)

    Beckman, Brian C.

    1993-01-01

    New approach reduces repetitive development of features common to different applications. High-level programming language and interactive environment with access to graphical hardware and software created by adding graphical commands and other constructs to standardized, general-purpose programming language, "Scheme". Designed for use in developing other software incorporating interactive computer-graphics capabilities into application programs. Provides alternative to programming entire applications in C or FORTRAN, specifically ameliorating design and implementation of complex control and data structures typifying applications with interactive graphics. Enables experimental programming and rapid development of prototype software, and yields high-level programs serving as executable versions of software-design documentation.

  15. Evaluation of features to support safety and quality in general practice clinical software

    Science.gov (United States)

    2011-01-01

    Background Electronic prescribing is now the norm in many countries. We wished to find out if clinical software systems used by general practitioners in Australia include features (functional capabilities and other characteristics) that facilitate improved patient safety and care, with a focus on quality use of medicines. Methods Seven clinical software systems used in general practice were evaluated. Fifty software features that were previously rated as likely to have a high impact on safety and/or quality of care in general practice were tested and are reported here. Results The range of results for the implementation of 50 features across the 7 clinical software systems was as follows: 17-31 features (34-62%) were fully implemented, 9-13 (18-26%) partially implemented, and 9-20 (18-40%) not implemented. Key findings included: Access to evidence based drug and therapeutic information was limited. Decision support for prescribing was available but varied markedly between systems. During prescribing there was potential for medicine mis-selection in some systems, and linking a medicine with its indication was optional. The definition of 'current medicines' versus 'past medicines' was not always clear. There were limited resources for patients, and some medicines lists for patients were suboptimal. Results were provided to the software vendors, who were keen to improve their systems. Conclusions The clinical systems tested lack some of the features expected to support patient safety and quality of care. Standards and certification for clinical software would ensure that safety features are present and that there is a minimum level of clinical functionality that clinicians could expect to find in any system.

  16. Towards a General Software Engineering Methodology for the Internet of Things

    OpenAIRE

    Zambonelli, Franco

    2016-01-01

    As research in the Internet of Thing area progresses, and a multitude of proposals exist to solve a variety of problems, the need for a general principled software engineering approach for the systematic development of IoT systems and applications arises. In this paper, by synthesizing form the state of the art in the area, we attempt at framing the key concepts and abstractions that revolve around the design and development of IoT systems and applications, and draft a software engineering me...

  17. The effect of advertising in clinical software on general practitioners' prescribing behaviour.

    Science.gov (United States)

    Henderson, Joan; Miller, Graeme; Pan, Ying; Britt, Helena

    2008-01-07

    To assess the effect of pharmaceutical advertising embedded in clinical software on the prescribing behaviour of general practitioners. Secondary analysis of data from a random sample of 1336 Australian GPs who participated in Bettering the Evaluation and Care of Health, a national continuous cross-sectional survey of general practice activity, between November 2003 and March 2005. The prescribing behaviour of participants who used the advertising software was compared with that of participants who did not, for seven pharmaceutical products advertised continually throughout the study period. Prescription for advertised product as a proportion (%) of prescriptions for all pharmaceutical products in the same generic class or group. GP age, practice location, accreditation status, patient bulk-billing status and hours worked were significantly associated (P advertising software. We found no significant differences, either before or after adjustment for these confounders, in the prescribing rate of Lipitor (adjusted odds ratio [AOR], 0.90; P = 0.26); Micardis (AOR, 0.98; P = 0.91); Mobic (AOR, 1.02; P = 0.89); Norvasc (AOR, 1.02; P = 0.91); Natrilix (AOR, 0.80; P = 0.32); or Zanidip (AOR, 0.88; P = 0.47). GPs using advertising software prescribed Nexium significantly less often than those not using advertising software (AOR, 0.78; P = 0.02). When all advertised products were combined and compared with products that were not advertised, no difference in the overall prescribing behaviour was demonstrated (AOR, 0.96; P = 0.42). Exposure to advertisements in clinical software has little influence on the prescribing behaviour of GPs.

  18. Basic guidelines to introduce electric circuit simulation software in a general physics course

    Science.gov (United States)

    Moya, A. A.

    2018-05-01

    The introduction of electric circuit simulation software for undergraduate students in a general physics course is proposed in order to contribute to the constructive learning of electric circuit theory. This work focuses on the lab exercises based on dc, transient and ac analysis in electric circuits found in introductory physics courses, and shows how students can use the simulation software to do simple activities associated with a lab exercise itself and with related topics. By introducing electric circuit simulation programs in a general physics course as a brief activitiy complementing lab exercise, students develop basic skills in using simulation software, improve their knowledge on the topology of electric circuits and perceive that the technology contributes to their learning, all without reducing the time spent on the actual content of the course.

  19. Perbandingan Kemampuan Embedded Computer dengan General Purpose Computer untuk Pengolahan Citra

    Directory of Open Access Journals (Sweden)

    Herryawan Pujiharsono

    2017-08-01

    Full Text Available Perkembangan teknologi komputer membuat pengolahan citra saat ini banyak dikembangkan untuk dapat membantu manusia di berbagai bidang pekerjaan. Namun, tidak semua bidang pekerjaan dapat dikembangkan dengan pengolahan citra karena tidak mendukung penggunaan komputer sehingga mendorong pengembangan pengolahan citra dengan mikrokontroler atau mikroprosesor khusus. Perkembangan mikrokontroler dan mikroprosesor memungkinkan pengolahan citra saat ini dapat dikembangkan dengan embedded computer atau single board computer (SBC. Penelitian ini bertujuan untuk menguji kemampuan embedded computer dalam mengolah citra dan membandingkan hasilnya dengan komputer pada umumnya (general purpose computer. Pengujian dilakukan dengan mengukur waktu eksekusi dari empat operasi pengolahan citra yang diberikan pada sepuluh ukuran citra. Hasil yang diperoleh pada penelitian ini menunjukkan bahwa optimasi waktu eksekusi embedded computer lebih baik jika dibandingkan dengan general purpose computer dengan waktu eksekusi rata-rata embedded computer adalah 4-5 kali waktu eksekusi general purpose computer dan ukuran citra maksimal yang tidak membebani CPU terlalu besar untuk embedded computer adalah 256x256 piksel dan untuk general purpose computer adalah 400x300 piksel.

  20. Software qualification of selected TOUGH2 modules

    International Nuclear Information System (INIS)

    Wu, Y.S.; Ahlers, C.F.; Fraser, P.; Simmons, A.; Pruess, K.

    1996-10-01

    The purpose of this package of reports is to provide all software baseline documents necessary for the software qualification of the single-phase Gas (EOS1G), Effective Continuum Method (ECM), Saturated/Unsaturated Flow (EOS9), and Radionuclide Transport (T2R3D) modules of TOUGH2, a numerical simulation code for multi-dimensional coupled fluid and heat flow of multiphase, multicomponent fluid mixtures in porous and fractured media. This report contains the following sections: (1) Requirements Specification, (2) Design Description, (3) Software Validation Test Plan and Report, (4) Software User Documentation, and (5) Appendices. These sections comprise sequential parts of the Software Life Cycle, and are not intended to stand alone but should be used in conjunction with the TOUGH User's Guide (Pruess, 1987), TOUGH2--A General Purpose Numerical Simulator for Multiphase Fluid and Heat Flow (Pruess, 1991), and the above-referenced TOUGH2 software qualification document. The qualification package is complete with the attached Software Identification Form and executable source code for the single-phase Gas, Effective Continuum method, Saturated/Unsaturated Flow, and Radionuclide Transport modules of TOUGH2

  1. Speech pattern recognition for forensic acoustic purposes

    OpenAIRE

    Herrera Martínez, Marcelo; Aldana Blanco, Andrea Lorena; Guzmán Palacios, Ana María

    2014-01-01

    The present paper describes the development of a software for analysis of acoustic voice parameters (APAVOIX), which can be used for forensic acoustic purposes, based on the speaker recognition and identification. This software enables to observe in a clear manner, the parameters which are sufficient and necessary when performing a comparison between two voice signals, the suspicious and the original one. These parameters are used according to the classic method, generally used by state entit...

  2. Applications of artificial intelligence to space station: General purpose intelligent sensor interface

    Science.gov (United States)

    Mckee, James W.

    1988-01-01

    This final report describes the accomplishments of the General Purpose Intelligent Sensor Interface task of the Applications of Artificial Intelligence to Space Station grant for the period from October 1, 1987 through September 30, 1988. Portions of the First Biannual Report not revised will not be included but only referenced. The goal is to develop an intelligent sensor system that will simplify the design and development of expert systems using sensors of the physical phenomena as a source of data. This research will concentrate on the integration of image processing sensors and voice processing sensors with a computer designed for expert system development. The result of this research will be the design and documentation of a system in which the user will not need to be an expert in such areas as image processing algorithms, local area networks, image processor hardware selection or interfacing, television camera selection, voice recognition hardware selection, or analog signal processing. The user will be able to access data from video or voice sensors through standard LISP statements without any need to know about the sensor hardware or software.

  3. How General-Purpose can a GPU be?

    Directory of Open Access Journals (Sweden)

    Philip Machanick

    2015-12-01

    Full Text Available The use of graphics processing units (GPUs in general-purpose computation (GPGPU is a growing field. GPU instruction sets, while implementing a graphics pipeline, draw from a range of single instruction multiple datastream (SIMD architectures characteristic of the heyday of supercomputers. Yet only one of these SIMD instruction sets has been of application on a wide enough range of problems to survive the era when the full range of supercomputer design variants was being explored: vector instructions. This paper proposes a reconceptualization of the GPU as a multicore design with minimal exotic modes of parallelism so as to make GPGPU truly general.

  4. The purpose of the general practice consultation from the patients perspective - theoretical aspects

    DEFF Research Database (Denmark)

    Thorsen, Hanne; Witt, Klaus; Malterud, Kirsti

    2001-01-01

    Consultation purposes, general practice, patients´expectations, patients satosfaction, patientcenteredness......Consultation purposes, general practice, patients´expectations, patients satosfaction, patientcenteredness...

  5. General Purpose Multimedia Dataset - GarageBand 2008

    DEFF Research Database (Denmark)

    Meng, Anders

    This document describes a general purpose multimedia data-set to be used in cross-media machine learning problems. In more detail we describe the genre taxonomy applied at http://www.garageband.com, from where the data-set was collected, and how the taxonomy have been fused into a more human...... understandable taxonomy. Finally, a description of various features extracted from both the audio and text are presented....

  6. ENDF/B-4 General Purpose File 1974

    International Nuclear Information System (INIS)

    Schwerer, O.

    1980-04-01

    This document summarizes contents and documentation of the 1974 version of the General Purpose File of the ENDF/B Library maintained by the National Nuclear Data Center (NNDC) at the Brookhaven National Laboratory, USA. The Library contains numerical neutron reaction data for 90 isotopes or elements. The entire Library or selective retrievals from it can be obtained on magnetic tape from the IAEA Nuclear Data Section. (author)

  7. Modelling of a general purpose irradiation chamber using a Monte Carlo particle transport code

    International Nuclear Information System (INIS)

    Dhiyauddin Ahmad Fauzi; Sheik, F.O.A.; Nurul Fadzlin Hasbullah

    2013-01-01

    Full-text: The aim of this research is to stimulate the effectiveness use of a general purpose irradiation chamber to contain pure neutron particles obtained from a research reactor. The secondary neutron and gamma particles dose discharge from the chamber layers will be used as a platform to estimate the safe dimension of the chamber. The chamber, made up of layers of lead (Pb), shielding, polyethylene (PE), moderator and commercial grade aluminium (Al) cladding is proposed for the use of interacting samples with pure neutron particles in a nuclear reactor environment. The estimation was accomplished through simulation based on general Monte Carlo N-Particle transport code using Los Alamos MCNPX software. Simulations were performed on the model of the chamber subjected to high neutron flux radiation and its gamma radiation product. The model of neutron particle used is based on the neutron source found in PUSPATI TRIGA MARK II research reactor which holds a maximum flux value of 1 x 10 12 neutron/ cm 2 s. The expected outcomes of this research are zero gamma dose in the core of the chamber and neutron dose rate of less than 10 μSv/ day discharge from the chamber system. (author)

  8. Lipseys Quest for the Micro-foundations of GPT-the General Purpose Engine

    NARCIS (Netherlands)

    Van der Kooij, B.J.G.

    2016-01-01

    The construct of the General Purpose Technology misses its micro-foundation (as observed by Richard Lipsey). We present a possible solution in the General Purpose Engines. These are the basic innovations and the clusters of contributing and derived innovation, that appear in a Schumpeterian 'cluster

  9. Software important to safety in nuclear power plants

    International Nuclear Information System (INIS)

    1994-01-01

    The report provides guidance on current practices, documenting their strengths and weaknesses for dealing with the important issues of software engineering that nuclear power plant system designers, software producers and regulators are facing. The focus of the report is on safety critical applications of general purpose processors controlled by custom developed software; however, it should also have application in safety related applications and for other types of computers. In addition to system designers, software producers and regulators, the intended readership of this report includes users of software based systems, who should be aware of the relevant issues in specifying and obtaining software for systems important to safety. Refs, 1 fig., tabs

  10. General Purpose Heat Source Simulator

    Science.gov (United States)

    Emrich, Bill

    2008-01-01

    The General Purpose Heat Source (GPHS) simulator project is designed to replicate through the use of electrical heaters, the form, fit, and function of actual GPHS modules which generate heat through the radioactive decay of Pu238. The use of electrically heated modules rather than modules containing Pu238 facilitates the testing of spacecraft subsystems and systems without sacrificing the quantity and quality of the test data gathered. Previous GPHS activities are centered around developing robust heater designs with sizes and weights that closely matched those of actual Pu238 fueled GPHS blocks. These efforts were successful, although their maximum temperature capabilities were limited to around 850 C. New designs are being pursued which also replicate the sizes and weights of actual Pu238 fueled GPHS blocks but will allow operation up to 1100 C.

  11. Computerized accounting for the dental office. Using horizontal applications general ledger software.

    Science.gov (United States)

    Garsson, B

    1988-01-01

    Remember that computer software is designed for accrual accounting, whereas your business operates and reports income on a cash basis. The rules of tax law stipulate that professional practices may use the cash method of accounting, but if accrual accounting is ever used to report taxable income the government may not permit a switch back to cash accounting. Therefore, always consider the computer as a bookkeeper, not a substitute for a qualified accountant. (Your accountant will have readily accessible payroll and general ledger data available for analysis and tax reports, thanks to the magic of computer processing.) Accounts Payable reports are interfaced with the general ledger and are of interest for transaction detail, open invoice and cash flow analysis, and for a record of payments by vendor. Payroll reports, including check register and withholding detail are provided and interfaced with the general ledger. The use of accounting software expands the use of in-office computers to areas beyond professional billing and insurance form generation. It simplifies payroll recordkeeping; maintains payables details; integrates payables, receivables, and payroll with general ledger files; provides instantaneous information on all aspects of the business office; and creates a continuous "audit-trail" following the entering of data. The availability of packaged accounting software allows the professional business office an array of choices. The person(s) responsible for bookkeeping and accounting should choose carefully, ensuring that any system is easy to use, has been thoroughly tested, and provides at least as much control over office records as has been outlined in this article.

  12. Increasing Open Source Software Integration on the Department of Defense Unclassified Desktop

    National Research Council Canada - National Science Library

    Schearer, Steven A

    2008-01-01

    .... While some of this expenditure goes to fund special-purpose military software, much of it is absorbed by license fees for computer operating systems and general-purpose office automation applications...

  13. Multiphysics software and the challenge to validating physical models

    International Nuclear Information System (INIS)

    Luxat, J.C.

    2008-01-01

    This paper discusses multi physics software and validation of physical models in the nuclear industry. The major challenge is to convert the general purpose software package to a robust application-specific solution. This requires greater knowledge of the underlying solution techniques and the limitations of the packages. Good user interfaces and neat graphics do not compensate for any deficiencies

  14. Development of a fatigue analysis software system

    International Nuclear Information System (INIS)

    Choi, B. I.; Lee, H. J.; Han, S. W.; Kim, J. Y.; Hwang, K. H.; Kang, J. Y.

    2001-01-01

    A general purpose fatigue analysis software to predict fatigue lives of mechanical components and structures was developed. This software has some characteristic features including functions of searching weak regions on the free surface in order to reduce computing time significantly, a database of fatigue properties for various materials, and an expert system which can assist any users to get more proper results. This software can be used in the environment consists of commercial finite element packages. Using the software developed fatigue analyses for a SAE keyhole specimen and an automobile knuckle were carried out. It was observed that the results were agree well with those from commercial packages

  15. Geographical parthenogenesis: General purpose genotypes and frozen niche variation

    DEFF Research Database (Denmark)

    Vrijenhoek, Robert C.; Parker, Dave

    2009-01-01

    hypotheses concerning the evolution of niche breadth in asexual species - the "general-purpose genotype" (GPG) and "frozen niche-variation" (FNV) models. The two models are often portrayed as mutually exclusive, respectively viewing clonal lineages as generalists versus specialists. Nonetheless...

  16. Hybrid molecular–continuum methods: From prototypes to coupling software

    KAUST Repository

    Neumann, Philipp

    2014-02-01

    In this contribution, we review software requirements in hybrid molecular-continuum simulations. For this purpose, we analyze a prototype implementation which combines two frameworks-the Molecular Dynamics framework MarDyn and the framework Peano for spatially adaptive mesh-based simulations-and point out particular challenges of a general coupling software. Based on this analysis, we discuss the software design of our recently published coupling tool. We explain details on its overall structure and show how the challenges that arise in respective couplings are resolved by the software. © 2013 Elsevier Ltd. All rights reserved.

  17. Software-based Microarchitectural Attacks

    OpenAIRE

    Gruss, Daniel

    2017-01-01

    Modern processors are highly optimized systems where every single cycle of computation time matters. Many optimizations depend on the data that is being processed. Software-based microarchitectural attacks exploit effects of these optimizations. Microarchitectural side-channel attacks leak secrets from cryptographic computations, from general purpose computations, or from the kernel. This leakage even persists across all common isolation boundaries, such as processes, containers, and virtual ...

  18. Digital image processing software system using an array processor

    International Nuclear Information System (INIS)

    Sherwood, R.J.; Portnoff, M.R.; Journeay, C.H.; Twogood, R.E.

    1981-01-01

    A versatile array processor-based system for general-purpose image processing was developed. At the heart of this system is an extensive, flexible software package that incorporates the array processor for effective interactive image processing. The software system is described in detail, and its application to a diverse set of applications at LLNL is briefly discussed. 4 figures, 1 table

  19. Visualization of scientific data for high energy physics: PAW, a general-purpose portable software tool for data analysis and presentation

    International Nuclear Information System (INIS)

    Brun, R.; Couet, O.; Vandoni, C.E.; Zanarini, P.

    1990-01-01

    Visualization of scientific data although a fashionable word in the world of computer graphics, is not a new invention, but it is hundreds years old. With the advent of computer graphics the visualization of Scientific Data has now become a well understood and widely used technology, with hundreds of applications in the most different fields, ranging from media applications to real scientific ones. In the present paper, we shall discuss the design concepts of the Visualization of Scientific Data systems in particular in the specific field of High Energy Physics. During the last twenty years, CERN has played a leading role as the focus for development of packages and software libraries to solve problems related to High Energy Physics (HEP). The results of the integration of resources from many different Laboratories can be expressed in several million lines of code written at CERN during this period of time, used at CERN and distributed to collaborating laboratories. Nowadays, this role of software developer is considered very important by the entire HEP community. In this paper a large software package, where man-machine interaction and graphics play a key role (PAW-Physics Analysis Workstation), is described. PAW is essentially an interactive system which includes many different software tools, strongly oriented towards data analysis and data presentation. Some of these tools have been available in different forms and with different human interfaces for several years. 6 figs

  20. Speed Control of General Purpose Engine with Electronic Governor

    Science.gov (United States)

    Sawut, Umerujan; Tohti, Gheyret; Takigawa, Buso; Tsuji, Teruo

    This paper presents a general purpose engine speed control system with an electronic governor in order to improve the current system with a mechanical governor which shows unstable characteristics by change of mecanical friction or A/F ratio (Air/Fuel ratio). For the control system above, there are problems that the feedback signal is only a crank angle because of cost and the controlled object is a general purpose engine which is strongly nonlinear. In order to overcome these problems, the system model is shown for the dynamic estimation of the amount of air flow and the robust controller is designed. That is, the proposed system includes the robust sliding-mode controller by the feedback signal of only a crank angle where Genetic Algorithm is applied for the controller design. The simulation and the experiments by MATLAB/Simulink are performed to show the effectiveness of our proposal.

  1. Cafe Variome: general-purpose software for making genotype-phenotype data discoverable in restricted or open access contexts.

    Science.gov (United States)

    Lancaster, Owen; Beck, Tim; Atlan, David; Swertz, Morris; Thangavelu, Dhiwagaran; Veal, Colin; Dalgleish, Raymond; Brookes, Anthony J

    2015-10-01

    Biomedical data sharing is desirable, but problematic. Data "discovery" approaches-which establish the existence rather than the substance of data-precisely connect data owners with data seekers, and thereby promote data sharing. Cafe Variome (http://www.cafevariome.org) was therefore designed to provide a general-purpose, Web-based, data discovery tool that can be quickly installed by any genotype-phenotype data owner, or network of data owners, to make safe or sensitive content appropriately discoverable. Data fields or content of any type can be accommodated, from simple ID and label fields through to extensive genotype and phenotype details based on ontologies. The system provides a "shop window" in front of data, with main interfaces being a simple search box and a powerful "query-builder" that enable very elaborate queries to be formulated. After a successful search, counts of records are reported grouped by "openAccess" (data may be directly accessed), "linkedAccess" (a source link is provided), and "restrictedAccess" (facilitated data requests and subsequent provision of approved records). An administrator interface provides a wide range of options for system configuration, enabling highly customized single-site or federated networks to be established. Current uses include rare disease data discovery, patient matchmaking, and a Beacon Web service. © 2015 WILEY PERIODICALS, INC.

  2. Sense and purpose of bilateral agreements - a general survey

    International Nuclear Information System (INIS)

    Buehler, O.

    1999-01-01

    Switzerland has concluded with its neighbouring States several bilateral agreements on nuclear information and on mutual help in case of catastrophes. The following general survey explains the sense and purpose of these agreements which complement the IAEA-Conventions referring to the same matters. (orig.) [de

  3. Weldability of general purpose heat source new-process iridium

    International Nuclear Information System (INIS)

    Kanne, W.R.

    1987-01-01

    Weldability tests on General Purpose Heat Source (GPHS) iridium capsules showed that a new iridium fabrication process reduced susceptibility to underbead cracking. Seventeen capsules were welded (a total of 255 welds) in four categories and the number of cracks in each weld was measured

  4. Transforming the ASDEX Upgrade discharge control system to a general-purpose plasma control platform

    International Nuclear Information System (INIS)

    Treutterer, Wolfgang; Cole, Richard; Gräter, Alexander; Lüddecke, Klaus; Neu, Gregor; Rapson, Christopher; Raupp, Gerhard; Zasche, Dieter; Zehetbauer, Thomas

    2015-01-01

    Highlights: • Control framework split in core and custom part. • Core framework deployable in other fusion device environments. • Adaptible through customizable modules, plug-in support and generic interfaces. - Abstract: The ASDEX Upgrade Discharge Control System DCS is a modern and mature product, originally designed to regulate and supervise ASDEX Upgrade Tokamak plasma operation. In its core DCS is based on a generic, versatile real-time software framework with a plugin architecture that allows to easily combine, modify and extend control function modules in order to tailor the system to required features and let it continuously evolve with the progress of an experimental fusion device. Due to these properties other fusion experiments like the WEST project have expressed interest in adopting DCS. For this purpose, essential parts of DCS must be unpinned from the ASDEX Upgrade environment by exposure or introduction of generalised interfaces. Re-organisation of DCS modules allows distinguishing between intrinsic framework core functions and device-specific applications. In particular, DCS must be prepared for deployment in different system environments with their own realisations for user interface, pulse schedule preparation, parameter server, time and event distribution, diagnostic and actuator systems, network communication and data archiving. The article explains the principles of the revised DCS structure, derives the necessary interface definitions and describes major steps to achieve the separation between general-purpose framework and fusion device specific components.

  5. Transforming the ASDEX Upgrade discharge control system to a general-purpose plasma control platform

    Energy Technology Data Exchange (ETDEWEB)

    Treutterer, Wolfgang, E-mail: Wolfgang.Treutterer@ipp.mpg.de [Max-Planck-Institut für Plasmaphysik, Boltzmannstr. 2, 85748 Garching (Germany); Cole, Richard [Unlimited Computer Systems, Seeshaupter Str. 15, 82393 Iffeldorf (Germany); Gräter, Alexander [Max-Planck-Institut für Plasmaphysik, Boltzmannstr. 2, 85748 Garching (Germany); Lüddecke, Klaus [Unlimited Computer Systems, Seeshaupter Str. 15, 82393 Iffeldorf (Germany); Neu, Gregor; Rapson, Christopher; Raupp, Gerhard; Zasche, Dieter; Zehetbauer, Thomas [Max-Planck-Institut für Plasmaphysik, Boltzmannstr. 2, 85748 Garching (Germany)

    2015-10-15

    Highlights: • Control framework split in core and custom part. • Core framework deployable in other fusion device environments. • Adaptible through customizable modules, plug-in support and generic interfaces. - Abstract: The ASDEX Upgrade Discharge Control System DCS is a modern and mature product, originally designed to regulate and supervise ASDEX Upgrade Tokamak plasma operation. In its core DCS is based on a generic, versatile real-time software framework with a plugin architecture that allows to easily combine, modify and extend control function modules in order to tailor the system to required features and let it continuously evolve with the progress of an experimental fusion device. Due to these properties other fusion experiments like the WEST project have expressed interest in adopting DCS. For this purpose, essential parts of DCS must be unpinned from the ASDEX Upgrade environment by exposure or introduction of generalised interfaces. Re-organisation of DCS modules allows distinguishing between intrinsic framework core functions and device-specific applications. In particular, DCS must be prepared for deployment in different system environments with their own realisations for user interface, pulse schedule preparation, parameter server, time and event distribution, diagnostic and actuator systems, network communication and data archiving. The article explains the principles of the revised DCS structure, derives the necessary interface definitions and describes major steps to achieve the separation between general-purpose framework and fusion device specific components.

  6. Report of the general purpose detector group

    International Nuclear Information System (INIS)

    Barbaro-Galtieri, A.; Bartel, W.; Bulos, F.; Cool, R.; Hanson, G.; Koetz, U.; Kottahaus, R.; Loken, S.; Luke, D.; Rothenberg, A.

    1975-01-01

    A general purpose detector for PEP is described. The main components of this detector are a l meter radius, 15 kilogauss superconducting solenoidal magnet with drift chambers to detect and measure the momentum of charged particles, a liquid argon neutral detector and hadron calorimeter, and a system of Cherenkov and time-of-flight counters for identification of charged hadrons. A major consideration in the design of this detector was that it be flexible: the magnet coil and drift chambers form a core around which various apparatus for specialized detection can be placed

  7. Discrete-Event Execution Alternatives on General Purpose Graphical Processing Units

    International Nuclear Information System (INIS)

    Perumalla, Kalyan S.

    2006-01-01

    Graphics cards, traditionally designed as accelerators for computer graphics, have evolved to support more general-purpose computation. General Purpose Graphical Processing Units (GPGPUs) are now being used as highly efficient, cost-effective platforms for executing certain simulation applications. While most of these applications belong to the category of time-stepped simulations, little is known about the applicability of GPGPUs to discrete event simulation (DES). Here, we identify some of the issues and challenges that the GPGPU stream-based interface raises for DES, and present some possible approaches to moving DES to GPGPUs. Initial performance results on simulation of a diffusion process show that DES-style execution on GPGPU runs faster than DES on CPU and also significantly faster than time-stepped simulations on either CPU or GPGPU.

  8. Cross-Language Support Mechanisms Significantly Aid Software Development

    DEFF Research Database (Denmark)

    Pfeiffer, Rolf-Helge; Wasowski, Andrzej

    2012-01-01

    Contemporary software systems combine many artifacts specified in various modeling and programming languages, domainspecific and general purpose as well. Since multi-language systems are so widespread, working on them calls for tools with cross-language support mechanisms such as (1) visualizatio...

  9. The Software Bus, an Object-Oriented Data Exchange System

    International Nuclear Information System (INIS)

    Akerbaek, T.; Louka, M.

    1996-01-01

    This document describes the Software Bus System, developed for object-oriented task to task communication in a TCP/IP based network. The Software Bus is a set of library functions, developed to be used for the Picasso-3 UIMS, and as a general purpose tool for dynamically interfacing programs at run-time. The Software Bus offers a high level object-oriented data exchange mechanism that relieves the application programmer of the low level TCP/IP-programming and communication protocol handling. The Software Bus is currently available under several UNIX platforms and a version for Windows NT is planned for late 1996. (author)

  10. General-purpose heat source development. Phase I: design requirements

    International Nuclear Information System (INIS)

    Snow, E.C.; Zocher, R.W.

    1978-09-01

    Studies have been performed to determine the necessary design requirements for a 238 PuO 2 General-Purpose Heat Source (GPHS). Systems and missions applications, as well as accident conditions, were considered. The results of these studies, along with the recommended GPHS design requirements, are given in this report

  11. Impact of Agile Software Development Model on Software Maintainability

    Science.gov (United States)

    Gawali, Ajay R.

    2012-01-01

    Software maintenance and support costs account for up to 60% of the overall software life cycle cost and often burdens tightly budgeted information technology (IT) organizations. Agile software development approach delivers business value early, but implications on software maintainability are still unknown. The purpose of this quantitative study…

  12. Some practical aspects of computer processing of uranium exploration data for environmental purposes

    International Nuclear Information System (INIS)

    Strumberger, V.; Miilojevic, M.; Strumberger, A.

    1997-01-01

    During a period of over 40 years an enormous amount of U exploration data has been accumulated. If specific requirements are met, this data can be reprocessed and used very efficiently for environmental purposes. Many IAEA Member States, where U exploration was carried out, are interested in using the data they possess for such purposes. The major difference is that the data is now intended for institutions that are engaged in environmental studies and not in uranium exploration. Moreover, the general interest of the public cannot be neglected. Therefore the data has to be presented with great care where different types of maps are probably one of the most significant forms. An important segment of the whole process is certainly computer data processing. Many countries have already carried out this process with the use of specialized software and modern hardware. Unfortunately many IAEA Member States - government institutions engaged in uranium exploration - are not equipped with the adequate (expensive) hardware and software and very often do not have the funds for this. The presented paper deals with some practical aspects of computer data processing from the initial data input (database) phase to the production of maps but with ''general purpose'' software that can be acquired with a minimum of expenses. It is worth mentioning that the IAEA has supplied many Member States with software and hardware that can be used immediately for this purpose. Preliminary processing and presentation of uranium exploration data for environmental purposes, with the available hardware and software, would certainly be of great benefit to the corresponding institutions and the whole country. (author)

  13. Design of low-cost general purpose microcontroller based neuromuscular stimulator.

    Science.gov (United States)

    Koçer, S; Rahmi Canal, M; Güler, I

    2000-04-01

    In this study, a general purpose, low-cost, programmable, portable and high performance stimulator is designed and implemented. For this purpose, a microcontroller is used in the design of the stimulator. The duty cycle and amplitude of the designed system can be controlled using a keyboard. The performance test of the system has shown that the results are reliable. The overall system can be used as the neuromuscular stimulator under safe conditions.

  14. On the characterization and software implementation of general protein lattice models.

    Directory of Open Access Journals (Sweden)

    Alessio Bechini

    Full Text Available models of proteins have been widely used as a practical means to computationally investigate general properties of the system. In lattice models any sterically feasible conformation is represented as a self-avoiding walk on a lattice, and residue types are limited in number. So far, only two- or three-dimensional lattices have been used. The inspection of the neighborhood of alpha carbons in the core of real proteins reveals that also lattices with higher coordination numbers, possibly in higher dimensional spaces, can be adopted. In this paper, a new general parametric lattice model for simplified protein conformations is proposed and investigated. It is shown how the supporting software can be consistently designed to let algorithms that operate on protein structures be implemented in a lattice-agnostic way. The necessary theoretical foundations are developed and organically presented, pinpointing the role of the concept of main directions in lattice-agnostic model handling. Subsequently, the model features across dimensions and lattice types are explored in tests performed on benchmark protein sequences, using a Python implementation. Simulations give insights on the use of square and triangular lattices in a range of dimensions. The trend of potential minimum for sequences of different lengths, varying the lattice dimension, is uncovered. Moreover, an extensive quantitative characterization of the usage of the so-called "move types" is reported for the first time. The proposed general framework for the development of lattice models is simple yet complete, and an object-oriented architecture can be proficiently employed for the supporting software, by designing ad-hoc classes. The proposed framework represents a new general viewpoint that potentially subsumes a number of solutions previously studied. The adoption of the described model pushes to look at protein structure issues from a more general and essential perspective, making

  15. Low Overhead Real-Time Computing With General Purpose Operating Systems

    National Research Council Canada - National Science Library

    Raymond, Michael

    2004-01-01

    .... In larger systems and more recently, general-purpose operating systems such as SGI IRIX and Linux are used for new projects because they already have multiprocessor and device driver support as well a large user base...

  16. Incremental and developmental perspectives for general-purpose learning systems

    Directory of Open Access Journals (Sweden)

    Fernando Martínez-Plumed

    2017-02-01

    Full Text Available The stupefying success of Articial Intelligence (AI for specic problems, from recommender systems to self-driving cars, has not yet been matched with a similar progress in general AI systems, coping with a variety of (dierent problems. This dissertation deals with the long-standing problem of creating more general AI systems, through the analysis of their development and the evaluation of their cognitive abilities. It presents a declarative general-purpose learning system and a developmental and lifelong approach for knowledge acquisition, consolidation and forgetting. It also analyses the use of the use of more ability-oriented evaluation techniques for AI evaluation and provides further insight for the understanding of the concepts of development and incremental learning in AI systems.

  17. 21 CFR 862.2050 - General purpose laboratory equipment labeled or promoted for a specific medical use.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false General purpose laboratory equipment labeled or... TOXICOLOGY DEVICES Clinical Laboratory Instruments § 862.2050 General purpose laboratory equipment labeled or promoted for a specific medical use. (a) Identification. General purpose laboratory equipment labeled or...

  18. DAE Tools: equation-based object-oriented modelling, simulation and optimisation software

    Directory of Open Access Journals (Sweden)

    Dragan D. Nikolić

    2016-04-01

    Full Text Available In this work, DAE Tools modelling, simulation and optimisation software, its programming paradigms and main features are presented. The current approaches to mathematical modelling such as the use of modelling languages and general-purpose programming languages are analysed. The common set of capabilities required by the typical simulation software are discussed, and the shortcomings of the current approaches recognised. A new hybrid approach is introduced, and the modelling languages and the hybrid approach are compared in terms of the grammar, compiler, parser and interpreter requirements, maintainability and portability. The most important characteristics of the new approach are discussed, such as: (1 support for the runtime model generation; (2 support for the runtime simulation set-up; (3 support for complex runtime operating procedures; (4 interoperability with the third party software packages (i.e. NumPy/SciPy; (5 suitability for embedding and use as a web application or software as a service; and (6 code-generation, model exchange and co-simulation capabilities. The benefits of an equation-based approach to modelling, implemented in a fourth generation object-oriented general purpose programming language such as Python are discussed. The architecture and the software implementation details as well as the type of problems that can be solved using DAE Tools software are described. Finally, some applications of the software at different levels of abstraction are presented, and its embedding capabilities and suitability for use as a software as a service is demonstrated.

  19. The Application Of Open-Source And Free Photogrammetric Software For The Purposes Of Cultural Heritage Documentation

    Directory of Open Access Journals (Sweden)

    Bartoš Karol

    2014-07-01

    Full Text Available The documentation of cultural heritage is an essential part of appropriate care of historical monuments, representing a part of our history. At present, it represents the current issue, for which considerable funds are being spent, as well as for the documentation of immovable historical monuments in a form of castle ruins, among the others. Non-contact surveying technologies - terrestrial laser scanning and digital photogrammetry belong to the most commonly used technologies, by which suitable documentation can be obtained, however their use may be very costly. In recent years, various types of software products and web services based on the SfM (or MVS method and developed as open-source software, or as a freely available and free service, relying on the basic principles of photogrammetry and computer vision, have started to get into the spotlight. By using the services and software, acquired digital images of a given object can be processed into a point cloud, serving directly as a final output or as a basis for further processing. The aim of this paper, based on images of various objects of the Slanec castle ruins obtained by the DSLR Pentax K5, is to assess the suitability of different types of open-source and free software and free web services and their reliability in terms of surface reconstruction and photo-texture quality for the purposes of castle ruins documentation.

  20. SSAGES: Software Suite for Advanced General Ensemble Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Sidky, Hythem [Department of Chemical and Biomolecular Engineering, University of Notre Dame, Notre Dame, Indiana 46556, USA; Colón, Yamil J. [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Institute for Molecular Engineering and Materials Science Division, Argonne National Laboratory, Lemont, Illinois 60439, USA; Helfferich, Julian [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Steinbuch Center for Computing, Karlsruhe Institute of Technology, Hermann-von-Helmholtz-Platz 1, 76344 Eggenstein-Leopoldshafen, Germany; Sikora, Benjamin J. [Department of Chemical and Biomolecular Engineering, University of Notre Dame, Notre Dame, Indiana 46556, USA; Bezik, Cody [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Chu, Weiwei [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Giberti, Federico [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Guo, Ashley Z. [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Jiang, Xikai [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Lequieu, Joshua [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Li, Jiyuan [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Moller, Joshua [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Quevillon, Michael J. [Department of Chemical and Biomolecular Engineering, University of Notre Dame, Notre Dame, Indiana 46556, USA; Rahimi, Mohammad [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Ramezani-Dakhel, Hadi [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Department of Biochemistry and Molecular Biology, University of Chicago, Chicago, Illinois 60637, USA; Rathee, Vikramjit S. [Department of Chemical and Biomolecular Engineering, University of Notre Dame, Notre Dame, Indiana 46556, USA; Reid, Daniel R. [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Sevgen, Emre [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Thapar, Vikram [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Webb, Michael A. [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Institute for Molecular Engineering and Materials Science Division, Argonne National Laboratory, Lemont, Illinois 60439, USA; Whitmer, Jonathan K. [Department of Chemical and Biomolecular Engineering, University of Notre Dame, Notre Dame, Indiana 46556, USA; de Pablo, Juan J. [Institute for Molecular Engineering, University of Chicago, Chicago, Illinois 60637, USA; Institute for Molecular Engineering and Materials Science Division, Argonne National Laboratory, Lemont, Illinois 60439, USA

    2018-01-28

    Molecular simulation has emerged as an essential tool for modern-day research, but obtaining proper results and making reliable conclusions from simulations requires adequate sampling of the system under consideration. To this end, a variety of methods exist in the literature that can enhance sampling considerably, and increasingly sophisticated, effective algorithms continue to be developed at a rapid pace. Implementation of these techniques, however, can be challenging for experts and non-experts alike. There is a clear need for software that provides rapid, reliable, and easy access to a wide range of advanced sampling methods, and that facilitates implementation of new techniques as they emerge. Here we present SSAGES, a publicly available Software Suite for Advanced General Ensemble Simulations designed to interface with multiple widely used molecular dynamics simulations packages. SSAGES allows facile application of a variety of enhanced sampling techniques—including adaptive biasing force, string methods, and forward flux sampling—that extract meaningful free energy and transition path data from all-atom and coarse grained simulations. A noteworthy feature of SSAGES is a user-friendly framework that facilitates further development and implementation of new methods and collective variables. In this work, the use of SSAGES is illustrated in the context of simple representative applications involving distinct methods and different collective variables that are available in the current release of the suite.

  1. The Generalized Support Software (GSS) Domain Engineering Process: An Object-Oriented Implementation and Reuse Success at Goddard Space Flight Center

    Science.gov (United States)

    Condon, Steven; Hendrick, Robert; Stark, Michael E.; Steger, Warren

    1997-01-01

    The Flight Dynamics Division (FDD) of NASA's Goddard Space Flight Center (GSFC) recently embarked on a far-reaching revision of its process for developing and maintaining satellite support software. The new process relies on an object-oriented software development method supported by a domain specific library of generalized components. This Generalized Support Software (GSS) Domain Engineering Process is currently in use at the NASA GSFC Software Engineering Laboratory (SEL). The key facets of the GSS process are (1) an architecture for rapid deployment of FDD applications, (2) a reuse asset library for FDD classes, and (3) a paradigm shift from developing software to configuring software for mission support. This paper describes the GSS architecture and process, results of fielding the first applications, lessons learned, and future directions

  2. On the System and Engineering Design of the General Purpose ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 13; Issue 5. On the System and Engineering Design of the General Purpose Electronic Digital Computer at TIFR. Rangaswamy Narasimhan. Classics Volume 13 Issue 5 May 2008 pp 490-501 ...

  3. New Generation General Purpose Computer (GPC) compact IBM unit

    Science.gov (United States)

    1991-01-01

    New Generation General Purpose Computer (GPC) compact IBM unit replaces a two-unit earlier generation computer. The new IBM unit is documented in table top views alone (S91-26867, S91-26868), with the onboard equipment it supports including the flight deck CRT screen and keypad (S91-26866), and next to the two earlier versions it replaces (S91-26869).

  4. An ontology based trust verification of software license agreement

    Science.gov (United States)

    Lu, Wenhuan; Li, Xiaoqing; Gan, Zengqin; Wei, Jianguo

    2017-08-01

    When we install software or download software, there will show up so big mass document to state the rights and obligations, for which lots of person are not patient to read it or understand it. That would may make users feel distrust for the software. In this paper, we propose an ontology based verification for Software License Agreement. First of all, this work proposed an ontology model for domain of Software License Agreement. The domain ontology is constructed by proposed methodology according to copyright laws and 30 software license agreements. The License Ontology can act as a part of generalized copyright law knowledge model, and also can work as visualization of software licenses. Based on this proposed ontology, a software license oriented text summarization approach is proposed which performances showing that it can improve the accuracy of software licenses summarizing. Based on the summarization, the underline purpose of the software license can be explicitly explored for trust verification.

  5. Global Software Engineering: A Software Process Approach

    Science.gov (United States)

    Richardson, Ita; Casey, Valentine; Burton, John; McCaffery, Fergal

    Our research has shown that many companies are struggling with the successful implementation of global software engineering, due to temporal, cultural and geographical distance, which causes a range of factors to come into play. For example, cultural, project managementproject management and communication difficulties continually cause problems for software engineers and project managers. While the implementation of efficient software processes can be used to improve the quality of the software product, published software process models do not cater explicitly for the recent growth in global software engineering. Our thesis is that global software engineering factors should be included in software process models to ensure their continued usefulness in global organisations. Based on extensive global software engineering research, we have developed a software process, Global Teaming, which includes specific practices and sub-practices. The purpose is to ensure that requirements for successful global software engineering are stipulated so that organisations can ensure successful implementation of global software engineering.

  6. SNAP: A General Purpose Network Analysis and Graph Mining Library.

    Science.gov (United States)

    Leskovec, Jure; Sosič, Rok

    2016-10-01

    Large networks are becoming a widely used abstraction for studying complex systems in a broad set of disciplines, ranging from social network analysis to molecular biology and neuroscience. Despite an increasing need to analyze and manipulate large networks, only a limited number of tools are available for this task. Here, we describe Stanford Network Analysis Platform (SNAP), a general-purpose, high-performance system that provides easy to use, high-level operations for analysis and manipulation of large networks. We present SNAP functionality, describe its implementational details, and give performance benchmarks. SNAP has been developed for single big-memory machines and it balances the trade-off between maximum performance, compact in-memory graph representation, and the ability to handle dynamic graphs where nodes and edges are being added or removed over time. SNAP can process massive networks with hundreds of millions of nodes and billions of edges. SNAP offers over 140 different graph algorithms that can efficiently manipulate large graphs, calculate structural properties, generate regular and random graphs, and handle attributes and meta-data on nodes and edges. Besides being able to handle large graphs, an additional strength of SNAP is that networks and their attributes are fully dynamic, they can be modified during the computation at low cost. SNAP is provided as an open source library in C++ as well as a module in Python. We also describe the Stanford Large Network Dataset, a set of social and information real-world networks and datasets, which we make publicly available. The collection is a complementary resource to our SNAP software and is widely used for development and benchmarking of graph analytics algorithms.

  7. Towards a general object-oriented software development methodology

    Science.gov (United States)

    Seidewitz, ED; Stark, Mike

    1986-01-01

    Object diagrams were used to design a 5000 statement team training exercise and to design the entire dynamics simulator. The object diagrams are also being used to design another 50,000 statement Ada system and a personal computer based system that will be written in Modula II. The design methodology evolves out of these experiences as well as the limitations of other methods that were studied. Object diagrams, abstraction analysis, and associated principles provide a unified framework which encompasses concepts from Yourdin, Booch, and Cherry. This general object-oriented approach handles high level system design, possibly with concurrency, through object-oriented decomposition down to a completely functional level. How object-oriented concepts can be used in other phases of the software life-cycle, such as specification and testing is being studied concurrently.

  8. Product-oriented Software Certification Process for Software Synthesis

    Science.gov (United States)

    Nelson, Stacy; Fischer, Bernd; Denney, Ewen; Schumann, Johann; Richardson, Julian; Oh, Phil

    2004-01-01

    The purpose of this document is to propose a product-oriented software certification process to facilitate use of software synthesis and formal methods. Why is such a process needed? Currently, software is tested until deemed bug-free rather than proving that certain software properties exist. This approach has worked well in most cases, but unfortunately, deaths still occur due to software failure. Using formal methods (techniques from logic and discrete mathematics like set theory, automata theory and formal logic as opposed to continuous mathematics like calculus) and software synthesis, it is possible to reduce this risk by proving certain software properties. Additionally, software synthesis makes it possible to automate some phases of the traditional software development life cycle resulting in a more streamlined and accurate development process.

  9. UFMulti: A new parallel processing software system for HEP

    Science.gov (United States)

    Avery, Paul; White, Andrew

    1989-12-01

    UFMulti is a multiprocessing software package designed for general purpose high energy physics applications, including physics and detector simulation, data reduction and DST physics analysis. The system is particularly well suited for installations where several workstation or computers are connected through a local area network (LAN). The initial configuration of the software is currently running on VAX/VMS machines with a planned extension to ULTRIX, using the new RISC CPUs from Digital, in the near future.

  10. UFMULTI: A new parallel processing software system for HEP

    International Nuclear Information System (INIS)

    Avery, P.; White, A.

    1989-01-01

    UFMulti is a multiprocessing software package designed for general purpose high energy physics applications, including physics and detector simulation, data reduction and DST physics analysis. The system is particularly well suited for installations where several workstations or computers are connected through a local area network (LAN). The initial configuration of the software is currently running on VAX/VMS machines with a planned extension to ULTRIX, using the new RISC CPUs from Digital, in the near future. (orig.)

  11. Usability Testing for Developing Effective Interactive Multimedia Software: Concepts, Dimensions, and Procedures

    Directory of Open Access Journals (Sweden)

    Sung Heum Lee

    1999-04-01

    Full Text Available Usability testing is a dynamic process that can be used throughout the process of developing interactive multimedia software. The purpose of usability testing is to find problems and make recommendations to improve the utility of a product during its design and development. For developing effective interactive multimedia software, dimensions of usability testing were classified into the general categories of: learnability; performance effectiveness; flexibility; error tolerance and system integrity; and user satisfaction. In the process of usability testing, evaluation experts consider the nature of users and tasks, tradeoffs supported by the iterative design paradigm, and real world constraints to effectively evaluate and improve interactive multimedia software. Different methods address different purposes and involve a combination of user and usability testing, however, usability practitioners follow the seven general procedures of usability testing for effective multimedia development. As the knowledge about usability testing grows, evaluation experts will be able to choose more effective and efficient methods and techniques that are appropriate to their goals.

  12. SSAGES: Software Suite for Advanced General Ensemble Simulations

    Science.gov (United States)

    Sidky, Hythem; Colón, Yamil J.; Helfferich, Julian; Sikora, Benjamin J.; Bezik, Cody; Chu, Weiwei; Giberti, Federico; Guo, Ashley Z.; Jiang, Xikai; Lequieu, Joshua; Li, Jiyuan; Moller, Joshua; Quevillon, Michael J.; Rahimi, Mohammad; Ramezani-Dakhel, Hadi; Rathee, Vikramjit S.; Reid, Daniel R.; Sevgen, Emre; Thapar, Vikram; Webb, Michael A.; Whitmer, Jonathan K.; de Pablo, Juan J.

    2018-01-01

    Molecular simulation has emerged as an essential tool for modern-day research, but obtaining proper results and making reliable conclusions from simulations requires adequate sampling of the system under consideration. To this end, a variety of methods exist in the literature that can enhance sampling considerably, and increasingly sophisticated, effective algorithms continue to be developed at a rapid pace. Implementation of these techniques, however, can be challenging for experts and non-experts alike. There is a clear need for software that provides rapid, reliable, and easy access to a wide range of advanced sampling methods and that facilitates implementation of new techniques as they emerge. Here we present SSAGES, a publicly available Software Suite for Advanced General Ensemble Simulations designed to interface with multiple widely used molecular dynamics simulations packages. SSAGES allows facile application of a variety of enhanced sampling techniques—including adaptive biasing force, string methods, and forward flux sampling—that extract meaningful free energy and transition path data from all-atom and coarse-grained simulations. A noteworthy feature of SSAGES is a user-friendly framework that facilitates further development and implementation of new methods and collective variables. In this work, the use of SSAGES is illustrated in the context of simple representative applications involving distinct methods and different collective variables that are available in the current release of the suite. The code may be found at: https://github.com/MICCoM/SSAGES-public.

  13. Commercial Literacy Software.

    Science.gov (United States)

    Balajthy, Ernest

    1997-01-01

    Presents the first year's results of a continuing project to monitor the availability of software of relevance for literacy education purposes. Concludes there is an enormous amount of software available for use by teachers of reading and literacy--whereas drill-and-practice software is the largest category of software available, large numbers of…

  14. Using the general-purpose reactivity indicator: challenging examples.

    Science.gov (United States)

    Anderson, James S M; Melin, Junia; Ayers, Paul W

    2016-03-01

    We elucidate the regioselectivity of nucleophilic attack on substituted benzenesulfonates, quinolines, and pyridines using a general-purpose reactivity indicator (GPRI) for electrophiles. We observe that the GPRI is most accurate when the incoming nucleophile resembles a point charge. We further observe that the GPRI often chooses reactive "dead ends" as the most reactive sites as well as sterically hindered reactive sites. This means that care must be taken to remove sites that are inherently unreactive. Generally, among sites where reactions actually occur, the GPRI identifies the sites in the molecule that lead to the kinetically favored product(s). Furthermore, the GPRI can discern which sites react with hard reagents and which sites react with soft reagents. Because it is currently impossible to use the mathematical framework of conceptual DFT to identify sterically inaccessible sites and reactive dead ends, the GPRI is primarily useful as an interpretative, not a predictive, tool.

  15. Software environment and configuration for the DSP controlled NSLS booster power supplies

    International Nuclear Information System (INIS)

    Olsen, R.; Dabrowski, J.; Murray, J.

    1993-01-01

    The booster at the NSLS is being upgraded from 0.75 to 2 pulses per second by means of the installation of new dipole, quadrupole, and sextupole power supplies. The control system of these power supplies employs general purpose digital signal processing modules, and therefore, software support is required. This paper outlines the development system configuration, and the software environment

  16. SRAC95; general purpose neutronics code system

    International Nuclear Information System (INIS)

    Okumura, Keisuke; Tsuchihashi, Keichiro; Kaneko, Kunio.

    1996-03-01

    SRAC is a general purpose neutronics code system applicable to core analyses of various types of reactors. Since the publication of JAERI-1302 for the revised SRAC in 1986, a number of additions and modifications have been made for nuclear data libraries and programs. Thus, the new version SRAC95 has been completed. The system consists of six kinds of nuclear data libraries(ENDF/B-IV, -V, -VI, JENDL-2, -3.1, -3.2), five modular codes integrated into SRAC95; collision probability calculation module (PIJ) for 16 types of lattice geometries, Sn transport calculation modules(ANISN, TWOTRAN), diffusion calculation modules(TUD, CITATION) and two optional codes for fuel assembly and core burn-up calculations(newly developed ASMBURN, revised COREBN). In this version, many new functions and data are implemented to support nuclear design studies of advanced reactors, especially for burn-up calculations. SRAC95 is available not only on conventional IBM-compatible computers but also on scalar or vector computers with the UNIX operating system. This report is the SRAC95 users manual which contains general description, contents of revisions, input data requirements, detail information on usage, sample input data and list of available libraries. (author)

  17. SRAC95; general purpose neutronics code system

    Energy Technology Data Exchange (ETDEWEB)

    Okumura, Keisuke; Tsuchihashi, Keichiro [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Kaneko, Kunio

    1996-03-01

    SRAC is a general purpose neutronics code system applicable to core analyses of various types of reactors. Since the publication of JAERI-1302 for the revised SRAC in 1986, a number of additions and modifications have been made for nuclear data libraries and programs. Thus, the new version SRAC95 has been completed. The system consists of six kinds of nuclear data libraries(ENDF/B-IV, -V, -VI, JENDL-2, -3.1, -3.2), five modular codes integrated into SRAC95; collision probability calculation module (PIJ) for 16 types of lattice geometries, Sn transport calculation modules(ANISN, TWOTRAN), diffusion calculation modules(TUD, CITATION) and two optional codes for fuel assembly and core burn-up calculations(newly developed ASMBURN, revised COREBN). In this version, many new functions and data are implemented to support nuclear design studies of advanced reactors, especially for burn-up calculations. SRAC95 is available not only on conventional IBM-compatible computers but also on scalar or vector computers with the UNIX operating system. This report is the SRAC95 users manual which contains general description, contents of revisions, input data requirements, detail information on usage, sample input data and list of available libraries. (author).

  18. A general-purpose development environment for intelligent computer-aided training systems

    Science.gov (United States)

    Savely, Robert T.

    1990-01-01

    Space station training will be a major task, requiring the creation of large numbers of simulation-based training systems for crew, flight controllers, and ground-based support personnel. Given the long duration of space station missions and the large number of activities supported by the space station, the extension of space shuttle training methods to space station training may prove to be impractical. The application of artificial intelligence technology to simulation training can provide the ability to deliver individualized training to large numbers of personnel in a distributed workstation environment. The principal objective of this project is the creation of a software development environment which can be used to build intelligent training systems for procedural tasks associated with the operation of the space station. Current NASA Johnson Space Center projects and joint projects with other NASA operational centers will result in specific training systems for existing space shuttle crew, ground support personnel, and flight controller tasks. Concurrently with the creation of these systems, a general-purpose development environment for intelligent computer-aided training systems will be built. Such an environment would permit the rapid production, delivery, and evolution of training systems for space station crew, flight controllers, and other support personnel. The widespread use of such systems will serve to preserve task and training expertise, support the training of many personnel in a distributed manner, and ensure the uniformity and verifiability of training experiences. As a result, significant reductions in training costs can be realized while safety and the probability of mission success can be enhanced.

  19. 78 FR 65300 - Notice of Availability (NOA) for General Purpose Warehouse and Information Technology Center...

    Science.gov (United States)

    2013-10-31

    ... (NOA) for General Purpose Warehouse and Information Technology Center Construction (GPW/IT)--Tracy Site... proposed action to construct a General Purpose Warehouse and Information Technology Center at Defense..., Suite 02G09, Alexandria, VA 22350- 3100. FOR FURTHER INFORMATION CONTACT: Ann Engelberger at (703) 767...

  20. Generalized Maintenance Trainer Simulator: Development of Hardware and Software. Final Report.

    Science.gov (United States)

    Towne, Douglas M.; Munro, Allen

    A general purpose maintenance trainer, which has the potential to simulate a wide variety of electronic equipments without hardware changes or new computer programs, has been developed and field tested by the Navy. Based on a previous laboratory model, the Generalized Maintenance Trainer Simulator (GMTS) is a relatively low cost trainer that…

  1. Basic Guidelines to Introduce Electric Circuit Simulation Software in a General Physics Course

    Science.gov (United States)

    Moya, A. A.

    2018-01-01

    The introduction of electric circuit simulation software for undergraduate students in a general physics course is proposed in order to contribute to the constructive learning of electric circuit theory. This work focuses on the lab exercises based on dc, transient and ac analysis in electric circuits found in introductory physics courses, and…

  2. Unix Philosophy and the Real World: Control Software for Humanoid Robots

    Directory of Open Access Journals (Sweden)

    Neil Thomas Dantam

    2016-03-01

    Full Text Available Robot software combines the challenges of general purpose and real-time software, requiring complex logic and bounded resource use. Physical safety, particularly for dynamic systems such as humanoid robots, depends on correct software. General purpose computation has converged on unix-like operating systems -- standardized as POSIX, the Portable Operating System Interface -- for devices from cellular phones to supercomputers. The modular, multi-process design typical of POSIX applications is effective for building complex and reliable software. Absent from POSIX, however, is an interproccess communication mechanism that prioritizes newer data as typically desired for control of physical systems. We address this need in the Ach communication library which provides suitable semantics and performance for real-time robot control. Although initially designed for humanoid robots, Ach has broader applicability to complex mechatronic devices -- humanoid and otherwise -- that require real-time coupling of sensors, control, planning, and actuation. The initial user space implementation of Ach was limited in the ability to receive data from multiple sources. We remove this limitation by implementing Ach as a Linux kernel module, enabling Ach's high-performance and latest-message-favored semantics within conventional POSIX communication pipelines. We discuss how these POSIX interfaces and design principles apply to robot software, and we present a case study using the Ach kernel module for communication on the Baxter robot.

  3. A randomised comparison between an inexpensive, general-purpose headlight and a purpose-built surgical headlight on users' visual acuity and colour vision.

    Science.gov (United States)

    Street, I; Sayles, M; Nistor, M; McRae, A R

    2014-02-01

    To determine if there are any differences in near visual acuity and colour vision between an inexpensive general-purpose light emitting diode (LED) headlight and a purpose-built surgical LED headlight. A prospective study was conducted sequentially comparing near visual acuity and colour vision, the headlights being tested in random order, in a testing room with a constant minimal amount of background light. The participants were NHS employee volunteers, with self-declared normal (or corrected) vision, working in occupations requiring full literacy. For visual acuity, outcome was measured by recording the smallest font legible when using each headlight when the subject read a near visual acuity test card. For colour vision, the outcome was passing or failing the Ishihara test. There was no statistically significant difference between the general-purpose and the purpose-built headlights in users' near visual acuity or colour vision.

  4. Software for relativistic atomic structure theory: The grasp project at oxford

    International Nuclear Information System (INIS)

    Parpia, F.A.; Grant, I.P.

    1991-01-01

    GRASP is an acronym for General-purpose Relativistic Atomic Structure Program. The objective of the GRASP project at Oxford is to produce user-friendly state-of-the-art multiconfiguration Dirac-Fock (MCDF) software packages for rleativistic atomic structure theory

  5. General-purpose parallel simulator for quantum computing

    International Nuclear Information System (INIS)

    Niwa, Jumpei; Matsumoto, Keiji; Imai, Hiroshi

    2002-01-01

    With current technologies, it seems to be very difficult to implement quantum computers with many qubits. It is therefore of importance to simulate quantum algorithms and circuits on the existing computers. However, for a large-size problem, the simulation often requires more computational power than is available from sequential processing. Therefore, simulation methods for parallel processors are required. We have developed a general-purpose simulator for quantum algorithms/circuits on the parallel computer (Sun Enterprise4500). It can simulate algorithms/circuits with up to 30 qubits. In order to test efficiency of our proposed methods, we have simulated Shor's factorization algorithm and Grover's database search, and we have analyzed robustness of the corresponding quantum circuits in the presence of both decoherence and operational errors. The corresponding results, statistics, and analyses are presented in this paper

  6. Patterns of Software Development Process

    Directory of Open Access Journals (Sweden)

    Sandro Javier Bolaños Castro

    2011-12-01

    Full Text Available "Times New Roman","serif";mso-fareast-font-family:"Times New Roman";mso-ansi-language:EN-US;mso-fareast-language:EN-US;mso-bidi-language:AR-SA">This article presents a set of patterns that can be found to perform best practices in software processes that are directly related to the problem of implementing the activities of the process, the roles involved, the knowledge generated and the inputs and outputs belonging to the process. In this work, a definition of the architecture is encouraged by using different recurrent configurations that strengthen the process and yield efficient results for the development of a software project. The patterns presented constitute a catalog, which serves as a vocabulary for communication among project participants [1], [2], and also can be implemented through software tools, thus facilitating patterns implementation [3]. Additionally, a tool that can be obtained under GPL (General Public license is provided for this purpose

  7. REVEAL - A tool for rule driven analysis of safety critical software

    International Nuclear Information System (INIS)

    Miedl, H.; Kersken, M.

    1998-01-01

    As the determination of ultrahigh reliability figures for safety critical software is hardly possible, national and international guidelines and standards give mainly requirements for the qualitative evaluation of software. An analysis whether all these requirements are fulfilled is time and effort consuming and prone to errors, if performed manually by analysts, and should instead be dedicated to tools as far as possible. There are many ''general-purpose'' software analysis tools, both static and dynamic, which help analyzing the source code. However, they are not designed to assess the adherence to specific requirements of guidelines and standards in the nuclear field. Against the background of the development of I and C systems in the nuclear field which are based on digital techniques and implemented in high level language, it is essential that the assessor or licenser has a tool with which he can automatically and uniformly qualify as many aspects as possible of the high level language software. For this purpose the software analysis tool REVEAL has been developed at ISTec and the Halden Reactor Project. (author)

  8. TOUGH2 software qualification

    Energy Technology Data Exchange (ETDEWEB)

    Pruess, K.; Simmons, A.; Wu, Y.S.; Moridis, G.

    1996-02-01

    TOUGH2 is a numerical simulation code for multi-dimensional coupled fluid and heat flow of multiphase, multicomponent fluid mixtures in porous and fractured media. It belongs to the MULKOM ({open_quotes}MULti-KOMponent{close_quotes}) family of codes and is a more general version of the TOUGH simulator. The MULKOM family of codes was originally developed with a focus on geothermal reservoir simulation. They are suited to modeling systems which contain different fluid mixtures, with applications to flow problems arising in the context of high-level nuclear waste isolation, oil and gas recovery and storage, and groundwater resource protection. TOUGH2 is essentially a subset of MULKOM, consisting of a selection of the better tested and documented MULKOM program modules. The purpose of this package of reports is to provide all software baseline documents necessary for the software qualification of TOUGH2.

  9. TOUGH2 software qualification

    International Nuclear Information System (INIS)

    Pruess, K.; Simmons, A.; Wu, Y.S.; Moridis, G.

    1996-02-01

    TOUGH2 is a numerical simulation code for multi-dimensional coupled fluid and heat flow of multiphase, multicomponent fluid mixtures in porous and fractured media. It belongs to the MULKOM (open-quotes MULti-KOMponentclose quotes) family of codes and is a more general version of the TOUGH simulator. The MULKOM family of codes was originally developed with a focus on geothermal reservoir simulation. They are suited to modeling systems which contain different fluid mixtures, with applications to flow problems arising in the context of high-level nuclear waste isolation, oil and gas recovery and storage, and groundwater resource protection. TOUGH2 is essentially a subset of MULKOM, consisting of a selection of the better tested and documented MULKOM program modules. The purpose of this package of reports is to provide all software baseline documents necessary for the software qualification of TOUGH2

  10. 24 CFR 990.310 - Purpose-General policy on financial management, monitoring and reporting.

    Science.gov (United States)

    2010-04-01

    ... Management Systems, Monitoring, and Reporting § 990.310 Purpose—General policy on financial management, monitoring and reporting. All PHA financial management systems, reporting, and monitoring of program... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Purpose-General policy on financial...

  11. The architecture of Newton, a general-purpose dynamics simulator

    Science.gov (United States)

    Cremer, James F.; Stewart, A. James

    1989-01-01

    The architecture for Newton, a general-purpose system for simulating the dynamics of complex physical objects, is described. The system automatically formulates and analyzes equations of motion, and performs automatic modification of this system equations when necessitated by changes in kinematic relationships between objects. Impact and temporary contact are handled, although only using simple models. User-directed influence of simulations is achieved using Newton's module, which can be used to experiment with the control of many-degree-of-freedom articulated objects.

  12. Foam: A general purpose Monte Carlo cellular algorithm

    International Nuclear Information System (INIS)

    Jadach, S.

    2003-01-01

    A general-purpose, self-adapting Monte Carlo (MC) algorithm implemented in the program Foam is described. The high efficiency of the MC, that is small maximum weight or variance of the MC weight is achieved by means of dividing the integration domain into small cells. The cells can be n-dimensional simplices, hyperrectangles cells. The next cell to be divided and the position/direction of the division hyperplane is chosen by the algorithm which optimizes the ratio of the maximum weight to the average weight or (optionally) the total variance. The algorithm is able to deal, in principle, with an arbitrary pattern of the singularities in the distribution

  13. Using general-purpose compression algorithms for music analysis

    DEFF Research Database (Denmark)

    Louboutin, Corentin; Meredith, David

    2016-01-01

    General-purpose compression algorithms encode files as dictionaries of substrings with the positions of these strings’ occurrences. We hypothesized that such algorithms could be used for pattern discovery in music. We compared LZ77, LZ78, Burrows–Wheeler and COSIATEC on classifying folk song...... in the input data, COSIATEC outperformed LZ77 with a mean F1 score of 0.123, compared with 0.053 for LZ77. However, when the music was processed a voice at a time, the F1 score for LZ77 more than doubled to 0.124. We also discovered a significant correlation between compression factor and F1 score for all...

  14. Software for FASTBUS and Motorola 68K based readout controllers for data acquisition

    International Nuclear Information System (INIS)

    Pordes, R.; Bernett, M.; Dorries, T.; Haire, M.; Moore, C.; Oleynik, G.; Votava, M.

    1989-01-01

    Many High Energy Physics experiments at Fermilab are now including FASTBUS front-ends in their data acquisition systems. The requirements on controllers to readout and control these FASTBUS systems are increasing in complexity and speed. The Data Acquisition Software Group has designed general software for front end 68Κ processor boards housed in FASTBUS or VME to meet these needs. The first implementation has been developed for the General Purpose FASTBUS Master (GPM). This software is being ported to the FASTBUS Smart Crate Controller under development at Fermilab. The software is designed, using structured analysis tools and coding in C, to be easily portable in the future to new processor boards. As part of their extended support for FASTBUS, they have enhanced their software for the intelligent LeCroy 1821 FASTBUS interface and implemented the FASTBUS standard routines for the VAX/VMS operating system

  15. Application of a general purpose finite element program system in pressure vessel technology

    International Nuclear Information System (INIS)

    Aamodt, B.; Sandsmark, N.; Medonos, S.

    1977-01-01

    Main advantages of using general purpose finite element program systems in structural analysis are summarized. Several illustrative applications of the program system SESAM-69 to pressure vessel problems are described. The first example is a dynamic analysis of the motor housing of the internal main circulation pump of a BWR nuclear reactor. The next example is a transient heat conduction and stress analysis of deflector of feeding nozzle of PWR nuclear reactor. Then, numerical calculations of stress intensity factors and fatigue crack growth of semi-elliptical surface cracks are discussed. And finally, an elasto-plastic analysis of a thick plate with edge-cracks is considered. It is concluded that due to the fact that general purpose finite element program systems are general and user-orientated, they will gain increasingly higher popularity in the years ahead

  16. Dependability modeling and assessment in UML-based software development.

    Science.gov (United States)

    Bernardi, Simona; Merseguer, José; Petriu, Dorina C

    2012-01-01

    Assessment of software nonfunctional properties (NFP) is an important problem in software development. In the context of model-driven development, an emerging approach for the analysis of different NFPs consists of the following steps: (a) to extend the software models with annotations describing the NFP of interest; (b) to transform automatically the annotated software model to the formalism chosen for NFP analysis; (c) to analyze the formal model using existing solvers; (d) to assess the software based on the results and give feedback to designers. Such a modeling→analysis→assessment approach can be applied to any software modeling language, be it general purpose or domain specific. In this paper, we focus on UML-based development and on the dependability NFP, which encompasses reliability, availability, safety, integrity, and maintainability. The paper presents the profile used to extend UML with dependability information, the model transformation to generate a DSPN formal model, and the assessment of the system properties based on the DSPN results.

  17. ProtDCal: A program to compute general-purpose-numerical descriptors for sequences and 3D-structures of proteins.

    Science.gov (United States)

    Ruiz-Blanco, Yasser B; Paz, Waldo; Green, James; Marrero-Ponce, Yovani

    2015-05-16

    software is intended to provide a useful tool for general-purpose encoding of protein sequences and structures for applications is protein classification, similarity analyses and function prediction.

  18. General-purpose heat source development. Phase II: conceptual designs

    International Nuclear Information System (INIS)

    Snow, E.C.; Zocher, R.W.; Grinberg, I.M.; Hulbert, L.E.

    1978-11-01

    Basic geometric module shapes and fuel arrays were studied to determine how well they could be expected to meet the General Purpose Heat Source (GPHS) design requirements. Seven conceptual designs were selected, detailed drawings produced, and these seven concepts analyzed. Three of these design concepts were selected as GPHS Trial Designs to be reanalyzed in more detail and tested. The geometric studies leading to the selection of the seven conceptual designs, the analyses of these designs, and the selection of the three trial designs are discussed

  19. Space shuttle general purpose computers (GPCs) (current and future versions)

    Science.gov (United States)

    1988-01-01

    Current and future versions of general purpose computers (GPCs) for space shuttle orbiters are represented in this frame. The two boxes on the left (AP101B) represent the current GPC configuration, with the input-output processor at far left and the central processing unit (CPU) at its side. The upgraded version combines both elements in a single unit (far right, AP101S).

  20. ABAQUS-EPGEN: a general-purpose finite-element code. Volume 1. User's manual

    International Nuclear Information System (INIS)

    Hibbitt, H.D.; Karlsson, B.I.; Sorensen, E.P.

    1982-10-01

    This document is the User's Manual for ABAQUS/EPGEN, a general purpose finite element computer program, designed specifically to serve advanced structural analysis needs. The program contains very general libraries of elements, materials and analysis procedures, and is highly modular, so that complex combinations of features can be put together to model physical problems. The program is aimed at production analysis needs, and for this purpose aspects such as ease-of-use, reliability, flexibility and efficiency have received maximum attention. The input language is designed to make it straightforward to describe complicated models; the analysis procedures are highly automated with the program choosing time or load increments based on user supplied tolerances and controls; and the program offers a wide range of post-processing options for display of the analysis results

  1. Catalog of physical protection equipment. Book 3: Volume VII. General purpose display components

    International Nuclear Information System (INIS)

    1977-06-01

    A catalog of commercially available physical protection equipment has been prepared under MITRE contract AT(49-24)-0376 for use by the U. S. Nuclear Regulatory Commission (NRC). Included is information on barrier structures and equipment, interior and exterior intrusion detection sensors, entry (access) control devices, surveillance and alarm assessment equipment, contraband detection sensors, automated response equipment, general purpose displays and general purpose communications, with one volume devoted to each of these eight areas. For each item of equipment the information included consists of performance, physical, cost and supply/logistics data. The entire catalog is contained in three notebooks for ease in its use by licensing and inspection staff at NRC

  2. An Aerodynamic Database for the Mk 82 General Purpose Low Drag Bomb

    National Research Council Canada - National Science Library

    Krishnamoorthy, L

    1997-01-01

    The drag database of the Mk 82 General Purpose Low Drag bomb, the primary gravity weapon in the RAAF inventory, has some shortcomings in the quality and traceability of data, and in the variations due...

  3. Introductory Molecular Orbital Theory: An Honors General Chemistry Computational Lab as Implemented Using Three-Dimensional Modeling Software

    Science.gov (United States)

    Ruddick, Kristie R.; Parrill, Abby L.; Petersen, Richard L.

    2012-01-01

    In this study, a computational molecular orbital theory experiment was implemented in a first-semester honors general chemistry course. Students used the GAMESS (General Atomic and Molecular Electronic Structure System) quantum mechanical software (as implemented in ChemBio3D) to optimize the geometry for various small molecules. Extended Huckel…

  4. A New Effort for Atmospherical Forecast: Meteorological Image Processing Software (MIPS) for Astronomical Observations

    Science.gov (United States)

    Shameoni Niaei, M.; Kilic, Y.; Yildiran, B. E.; Yüzlükoglu, F.; Yesilyaprak, C.

    2016-12-01

    We have described a new software (MIPS) about the analysis and image processing of the meteorological satellite (Meteosat) data for an astronomical observatory. This software will be able to help to make some atmospherical forecast (cloud, humidity, rain) using meteosat data for robotic telescopes. MIPS uses a python library for Eumetsat data that aims to be completely open-source and licenced under GNU/General Public Licence (GPL). MIPS is a platform independent and uses h5py, numpy, and PIL with the general-purpose and high-level programming language Python and the QT framework.

  5. Computer-assisted analyses of (/sup 14/C)2-DG autoradiographs employing a general purpose image processing system

    Energy Technology Data Exchange (ETDEWEB)

    Porro, C; Biral, G P [Modena Univ. (Italy). Ist. di Fisiologia Umana; Fonda, S; Baraldi, P [Modena Univ. (Italy). Lab. di Bioingegneria della Clinica Oculistica; Cavazzuti, M [Modena Univ. (Italy). Clinica Neurologica

    1984-09-01

    A general purpose image processing system is described including B/W TV camera, high resolution image processor and display system (TESAK VDC 501), computer (DEC PDP 11/23) and monochrome and color monitors. Images may be acquired from a microscope equipped with a TV camera or using the TV in direct viewing; the A/D converter and the image processor provides fast (40 ms) and precise (512x512 data points) digitization of TV signal with a 256 gray levels maximum resolution. Computer programs have been developed in order to perform qualitative and quantitative analyses of autoradiographs obtained with the 2-DG method, which are written in FORTRAN and MACRO 11 Assembly Language. They include: (1) procedures designed to recognize errors in acquisition due to possible image shading and correct them via software; (2) routines suitable for qualitative analyses of the whole image or selected regions of it, providing the opportunity for pseudocolor coding, statistics, graphic overlays; (3) programs permitting the conversion of gray levels into metabolic rates of glucose utilization and the display of gray- or color-coded metabolic maps.

  6. 15 CFR 744.17 - Restrictions on certain exports and reexports of general purpose microprocessors for “military...

    Science.gov (United States)

    2010-01-01

    ... reexports of general purpose microprocessors for âmilitary end-usesâ and to âmilitary end-users.â 744.17...: END-USER AND END-USE BASED § 744.17 Restrictions on certain exports and reexports of general purpose microprocessors for “military end-uses” and to “military end-users.” (a) General prohibition. In addition to the...

  7. An FPGA- Based General-Purpose Data Acquisition Controller

    Science.gov (United States)

    Robson, C. C. W.; Bousselham, A.; Bohm

    2006-08-01

    System development in advanced FPGAs allows considerable flexibility, both during development and in production use. A mixed firmware/software solution allows the developer to choose what shall be done in firmware or software, and to make that decision late in the process. However, this flexibility comes at the cost of increased complexity. We have designed a modular development framework to help to overcome these issues of increased complexity. This framework comprises a generic controller that can be adapted for different systems by simply changing the software or firmware parts. The controller can use both soft and hard processors, with or without an RTOS, based on the demands of the system to be developed. The resulting system uses the Internet for both control and data acquisition. In our studies we developed the embedded system in a Xilinx Virtex-II Pro FPGA, where we used both PowerPC and MicroBlaze cores, http, Java, and LabView for control and communication, together with the MicroC/OS-II and OSE operating systems

  8. BEANS - a software package for distributed Big Data analysis

    Science.gov (United States)

    Hypki, Arkadiusz

    2018-03-01

    BEANS software is a web based, easy to install and maintain, new tool to store and analyse in a distributed way a massive amount of data. It provides a clear interface for querying, filtering, aggregating, and plotting data from an arbitrary number of datasets. Its main purpose is to simplify the process of storing, examining and finding new relations in huge datasets. The software is an answer to a growing need of the astronomical community to have a versatile tool to store, analyse and compare the complex astrophysical numerical simulations with observations (e.g. simulations of the Galaxy or star clusters with the Gaia archive). However, this software was built in a general form and it is ready to use in any other research field. It can be used as a building block for other open source software too.

  9. Power performance of the general-purpose heat source radioisotope thermoelectric generator

    International Nuclear Information System (INIS)

    Bennett, G.L.; Lombardo, J.J.; Rock, B.J.

    1986-01-01

    The General-Purpose Heat Source Radioisotope Thermoelectric Generator (GRHS-RTG) has been developed under the sponsorship of the Department of Energy (DOE) to provide electrical power for the National Aeronautics and Space Administration (NASA) Galileo mission to Jupiter and the joint NASA/European Space Agency (ESA) Ulysses mission to study the polar regions of the sun. A total of five nuclear-heated generators and one electrically heated generator have been built and tested, proving out the design concept and meeting the specification requirements. The GPHS-RTG design is built upon the successful-technology used in the RTGs flown on the two NASA Voyager spacecraft and two US Air Force communications satellites. THe GPHS-RTG converts about 4400 W(t) from the nuclear heat source into at least 285 W(e) at beginning of mission (BOM). The GPHS-RTG consists of two major components: the General-Purpose Heat Source (GPHS) and the Converter. A conceptual drawing of the GPHs-RTG is presented and its design and performance are described

  10. General-purpose event generators for LHC physics

    CERN Document Server

    Buckley, Andy; Gieseke, Stefan; Grellscheid, David; Hoche, Stefan; Hoeth, Hendrik; Krauss, Frank; Lonnblad, Leif; Nurse, Emily; Richardson, Peter; Schumann, Steffen; Seymour, Michael H.; Sjostrand, Torbjorn; Skands, Peter; Webber, Bryan

    2011-01-01

    We review the physics basis, main features and use of general-purpose Monte Carlo event generators for the simulation of proton-proton collisions at the Large Hadron Collider. Topics included are: the generation of hard-scattering matrix elements for processes of interest, at both leading and next-to-leading QCD perturbative order; their matching to approximate treatments of higher orders based on the showering approximation; the parton and dipole shower formulations; parton distribution functions for event generators; non-perturbative aspects such as soft QCD collisions, the underlying event and diffractive processes; the string and cluster models for hadron formation; the treatment of hadron and tau decays; the inclusion of QED radiation and beyond-Standard-Model processes. We describe the principal features of the ARIADNE, Herwig++, PYTHIA 8 and SHERPA generators, together with the Rivet and Professor validation and tuning tools, and discuss the physics philosophy behind the proper use of these generators ...

  11. General purpose heat source task group. Final report

    International Nuclear Information System (INIS)

    1979-01-01

    The results of thermal analyses and impact tests on a modified design of a 238 Pu-fueled general purpose heat source (GPHS) for spacecraft power supplies are presented. This work was performed to establish the safety of a heat source with pyrolytic graphite insulator shells located either inside or outside the graphite impact shell. This safety is dependent on the degree of aerodynamic heating of the heat source during reentry and on the ability of the heat source capsule to withstand impact after reentry. Analysis of wind tunnel and impact test data result in a recommended GPHS design which should meet all temperature and safety requirements. Further wind tunnel tests, drop tests, and impact tests are recommended to verify the safety of this design

  12. Automating Object-Oriented Software Development Methods

    NARCIS (Netherlands)

    Tekinerdogan, B.; Saeki, Motoshi; Sunyé, Gerson; van den Broek, P.M.; Hruby, Pavel; Tekinerdogan, B.; van den Broek, P.M.; Saeki, M.; Hruby, P.; Sunye, G.

    2001-01-01

    Current software projects have generally to deal with producing and managing large and complex software products. It is generally believed that applying software development methods are useful in coping with this complexity and for supporting quality. As such numerous object-oriented software

  13. Automating Object-Oriented Software Development Methods

    NARCIS (Netherlands)

    Tekinerdogan, B.; Frohner, A´ kos; Saeki, Motoshi; Sunyé, Gerson; van den Broek, P.M.; Hruby, Pavel

    2002-01-01

    Current software projects have generally to deal with producing and managing large and complex software products. It is generally believed that applying software development methods are useful in coping with this complexity and for supporting quality. As such numerous object-oriented software

  14. Presenting an Evaluation Model for the Cancer Registry Software.

    Science.gov (United States)

    Moghaddasi, Hamid; Asadi, Farkhondeh; Rabiei, Reza; Rahimi, Farough; Shahbodaghi, Reihaneh

    2017-12-01

    As cancer is increasingly growing, cancer registry is of great importance as the main core of cancer control programs, and many different software has been designed for this purpose. Therefore, establishing a comprehensive evaluation model is essential to evaluate and compare a wide range of such software. In this study, the criteria of the cancer registry software have been determined by studying the documents and two functional software of this field. The evaluation tool was a checklist and in order to validate the model, this checklist was presented to experts in the form of a questionnaire. To analyze the results of validation, an agreed coefficient of %75 was determined in order to apply changes. Finally, when the model was approved, the final version of the evaluation model for the cancer registry software was presented. The evaluation model of this study contains tool and method of evaluation. The evaluation tool is a checklist including the general and specific criteria of the cancer registry software along with their sub-criteria. The evaluation method of this study was chosen as a criteria-based evaluation method based on the findings. The model of this study encompasses various dimensions of cancer registry software and a proper method for evaluating it. The strong point of this evaluation model is the separation between general criteria and the specific ones, while trying to fulfill the comprehensiveness of the criteria. Since this model has been validated, it can be used as a standard to evaluate the cancer registry software.

  15. An object-oriented software interface to VME

    International Nuclear Information System (INIS)

    Thomas, Timothy L; Gottlieb, Eric; Gold, Michael

    1996-01-01

    In the next millennium, data acquisition tasks for high energy physics will increasingly rely on distributed processing and the VME bus. To provide transparent, general-purpose access to VME hardware modules through a VME-embedded processor, we have created a simple, portable, easily configured object-oriented interface to the VME bus. This software is particularly well-suited for hardware development, providing rapid engineering level access to the VME interface of prototype modules. (author)

  16. [Violence for educational purpose: Representations of general practitioners in the Paris area, France. A qualitative study].

    Science.gov (United States)

    de Brie, Claire; Piet, Emmanuelle; Chariot, Patrick

    2018-03-01

    Violence for educational purpose refers to a modality of education that includes threats, verbal abuse, physical abuse and humiliations. Twenty European countries, not including France, have abolished corporal punishment through explicit laws and regulations. The position of general practitioners in the screening and care of violence for educational purpose in France is unknown. In this study, we aimed to assess the representations of this form of violence among general practitioners. We have performed semi-directed interviews of general practitioners in the Paris, France region (Île-de-France). Interviews were conducted until data saturation was achieved. Interviews were recorded, transcribed and analysed by two investigators. Interviews were conducted with 20 physicians (November 2015-January 2016). General practitioners considered that physical, verbal or psychological abuse had possible negative consequences on children. Uncertainty regarding the consequences of violence was a cause of tolerance towards violence for educational purpose, depending on the act committed and the context, as perceived by nearly all practitioners. General practitioners expressed interest in the field. They cited their own education and experience as the main obstacles to action. Most of them expressed a feeling of failure when they screened or took care of violence for educational purpose. This study suggests that doctors can participate in supporting the parents in the prevention of violence for educational purpose. Support to parents would need specific medical training as well as a societal change. Copyright © 2018 Elsevier Masson SAS. All rights reserved.

  17. Analytical exploration of the thermodynamic potentials by using symbolic computation software

    International Nuclear Information System (INIS)

    Hantsaridou, Anastasia P; Polatoglou, Hariton M

    2005-01-01

    Thermodynamics is a very general theory, based on fundamental symmetries. It generalizes classical mechanics and incorporates theoretical concepts such as field and field equations. Although all these ingredients are of the highest importance for a scientist, they are not given the attention they perhaps deserve in most undergraduate courses. Nowadays, powerful computers in conjunction with equally powerful software can ease the exploration of the crucial ideas of thermodynamics. The purpose of the present work is to show how the utilization of symbolic computation software can lead to a complementary understanding of thermodynamics. The method was applied to first and second year physics students in the Aristotle University of Thessaloniki (Greece) during the 2002-2003 academic year. The results indicate that symbolic computation software is appropriate not only for enhancing the teaching of the fundamental principles in thermodynamics and their applications, but also for increasing students' motivation for learning

  18. Foam: A general purpose Monte Carlo cellular algorithm

    International Nuclear Information System (INIS)

    Jadach, S.

    2002-01-01

    A general-purpose, self-adapting Monte Carlo (MC) algorithm implemented in the program Foam is described. The high efficiency of the MC, that is small maximum weight or variance of the MC weight is achieved by means of dividing the integration domain into small cells. The cells can be n-dimensional simplices, hyperrectangles or a Cartesian product of them. The grid of cells, called 'foam', is produced in the process of the binary split of the cells. The choice of the next cell to be divided and the position/direction of the division hyperplane is driven by the algorithm which optimizes the ratio of the maximum weight to the average weight or (optionally) the total variance. The algorithm is able to deal, in principle, with an arbitrary pattern of the singularities in the distribution. (author)

  19. Foam A General purpose Monte Carlo Cellular Algorithm

    CERN Document Server

    Jadach, Stanislaw

    2002-01-01

    A general-purpose, self-adapting Monte Carlo (MC) algorithm implemented in the program {\\tt Foam} is described. The high efficiency of the MC, that is small maximum weight or variance of the MC weight is achieved by means of dividing the integration domain into small cells. The cells can be $n$-dimensional simplices, hyperrectangles or a Cartesian product of them. The grid of cells, ``foam'', is produced in the process of the binary split of the cells. The next cell to be divided and the position/direction of the division hyperplane is chosen by the algorithm which optimizes the ratio of the maximum weight to the average weight or (optionally) the total variance. The algorithm is able to deal, in principle, with an arbitrary pattern of the singularities in the distribution.

  20. Module Testing Techniques for Nuclear Safety Critical Software Using LDRA Testing Tool

    International Nuclear Information System (INIS)

    Moon, Kwon-Ki; Kim, Do-Yeon; Chang, Hoon-Seon; Chang, Young-Woo; Yun, Jae-Hee; Park, Jee-Duck; Kim, Jae-Hack

    2006-01-01

    The safety critical software in the I and C systems of nuclear power plants requires high functional integrity and reliability. To achieve those requirement goals, the safety critical software should be verified and tested according to related codes and standards through verification and validation (V and V) activities. The safety critical software testing is performed at various stages during the development of the software, and is generally classified as three major activities: module testing, system integration testing, and system validation testing. Module testing involves the evaluation of module level functions of hardware and software. System integration testing investigates the characteristics of a collection of modules and aims at establishing their correct interactions. System validation testing demonstrates that the complete system satisfies its functional requirements. In order to generate reliable software and reduce high maintenance cost, it is important that software testing is carried out at module level. Module testing for the nuclear safety critical software has rarely been performed by formal and proven testing tools because of its various constraints. LDRA testing tool is a widely used and proven tool set that provides powerful source code testing and analysis facilities for the V and V of general purpose software and safety critical software. Use of the tool set is indispensable where software is required to be reliable and as error-free as possible, and its use brings in substantial time and cost savings, and efficiency

  1. Software Acquisition and Software Engineering Best Practices

    National Research Council Canada - National Science Library

    Eslinger, S

    1999-01-01

    The purpose of this white paper is to address the issues raised in the recently published Senate Armed Services Committee Report 106-50 concerning Software Management Improvements for the Department of Defense (DoD...

  2. The use of generalised audit software by internal audit functions in a developing country: The purpose of the use of generalised audit software as a data analytics tool

    Directory of Open Access Journals (Sweden)

    D.P. van der Nest

    2017-11-01

    Full Text Available This article explores the purpose of the use of generalised audit software as a data analytics tool by internal audit functions in the locally controlled banking industry of South Africa. The evolution of the traditional internal audit methodology of collecting audit evidence through the conduct of interviews, the completion of questionnaires, and by testing controls on a sample basis, is long overdue, and such practice in the present technological, data-driven era will soon render such an internal audit function obsolete. The research results indicate that respondents are utilising GAS for a variety of purposes but that its frequency of use is not yet optimal and that there is still much room for improvement for tests of controls purposes. The top five purposes for which the respondents make use of GAS often to always during separate internal audit engagements are: (1 to identify transactions with specific characteristics or control criteria for tests of control purposes; (2 for conducting full population analysis; (3 to identify account balances over a certain amount; (4 to identify and report on the frequency of occurrence of risks or frequency of occurrence of specific events; and (5 to obtain audit evidence about control effectiveness

  3. A VMEbus general-purpose data acquisition system

    International Nuclear Information System (INIS)

    Ninane, A.; Nemry, M.; Martou, J.L.; Somers, F.

    1992-01-01

    We present a general-purpose, VMEbus based, multiprocessor data acquisition and monitoring system. Events, handled by a master CPU, are kept at the disposal of data storage and monitoring processes which can run on distinct processors. They access either the complete set of data or a fraction of them, minimizing the acquisition dead-time. The system is built with the VxWorks 5.0 real time kernel to which we have added device drivers for data acquisition and monitoring. The acquisition is controlled and the data are displayed on a workstation. The user interface is written in C ++ and re-uses the classes of the Interviews and the NIH libraries. The communication between the control workstation and the VMEbus processors is made through SUN RPCs on an Ethernet link. The system will be used for, CAMAC based, data acquisition for nuclear physics experiments as well as for the VXI data taking with the 4π configuration (100 neutron detectors) of the Brussels-Caen-Louvian-Strasbourg DEMON collaboration. (author)

  4. High performance in software development

    CERN Multimedia

    CERN. Geneva; Haapio, Petri; Liukkonen, Juha-Matti

    2015-01-01

    What are the ingredients of high-performing software? Software development, especially for large high-performance systems, is one the most complex tasks mankind has ever tried. Technological change leads to huge opportunities but challenges our old ways of working. Processing large data sets, possibly in real time or with other tight computational constraints, requires an efficient solution architecture. Efficiency requirements span from the distributed storage and large-scale organization of computation and data onto the lowest level of processor and data bus behavior. Integrating performance behavior over these levels is especially important when the computation is resource-bounded, as it is in numerics: physical simulation, machine learning, estimation of statistical models, etc. For example, memory locality and utilization of vector processing are essential for harnessing the computing power of modern processor architectures due to the deep memory hierarchies of modern general-purpose computers. As a r...

  5. 48 CFR 252.227-7014 - Rights in noncommercial computer software and noncommercial computer software documentation.

    Science.gov (United States)

    2010-10-01

    ...) Restricted rights in computer software, limited rights in technical data, or government purpose license... necessary to perfect a license or licenses in the deliverable software or documentation of the appropriate... the license rights obtained. (e) Identification and delivery of computer software and computer...

  6. Installation of new Generation General Purpose Computer (GPC) compact unit

    Science.gov (United States)

    1991-01-01

    In the Kennedy Space Center's (KSC's) Orbiter Processing Facility (OPF) high bay 2, Spacecraft Electronics technician Ed Carter (right), wearing clean suit, prepares for (26864) and installs (26865) the new Generation General Purpose Computer (GPC) compact IBM unit in Atlantis', Orbiter Vehicle (OV) 104's, middeck avionics bay as Orbiter Systems Quality Control technician Doug Snider looks on. Both men work for NASA contractor Lockheed Space Operations Company. All three orbiters are being outfitted with the compact IBM unit, which replaces a two-unit earlier generation computer.

  7. Detectability of T1a lung cancer on digital chest radiographs: an observer-performance comparison among 2-megapixel general-purpose, 2-megapixel medical-purpose, and 3-megapixel medical-purpose liquid-crystal display (LCD) monitors.

    Science.gov (United States)

    Yabuuchi, Hidetake; Matsuo, Yoshio; Kamitani, Takeshi; Jinnnouchi, Mikako; Yonezawa, Masato; Yamasaki, Yuzo; Nagao, Michinobu; Kawanami, Satoshi; Okamoto, Tatsuro; Sasaki, Masayuki; Honda, Hiroshi

    2015-08-01

    There has been no comparison of detectability of small lung cancer between general and medical LCD monitors or no comparison of detectability of small lung cancer between solid and part-solid nodules. To compare the detectabilities of T1a lung cancer on chest radiographs on three LCD monitor types: 2-megapixel (MP) for general purpose (General), 2-MP for medical purpose (Medical), and 3-MP-Medical. Radiographs from forty patients with T1aN0M0 primary lung cancer (27 solid nodules, 13 part-solid nodules) and 60 patients with no abnormalities on both chest X-ray and computed tomography (CT) were consecutively collected. Five readers assessed 100 cases for each monitor. The observations were analyzed using receiver operating characteristic (ROC) analysis. A jackknife method was used for statistical analysis. A P value of General, 2-MP-Medical, and 3-MP-Medical LCD monitors were 0.86, 0.89, and 0.89, respectively; there were no significant differences among them. The average AUC for part-solid nodule detection using a 2-MP-General, 2-MP-Medical, and 3-MP-Medical LCD monitors were 0.77, 0.86, and 0.89, respectively. There were significant differences between the 2-MP-General and 2-MP-Medical LCD monitors (P = 0.043) and between the 2-MP-General and 3-MP-Medical LCD monitors (P = 0.027). There was no significant difference between the 2-MP-Medical and 3-MP-Medical LCD monitors. The average AUC for solid nodule detection using a 2-MP-General, 2-MP-Medical, and 3-MP-Medical LCD monitors were 0.90, 0.90, and 0.88, respectively; there were no significant differences among them. The mean AUC values for all and part-solid nodules of the low-experienced readers were significantly lower than those of the high-experienced readers with the 2 M-GP color LCD monitor (P general-purpose LCD monitor was significantly lower than those using medical-purpose LCD monitors. © The Foundation Acta Radiologica 2014.

  8. Strengthening Software Authentication with the ROSE Software Suite

    International Nuclear Information System (INIS)

    White, G

    2006-01-01

    Many recent nonproliferation and arms control software projects include a software authentication regime. These include U.S. Government-sponsored projects both in the United States and in the Russian Federation (RF). This trend toward requiring software authentication is only accelerating. Demonstrating assurance that software performs as expected without hidden ''backdoors'' is crucial to a project's success. In this context, ''authentication'' is defined as determining that a software package performs only its intended purpose and performs said purpose correctly and reliably over the planned duration of an agreement. In addition to visual inspections by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs, both to aid visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary and not extensible. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool has to be based on a complete language compiler. ROSE is precisely such a compiler infrastructure developed within the Department of Energy (DOE) and targeted at the optimization of scientific applications and user-defined libraries within large-scale applications (typically applications of a million lines of code). ROSE is a robust, source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C and C++ (handling the full C, C99, C++ languages and with current collaborations to support Fortran90). We propose to extend ROSE to address a number of security-specific requirements, and apply it to software authentication for nonproliferation and arms control projects

  9. Coupling Sensing Hardware with Data Interrogation Software for Structural Health Monitoring

    Directory of Open Access Journals (Sweden)

    Charles R. Farrar

    2006-01-01

    Full Text Available The process of implementing a damage detection strategy for aerospace, civil and mechanical engineering infrastructure is referred to as structural health monitoring (SHM. The authors' approach is to address the SHM problem in the context of a statistical pattern recognition paradigm. In this paradigm, the process can be broken down into four parts: (1 Operational Evaluation, (2 Data Acquisition and Cleansing, (3 Feature Extraction and Data Compression, and (4 Statistical Model Development for Feature Discrimination. These processes must be implemented through hardware or software and, in general, some combination of these two approaches will be used. This paper will discuss each portion of the SHM process with particular emphasis on the coupling of a general purpose data interrogation software package for structural health monitoring with a modular wireless sensing and processing platform. More specifically, this paper will address the need to take an integrated hardware/software approach to developing SHM solutions.

  10. Software engineering laboratory series: Annotated bibliography of software engineering laboratory literature

    Science.gov (United States)

    Morusiewicz, Linda; Valett, Jon

    1992-01-01

    This document is an annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory. More than 100 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. This document has been updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials have been grouped into eight general subject areas for easy reference: (1) the Software Engineering Laboratory; (2) the Software Engineering Laboratory: Software Development Documents; (3) Software Tools; (4) Software Models; (5) Software Measurement; (6) Technology Evaluations; (7) Ada Technology; and (8) Data Collection. This document contains an index of these publications classified by individual author.

  11. Apple-CORE: Microgrids of SVP cores: flexible, general-purpose, fine-grained hardware concurrency management

    NARCIS (Netherlands)

    Poss, R.; Lankamp, M.; Yang, Q.; Fu, J.; van Tol, M.W.; Jesshope, C.; Nair, S.

    2012-01-01

    To harness the potential of CMPs for scalable, energy-efficient performance in general-purpose computers, the Apple-CORE project has co-designed a general machine model and concurrency control interface with dedicated hardware support for concurrency control across multiple cores. Its SVP interface

  12. Methods to model-check parallel systems software

    International Nuclear Information System (INIS)

    Matlin, O. S.; McCune, W.; Lusk, E.

    2003-01-01

    We report on an effort to develop methodologies for formal verification of parts of the Multi-Purpose Daemon (MPD) parallel process management system. MPD is a distributed collection of communicating processes. While the individual components of the collection execute simple algorithms, their interaction leads to unexpected errors that are difficult to uncover by conventional means. Two verification approaches are discussed here: the standard model checking approach using the software model checker SPIN and the nonstandard use of a general-purpose first-order resolution-style theorem prover OTTER to conduct the traditional state space exploration. We compare modeling methodology and analyze performance and scalability of the two methods with respect to verification of MPD

  13. Experiments with general purpose visualization software on a unix workstation

    International Nuclear Information System (INIS)

    Adam, G.

    1995-10-01

    A study was performed on the opportunity of buying, for the ICTP use, one of the following visualization systems: Advanced Visualization Systems (AVS) - release 5.02 IRIS Explorer - release 2.2 from NAG IBM Data Explorer (DX) - release 2.1.5 Khoros - Developer's Release 2.0+p2. Criteria for an optimal choice were defined and it was concluded that none of these visualization systems would be a good today compromise. Conservative consideration of the market opportunities shows that substantially improved releases of these systems are expected to be operational within at most an year. For short term period, the ratio benefit to burden still makes public domain low-end graphics attractive. (author)

  14. Experiments with general purpose visualization software on a unix workstation

    Energy Technology Data Exchange (ETDEWEB)

    Adam, G

    1995-10-01

    A study was performed on the opportunity of buying, for the ICTP use, one of the following visualization systems: Advanced Visualization Systems (AVS) - release 5.02 IRIS Explorer - release 2.2 from NAG IBM Data Explorer (DX) - release 2.1.5 Khoros - Developer`s Release 2.0+p2. Criteria for an optimal choice were defined and it was concluded that none of these visualization systems would be a good today compromise. Conservative consideration of the market opportunities shows that substantially improved releases of these systems are expected to be operational within at most an year. For short term period, the ratio benefit to burden still makes public domain low-end graphics attractive. (author).

  15. Ground control station software design for micro aerial vehicles

    Science.gov (United States)

    Walendziuk, Wojciech; Oldziej, Daniel; Binczyk, Dawid Przemyslaw; Slowik, Maciej

    2017-08-01

    This article describes the process of designing the equipment part and the software of a ground control station used for configuring and operating micro unmanned aerial vehicles (UAV). All the works were conducted on a quadrocopter model being a commonly accessible commercial construction. This article contains a characteristics of the research object, the basics of operating the micro aerial vehicles (MAV) and presents components of the ground control station model. It also describes the communication standards for the purpose of building a model of the station. Further part of the work concerns the software of the product - the GIMSO application (Generally Interactive Station for Mobile Objects), which enables the user to manage the actions and communication and control processes from the UAV. The process of creating the software and the field tests of a station model are also presented in the article.

  16. General-purpose readout electronics for white neutron source at China Spallation Neutron Source.

    Science.gov (United States)

    Wang, Q; Cao, P; Qi, X; Yu, T; Ji, X; Xie, L; An, Q

    2018-01-01

    The under-construction White Neutron Source (WNS) at China Spallation Neutron Source is a facility for accurate measurements of neutron-induced cross section. Seven spectrometers are planned at WNS. As the physical objectives of each spectrometer are different, the requirements for readout electronics are not the same. In order to simplify the development of the readout electronics, this paper presents a general method for detector signal readout. This method has advantages of expansibility and flexibility, which makes it adaptable to most detectors at WNS. In the WNS general-purpose readout electronics, signals from any kinds of detectors are conditioned by a dedicated signal conditioning module corresponding to this detector, and then digitized by a common waveform digitizer with high speed and high precision (1 GSPS at 12-bit) to obtain the full waveform data. The waveform digitizer uses a field programmable gate array chip to process the data stream and trigger information in real time. PXI Express platform is used to support the functionalities of data readout, clock distribution, and trigger information exchange between digitizers and trigger modules. Test results show that the performance of the WNS general-purpose readout electronics can meet the requirements of the WNS spectrometers.

  17. General-purpose event generators for LHC physics

    International Nuclear Information System (INIS)

    Buckley, Andy; Butterworth, Jonathan; Gieseke, Stefan; Grellscheid, David; Hoeche, Stefan; Hoeth, Hendrik; Krauss, Frank; Loennblad, Leif; Nurse, Emily; Richardson, Peter; Schumann, Steffen; Seymour, Michael H.; Sjoestrand, Torbjoern; Skands, Peter; Webber, Bryan

    2011-01-01

    We review the physics basis, main features and use of general-purpose Monte Carlo event generators for the simulation of proton-proton collisions at the Large Hadron Collider. Topics included are: the generation of hard scattering matrix elements for processes of interest, at both leading and next-to-leading QCD perturbative order; their matching to approximate treatments of higher orders based on the showering approximation; the parton and dipole shower formulations; parton distribution functions for event generators; non-perturbative aspects such as soft QCD collisions, the underlying event and diffractive processes; the string and cluster models for hadron formation; the treatment of hadron and tau decays; the inclusion of QED radiation and beyond Standard Model processes. We describe the principal features of the ARIADNE, Herwig++, PYTHIA 8 and SHERPA generators, together with the Rivet and Professor validation and tuning tools, and discuss the physics philosophy behind the proper use of these generators and tools. This review is aimed at phenomenologists wishing to understand better how parton-level predictions are translated into hadron-level events as well as experimentalists seeking a deeper insight into the tools available for signal and background simulation at the LHC.

  18. General-purpose event generators for LHC physics

    Energy Technology Data Exchange (ETDEWEB)

    Buckley, Andy [PPE Group, School of Physics and Astronomy, University of Edinburgh, EH25 9PN (United Kingdom); Butterworth, Jonathan [Department of Physics and Astronomy, University College London, WC1E 6BT (United Kingdom); Gieseke, Stefan [Institute for Theoretical Physics, Karlsruhe Institute of Technology, D-76128 Karlsruhe (Germany); Grellscheid, David [Institute for Particle Physics Phenomenology, Durham University, DH1 3LE (United Kingdom); Hoeche, Stefan [SLAC National Accelerator Laboratory, Menlo Park, CA 94025 (United States); Hoeth, Hendrik; Krauss, Frank [Institute for Particle Physics Phenomenology, Durham University, DH1 3LE (United Kingdom); Loennblad, Leif [Department of Astronomy and Theoretical Physics, Lund University (Sweden); PH Department, TH Unit, CERN, CH-1211 Geneva 23 (Switzerland); Nurse, Emily [Department of Physics and Astronomy, University College London, WC1E 6BT (United Kingdom); Richardson, Peter [Institute for Particle Physics Phenomenology, Durham University, DH1 3LE (United Kingdom); Schumann, Steffen [Institute for Theoretical Physics, University of Heidelberg, 69120 Heidelberg (Germany); Seymour, Michael H. [School of Physics and Astronomy, University of Manchester, M13 9PL (United Kingdom); Sjoestrand, Torbjoern [Department of Astronomy and Theoretical Physics, Lund University (Sweden); Skands, Peter [PH Department, TH Unit, CERN, CH-1211 Geneva 23 (Switzerland); Webber, Bryan, E-mail: webber@hep.phy.cam.ac.uk [Cavendish Laboratory, J.J. Thomson Avenue, Cambridge CB3 0HE (United Kingdom)

    2011-07-15

    We review the physics basis, main features and use of general-purpose Monte Carlo event generators for the simulation of proton-proton collisions at the Large Hadron Collider. Topics included are: the generation of hard scattering matrix elements for processes of interest, at both leading and next-to-leading QCD perturbative order; their matching to approximate treatments of higher orders based on the showering approximation; the parton and dipole shower formulations; parton distribution functions for event generators; non-perturbative aspects such as soft QCD collisions, the underlying event and diffractive processes; the string and cluster models for hadron formation; the treatment of hadron and tau decays; the inclusion of QED radiation and beyond Standard Model processes. We describe the principal features of the ARIADNE, Herwig++, PYTHIA 8 and SHERPA generators, together with the Rivet and Professor validation and tuning tools, and discuss the physics philosophy behind the proper use of these generators and tools. This review is aimed at phenomenologists wishing to understand better how parton-level predictions are translated into hadron-level events as well as experimentalists seeking a deeper insight into the tools available for signal and background simulation at the LHC.

  19. General-purpose event generators for LHC physics

    Energy Technology Data Exchange (ETDEWEB)

    Buckley, Andy; /Edinburgh U.; Butterworth, Jonathan; /University Coll. London; Gieseke, Stefan; /Karlsruhe U., ITP; Grellscheid, David; /Durham U., IPPP; Hoche, Stefan; /SLAC; Hoeth, Hendrik; Krauss, Frank; /Durham U., IPPP; Lonnblad, Leif; /Lund U., Dept. Theor. Phys. /CERN; Nurse, Emily; /University Coll. London; Richardson, Peter; /Durham U., IPPP; Schumann, Steffen; /Heidelberg U.; Seymour, Michael H.; /Manchester U.; Sjostrand, Torbjorn; /Lund U., Dept. Theor. Phys.; Skands, Peter; /CERN; Webber, Bryan; /Cambridge U.

    2011-03-03

    We review the physics basis, main features and use of general-purpose Monte Carlo event generators for the simulation of proton-proton collisions at the Large Hadron Collider. Topics included are: the generation of hard-scattering matrix elements for processes of interest, at both leading and next-to-leading QCD perturbative order; their matching to approximate treatments of higher orders based on the showering approximation; the parton and dipole shower formulations; parton distribution functions for event generators; non-perturbative aspects such as soft QCD collisions, the underlying event and diffractive processes; the string and cluster models for hadron formation; the treatment of hadron and tau decays; the inclusion of QED radiation and beyond-Standard-Model processes. We describe the principal features of the Ariadne, Herwig++, Pythia 8 and Sherpa generators, together with the Rivet and Professor validation and tuning tools, and discuss the physics philosophy behind the proper use of these generators and tools. This review is aimed at phenomenologists wishing to understand better how parton-level predictions are translated into hadron-level events as well as experimentalists wanting a deeper insight into the tools available for signal and background simulation at the LHC.

  20. [Application of the grayscale standard display function to general purpose liquid-crystal display monitors for clinical use].

    Science.gov (United States)

    Tanaka, Nobukazu; Naka, Kentaro; Sueoka, Masaki; Higashida, Yoshiharu; Morishita, Junji

    2010-01-20

    Interpretations of medical images have been shifting to soft-copy readings with liquid-crystal display (LCD) monitors. The display function of the medical-grade LCD monitor for soft-copy readings is recommended to calibrate the grayscale standard display function (GSDF) in accordance with the guidelines of Japan and other countries. In this study, the luminance and display function of five models of eight general purpose LCD monitors were measured to gain an understanding of their characteristics. Moreover, the display function (gamma 2.2 or gamma 1.8) of general purpose LCD monitors was converted to GSDF through the use of a look-up table, and the detectability of a simulated lung nodule in the chest x-ray image was examined. As a result, the maximum luminance, contrast ratio, and luminance uniformity of general purpose LCD monitors, except for one model of two LCD monitors, met the management grade 1 standard in the guideline JESRA X-0093-2005. In addition, the detectability of simulated lung nodule in the mediastinal space was obviously improved by converting the display function of a general purpose LCD monitor into GSDF.

  1. Impact Analysis of Generalized Audit Software (GAS Utilization to Auditor Performances

    Directory of Open Access Journals (Sweden)

    Aries Wicaksono

    2016-09-01

    Full Text Available This study aimed to understand whether the use of Generalized Audit Software (GAS in the audit process had an impact on the auditors performance and to acquire conclusions in the evaluation form towards GAS audit process to provide a positive impact on the performance of auditors. The models used to evaluate the impact of GAS were Quantity of Work, Quality of Work, Job Knowledge, Creativeness, Cooperation, Dependability, Initiative, and Personal Qualities. The method used in this research was a qualitative method of analytical descriptive and evaluative, by analyzing the impact of the GAS implementation to the components of the user’s performance. The results indicate that the use of GAS has a positive impact on user’s performance components.

  2. A general-purpose process modelling framework for marine energy systems

    International Nuclear Information System (INIS)

    Dimopoulos, George G.; Georgopoulou, Chariklia A.; Stefanatos, Iason C.; Zymaris, Alexandros S.; Kakalis, Nikolaos M.P.

    2014-01-01

    Highlights: • Process modelling techniques applied in marine engineering. • Systems engineering approaches to manage the complexity of modern ship machinery. • General purpose modelling framework called COSSMOS. • Mathematical modelling of conservation equations and related chemical – transport phenomena. • Generic library of ship machinery component models. - Abstract: High fuel prices, environmental regulations and current shipping market conditions impose ships to operate in a more efficient and greener way. These drivers lead to the introduction of new technologies, fuels, and operations, increasing the complexity of modern ship energy systems. As a means to manage this complexity, in this paper we present the introduction of systems engineering methodologies in marine engineering via the development of a general-purpose process modelling framework for ships named as DNV COSSMOS. Shifting the focus from components – the standard approach in shipping- to systems, widens the space for optimal design and operation solutions. The associated computer implementation of COSSMOS is a platform that models, simulates and optimises integrated marine energy systems with respect to energy efficiency, emissions, safety/reliability and costs, under both steady-state and dynamic conditions. DNV COSSMOS can be used in assessment and optimisation of design and operation problems in existing vessels, new builds as well as new technologies. The main features and our modelling approach are presented and key capabilities are illustrated via two studies on the thermo-economic design and operation optimisation of a combined cycle system for large bulk carriers, and the transient operation simulation of an electric marine propulsion system

  3. No free lunch for software after all

    NARCIS (Netherlands)

    Rutkowski, Anne-Francoise; van Genugten, Michiel; Hatton, L.

    Software's lack of reproduction costs provides benefits to not just legitimate developers but also people who want to use software for criminal purposes. The software community must address this issue or risk disenfranchising the users on whom the software industry depends.

  4. CPAs in Mississippi: Communication Skills and Software Needed by Entry-Level Accountants

    Science.gov (United States)

    Bunn, Phyllis C.; Barfit, Laurie A.; Cooper, Jan

    2005-01-01

    The purpose of this paper was to determine what communication skills are considered most important by employers in the accounting profession as well as to determine the general office, income tax, and bookkeeping software packages used by CPA firms in Mississippi. The data was collected by means of an electronic five-point Likert-type survey…

  5. Static analysis of software the abstract interpretation

    CERN Document Server

    Boulanger, Jean-Louis

    2013-01-01

    The existing literature currently available to students and researchers is very general, covering only the formal techniques of static analysis. This book presents real examples of the formal techniques called ""abstract interpretation"" currently being used in various industrial fields: railway, aeronautics, space, automotive, etc. The purpose of this book is to present students and researchers, in a single book, with the wealth of experience of people who are intrinsically involved in the realization and evaluation of software-based safety critical systems. As the authors are people curr

  6. Specialized Monte Carlo codes versus general-purpose Monte Carlo codes

    International Nuclear Information System (INIS)

    Moskvin, Vadim; DesRosiers, Colleen; Papiez, Lech; Lu, Xiaoyi

    2002-01-01

    The possibilities of Monte Carlo modeling for dose calculations and optimization treatment are quite limited in radiation oncology applications. The main reason is that the Monte Carlo technique for dose calculations is time consuming while treatment planning may require hundreds of possible cases of dose simulations to be evaluated for dose optimization. The second reason is that general-purpose codes widely used in practice, require an experienced user to customize them for calculations. This paper discusses the concept of Monte Carlo code design that can avoid the main problems that are preventing wide spread use of this simulation technique in medical physics. (authors)

  7. Signal processing and general purpose data acquisition system for on-line tomographic measurements

    Science.gov (United States)

    Murari, A.; Martin, P.; Hemming, O.; Manduchi, G.; Marrelli, L.; Taliercio, C.; Hoffmann, A.

    1997-01-01

    New analog signal conditioning electronics and data acquisition systems have been developed for the soft x-ray and bolometric tomography diagnostic in the reverse field pinch experiment (RFX). For the soft x-ray detectors the analog signal processing includes a fully differential current to voltage conversion, with up to a 200 kHz bandwidth. For the bolometers, a 50 kHz carrier frequency amplifier allows a maximum bandwidth of 10 kHz. In both cases the analog signals are digitized with a 1 MHz sampling rate close to the diagnostic and are transmitted via a transparent asynchronous xmitter/receiver interface (TAXI) link to purpose built Versa Module Europa (VME) modules which perform data acquisition. A software library has been developed for data preprocessing and tomographic reconstruction. It has been written in C language and is self-contained, i.e., no additional mathematical library is required. The package is therefore platform-free: in particular it can perform online analysis in a real-time application, such as continuous display and feedback, and is portable for long duration fusion or other physical experiments. Due to the modular organization of the library, new preprocessing and analysis modules can be easily integrated in the environment. This software is implemented in RFX over three different platforms: open VMS, digital Unix, and VME 68040 CPU.

  8. Software process improvement in the NASA software engineering laboratory

    Science.gov (United States)

    Mcgarry, Frank; Pajerski, Rose; Page, Gerald; Waligora, Sharon; Basili, Victor; Zelkowitz, Marvin

    1994-01-01

    The Software Engineering Laboratory (SEL) was established in 1976 for the purpose of studying and measuring software processes with the intent of identifying improvements that could be applied to the production of ground support software within the Flight Dynamics Division (FDD) at the National Aeronautics and Space Administration (NASA)/Goddard Space Flight Center (GSFC). The SEL has three member organizations: NASA/GSFC, the University of Maryland, and Computer Sciences Corporation (CSC). The concept of process improvement within the SEL focuses on the continual understanding of both process and product as well as goal-driven experimentation and analysis of process change within a production environment.

  9. An FPGA Scalable Software Defined Radio Platform Design for Educational and Research Purposes

    Directory of Open Access Journals (Sweden)

    Marcos Hervás

    2016-06-01

    Full Text Available In a digital modem design, the integration of the Analog to Digital Converters (ADC and Digital to Analog Converters (DAC with the core processor is usually a major issue for the designer. In this paper an FPGA scalable Software Defined Radio platform based on a Spartan-6 as a control unit is presented, developed for both educational and research purposes, which can fit the different application requirements in terms of analog front-end performance, processing unit and cost. The resolution and sampling frequency of the analog front-end are its main adjustable parameters. The processing core requirements involve the FPGA and the communication ports. A multidisciplinary working group was required to design a high performance system for both analog front-end and digital processing core in terms of signal integrity and electromagnetic compatibility. The platform has 5 different peripheral ports ranging from 16 kbps to 2.5 Gbps. The communication ports allow our students to develop a high range of applications for both on-site and online courses applying teaching methodology based on learning by doing using a real system to help them to reach other transversal skills.

  10. Real-time radar signal processing using GPGPU (general-purpose graphic processing unit)

    Science.gov (United States)

    Kong, Fanxing; Zhang, Yan Rockee; Cai, Jingxiao; Palmer, Robert D.

    2016-05-01

    This study introduces a practical approach to develop real-time signal processing chain for general phased array radar on NVIDIA GPUs(Graphical Processing Units) using CUDA (Compute Unified Device Architecture) libraries such as cuBlas and cuFFT, which are adopted from open source libraries and optimized for the NVIDIA GPUs. The processed results are rigorously verified against those from the CPUs. Performance benchmarked in computation time with various input data cube sizes are compared across GPUs and CPUs. Through the analysis, it will be demonstrated that GPGPUs (General Purpose GPU) real-time processing of the array radar data is possible with relatively low-cost commercial GPUs.

  11. Possibilities and limitations of applying software reliability growth models to safety-critical software

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Jang, Seung Cheol; Ha, Jae Joo

    2007-01-01

    It is generally known that software reliability growth models such as the Jelinski-Moranda model and the Goel-Okumoto's Non-Homogeneous Poisson Process (NHPP) model cannot be applied to safety-critical software due to a lack of software failure data. In this paper, by applying two of the most widely known software reliability growth models to sample software failure data, we demonstrate the possibility of using the software reliability growth models to prove the high reliability of safety-critical software. The high sensitivity of a piece of software's reliability to software failure data, as well as a lack of sufficient software failure data, is also identified as a possible limitation when applying the software reliability growth models to safety-critical software

  12. General-purpose microprocessor-based control chassis

    International Nuclear Information System (INIS)

    Halbig, J.K.; Klosterbuer, S.F.; Swenson, D.A.

    1979-12-01

    The objective of the Pion Generation for Medical Irradiations (PIGMI) program at the Los Alamos Scientific Laboratory is to develop the technology to build smaller, less expensive, and more reliable proton linear accelerators for medical applications. For this program, a powerful, simple, inexpensive, and reliable control and data acquisition system was developed. The system has a NOVA 3D computer with a real time disk-operating system (RDOS) that communicates with distributed microprocessor-based controllers which directly control data input/output chassis. At the heart of the controller is a microprocessor crate which was conceived at the Fermi National Accelerator Laboratory. This idea was applied to the design of the hardware and software of the controller

  13. General-purpose chemical analyzer for on-line analyses of radioactive solutions

    International Nuclear Information System (INIS)

    Spencer, W.A.; Kronberg, J.W.

    1983-01-01

    An automated analyzer is being developed to perform analytical measurements on radioactive solutions on-line in a hostile environment. This General Purpose Chemical Analyzer (GPCA) samples a process stream, adds reagents, measures solution absorbances or electrode potentials, and automatically calculates the results. The use of modular components, under microprocessor control, permits a single analyzer design to carry out many types of analyses. This paper discusses the more important design criteria for the GPCA, and describes the equipment being tested in a prototype unit

  14. A general-purpose trigger processor system and its application to fast vertex trigger

    International Nuclear Information System (INIS)

    Hazumi, M.; Banas, E.; Natkaniec, Z.; Ostrowicz, W.

    1997-12-01

    A general-purpose hardware trigger system has been developed. The system comprises programmable trigger processors and pattern generator/samplers. The hardware design of the system is described. An application as a prototype of the very fast vertex trigger in an asymmetric B-factory at KEK is also explained. (author)

  15. Features of commercial computer software systems for medical examiners and coroners.

    Science.gov (United States)

    Hanzlick, R L; Parrish, R G; Ing, R

    1993-12-01

    There are many ways of automating medical examiner and coroner offices, one of which is to purchase commercial software products specifically designed for death investigation. We surveyed four companies that offer such products and requested information regarding each company and its hardware, software, operating systems, peripheral devices, applications, networking options, programming language, querying capability, coding systems, prices, customer support, and number and size of offices using the product. Although the four products (CME2, ForenCIS, InQuest, and Medical Examiner's Software System) are similar in many respects and each can be installed on personal computers, there are differences among the products with regard to cost, applications, and the other features. Death investigators interested in office automation should explore these products to determine the usefulness of each in comparison with the others and in comparison with general-purpose, off-the-shelf databases and software adaptable to death investigation needs.

  16. Development of Radio Frequency Antenna Radiation Simulation Software

    International Nuclear Information System (INIS)

    Mohamad Idris Taib; Rozaimah Abd Rahim; Noor Ezati Shuib; Wan Saffiey Wan Abdullah

    2014-01-01

    Antennas are widely used national wide for radio frequency propagation especially for communication system. Radio frequency is electromagnetic spectrum from 10 kHz to 300 GHz and non-ionizing. These radiation exposures to human being have radiation hazard risk. This software was under development using LabVIEW for radio frequency exposure calculation. For the first phase of this development, software purposely to calculate possible maximum exposure for quick base station assessment, using prediction methods. This software also can be used for educational purpose. Some results of this software are comparing with commercial IXUS and free ware NEC software. (author)

  17. The fallacy of Software Patents

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Software patents are usually used as argument for innovation but do they really promote innovation? Who really benefits from software patents? This talk attempts to show the problems with software patents and how they can actually harm innovation having little value for software users and our society in general.

  18. High-Speed General Purpose Genetic Algorithm Processor.

    Science.gov (United States)

    Hoseini Alinodehi, Seyed Pourya; Moshfe, Sajjad; Saber Zaeimian, Masoumeh; Khoei, Abdollah; Hadidi, Khairollah

    2016-07-01

    In this paper, an ultrafast steady-state genetic algorithm processor (GAP) is presented. Due to the heavy computational load of genetic algorithms (GAs), they usually take a long time to find optimum solutions. Hardware implementation is a significant approach to overcome the problem by speeding up the GAs procedure. Hence, we designed a digital CMOS implementation of GA in [Formula: see text] process. The proposed processor is not bounded to a specific application. Indeed, it is a general-purpose processor, which is capable of performing optimization in any possible application. Utilizing speed-boosting techniques, such as pipeline scheme, parallel coarse-grained processing, parallel fitness computation, parallel selection of parents, dual-population scheme, and support for pipelined fitness computation, the proposed processor significantly reduces the processing time. Furthermore, by relying on a built-in discard operator the proposed hardware may be used in constrained problems that are very common in control applications. In the proposed design, a large search space is achievable through the bit string length extension of individuals in the genetic population by connecting the 32-bit GAPs. In addition, the proposed processor supports parallel processing, in which the GAs procedure can be run on several connected processors simultaneously.

  19. Implementation elements for conversion of general-purpose freeway lane into high-occupancy-vehicle lane

    Science.gov (United States)

    1997-01-01

    Conversion of a general-purpose freeway into a high-occupancy-vehicle (HOV) lane is an alternative to infrastructure addition for HOV system implementation. Research indicates that lane conversion is feasible technically if sufficient HOV usage and m...

  20. General Purpose Technologies and their Implications for International Trade

    Directory of Open Access Journals (Sweden)

    Petsas Iordanis

    2015-09-01

    Full Text Available This paper develops a simple model of trade and “quality-ladders” growth without scale effects to study the implications of general purpose technologies (GPTs for international trade. GPTs refer to a certain type of drastic innovations, such as electrification, the transistor, and the Internet, that are characterized by the pervasiveness in use, innovational complementarities, and technological dynamism. The model presents a two-country (Home and Foreign dynamic general equilibrium framework and incorporates GPT diffusion within Home that exhibits endogenous Schumpeterian growth. The model analyzes the long-run and transitional dynamic effects of a new GPT on the pattern of trade and relative wages. The main findings of the paper are: 1 when the GPT diffusion across industries is governed by S-curve dynamics, there are two steady-state equilibria: the initial steadystate arises before the adoption of the new GPT and the final one is reached after the GPT diffusion process has been completed, 2 when all industries at Home have adopted the new GPT, Home enjoys comparative advantage in a greater range of industries compared to Foreign, 3 during the transitional dynamics, Foreign gains back its competitiveness in some of the industries that lost its comparative advantage to Home.

  1. Open source software and libraries

    OpenAIRE

    Randhawa, Sukhwinder

    2008-01-01

    Open source software is, software that users have the ability to run, copy, distribute, study, change, share and improve for any purpose. Open source library software’s does not need the initial cost of commercial software and enables libraries to have greater control over their working environment. Library professionals should be aware of the advantages of open source software and should involve in their development. They should have basic knowledge about the selection, installation and main...

  2. Light Duty Utility Arm Software Test Plan

    International Nuclear Information System (INIS)

    Kiebel, G.R.

    1995-01-01

    This plan describes how validation testing of the software will be implemented for the integrated control and data acquisition system of the Light Duty Utility Arm System (LDUA). The purpose of LDUA software validation testing is to demonstrate and document that the LDUA software meets its software requirements specification

  3. Hospital Management Software Development

    OpenAIRE

    sobogunGod, olawale

    2012-01-01

    The purpose of this thesis was to implement a hospital management software which is suitable for small private hospitals in Nigeria, especially for the ones that use a file based system for storing information rather than having it stored in a more efficient and safer environment like databases or excel programming software. The software developed within this thesis project was specifically designed for the Rainbow specialist hospital which is based in Lagos, the commercial neurological cente...

  4. General Purpose Data-Driven Monitoring for Space Operations

    Science.gov (United States)

    Iverson, David L.; Martin, Rodney A.; Schwabacher, Mark A.; Spirkovska, Liljana; Taylor, William McCaa; Castle, Joseph P.; Mackey, Ryan M.

    2009-01-01

    As modern space propulsion and exploration systems improve in capability and efficiency, their designs are becoming increasingly sophisticated and complex. Determining the health state of these systems, using traditional parameter limit checking, model-based, or rule-based methods, is becoming more difficult as the number of sensors and component interactions grow. Data-driven monitoring techniques have been developed to address these issues by analyzing system operations data to automatically characterize normal system behavior. System health can be monitored by comparing real-time operating data with these nominal characterizations, providing detection of anomalous data signatures indicative of system faults or failures. The Inductive Monitoring System (IMS) is a data-driven system health monitoring software tool that has been successfully applied to several aerospace applications. IMS uses a data mining technique called clustering to analyze archived system data and characterize normal interactions between parameters. The scope of IMS based data-driven monitoring applications continues to expand with current development activities. Successful IMS deployment in the International Space Station (ISS) flight control room to monitor ISS attitude control systems has led to applications in other ISS flight control disciplines, such as thermal control. It has also generated interest in data-driven monitoring capability for Constellation, NASA's program to replace the Space Shuttle with new launch vehicles and spacecraft capable of returning astronauts to the moon, and then on to Mars. Several projects are currently underway to evaluate and mature the IMS technology and complementary tools for use in the Constellation program. These include an experiment on board the Air Force TacSat-3 satellite, and ground systems monitoring for NASA's Ares I-X and Ares I launch vehicles. The TacSat-3 Vehicle System Management (TVSM) project is a software experiment to integrate fault

  5. Optimization of Antivirus Software

    OpenAIRE

    Catalin BOJA; Adrian VISOIU

    2007-01-01

    The paper describes the main techniques used in development of computer antivirus software applications. For this particular category of software, are identified and defined optimum criteria that helps determine which solution is better and what are the objectives of the optimization process. From the general viewpoint of software optimization are presented methods and techniques that are applied at code development level. Regarding the particularities of antivirus software, the paper analyze...

  6. An Application-oriented Open Software Platform for Multi-purpose Field Robotics

    DEFF Research Database (Denmark)

    Jensen, Kjeld

    that solve simple tasks such as mowing, and automatic tractor steering that navigates through a planned route under the supervision of an operator. The outdoor environment in which the robot operates is often very complex. This places great demands on the robot's ability to perceive the environment and based...... on this behave in a way that is appropriate and productive with respect to the given task while being safe for nearby people, animals and objects. Researchers are challenged by the considerable resources required to develop robot software capable of supporting experiments in such a complex perception...... and behavior. The lack of collaboration between research groups contributes to the problem, the scientific publications describe methods and results from the work, but little software for field robots are released and documented for use by others. The hypothesis of this work is that an application oriented...

  7. A real-time MPEG software decoder using a portable message-passing library

    Energy Technology Data Exchange (ETDEWEB)

    Kwong, Man Kam; Tang, P.T. Peter; Lin, Biquan

    1995-12-31

    We present a real-time MPEG software decoder that uses message-passing libraries such as MPL, p4 and MPI. The parallel MPEG decoder currently runs on the IBM SP system but can be easil ported to other parallel machines. This paper discusses our parallel MPEG decoding algorithm as well as the parallel programming environment under which it uses. Several technical issues are discussed, including balancing of decoding speed, memory limitation, 1/0 capacities, and optimization of MPEG decoding components. This project shows that a real-time portable software MPEG decoder is feasible in a general-purpose parallel machine.

  8. A Real-Time GPP Software-Defined Radio Testbed for the Physical Layer of Wireless Standards

    NARCIS (Netherlands)

    Schiphorst, Roelof; Hoeksema, F.W.; Slump, Cornelis H.

    2005-01-01

    We present our contribution to the general-purpose-processor-(GPP)-based radio. We describe a baseband software-defined radio testbed for the physical layer of wireless LAN standards. All physical layer functions have been successfully mapped on a Pentium 4 processor that performs these functions in

  9. Software complex for developing dynamically packed program system for experiment automation

    International Nuclear Information System (INIS)

    Baluka, G.; Salamatin, I.M.

    1985-01-01

    Software complex for developing dynamically packed program system for experiment automation is considered. The complex includes general-purpose programming systems represented as the RT-11 standard operating system and specially developed problem-oriented moduli providing execution of certain jobs. The described complex is realized in the PASKAL' and MAKRO-2 languages and it is rather flexible to variations of the technique of the experiment

  10. General-purpose Monte Carlo codes for neutron and photon transport calculations. MVP version 3

    International Nuclear Information System (INIS)

    Nagaya, Yasunobu

    2017-01-01

    JAEA has developed a general-purpose neutron/photon transport Monte Carlo code MVP. This paper describes the recent development of the MVP code and reviews the basic features and capabilities. In addition, capabilities implemented in Version 3 are also described. (author)

  11. FY1995 study of very flexible software structures based on soft-software components; 1995 nendo yawarankana software buhin ni motozuku software no choju kozo ni kansuru kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    The purpose of this study is to develop the method and tools for changing the software structure flexibly along with the continuous continuous change of its environment and conditions of use. The goal is the software of very high adaptability by using soft-software components and flexible assembly. The CASE tool platform Sapid based on a fine-grained repository was developed and enforced for raising the abstraction level of program code and for mining potential flexible components. To reconstruct the software adaptable to a required environment, the SQM (Software Quark Model) was used in managing interconnectivity and other semantic relationships of among components. On these two basic systems, we developed various methods and tools such as those for static and dynamic analysis of very flexible software structures, program transformation description, program pattern extraction and composition component optimization by partial evaluation, component extraction by function slicing, code encapsulation, and component navigation and application. (NEDO)

  12. Fault-Tolerant Software-Defined Radio on Manycore

    Science.gov (United States)

    Ricketts, Scott

    2015-01-01

    Software-defined radio (SDR) platforms generally rely on field-programmable gate arrays (FPGAs) and digital signal processors (DSPs), but such architectures require significant software development. In addition, application demands for radiation mitigation and fault tolerance exacerbate programming challenges. MaXentric Technologies, LLC, has developed a manycore-based SDR technology that provides 100 times the throughput of conventional radiationhardened general purpose processors. Manycore systems (30-100 cores and beyond) have the potential to provide high processing performance at error rates that are equivalent to current space-deployed uniprocessor systems. MaXentric's innovation is a highly flexible radio, providing over-the-air reconfiguration; adaptability; and uninterrupted, real-time, multimode operation. The technology is also compliant with NASA's Space Telecommunications Radio System (STRS) architecture. In addition to its many uses within NASA communications, the SDR can also serve as a highly programmable research-stage prototyping device for new waveforms and other communications technologies. It can also support noncommunication codes on its multicore processor, collocated with the communications workload-reducing the size, weight, and power of the overall system by aggregating processing jobs to a single board computer.

  13. Computer Software Configuration Item-Specific Flight Software Image Transfer Script Generator

    Science.gov (United States)

    Bolen, Kenny; Greenlaw, Ronald

    2010-01-01

    A K-shell UNIX script enables the International Space Station (ISS) Flight Control Team (FCT) operators in NASA s Mission Control Center (MCC) in Houston to transfer an entire or partial computer software configuration item (CSCI) from a flight software compact disk (CD) to the onboard Portable Computer System (PCS). The tool is designed to read the content stored on a flight software CD and generate individual CSCI transfer scripts that are capable of transferring the flight software content in a given subdirectory on the CD to the scratch directory on the PCS. The flight control team can then transfer the flight software from the PCS scratch directory to the Electronically Erasable Programmable Read Only Memory (EEPROM) of an ISS Multiplexer/ Demultiplexer (MDM) via the Indirect File Transfer capability. The individual CSCI scripts and the CSCI Specific Flight Software Image Transfer Script Generator (CFITSG), when executed a second time, will remove all components from their original execution. The tool will identify errors in the transfer process and create logs of the transferred software for the purposes of configuration management.

  14. Some thermo-electromagnetic applications to fusion technology of a general purpose CAD package

    International Nuclear Information System (INIS)

    Girdinio, P.; Molfino, P.; Molinari, G.; Raia, G.; Rosatelli, F.; Viviani, A.

    1985-01-01

    A general purpose CAD package is applied to the solution of problems related to fusion technology. The problems solved are the interacting electromagnetic and thermal fields in a resistive toroidal coil and the design of the poloidal field coils in Tokamak machines. In both cases, the procedure used is reported and the results obtained are displayed and discussed

  15. Some thermo-electromagnetic applications to fusion technology of a general purpose CAD package

    International Nuclear Information System (INIS)

    Girdinio, P.; Molfino, P.; Molinari, G.; Viviani, A.; Raia, G.; Rosatelli, F.

    1984-01-01

    A general purpose CAD package is applied to the solution of problems related to fusion technology. The problems solved are the interacting electromagnetic and thermal fields in a resistive toroidal coil and the design of the poloidal field coils in Tokamak machines. In both cases, the procedure used is reported and the results obtained are displayed and discussed. (author)

  16. Development of general-purpose particle and heavy ion transport monte carlo code

    International Nuclear Information System (INIS)

    Iwase, Hiroshi; Nakamura, Takashi; Niita, Koji

    2002-01-01

    The high-energy particle transport code NMTC/JAM, which has been developed at JAERI, was improved for the high-energy heavy ion transport calculation by incorporating the JQMD code, the SPAR code and the Shen formula. The new NMTC/JAM named PHITS (Particle and Heavy-Ion Transport code System) is the first general-purpose heavy ion transport Monte Carlo code over the incident energies from several MeV/nucleon to several GeV/nucleon. (author)

  17. Core Flight Software

    Data.gov (United States)

    National Aeronautics and Space Administration — The AES Core Flight Software (CFS) project purpose is to analyze applicability, and evolve and extend the reusability of the CFS system originally developed by...

  18. Reviews in innovative software development

    DEFF Research Database (Denmark)

    Aaen, Ivan; Boelsmand, Jeppe Vestergaard; Jensen, Rasmus

    2009-01-01

    This paper proposes a new review approach for innovative software development. Innovative software development implies that requirements are rarely available as a basis for reviewing and that the purpose of a review is as much to forward additional ideas, as to validate what has been accomplished...

  19. Process mining software repositories

    NARCIS (Netherlands)

    Poncin, W.; Serebrenik, A.; Brand, van den M.G.J.

    2011-01-01

    Software developers' activities are in general recorded in software repositories such as version control systems, bug trackers and mail archives. While abundant information is usually present in such repositories, successful information extraction is often challenged by the necessity to

  20. Essential software architecture

    CERN Document Server

    Gorton, Ian

    2011-01-01

    Job titles like ""Technical Architect"" and ""Chief Architect"" nowadays abound in software industry, yet many people suspect that ""architecture"" is one of the most overused and least understood terms in professional software development. Gorton's book tries to resolve this dilemma. It concisely describes the essential elements of knowledge and key skills required to be a software architect. The explanations encompass the essentials of architecture thinking, practices, and supporting technologies. They range from a general understanding of structure and quality attributes through technical i

  1. A prepaid case study: Ready Credit’s general-purpose & transit-fare programs

    OpenAIRE

    Philip Keitel

    2012-01-01

    Today, prepaid cards are used in dozens of payment applications. To examine the most recent developments, the Payment Cards Center of the Federal Reserve Bank of Philadelphia hosted a workshop on August 22, 2011. Leading the workshop was Tim Walsh, president and chief executive officer of Ready Credit Corporation, a firm that developed network-branded prepaid cards for use in transit-fare systems and also markets general-purpose, reloadable prepaid cards to consumers. Walsh discussed the uniq...

  2. Off-line software for large experimental setups

    International Nuclear Information System (INIS)

    Bruyant, F.

    1983-07-01

    The purpose of this report is to emphasize the importance of Off-line software for large experimental setups in High Energy Physics. Simple notions of program structuring, data structuring and software organization are discussed in the context of the software developped for the European Hybrid Spectrometer. (author)

  3. LDUA software custodian's notebook

    International Nuclear Information System (INIS)

    Aftanas, B.L.

    1998-01-01

    This plan describes the activities to be performed and controls to be applied to the process of specifying, obtaining, and qualifying the control and data acquisition software for the Light Duty Utility Arm (LDUA) System. It serves the purpose of a software quality assurance plan, a verification and validation plan, and a configuration management plan. This plan applies to all software that is an integral part of the LDUA control and data acquisition system, that is, software that is installed in the computers that are part of the LDUA system as it is deployed in the field. This plan applies to the entire development process, including: requirements; design; implementation; and operations and maintenance. This plan does not apply to any software that is not integral with the LDUA system. This plan has-been prepared in accordance with WHC-CM-6-1 Engineering Practices, EP-2.1; WHC-CM-3-10 Software Practices; and WHC-CM-4-2, QR 19.0, Software Quality Assurance Requirements

  4. Software quality assurance handbook

    Energy Technology Data Exchange (ETDEWEB)

    1990-09-01

    There are two important reasons for Software Quality Assurance (SQA) at Allied-Signal Inc., Kansas City Division (KCD): First, the benefits from SQA make good business sense. Second, the Department of Energy has requested SQA. This handbook is one of the first steps in a plant-wide implementation of Software Quality Assurance at KCD. The handbook has two main purposes. The first is to provide information that you will need to perform software quality assurance activities. The second is to provide a common thread to unify the approach to SQA at KCD. 2 figs.

  5. Comparison of progressive addition lenses for general purpose and for computer vision: an office field study.

    Science.gov (United States)

    Jaschinski, Wolfgang; König, Mirjam; Mekontso, Tiofil M; Ohlendorf, Arne; Welscher, Monique

    2015-05-01

    Two types of progressive addition lenses (PALs) were compared in an office field study: 1. General purpose PALs with continuous clear vision between infinity and near reading distances and 2. Computer vision PALs with a wider zone of clear vision at the monitor and in near vision but no clear distance vision. Twenty-three presbyopic participants wore each type of lens for two weeks in a double-masked four-week quasi-experimental procedure that included an adaptation phase (Weeks 1 and 2) and a test phase (Weeks 3 and 4). Questionnaires on visual and musculoskeletal conditions as well as preferences regarding the type of lenses were administered. After eight more weeks of free use of the spectacles, the preferences were assessed again. The ergonomic conditions were analysed from photographs. Head inclination when looking at the monitor was significantly lower by 2.3 degrees with the computer vision PALs than with the general purpose PALs. Vision at the monitor was judged significantly better with computer PALs, while distance vision was judged better with general purpose PALs; however, the reported advantage of computer vision PALs differed in extent between participants. Accordingly, 61 per cent of the participants preferred the computer vision PALs, when asked without information about lens design. After full information about lens characteristics and additional eight weeks of free spectacle use, 44 per cent preferred the computer vision PALs. On average, computer vision PALs were rated significantly better with respect to vision at the monitor during the experimental part of the study. In the final forced-choice ratings, approximately half of the participants preferred either the computer vision PAL or the general purpose PAL. Individual factors seem to play a role in this preference and in the rated advantage of computer vision PALs. © 2015 The Authors. Clinical and Experimental Optometry © 2015 Optometry Australia.

  6. Economic selection index development for Beefmaster cattle II: General-purpose breeding objective.

    Science.gov (United States)

    Ochsner, K P; MacNeil, M D; Lewis, R M; Spangler, M L

    2017-05-01

    An economic selection index was developed for Beefmaster cattle in a general-purpose production system in which bulls are mated to a combination of heifers and mature cows, with resulting progeny retained as replacements or sold at weaning. National average prices from 2010 to 2014 were used to establish income and expenses for the system. Genetic parameters were obtained from the literature. Economic values were estimated by simulating 100,000 animals and approximating the partial derivatives of the profit function by perturbing traits 1 at a time, by 1 unit, while holding the other traits constant at their respective means. Relative economic values for the objective traits calving difficultly direct (CDd), calving difficulty maternal (CDm), weaning weight direct (WWd), weaning weight maternal (WWm), mature cow weight (MW), and heifer pregnancy (HP) were -2.11, -1.53, 18.49, 11.28, -33.46, and 1.19, respectively. Consequently, under the scenario assumed herein, the greatest improvements in profitability could be made by decreasing maintenance energy costs associated with MW followed by improvements in weaning weight. The accuracy of the index lies between 0.218 (phenotypic-based index selection) and 0.428 (breeding values known without error). Implementation of this index would facilitate genetic improvement and increase profitability of Beefmaster cattle operations with a general-purpose breeding objective when replacement females are retained and with weaned calves as the sale end point.

  7. Operation of general purpose stepping motor controllers at the National Synchrotron Light Source

    International Nuclear Information System (INIS)

    Stubblefield, F.W.

    1987-01-01

    A prototype and four copies of a general purpose subsystem for mechanical positioning of detectors, samples, and beam line optical elements which constitute experiments at the National Synchrotron Light Source facility of Brookhaven National Laboratory have been constructed and placed into operation. Construction of a sixth subsystem is nearing completion. The subsystems effect mechanical positioning by controlling a set of stepping motors and their associated position encoders. The units are general purpose in the sense that they receive commands over a standard 9600 baud asynchronous serial line compatible with the RS-232-C electrical signal standard, generate TTL-compatible streams of stepping pulses which can be used with a wide variety of stepping motors, and read back position values from a number of different types and models of position encoder. The basic structure of the motor controller subsystem is briefly reviewed. Short descriptions of the positioning apparatus actuated at each of the test and experiment stations employing a motor control unit are given. Additions and enhancements to the sub-system made in response to problems indicated by actual operation of the four installed units are described in more detail

  8. Operation of general purpose stepping motor controllers at the National Synchrotron Light Source

    International Nuclear Information System (INIS)

    Stubblefield, F.W.

    1986-10-01

    A prototype and four copies of a general purpose subsystem for mechanical positioning of detectors, samples, and beam line optical elements which constitute experiments at the National Synchrotron Light Source facility of Brookhaven National Laboratory have been constructed and placed into operation. Construction of a sixth subsystem is nearing completion. The subsystems effect mechanical positioning by controlling a set of stepping motors and their associated position encoders. The units are general purpose in the sense that they receive commands over a standard 9600 baud asynchronous serial line compatible with the RS-232-C electrical signal standard, generate TTL-compatible streams of stepping pulses which can be used with a wide variety of stepping motors, and read back position values from a number of different types and models of position encoder. The basic structure of the motor controller subsystem will be briefly reviewed. Short descriptions of the positioning apparatus actuated at each of the test and experiment stations employing a motor control unit are given. Additions and enhancements to the subsystem made in response to problems indicated by actual operation of the four installed units are described in more detail

  9. Evolving software products, the design of a water-related modeling software ecosystem

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    2017-01-01

    more than 50 years ago. However, a radical change of software products to evolve both in the software engineering as much as the organizational and business aspects in a disruptive manner are rather rare. In this paper, we report on the transformation of one of the market leader product series in water......-related calculation and modeling from a traditional business-as-usual series of products to an evolutionary software ecosystem. We do so by relying on existing concepts on software ecosystem analysis to analyze the future ecosystem. We report and elaborate on the main focus points necessary for this transition. We...... argue for the generalization of our focus points to the transition from traditional business-as-usual software products to software ecosystems....

  10. Optimization of Antivirus Software

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available The paper describes the main techniques used in development of computer antivirus software applications. For this particular category of software, are identified and defined optimum criteria that helps determine which solution is better and what are the objectives of the optimization process. From the general viewpoint of software optimization are presented methods and techniques that are applied at code development level. Regarding the particularities of antivirus software, the paper analyzes some of the optimization concepts applied to this category of applications

  11. A new general purpose event horizon finder for 3D numerical spacetimes

    International Nuclear Information System (INIS)

    Diener, Peter

    2003-01-01

    I present a new general purpose event horizon finder for full 3D numerical spacetimes. It works by evolving a complete null surface backwards in time. The null surface is described as the zero-level set of a scalar function, which in principle is defined everywhere. This description of the surface allows the surface, trivially, to change topology, making this event horizon finder able to handle numerical spacetimes where two (or more) black holes merge into a single final black hole

  12. Report on the operation and utilization of general purpose use computer system 2001

    Energy Technology Data Exchange (ETDEWEB)

    Watanabe, Kunihiko; Watanabe, Reiko; Tsugawa, Kazuko; Tsuda, Kenzo; Yamamoto, Takashi; Nakamura, Osamu; Kamimura, Tetsuo [National Inst. for Fusion Science, Toki, Gifu (Japan)

    2001-09-01

    The General Purpose Use Computer System of National Institute for Fusion Science was replaced in January, 2001. The System is almost fully used after the first three months operation. Reported here is the process of the introduction of the new system and the state of the operation and utilization of the System between January and March, 2001, especially the detailed utilization of March. (author)

  13. A general purpose program system for high energy physics experiment data acquisition and analysis

    International Nuclear Information System (INIS)

    Li Shuren; Xing Yuguo; Jin Bingnian

    1985-01-01

    This paper introduced the functions, structure and system generation of a general purpose program system (Fermilab MULTI) for high energy physics experiment data acquisition and analysis. Works concerning the reconstruction of MULTI system level 0.5 which can be run on the computer PDP-11/23 are also introduced briefly

  14. Child first language and adult second language are both tied to general-purpose learning systems.

    Science.gov (United States)

    Hamrick, Phillip; Lum, Jarrad A G; Ullman, Michael T

    2018-02-13

    Do the mechanisms underlying language in fact serve general-purpose functions that preexist this uniquely human capacity? To address this contentious and empirically challenging issue, we systematically tested the predictions of a well-studied neurocognitive theory of language motivated by evolutionary principles. Multiple metaanalyses were performed to examine predicted links between language and two general-purpose learning systems, declarative and procedural memory. The results tied lexical abilities to learning only in declarative memory, while grammar was linked to learning in both systems in both child first language and adult second language, in specific ways. In second language learners, grammar was associated with only declarative memory at lower language experience, but with only procedural memory at higher experience. The findings yielded large effect sizes and held consistently across languages, language families, linguistic structures, and tasks, underscoring their reliability and validity. The results, which met the predicted pattern, provide comprehensive evidence that language is tied to general-purpose systems both in children acquiring their native language and adults learning an additional language. Crucially, if language learning relies on these systems, then our extensive knowledge of the systems from animal and human studies may also apply to this domain, leading to predictions that might be unwarranted in the more circumscribed study of language. Thus, by demonstrating a role for these systems in language, the findings simultaneously lay a foundation for potentially important advances in the study of this critical domain.

  15. The Efficiency of Linda for General Purpose Scientific Programming

    Directory of Open Access Journals (Sweden)

    Timothy G. Mattson

    1994-01-01

    Full Text Available Linda (Linda is a registered trademark of Scientific Computing Associates, Inc. is a programming language for coordinating the execution and interaction of processes. When combined with a language for computation (such as C or Fortran, the resulting hybrid language can be used to write portable programs for parallel and distributed multiple instruction multiple data (MIMD computers. The Linda programming model is based on operations that read, write, and erase a virtual shared memory. It is easy to use, and lets the programmer code in a very expressive, uncoupled programming style. These benefits, however, are of little value unless Linda programs execute efficiently. The goal of this article is to demonstrate that Linda programs are efficient making Linda an effective general purpose tool for programming MIMD parallel computers. Two arguments for Linda's efficiency are given; the first is based on Linda's implementation and the second on a range of case studies spanning a complete set of parallel algorithm classes.

  16. System support software for TSTA

    International Nuclear Information System (INIS)

    Claborn, G.W.; Mann, L.W.; Nielson, C.W.

    1987-01-01

    The software at the Tritium Systems Test Assembly (TSTA) is logically broken into two parts, the system support software and the subsystem software. The purpose of the system support software is to isolate the subsystem software from the physical hardware. In this sense the system support software forms the kernel of the software at TSTA. The kernel software performs several functions. It gathers data from CAMAC modules and makes that data available for subsystem processes. It services requests to send commands to CAMAC modules. It provides a system of logging functions and provides for a system-wide global program state that allows highly structured interaction between subsystem processes. The kernel's most visible function is to provide the Man-Machine Interface (MMI). The MMI allows the operators a window into the physical hardware and subsystem process state. Finally the kernel provides a data archiving and compression function that allows archival data to be accessed and plotted. Such kernel software as developed and implemented at TSTA is described

  17. Generalized Software Architecture Applied to the Continuous Lunar Water Separation Process and the Lunar Greenhouse Amplifier

    Science.gov (United States)

    Perusich, Stephen; Moos, Thomas; Muscatello, Anthony

    2011-01-01

    This innovation provides the user with autonomous on-screen monitoring, embedded computations, and tabulated output for two new processes. The software was originally written for the Continuous Lunar Water Separation Process (CLWSP), but was found to be general enough to be applicable to the Lunar Greenhouse Amplifier (LGA) as well, with minor alterations. The resultant program should have general applicability to many laboratory processes (see figure). The objective for these programs was to create a software application that would provide both autonomous monitoring and data storage, along with manual manipulation. The software also allows operators the ability to input experimental changes and comments in real time without modifying the code itself. Common process elements, such as thermocouples, pressure transducers, and relative humidity sensors, are easily incorporated into the program in various configurations, along with specialized devices such as photodiode sensors. The goal of the CLWSP research project is to design, build, and test a new method to continuously separate, capture, and quantify water from a gas stream. The application is any In-Situ Resource Utilization (ISRU) process that desires to extract or produce water from lunar or planetary regolith. The present work is aimed at circumventing current problems and ultimately producing a system capable of continuous operation at moderate temperatures that can be scaled over a large capacity range depending on the ISRU process. The goal of the LGA research project is to design, build, and test a new type of greenhouse that could be used on the moon or Mars. The LGA uses super greenhouse gases (SGGs) to absorb long-wavelength radiation, thus creating a highly efficient greenhouse at a future lunar or Mars outpost. Silica-based glass, although highly efficient at trapping heat, is heavy, fragile, and not suitable for space greenhouse applications. Plastics are much lighter and resilient, but are not

  18. An Evaluation of Classroom Activities and Exercises in ELT Classroom for General Purposes Course

    Science.gov (United States)

    Zohrabi, Mohammad

    2011-01-01

    It is through effective implementation of activities and exercises which students can be motivated and consequently lead to language learning. However, as an insider, the experience of teaching English for General Purposes (EGP) course indicates that it has some problems which need to be modified. In order to evaluate the EGP course,…

  19. COTS-based OO-component approach for software inter-operability and reuse (software systems engineering methodology)

    Science.gov (United States)

    Yin, J.; Oyaki, A.; Hwang, C.; Hung, C.

    2000-01-01

    The purpose of this research and study paper is to provide a summary description and results of rapid development accomplishments at NASA/JPL in the area of advanced distributed computing technology using a Commercial-Off--The-Shelf (COTS)-based object oriented component approach to open inter-operable software development and software reuse.

  20. 21 CFR 880.6890 - General purpose disinfectants.

    Science.gov (United States)

    2010-04-01

    ... (CONTINUED) MEDICAL DEVICES GENERAL HOSPITAL AND PERSONAL USE DEVICES General Hospital and Personal Use... disinfectant is a germicide intended to process noncritical medical devices and equipment surfaces. A general... prior to terminal sterilization or high level disinfection. Noncritical medical devices make only...

  1. Design and validation of Segment - freely available software for cardiovascular image analysis

    International Nuclear Information System (INIS)

    Heiberg, Einar; Sjögren, Jane; Ugander, Martin; Carlsson, Marcus; Engblom, Henrik; Arheden, Håkan

    2010-01-01

    Commercially available software for cardiovascular image analysis often has limited functionality and frequently lacks the careful validation that is required for clinical studies. We have already implemented a cardiovascular image analysis software package and released it as freeware for the research community. However, it was distributed as a stand-alone application and other researchers could not extend it by writing their own custom image analysis algorithms. We believe that the work required to make a clinically applicable prototype can be reduced by making the software extensible, so that researchers can develop their own modules or improvements. Such an initiative might then serve as a bridge between image analysis research and cardiovascular research. The aim of this article is therefore to present the design and validation of a cardiovascular image analysis software package (Segment) and to announce its release in a source code format. Segment can be used for image analysis in magnetic resonance imaging (MRI), computed tomography (CT), single photon emission computed tomography (SPECT) and positron emission tomography (PET). Some of its main features include loading of DICOM images from all major scanner vendors, simultaneous display of multiple image stacks and plane intersections, automated segmentation of the left ventricle, quantification of MRI flow, tools for manual and general object segmentation, quantitative regional wall motion analysis, myocardial viability analysis and image fusion tools. Here we present an overview of the validation results and validation procedures for the functionality of the software. We describe a technique to ensure continued accuracy and validity of the software by implementing and using a test script that tests the functionality of the software and validates the output. The software has been made freely available for research purposes in a source code format on the project home page (http://segment.heiberg.se). Segment

  2. How does Software Process Improvement Address Global Software Engineering?

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Diebold, Philipp; Münch, Jürgen

    2016-01-01

    a systematic mapping study on the state-of-the-art in SPI from a general perspective, we observed Global Software Engineering (GSE) becoming a topic of interest in recent years. Therefore, in this paper, we provide a detailed investigation of those papers from the overall systematic mapping study that were......For decades, Software Process Improvement (SPI) programs have been implemented, inter alia, to improve quality and speed of software development. To set up, guide, and carry out SPI projects, and to measure SPI state, impact, and success, a multitude of different SPI approaches and considerable...... experience are available. SPI addresses many aspects ranging from individual developer skills to entire organizations. It comprises for instance the optimization of specific activities in the software lifecycle as well as the creation of organization awareness and project culture. In the course of conducting...

  3. Purpose of the workshop

    International Nuclear Information System (INIS)

    Brunner, H.

    1998-01-01

    The main purpose of the Workshop is to share the experience on emergency data management and to review various conceptual, technical, organisational and operational aspects and problems. The problems posed by hardware and software, the interplay of software developers and users/operators and the positive and negative experiences both from development and operation of data management systems are discussed. Emergency data management systems and their demonstrations are divided into four classes of possible applications: video games, training and simulation systems, 'history writing' = post-event analysis and documentation systems, real-time operational systems. (author)

  4. Knowledge Management Systems as an Interdisciplinary Communication and Personalized General-Purpose Technology

    Directory of Open Access Journals (Sweden)

    Ulrich Schmitt

    2015-10-01

    Full Text Available As drivers of human civilization, Knowledge Management (KM processes have co-evolved in line with General-Purpose-Technologies (GPT, such as writing, printing, and information and communication systems. As evidenced by the recent shift from information scarcity to abundance, GPTs are capable of drastically altering societies due to their game-changing impact on our spheres of work and personal development. This paper looks at the prospect of whether a novel Personal Knowledge Management (PKM concept supported by a prototype system has got what it takes to grow into a transformative General-Purpose-Technology. Following up on a series of papers, the KM scenario of a decentralizing revolution where individuals and self-organized groups yield more power and autonomy is examined according to a GPT's essential characteristics, including a wide scope for improvement and elaboration (in people's private, professional and societal life, applicability across a broad range of uses in a wide variety of products and processes (in multi-disciplinary educational and work contexts, and strong complementarities with existing or potential new technologies (like organizational KM Systems and a proposed World Heritage of Memes Repository. The result portrays the PKM concept as a strong candidate due to its personal, autonomous, bottom-up, collaborative, interdisciplinary, and creativity-supporting approach destined to advance the availability, quantity, and quality of the world extelligence and to allow for a wider sharing and faster diffusion of ideas across current disciplinary and opportunity divides.

  5. Application of the thermal efficiency analysis software 'EgWin' at existing power plants

    International Nuclear Information System (INIS)

    Koda, E.; Takahashi, T.; Nakao, Y.

    2008-01-01

    'EgWin' is the general purpose software to analyze a thermal efficiency of power system developed in CRIEPI. This software has been used to analyze the existing power generation unit of 30 or more, and the effectiveness has been confirmed. In thermal power plants, it was used for the clarification of the thermal efficiency decrease factor and the quantitative estimation of the influence that each factor gave to the thermal efficiency of the plant. Also it was used for the quantitative estimation of the effect by the operating condition change and the facility remodeling in thermal power, atomic energy, and geothermal power plants. (author)

  6. Systematic Software Development

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Méndez Fernández, Daniel

    2015-01-01

    The speed of innovation and the global allocation of resources to accelerate development or to reduce cost put pressure on the software industry. In the global competition, especially so-called high-price countries have to present arguments why the higher development cost is justified and what...... makes these countries an attractive host for software companies. Often, high-quality engineering and excellent quality of products, e.g., machinery and equipment, are mentioned. Yet, the question is: Can such arguments be also found for the software industry? We aim at investigating the degree...... of professionalism and systematization of software development to draw a map of strengths and weaknesses. To this end, we conducted as a first step an exploratory survey in Germany, presented in this paper. In this survey, we focused on the perceived importance of the two general software engineering process areas...

  7. Improving Software Performance in the Compute Unified Device Architecture

    Directory of Open Access Journals (Sweden)

    Alexandru PIRJAN

    2010-01-01

    Full Text Available This paper analyzes several aspects regarding the improvement of software performance for applications written in the Compute Unified Device Architecture CUDA. We address an issue of great importance when programming a CUDA application: the Graphics Processing Unit’s (GPU’s memory management through ranspose ernels. We also benchmark and evaluate the performance for progressively optimizing a transposing matrix application in CUDA. One particular interest was to research how well the optimization techniques, applied to software application written in CUDA, scale to the latest generation of general-purpose graphic processors units (GPGPU, like the Fermi architecture implemented in the GTX480 and the previous architecture implemented in GTX280. Lately, there has been a lot of interest in the literature for this type of optimization analysis, but none of the works so far (to our best knowledge tried to validate if the optimizations can apply to a GPU from the latest Fermi architecture and how well does the Fermi architecture scale to these software performance improving techniques.

  8. Use of general purpose graphics processing units with MODFLOW

    Science.gov (United States)

    Hughes, Joseph D.; White, Jeremy T.

    2013-01-01

    To evaluate the use of general-purpose graphics processing units (GPGPUs) to improve the performance of MODFLOW, an unstructured preconditioned conjugate gradient (UPCG) solver has been developed. The UPCG solver uses a compressed sparse row storage scheme and includes Jacobi, zero fill-in incomplete, and modified-incomplete lower-upper (LU) factorization, and generalized least-squares polynomial preconditioners. The UPCG solver also includes options for sequential and parallel solution on the central processing unit (CPU) using OpenMP. For simulations utilizing the GPGPU, all basic linear algebra operations are performed on the GPGPU; memory copies between the central processing unit CPU and GPCPU occur prior to the first iteration of the UPCG solver and after satisfying head and flow criteria or exceeding a maximum number of iterations. The efficiency of the UPCG solver for GPGPU and CPU solutions is benchmarked using simulations of a synthetic, heterogeneous unconfined aquifer with tens of thousands to millions of active grid cells. Testing indicates GPGPU speedups on the order of 2 to 8, relative to the standard MODFLOW preconditioned conjugate gradient (PCG) solver, can be achieved when (1) memory copies between the CPU and GPGPU are optimized, (2) the percentage of time performing memory copies between the CPU and GPGPU is small relative to the calculation time, (3) high-performance GPGPU cards are utilized, and (4) CPU-GPGPU combinations are used to execute sequential operations that are difficult to parallelize. Furthermore, UPCG solver testing indicates GPGPU speedups exceed parallel CPU speedups achieved using OpenMP on multicore CPUs for preconditioners that can be easily parallelized.

  9. Development of an engine system simulation software package - ESIM

    Energy Technology Data Exchange (ETDEWEB)

    Erlandsson, Olof

    2000-10-01

    A software package, ESIM is developed for simulating internal combustion engine systems, including models for engine, manifolds, turbocharger, charge-air cooler (inter cooler) and inlet air heater. This study focus on the thermodynamic treatment and methods used in the models. It also includes some examples of system simulations made with these models for validation purposes. The engine model can be classified as a zero-dimensional, single zone model. It includes calculation of the valve flow process, models for heat release and models for in-cylinder, exhaust port and manifold heat transfer. Models are developed for handling turbocharger performance and charge air cooler characteristics. The main purpose of the project related to this work is to use the ESIM software to study heat balance and performance of homogeneous charge compression ignition (HCCI) engine systems. A short description of the HCCI engine is therefore included, pointing out the difficulties, or challenges regarding the HCCI engine, from a system perspective. However, the relations given here, and the code itself, is quite general, making it possible to use these models to simulate spark ignited, as well as direct injected engines.

  10. Software Process Improvement through the Removal of Project-Level Knowledge Flow Obstacles: The Perceptions of Software Engineers

    Science.gov (United States)

    Mitchell, Susan Marie

    2012-01-01

    Uncontrollable costs, schedule overruns, and poor end product quality continue to plague the software engineering field. Innovations formulated with the expectation to minimize or eliminate cost, schedule, and quality problems have generally fallen into one of three categories: programming paradigms, software tools, and software process…

  11. Medical device software: defining key terms.

    Science.gov (United States)

    Pashkov, Vitalii; Gutorova, Nataliya; Harkusha, Andrii

    one of the areas of significant growth in medical devices has been the role of software - as an integral component of a medical device, as a standalone device and more recently as applications on mobile devices. The risk related to a malfunction of the standalone software used within healthcare is in itself not a criterion for its qualification or not as a medical device. It is therefore, necessary to clarify some criteria for the qualification of stand-alone software as medical devices Materials and methods: Ukrainian, European Union, United States of America legislation, Guidelines developed by European Commission and Food and Drug Administration's, recommendations represented by international voluntary group and scientific works. This article is based on dialectical, comparative, analytic, synthetic and comprehensive research methods. the legal regulation of software which is used for medical purpose in Ukraine limited to one definition. In European Union and United States of America were developed and applying special guidelines that help developers, manufactures and end users to difference software on types standing on medical purpose criteria. Software becomes more and more incorporated into medical devices. Developers and manufacturers may not have initially appreciated potential risks to patients and users such situation could have dangerous results for patients or users. It is necessary to develop and adopt the legislation that will intend to define the criteria for the qualification of medical device software and the application of the classification criteria to such software, provide some illustrative examples and step by step recommendations to qualify software as medical device.

  12. Software-based acoustical measurements

    CERN Document Server

    Miyara, Federico

    2017-01-01

    This textbook provides a detailed introduction to the use of software in combination with simple and economical hardware (a sound level meter with calibrated AC output and a digital recording system) to obtain sophisticated measurements usually requiring expensive equipment. It emphasizes the use of free, open source, and multiplatform software. Many commercial acoustical measurement systems use software algorithms as an integral component; however the methods are not disclosed. This book enables the reader to develop useful algorithms and provides insight into the use of digital audio editing tools to document features in the signal. Topics covered include acoustical measurement principles, in-depth critical study of uncertainty applied to acoustical measurements, digital signal processing from the basics, and metrologically-oriented spectral and statistical analysis of signals. The student will gain a deep understanding of the use of software for measurement purposes; the ability to implement software-based...

  13. General guidelines for biomedical software development [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Luis Bastiao Silva

    2017-03-01

    Full Text Available Most bioinformatics tools available today were not written by professional software developers, but by people that wanted to solve their own problems, using computational solutions and spending the minimum time and effort possible, since these were just the means to an end. Consequently, a vast number of software applications are currently available, hindering the task of identifying the utility and quality of each. At the same time, this situation has hindered regular adoption of these tools in clinical practice. Typically, they are not sufficiently developed to be used by most clinical researchers and practitioners. To address these issues, it is necessary to re-think how biomedical applications are built and adopt new strategies that ensure quality, efficiency, robustness, correctness and reusability of software components. We also need to engage end-users during the development process to ensure that applications fit their needs. In this review, we present a set of guidelines to support biomedical software development, with an explanation of how they can be implemented and what kind of open-source tools can be used for each specific topic.

  14. General guidelines for biomedical software development [version 2; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Luis Bastiao Silva

    2017-07-01

    Full Text Available Most bioinformatics tools available today were not written by professional software developers, but by people that wanted to solve their own problems, using computational solutions and spending the minimum time and effort possible, since these were just the means to an end. Consequently, a vast number of software applications are currently available, hindering the task of identifying the utility and quality of each. At the same time, this situation has hindered regular adoption of these tools in clinical practice. Typically, they are not sufficiently developed to be used by most clinical researchers and practitioners. To address these issues, it is necessary to re-think how biomedical applications are built and adopt new strategies that ensure quality, efficiency, robustness, correctness and reusability of software components. We also need to engage end-users during the development process to ensure that applications fit their needs. In this review, we present a set of guidelines to support biomedical software development, with an explanation of how they can be implemented and what kind of open-source tools can be used for each specific topic.

  15. Presentation of the ASTRAL software

    International Nuclear Information System (INIS)

    2011-01-01

    This report presents the ASTRAL software (ASTRAL means technical assistance in post-accidental radioprotection) which is aimed to be used as an aid-to-decision tool in the case of an important release of radionuclides in the environment by assessing radionuclide concentration in different environments and food products, determining the potential exposure to irradiation of concerned populations, foreseeing the evolution of the situation, and proposing different scenarios for the management of contaminated areas. The report describes the software general operation, presents the calculation module (main functionalities, concentration index calculation, dose calculation or radiological impact calculation, how countermeasures are taken into account), the data bases (contextual data, data for radio-ecological calculations and for radiological calculations), the software ergonomics (general principles, result selection and display, result printing and input data). Its briefly evokes the development quality assurance, and describes the software implementation architecture

  16. High vacuum general purpose scattering chamber for nuclear reaction study

    International Nuclear Information System (INIS)

    Suresh Kumar; Ojha, S.C.

    2003-01-01

    To study the nuclear reactions induced by beam from medium energy accelerators, one of the most common facility required is a scattering chamber. In the scattering chamber, projectile collides with the target nucleus and the scattered reaction products are detected with various type of nuclear detector at different angles with respect to the beam. The experiments are performed under high vacuum to minimize the background reaction and the energy losses of the charged particles. To make the chamber general purpose various requirement of the experiments are incorporated into it. Changing of targets, changing angle of various detectors while in vacuum are the most desired features. The other features like ascertaining the beam spot size and position on the target, minimizing the background counts by proper beam dump, accurate positioning of the detector as per plan etc. are some of the important requirements

  17. The RHIC general purpose multiplexed analog to digital converter system

    International Nuclear Information System (INIS)

    Michnoff, R.

    1995-01-01

    A general purpose multiplexed analog to digital converter system is currently under development to support acquisition of analog signals for the Relativistic Heavy Ion Collider (RHIC) at Brookhaven National Laboratory. The system consists of a custom intelligent VME based controller module (V113) and a 14-bit 64 channel multiplexed A/D converter module (V114). The design features two independent scan groups, where one scan group is capable of acquiring 64 channels at 60 Hz, concurrently with the second scan group acquiring data at an aggregate rate of up to 80 k samples/second. An interface to the RHIC serially encoded event line is used to synchronize acquisition. Data is stored in a circular static RAM buffer on the controller module, then transferred to a commercial VMEbus CPU board and higher level workstations for plotting, report Generation, analysis and storage

  18. Using a cognitive architecture for general purpose service robot control

    Science.gov (United States)

    Puigbo, Jordi-Ysard; Pumarola, Albert; Angulo, Cecilio; Tellez, Ricardo

    2015-04-01

    A humanoid service robot equipped with a set of simple action skills including navigating, grasping, recognising objects or people, among others, is considered in this paper. By using those skills the robot should complete a voice command expressed in natural language encoding a complex task (defined as the concatenation of a number of those basic skills). As a main feature, no traditional planner has been used to decide skills to be activated, as well as in which sequence. Instead, the SOAR cognitive architecture acts as the reasoner by selecting which action the robot should complete, addressing it towards the goal. Our proposal allows to include new goals for the robot just by adding new skills (without the need to encode new plans). The proposed architecture has been tested on a human-sized humanoid robot, REEM, acting as a general purpose service robot.

  19. Software needs engineering - a position paper

    OpenAIRE

    GRIMSON, JANE BARCLAY

    2000-01-01

    PUBLISHED When the general press refers to `software' in its headlines, then this is often not to relate a success story, but to expand on yet another `software-risk-turned-problem-story'. For many people, the term `software' evokes the image of an application package running either on a PC or some similar stand-alone usage. Over 70% of all software, however, are not developed in the traditional software houses as part of the creation of such packages. Much of this software comes in the fo...

  20. Interfacial Properties of EXXPRO(TM) and General Purpose Elastomers

    Science.gov (United States)

    Zhang, Y.; Rafailovich, M.; Sokolov, Jon; Qu, S.; Ge, S.; Ngyuen, D.; Li, Z.; Peiffer, D.; Song, L.; Dias, J. A.; McElrath, K. O.

    1998-03-01

    EXXPRO(Trademark) elastomers are used for tires and many other applications. This elastomer (denoted as BIMS) is a random copolymer of p-methylstyrene (MS) and polyisobutylene (I) with varying degrees of PMS content and bromination (B) on the p-methyl group. BIMS is impermeable to gases, and has good heat, ozone and flex resistance. Very often general purpose elastomers are blended with BIMS. The interfacial width between polybutadiene and BIMS is a sensitive function of the Br level and PMS content. By neutron reflectivity (NR), we studied the dynamics of interface formation as a function of time and temperature for BIMS with varying degrees of PMS and Br. We found that in addition to the bulk parameters, the total film thickness and the proximity of an interactive surface can affect the interfacial interaction rates. The interfacial properties can also be modified by inclusion of particles, such as carbon black (a filler component in tire rubbers). Results will be presented on the relation between the interfacial width as measured by NR and compatibilization studies via AFM and LFM.

  1. Foam A General Purpose Cellular Monte Carlo Event Generator

    CERN Document Server

    Jadach, Stanislaw

    2003-01-01

    A general purpose, self-adapting, Monte Carlo (MC) event generator (simulator) is described. The high efficiency of the MC, that is small maximum weight or variance of the MC weight is achieved by means of dividing the integration domain into small cells. The cells can be $n$-dimensional simplices, hyperrectangles or Cartesian product of them. The grid of cells, called ``foam'', is produced in the process of the binary split of the cells. The choice of the next cell to be divided and the position/direction of the division hyper-plane is driven by the algorithm which optimizes the ratio of the maximum weight to the average weight or (optionally) the total variance. The algorithm is able to deal, in principle, with an arbitrary pattern of the singularities in the distribution. As any MC generator, it can also be used for the MC integration. With the typical personal computer CPU, the program is able to perform adaptive integration/simulation at relatively small number of dimensions ($\\leq 16$). With the continu...

  2. Honeywell Modular Automation System Computer Software Documentation

    International Nuclear Information System (INIS)

    STUBBS, A.M.

    2000-01-01

    The purpose of this Computer Software Document (CSWD) is to provide configuration control of the Honeywell Modular Automation System (MAS) in use at the Plutonium Finishing Plant (PFP). This CSWD describes hardware and PFP developed software for control of stabilization furnaces. The Honeywell software can generate configuration reports for the developed control software. These reports are described in the following section and are attached as addendum's. This plan applies to PFP Engineering Manager, Thermal Stabilization Cognizant Engineers, and the Shift Technical Advisors responsible for the Honeywell MAS software/hardware and administration of the Honeywell System

  3. The Software Point of View

    CERN Multimedia

    Bentvelsen, Stan

    Physics was meant to be the topic of the LUND workshop, but it was software that dominated throughout the week. ATLAS' new software proves a tough nut to crack. Many presented physics analyses were repetitions of the Physics TDR, using (or trying to use) the new software. The main purpose was to demonstrate that results were invariant with new software. Hard to prove since the 'new software' is not yet completed. Nevertheless the so far existing sofware is massively experimented with throughout the entire ATLAS community. So what does this "new software" really mean? The answer depends strongly on the person who deals with this question. The scope varies between "migrating from Fortran77 to C++" and "learn Object Oriented approach" to "implementing services and algorithms in Athena" or "feeding detector description in FADS". Clearly the new software primarily involves migrating from the old Fortran code to C++ with its object oriententation paradigm. Occasionaly there was a hint that this migration is not c...

  4. Software for microcircuit systems

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1978-10-01

    Modern Large Scale Integration (LSI) microcircuits are meant to be programed in order to control the function that they perform. The basics of microprograming and new microcircuits have already been discussed. In this course, the methods of developing software for these microcircuits are explored. This generally requires a package of support software in order to assemble the microprogram, and also some amount of support software to test the microprograms and to test the microprogramed circuit itself. 15 figures, 2 tables

  5. Annotated bibliography of Software Engineering Laboratory literature

    Science.gov (United States)

    Morusiewicz, Linda; Valett, Jon D.

    1991-01-01

    An annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory is given. More than 100 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. All materials have been grouped into eight general subject areas for easy reference: The Software Engineering Laboratory; The Software Engineering Laboratory: Software Development Documents; Software Tools; Software Models; Software Measurement; Technology Evaluations; Ada Technology; and Data Collection. Subject and author indexes further classify these documents by specific topic and individual author.

  6. Design of General-purpose Industrial signal acquisition system in a large scientific device

    Science.gov (United States)

    Ren, Bin; Yang, Lei

    2018-02-01

    In order to measure the industrial signal of a large scientific device experiment, a set of industrial data general-purpose acquisition system has been designed. It can collect 4~20mA current signal and 0~10V voltage signal. Through the practical experiments, it shows that the system is flexible, reliable, convenient and economical, and the system has characters of high definition and strong anti-interference ability. Thus, the system fully meets the design requirements..

  7. Software methodologies for the SSC

    International Nuclear Information System (INIS)

    Loken, S.C.

    1990-01-01

    This report describes some of the considerations that will determine how the author developed software for the SSC. He begins with a review of the general computing problem for SSC experiments and recent experiences in software engineering for the present generation of experiments. This leads to a discussion of the software technologies that will be critical for the SSC experiments. He describes the emerging software standards and commercial products that may be useful in addressing the SSC needs. He concludes with some comments on how collaborations and the SSC Lab should approach the software development issue

  8. IDES: Interactive Data Entry System: a generalized data acquisition software package

    International Nuclear Information System (INIS)

    Gasser, S.B.

    1980-04-01

    The Interactive Data Entry System (IDES) is a software package which greatly assists in designing and storing forms to be used for the directed acquisition of data. Objective of this package is to provide a viable man/machine interface to any comprehensive data base. This report provides a technical description of the software and can be used as a user's manual

  9. Exploiting Software Tool Towards Easier Use And Higher Efficiency

    Science.gov (United States)

    Lin, G. H.; Su, J. T.; Deng, Y. Y.

    2006-08-01

    In developing countries, using data based on instrument made by themselves in maximum extent is very important. It is not only related to maximizing science returns upon prophase investment -- deep accumulations in every aspects but also science output. Based on the idea, we are exploiting a software (called THDP: Tool of Huairou Data Processing). It is used for processing a series of issues, which is met necessary in processing data. This paper discusses its designed purpose, functions, method and specialities. The primary vehicle for general data interpretation is through various techniques of data visualization, techniques of interactive. In the software, we employed Object Oriented approach. It is appropriate to the vehicle. it is imperative that the approach provide not only function, but do so in as convenient a fashion as possible. As result of the software exploiting, it is not only easier to learn data processing for beginner and more convenienter to need further improvement for senior but also increase greatly efficiency in every phrases include analyse, parameter adjusting, result display. Under frame of virtual observatory, for developing countries, we should study more and newer related technologies, which can advance ability and efficiency in science research, like the software we are developing

  10. Possibilities and Limitations of Applying Software Reliability Growth Models to Safety- Critical Software

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Jang, Seung Cheol; Ha, Jae Joo

    2006-01-01

    As digital systems are gradually introduced to nuclear power plants (NPPs), the need of quantitatively analyzing the reliability of the digital systems is also increasing. Kang and Sung identified (1) software reliability, (2) common-cause failures (CCFs), and (3) fault coverage as the three most critical factors in the reliability analysis of digital systems. For the estimation of the safety-critical software (the software that is used in safety-critical digital systems), the use of Bayesian Belief Networks (BBNs) seems to be most widely used. The use of BBNs in reliability estimation of safety-critical software is basically a process of indirectly assigning a reliability based on various observed information and experts' opinions. When software testing results or software failure histories are available, we can use a process of directly estimating the reliability of the software using various software reliability growth models such as Jelinski- Moranda model and Goel-Okumoto's nonhomogeneous Poisson process (NHPP) model. Even though it is generally known that software reliability growth models cannot be applied to safety-critical software due to small number of expected failure data from the testing of safety-critical software, we try to find possibilities and corresponding limitations of applying software reliability growth models to safety critical software

  11. APL/JHU free flight tests of the General Purpose Heat Source module. Testing: 5-7 March 1984

    International Nuclear Information System (INIS)

    Baker, W.M. II.

    1984-01-01

    Purpose of the test was to obtain statistical information on the dynamics of the General Purpose Heat Source (GPHS) module at terminal speeds. Models were designed to aerodynamically and dynamically represent the GPHS module. Normal and high speed photographic coverage documented the motion of the models. This report documents test parameters and techniques for the free-spin tests. It does not include data analysis

  12. Development of a large-scale general purpose two-phase flow analysis code

    International Nuclear Information System (INIS)

    Terasaka, Haruo; Shimizu, Sensuke

    2001-01-01

    A general purpose three-dimensional two-phase flow analysis code has been developed for solving large-scale problems in industrial fields. The code uses a two-fluid model to describe the conservation equations for two-phase flow in order to be applicable to various phenomena. Complicated geometrical conditions are modeled by FAVOR method in structured grid systems, and the discretization equations are solved by a modified SIMPLEST scheme. To reduce computing time a matrix solver for the pressure correction equation is parallelized with OpenMP. Results of numerical examples show that the accurate solutions can be obtained efficiently and stably. (author)

  13. Genten: Software for Generalized Tensor Decompositions v. 1.0.0

    Energy Technology Data Exchange (ETDEWEB)

    2017-06-22

    Tensors, or multidimensional arrays, are a powerful mathematical means of describing multiway data. This software provides computational means for decomposing or approximating a given tensor in terms of smaller tensors of lower dimension, focusing on decomposition of large, sparse tensors. These techniques have applications in many scientific areas, including signal processing, linear algebra, computer vision, numerical analysis, data mining, graph analysis, neuroscience and more. The software is designed to take advantage of parallelism present emerging computer architectures such has multi-core CPUs, many-core accelerators such as the Intel Xeon Phi, and computation-oriented GPUs to enable efficient processing of large tensors.

  14. INGEN: a general-purpose mesh generator for finite element codes

    International Nuclear Information System (INIS)

    Cook, W.A.

    1979-05-01

    INGEN is a general-purpose mesh generator for two- and three-dimensional finite element codes. The basic parts of the code are surface and three-dimensional region generators that use linear-blending interpolation formulas. These generators are based on an i, j, k index scheme that is used to number nodal points, construct elements, and develop displacement and traction boundary conditions. This code can generate truss elements (2 modal points); plane stress, plane strain, and axisymmetry two-dimensional continuum elements (4 to 8 nodal points); plate elements (4 to 8 nodal points); and three-dimensional continuum elements (8 to 21 nodal points). The traction loads generated are consistent with the element generated. The expansion--contraction option is of special interest. This option makes it possible to change an existing mesh such that some regions are refined and others are made coarser than the original mesh. 9 figures

  15. Software as a service approach to sensor simulation software deployment

    Science.gov (United States)

    Webster, Steven; Miller, Gordon; Mayott, Gregory

    2012-05-01

    Traditionally, military simulation has been problem domain specific. Executing an exercise currently requires multiple simulation software providers to specialize, deploy, and configure their respective implementations, integrate the collection of software to achieve a specific system behavior, and then execute for the purpose at hand. This approach leads to rigid system integrations which require simulation expertise for each deployment due to changes in location, hardware, and software. Our alternative is Software as a Service (SaaS) predicated on the virtualization of Night Vision Electronic Sensors (NVESD) sensor simulations as an exemplary case. Management middleware elements layer self provisioning, configuration, and integration services onto the virtualized sensors to present a system of services at run time. Given an Infrastructure as a Service (IaaS) environment, enabled and managed system of simulations yields a durable SaaS delivery without requiring user simulation expertise. Persistent SaaS simulations would provide on demand availability to connected users, decrease integration costs and timelines, and benefit the domain community from immediate deployment of lessons learned.

  16. General Purpose Probabilistic Programming Platform with Effective Stochastic Inference

    Science.gov (United States)

    2018-04-01

    REFERENCES 74 LIST OF ACRONYMS 80 ii List of Figures Figure 1. The problem of inferring curves from data while simultaneously choosing the...bottom path) as the inverse problem to computer graphics (top path). ........ 18 Figure 18. An illustration of generative probabilistic graphics for 3D...Building these systems involves simultaneously developing mathematical models, inference algorithms and optimized software implementations. Small changes

  17. Archival standards, in archival open access software And offer appropriate software for internal archival centers

    Directory of Open Access Journals (Sweden)

    Abdolreza Izadi

    2016-12-01

    Full Text Available The purpose of this study is Study of Descriptive Metadata Standards in Archival open source software, to determine the most appropriate descriptive metadata standard (s and also Encoder Software support of these standards. The approach of present study is combination and library methods, Delphi and descriptive survey are used. Data gathering in library study is fiche, in the Delphi method is questionnaire and in descriptive survey is checklist. Statistical population contains 5 Archival open source software. The findings suggest that 5 metadata standards, consist of EAD, ISAD, EAC-CPF, ISAAR & ISDF, diagnosed appropriate by Delphi Panel members as the most appropriate descriptive metadata standards to use for archival software. Moreover, ICA-ATOM and Archivist toolkit in terms of support for standards that were suitable, diagnosed as the most appropriate archival software.

  18. Firing Room Remote Application Software Development

    Science.gov (United States)

    Liu, Kan

    2015-01-01

    The Engineering and Technology Directorate (NE) at National Aeronautics and Space Administration (NASA) Kennedy Space Center (KSC) is designing a new command and control system for the checkout and launch of Space Launch System (SLS) and future rockets. The purposes of the semester long internship as a remote application software developer include the design, development, integration, and verification of the software and hardware in the firing rooms, in particular with the Mobile Launcher (ML) Launch Accessories (LACC) subsystem. In addition, a software test verification procedure document was created to verify and checkout LACC software for Launch Equipment Test Facility (LETF) testing.

  19. A low cost general purpose portable programmable master/slave manipulative appliance

    International Nuclear Information System (INIS)

    Cameron, W.

    1984-01-01

    The TRIUMF 100 μA 500 MeV cyclotron, located at the University of British Columbia, required a low cost, portable master/slave manipulative capability for experimental beam line servicing. A programmable capability was also required for the hot cell manipulators. A general purpose unit was developed that might also have applications in light manufacturing and medical rehabilitation. The project now in prototype testing represents a modular portable robot costing less than $5000 that is lead-through-teach programmable by either a master controller or hands-on lead-through. Task programs are stored and retrieved on any 32 k personal computer. An on-board proportional integral derivative controller (Motorola 6809 based) gives discrete positioning of the six degrees of freedom 2 kg capacity end effector

  20. Health software: a new CEI Guide for software management in medical environment.

    Science.gov (United States)

    Giacomozzi, Claudia; Martelli, Francesco

    2016-01-01

    The increasing spread of software components in the healthcare context renders explanatory guides relevant and mandatory to interpret laws and standards, and to support safe management of software products in healthcare. In 2012 a working group has been settled for the above purposes at Italian Electrotechnical Committee (CEI), made of experts from Italian National Institute of Health (ISS), representatives of industry, and representatives of the healthcare organizations. As a first outcome of the group activity, Guide CEI 62-237 was published in February 2015. The Guide incorporates an innovative approach based on the proper contextualization of software products, either medical devices or not, to the specific healthcare scenario, and addresses the risk management of IT systems. The Guide provides operators and manufacturers with an interpretative support with many detailed examples to facilitate the proper contextualization and management of health software, in compliance with related European and international regulations and standards.

  1. BALTORO a general purpose code for coupling discrete ordinates and Monte-Carlo radiation transport calculations

    International Nuclear Information System (INIS)

    Zazula, J.M.

    1983-01-01

    The general purpose code BALTORO was written for coupling the three-dimensional Monte-Carlo /MC/ with the one-dimensional Discrete Ordinates /DO/ radiation transport calculations. The quantity of a radiation-induced /neutrons or gamma-rays/ nuclear effect or the score from a radiation-yielding nuclear effect can be analysed in this way. (author)

  2. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1: ASC software quality engineering practices, Version 2.0.

    Energy Technology Data Exchange (ETDEWEB)

    Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr. (,; .); Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR 1.3.2 and 1.3.6 and to a Department of Energy document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines. This document also identifies ASC management and software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  3. 48 CFR 212.7003 - Technical data and computer software.

    Science.gov (United States)

    2010-10-01

    ... computer software. 212.7003 Section 212.7003 Federal Acquisition Regulations System DEFENSE ACQUISITION... data and computer software. For purposes of establishing delivery requirements and license rights for technical data under 227.7102 and for computer software under 227.7202, there shall be a rebuttable...

  4. Solid Waste Information and Tracking System (SWITS) Software Requirements Specification

    International Nuclear Information System (INIS)

    MAY, D.L.

    2000-01-01

    This document is the primary document establishing requirements for the Solid Waste Information and Tracking System (SWITS) as it is converted to a client-server architecture. The purpose is to provide the customer and the performing organizations with the requirements for the SWITS in the new environment. This Software Requirement Specification (SRS) describes the system requirements for the SWITS Project, and follows the PHMC Engineering Requirements, HNF-PRO-1819, and Computer Software Qualify Assurance Requirements, HNF-PRO-309, policies. This SRS includes sections on general description, specific requirements, references, appendices, and index. The SWITS system defined in this document stores information about the solid waste inventory on the Hanford site. Waste is tracked as it is generated, analyzed, shipped, stored, and treated. In addition to inventory reports a number of reports for regulatory agencies are produced

  5. Solid Waste Information and Tracking System (SWITS) Software Requirements Specification

    Energy Technology Data Exchange (ETDEWEB)

    MAY, D.L.

    2000-03-22

    This document is the primary document establishing requirements for the Solid Waste Information and Tracking System (SWITS) as it is converted to a client-server architecture. The purpose is to provide the customer and the performing organizations with the requirements for the SWITS in the new environment. This Software Requirement Specification (SRS) describes the system requirements for the SWITS Project, and follows the PHMC Engineering Requirements, HNF-PRO-1819, and Computer Software Qualify Assurance Requirements, HNF-PRO-309, policies. This SRS includes sections on general description, specific requirements, references, appendices, and index. The SWITS system defined in this document stores information about the solid waste inventory on the Hanford site. Waste is tracked as it is generated, analyzed, shipped, stored, and treated. In addition to inventory reports a number of reports for regulatory agencies are produced.

  6. Collected Software Engineering Papers, Volume 10

    Science.gov (United States)

    1992-01-01

    This document is a collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) from Oct. 1991 - Nov. 1992. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. Additional information about the SEL and its research efforts may be obtained from the sources listed in the bibliography at the end of this document. For the convenience of this presentation, the 11 papers contained here are grouped into 5 major sections: (1) the Software Engineering Laboratory; (2) software tools studies; (3) software models studies; (4) software measurement studies; and (5) Ada technology studies.

  7. Modular Software-Defined Radio

    Directory of Open Access Journals (Sweden)

    Rhiemeier Arnd-Ragnar

    2005-01-01

    Full Text Available In view of the technical and commercial boundary conditions for software-defined radio (SDR, it is suggestive to reconsider the concept anew from an unconventional point of view. The organizational principles of signal processing (rather than the signal processing algorithms themselves are the main focus of this work on modular software-defined radio. Modularity and flexibility are just two key characteristics of the SDR environment which extend smoothly into the modeling of hardware and software. In particular, the proposed model of signal processing software includes irregular, connected, directed, acyclic graphs with random node weights and random edges. Several approaches for mapping such software to a given hardware are discussed. Taking into account previous findings as well as new results from system simulations presented here, the paper finally concludes with the utility of pipelining as a general design guideline for modular software-defined radio.

  8. Speech to Text Software Evaluation Report

    CERN Document Server

    Martins Santo, Ana Luisa

    2017-01-01

    This document compares out-of-box performance of three commercially available speech recognition software: Vocapia VoxSigma TM , Google Cloud Speech, and Lime- craft Transcriber. It is defined a set of evaluation criteria and test methods for speech recognition softwares. The evaluation of these softwares in noisy environments are also included for the testing purposes. Recognition accuracy was compared using noisy environments and languages. Testing in ”ideal” non-noisy environment of a quiet room has been also performed for comparison.

  9. Software for microcircuit systems

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1978-01-01

    Modern Large Scale Integration (LSI) microcircuits are meant to be programmed in order to control the function that they perform. In the previous paper the author has already discussed the basics of microprogramming and have studied in some detail two types of new microcircuits. In this paper, methods of developing software for these microcircuits are explored. This generally requires a package of support software in order to assemble the microprogram, and also some amount of support software to test the microprograms and to test the microprogrammed circuit itself. (Auth.)

  10. Experience of application of the general-purpose pressure and pressure drop transformers on nitrogen tetroxide

    International Nuclear Information System (INIS)

    Grishchuk, M.Kh.

    1979-01-01

    An experience of application of the general-purpose pressure and pressure drop transformers at the Nuclear Power Engineering Institute of the BSSR Academy of Sciences for measurements on nitrogen tetroxide has been described. The concrete recommendations on the types of transformers and the volume of preparational work before putting them into operation have been given

  11. A NEW CONTROL CIRCUIT AND COMPUTER SOFTWARE FOR CONTROLING PHOTOVOLTAIC SYSTEMS

    Directory of Open Access Journals (Sweden)

    Mustafa Berkant SELEK

    2008-02-01

    Full Text Available In this study, a new microcontroller circuit was designed and new computer software was implemented to control power flow currents of renewable energy system, which is established in Solar Energy Institute, Ege University, Bornova, Izmir, Turkey. PIC18F452 microcontroller based electronic circuit was designed to control another electronic circuit that includes power electronic switching components. Readily available standard control circuits are designed for switching single level inverters. In contrary, implemented circuit allows to switch multilevel inverters. In addition, because the efficiency of solar energy panels is considerably low, solar panels should be operated under the maximum power point (MPP. Therefore, MPP algorithm is included in the designed control circuit. Next, the control circuit also includes a serial communication interface based on RS232 standard. Using this interface enables the user to choose all functions available in the control circuit and take status report via computer software. Last, a general purpose command set was designed to establish communication between the computer software and the microcontroller-based control circuit. As a result, it is aimed that this study supply a basis for the researchers who want to develop own control circuits or more visual software.

  12. Common software for controlling and monitoring the upgraded CMS Level-1 trigger

    CERN Document Server

    Codispoti, Giuseppe

    2017-01-01

    The Large Hadron Collider restarted in 2015 with a higher centre-of-mass energy of 13 TeV. The instantaneous luminosity is expected to increase significantly in the coming years. An upgraded Level-1 trigger system was deployed in the CMS experiment in order to maintain the same efficiencies for searches and precision measurements as those achieved in 2012. This system must be controlled and monitored coherently through software, with high operational efficiency.The legacy system was composed of a large number of custom data processor boards; correspondingly, only a small fraction of the software was common between the different subsystems. The upgraded system is composed of a set of general purpose boards, that follow the MicroTCA specification, and transmit data over optical links, resulting in a more homogeneous system. The associated software is based on generic components corresponding to the firmware blocks that are shared across different cards, regardless of the role that the card plays in the system. ...

  13. General purpose graphics-processing-unit implementation of cosmological domain wall network evolution.

    Science.gov (United States)

    Correia, J R C C C; Martins, C J A P

    2017-10-01

    Topological defects unavoidably form at symmetry breaking phase transitions in the early universe. To probe the parameter space of theoretical models and set tighter experimental constraints (exploiting the recent advances in astrophysical observations), one requires more and more demanding simulations, and therefore more hardware resources and computation time. Improving the speed and efficiency of existing codes is essential. Here we present a general purpose graphics-processing-unit implementation of the canonical Press-Ryden-Spergel algorithm for the evolution of cosmological domain wall networks. This is ported to the Open Computing Language standard, and as a consequence significant speedups are achieved both in two-dimensional (2D) and 3D simulations.

  14. General purpose graphics-processing-unit implementation of cosmological domain wall network evolution

    Science.gov (United States)

    Correia, J. R. C. C. C.; Martins, C. J. A. P.

    2017-10-01

    Topological defects unavoidably form at symmetry breaking phase transitions in the early universe. To probe the parameter space of theoretical models and set tighter experimental constraints (exploiting the recent advances in astrophysical observations), one requires more and more demanding simulations, and therefore more hardware resources and computation time. Improving the speed and efficiency of existing codes is essential. Here we present a general purpose graphics-processing-unit implementation of the canonical Press-Ryden-Spergel algorithm for the evolution of cosmological domain wall networks. This is ported to the Open Computing Language standard, and as a consequence significant speedups are achieved both in two-dimensional (2D) and 3D simulations.

  15. Hooke: an open software platform for force spectroscopy.

    Science.gov (United States)

    Sandal, Massimo; Benedetti, Fabrizio; Brucale, Marco; Gomez-Casado, Alberto; Samorì, Bruno

    2009-06-01

    Hooke is an open source, extensible software intended for analysis of atomic force microscope (AFM)-based single molecule force spectroscopy (SMFS) data. We propose it as a platform on which published and new algorithms for SMFS analysis can be integrated in a standard, open fashion, as a general solution to the current lack of a standard software for SMFS data analysis. Specific features and support for file formats are coded as independent plugins. Any user can code new plugins, extending the software capabilities. Basic automated dataset filtering and semi-automatic analysis facilities are included. Software and documentation are available at (http://code.google.com/p/hooke). Hooke is a free software under the GNU Lesser General Public License.

  16. Criteria for the selection of ERP software

    Directory of Open Access Journals (Sweden)

    2007-01-01

    Full Text Available The implementation of an ERP software package is an important investment for an organization, which is characterized also by a high degree of risk. Selecting the most appropriate software is a necessary condition for a successful implementation. This paper is describing the major aspects of software selection in general and the relevant criteria in the case of ERP software.

  17. Selection and Management of Open Source Software in Libraries

    OpenAIRE

    Vimal Kumar, V.

    2007-01-01

    Open source software was a revolutionary concept among computer programmers and users. To a certain extent open source solutions could provide an alternative solution to costly commercial software. Open source software is, software that users have the ability to run, copy, distribute, study, change, share and improve for any purpose. Open source library software’s does not need the initial cost of commercial software and enables libraries to have greater control over their working environmen...

  18. A refinement-based approach to developing software controllers for reactive systems

    Energy Technology Data Exchange (ETDEWEB)

    Winter, V.L.; Kapur, D.; Berg, R.S.

    1999-12-09

    The purpose of this paper is to demonstrate how transformation can be used to derive a high integrity implementation of a train controller from an algorithmic specification. The paper begins with a general discussion of high consequence systems (e.g., software systems) and describes how rewrite-based transformation systems can be used in the development of such systems. The authors then discuss how such transformations can be used to derive a high assurance controller for the Bay Area Rapid Transit (BART) system from an algorithmic specification.

  19. Testing Object-Oriented Software

    DEFF Research Database (Denmark)

    Caspersen, Michael Edelgaard; Madsen, Ole Lehrmann; Skov, Stefan H.

    The report is a result of an activity within the project Centre for Object Technology (COT), case 2. In case 2 a number of pilot projects have been carried out to test the feasibility of using object technology within embedded software. Some of the pilot projects have resulted in proto-types that......The report is a result of an activity within the project Centre for Object Technology (COT), case 2. In case 2 a number of pilot projects have been carried out to test the feasibility of using object technology within embedded software. Some of the pilot projects have resulted in proto......-types that are currently being developed into production versions. To assure a high quality in the product it was decided to carry out an activ-ity regarding issues in testing OO software. The purpose of this report is to discuss the issues of testing object-oriented software. It is often claimed that testing of OO...... software is radically different form testing traditional software developed using imperative/procedural programming. Other authors claim that there is no difference. In this report we will attempt to give an answer to these questions (or at least initiate a discussion)....

  20. Software Geometry in Simulations

    Science.gov (United States)

    Alion, Tyler; Viren, Brett; Junk, Tom

    2015-04-01

    The Long Baseline Neutrino Experiment (LBNE) involves many detectors. The experiment's near detector (ND) facility, may ultimately involve several detectors. The far detector (FD) will be significantly larger than any other Liquid Argon (LAr) detector yet constructed; many prototype detectors are being constructed and studied to motivate a plethora of proposed FD designs. Whether it be a constructed prototype or a proposed ND/FD design, every design must be simulated and analyzed. This presents a considerable challenge to LBNE software experts; each detector geometry must be described to the simulation software in an efficient way which allows for multiple authors to easily collaborate. Furthermore, different geometry versions must be tracked throughout their use. We present a framework called General Geometry Description (GGD), written and developed by LBNE software collaborators for managing software to generate geometries. Though GGD is flexible enough to be used by any experiment working with detectors, we present it's first use in generating Geometry Description Markup Language (GDML) files to interface with LArSoft, a framework of detector simulations, event reconstruction, and data analyses written for all LAr technology users at Fermilab. Brett is the other of the framework discussed here, the General Geometry Description (GGD).

  1. Software reliability models for critical applications

    Energy Technology Data Exchange (ETDEWEB)

    Pham, H.; Pham, M.

    1991-12-01

    This report presents the results of the first phase of the ongoing EG&G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  2. Software reliability models for critical applications

    Energy Technology Data Exchange (ETDEWEB)

    Pham, H.; Pham, M.

    1991-12-01

    This report presents the results of the first phase of the ongoing EG G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  3. Literature Review: Weldability of Iridium DOP-26 Alloy for General Purpose Heat Source

    Energy Technology Data Exchange (ETDEWEB)

    Burgardt, Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pierce, Stanley W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-10-19

    The basic purpose of this paper is to provide a literature review relative to fabrication of the General Purpose Heat Source (GPHS) that is used to provide electrical power for deep space missions of NASA. The particular fabrication operation to be addressed here is arc welding of the GPHS encapsulation. A considerable effort was made to optimize the fabrication of the fuel pellets and of other elements of the encapsulation; that work will not be directly addressed in this paper. This report consists of three basic sections: 1) a brief description of the GPHS will be provided as background information for the reader; 2) mechanical properties and the optimization thereof as relevant to welding will be discussed; 3) a review of the arc welding process development and optimization will be presented. Since the welding equipment must be upgraded for future production, some discussion of the historical establishment of relevant welding variables and possible changes thereto will also be discussed.

  4. Design of tallying function for general purpose Monte Carlo particle transport code JMCT

    International Nuclear Information System (INIS)

    Shangguan Danhua; Li Gang; Deng Li; Zhang Baoyin

    2013-01-01

    A new postponed accumulation algorithm was proposed. Based on JCOGIN (J combinatorial geometry Monte Carlo transport infrastructure) framework and the postponed accumulation algorithm, the tallying function of the general purpose Monte Carlo neutron-photon transport code JMCT was improved markedly. JMCT gets a higher tallying efficiency than MCNP 4C by 28% for simple geometry model, and JMCT is faster than MCNP 4C by two orders of magnitude for complicated repeated structure model. The available ability of tallying function for JMCT makes firm foundation for reactor analysis and multi-step burnup calculation. (authors)

  5. A Software Configuration Management Course

    DEFF Research Database (Denmark)

    Asklund, U.; Bendix, Lars Gotfred

    2003-01-01

    Software Configuration Management has been a big success in research and creation of tools. There are also many vendors in the market of selling courses to companies. However, in the education sector Software Configuration Management has still not quite made it - at least not into the university...... curriculum. It is either not taught at all or is just a minor part of a general course in software engineering. In this paper, we report on our experience with giving a full course entirely dedicated to Software Configuration Management topics and start a discussion of what ideally should be the goal...

  6. General-purpose heat source project and space nuclear safety and fuels program. Progress report

    International Nuclear Information System (INIS)

    Maraman, W.J.

    1979-12-01

    This formal monthly report covers the studies related to the use of 238 PuO 2 in radioisotopic power systems carried out for the Advanced Nuclear Systems and Projects Division of the Los Alamos Scientific Laboratory. The two programs involved are general-purpose heat source development and space nuclear safety and fuels. Most of the studies discussed hear are of a continuing nature. Results and conclusions described may change as the work continues

  7. What Librarians Still Don't Know about Free Software

    Science.gov (United States)

    Chudnov, Daniel

    2009-01-01

    Free software isn't about cost, and it isn't about hype, it isn't about taking business away from vendors. It's about four kinds of freedom--the freedom to use the software for any purpose, the freedom to study how the software works, the freedom to modify the software to adapt it to one's needs, and the freedom to copy and share copies of the…

  8. Software reengineering

    Science.gov (United States)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  9. Operation of a general purpose stepping motor-encoder positioning subsystem at the National Synchrotron Light Source

    International Nuclear Information System (INIS)

    Stubblefield, F.W.

    1985-11-01

    Four copies of a general purpose subsystem for mechanical positioning of detectors, samples, and beam line optical elements which constitute experiments at the National Synchrotron Light Source facility of Brookhaven National Laboratory have been constructed and placed into operation. Construction of a fifth subsystem unit is nearing completion. The subsystems affect mechanical positioning by controlling a set of stepping motor-encoder pairs. The units are general purpose in the sense that they receive commands over a 9600 baud asynchronous serial line compatible with the RS-232-C electrical signal standard, generate TTL-compatible streams of stepping pulses which can be used with a wide variety of stepping motors, and read back position values from a number of different types and models of position encoder. The basic structure of the motor controller subsystem is briefly reviewed. Additions to the subsystem made in response to problems indicated by actual operation of the four installed units are described in more detail

  10. Quench Simulation of Superconducting Magnets with Commercial Multiphysics Software

    CERN Document Server

    AUTHOR|(SzGeCERN)751171; Auchmann, Bernhard; Jarkko, Niiranen; Maciejewski, Michal

    The simulation of quenches in superconducting magnets is a multiphysics problem of highest complexity. Operated at 1.9 K above absolute zero, the material properties of superconductors and superfluid helium vary by several orders of magnitude over a range of only 10 K. The heat transfer from metal to helium goes through different transfer and boiling regimes as a function of temperature, heat flux, and transferred energy. Electrical, magnetic, thermal, and fluid dynamic effects are intimately coupled, yet live on vastly different time and spatial scales. While the physical models may be the same in all cases, it is an open debate whether the user should opt for commercial multiphysics software like ANSYS or COMSOL, write customized models based on general purpose network solvers like SPICE, or implement the physics models and numerical solvers entirely in custom software like the QP3, THEA, and ROXIE codes currently in use at the European Organisation for Nuclear Research (CERN). Each approach has its strengt...

  11. The Ethics of Software Engineering should be an Ethics for the Client

    OpenAIRE

    McBride, Neil

    2012-01-01

    The developing nature of software engineering requires not a revision of an ailing code but a revolution in ethical thinking that acknowledges the purpose and practice of software engineering. Computer systems are designed and implemented to support human purposeful activity. Whether the software is concerned with student enrollment, customer relationship management, or hospital administration, its success lies in the extent to which it enables others to en...

  12. Annotated bibliography of software engineering laboratory literature

    Science.gov (United States)

    Kistler, David; Bristow, John; Smith, Don

    1994-01-01

    This document is an annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory. Nearly 200 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. This document has been updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials have been grouped into eight general subject areas for easy reference: (1) The Software Engineering Laboratory; (2) The Software Engineering Laboratory: Software Development Documents; (3) Software Tools; (4) Software Models; (5) Software Measurement; (6) Technology Evaluations; (7) Ada Technology; and (8) Data Collection. This document contains an index of these publications classified by individual author.

  13. Ethics and Practice of Free Software

    CERN Document Server

    CERN. Geneva

    2007-01-01

    About the speaker Richard Matthew Stallman is a software freedom activist, hacker, and software developer. In September 1983, he launched the GNU Project to create a free Unix-like operating system, and has been the project's lead architect and organizer. With the launch of the GNU project he started the free software movement, and in October 1985 set up the Free Software Foundation. He co-founded the League for Programming Freedom. Stallman pioneered the concept of copyleft and is the main author of several copyleft licenses including the GNU General Public License, the most widely used free software license. (from Wikipedia)

  14. Upper Secondary and Vocational Level Teachers at Social Software

    Science.gov (United States)

    Valtonen, Teemu; Kontkanen, Sini; Dillon, Patrick; Kukkonen, Jari; Väisänen, Pertti

    2014-01-01

    This study focuses on upper secondary and vocational level teachers as users of social software i.e. what software they use during their leisure and work and for what purposes they use software in teaching. The study is theorised within a technological pedagogical content knowledge framework, the emphasis is especially on technological knowledge…

  15. Design of a general-purpose European compound screening library for EU-OPENSCREEN.

    Science.gov (United States)

    Horvath, Dragos; Lisurek, Michael; Rupp, Bernd; Kühne, Ronald; Specker, Edgar; von Kries, Jens; Rognan, Didier; Andersson, C David; Almqvist, Fredrik; Elofsson, Mikael; Enqvist, Per-Anders; Gustavsson, Anna-Lena; Remez, Nikita; Mestres, Jordi; Marcou, Gilles; Varnek, Alexander; Hibert, Marcel; Quintana, Jordi; Frank, Ronald

    2014-10-01

    This work describes a collaborative effort to define and apply a protocol for the rational selection of a general-purpose screening library, to be used by the screening platforms affiliated with the EU-OPENSCREEN initiative. It is designed as a standard source of compounds for primary screening against novel biological targets, at the request of research partners. Given the general nature of the potential applications of this compound collection, the focus of the selection strategy lies on ensuring chemical stability, absence of reactive compounds, screening-compliant physicochemical properties, loose compliance to drug-likeness criteria (as drug design is a major, but not exclusive application), and maximal diversity/coverage of chemical space, aimed at providing hits for a wide spectrum of drugable targets. Finally, practical availability/cost issues cannot be avoided. The main goal of this publication is to inform potential future users of this library about its conception, sources, and characteristics. The outline of the selection procedure, notably of the filtering rules designed by a large committee of European medicinal chemists and chemoinformaticians, may be of general methodological interest for the screening/medicinal chemistry community. The selection task of 200K molecules out of a pre-filtered set of 1.4M candidates was shared by five independent European research groups, each picking a subset of 40K compounds according to their own in-house methodology and expertise. An in-depth analysis of chemical space coverage of the library serves not only to characterize the collection, but also to compare the various chemoinformatics-driven selection procedures of maximal diversity sets. Compound selections contributed by various participating groups were mapped onto general-purpose self-organizing maps (SOMs) built on the basis of marketed drugs and bioactive reference molecules. In this way, the occupancy of chemical space by the EU-OPENSCREEN library could

  16. Validation of software for calculating the likelihood ratio for parentage and kinship.

    Science.gov (United States)

    Drábek, J

    2009-03-01

    Although the likelihood ratio is a well-known statistical technique, commercial off-the-shelf (COTS) software products for its calculation are not sufficiently validated to suit general requirements for the competence of testing and calibration laboratories (EN/ISO/IEC 17025:2005 norm) per se. The software in question can be considered critical as it directly weighs the forensic evidence allowing judges to decide on guilt or innocence or to identify person or kin (i.e.: in mass fatalities). For these reasons, accredited laboratories shall validate likelihood ratio software in accordance with the above norm. To validate software for calculating the likelihood ratio in parentage/kinship scenarios I assessed available vendors, chose two programs (Paternity Index and familias) for testing, and finally validated them using tests derived from elaboration of the available guidelines for the field of forensics, biomedicine, and software engineering. MS Excel calculation using known likelihood ratio formulas or peer-reviewed results of difficult paternity cases were used as a reference. Using seven testing cases, it was found that both programs satisfied the requirements for basic paternity cases. However, only a combination of two software programs fulfills the criteria needed for our purpose in the whole spectrum of functions under validation with the exceptions of providing algebraic formulas in cases of mutation and/or silent allele.

  17. Collected software engineering papers, volume 8

    Science.gov (United States)

    1990-01-01

    A collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) during the period November 1989 through October 1990 is presented. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. Additional information about the SEL and its research efforts may be obtained from the sources listed in the bibliography. The seven presented papers are grouped into four major categories: (1) experimental research and evaluation of software measurement; (2) studies on models for software reuse; (3) a software tool evaluation; and (4) Ada technology and studies in the areas of reuse and specification.

  18. A Longitudinal Study of the e-Market for Software Components

    NARCIS (Netherlands)

    van Hillegersberg, Jos; Traas, Vincent; Dragt, Roland

    2001-01-01

    Component Based Software Development (CBD) holds high promises, but develops its full potential only when software components are traded in a component market. The Internet seems ideal for this purpose and various sources have predicted a bright future for the Internet Software Component Market

  19. Generalized data management systems and scientific information

    International Nuclear Information System (INIS)

    Anon.

    1978-01-01

    This report aims to stimulate scientists of all disciplines to consider the advantages of using a generalized data management system (GDMS) for storage, manipulation and retrieval of the data they collect and often need to share. It should also be of interest to managers and programmers who need to make decisions on the management of scientific (numeric or non-numeric) data. Another goal of this report is to expose the features that a GDMS should have which are specifically necessary to support scientific data, such as data types and special manipulation functions. A GDMS is a system that provides generalized tools for the purpose of defining a database structure, for loading the data, for modification of the data, and for organizing the database for efficient retrieval and formatted output. A data management system is 'generalized' when it provides a user-oriented language for the different functions, so that it is possible to define any new database, its internal organization, and to retrieve and modify the data without the need to develop special purpose software (program) for each new database

  20. General purpose dynamic Monte Carlo with continuous energy for transient analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sjenitzer, B. L.; Hoogenboom, J. E. [Delft Univ. of Technology, Dept. of Radiation, Radionuclide and Reactors, Mekelweg 15, 2629JB Delft (Netherlands)

    2012-07-01

    For safety assessments transient analysis is an important tool. It can predict maximum temperatures during regular reactor operation or during an accident scenario. Despite the fact that this kind of analysis is very important, the state of the art still uses rather crude methods, like diffusion theory and point-kinetics. For reference calculations it is preferable to use the Monte Carlo method. In this paper the dynamic Monte Carlo method is implemented in the general purpose Monte Carlo code Tripoli4. Also, the method is extended for use with continuous energy. The first results of Dynamic Tripoli demonstrate that this kind of calculation is indeed accurate and the results are achieved in a reasonable amount of time. With the method implemented in Tripoli it is now possible to do an exact transient calculation in arbitrary geometry. (authors)

  1. THE ADAPTIVE NATURE OF MANAGING SOFTWARE INNOVATION

    OpenAIRE

    Mihai Liviu Despa

    2013-01-01

    The focus of this article is pointed at adaptive management in the context of innovative software projects. Software development is presented through the filter of innovation. The aspects that differentiate software innovation from any other kind of innovation are highlighted. Adaptive management is addressed from a general point of view. The circumstances that require adaptive management are emphasized. Methods of implementing adaptive management in innovation oriented software projects are ...

  2. Presenting an evaluation model of the trauma registry software.

    Science.gov (United States)

    Asadi, Farkhondeh; Paydar, Somayeh

    2018-04-01

    Trauma is a major cause of 10% death in the worldwide and is considered as a global concern. This problem has made healthcare policy makers and managers to adopt a basic strategy in this context. Trauma registry has an important and basic role in decreasing the mortality and the disabilities due to injuries resulted from trauma. Today, different software are designed for trauma registry. Evaluation of this software improves management, increases efficiency and effectiveness of these systems. Therefore, the aim of this study is to present an evaluation model for trauma registry software. The present study is an applied research. In this study, general and specific criteria of trauma registry software were identified by reviewing literature including books, articles, scientific documents, valid websites and related software in this domain. According to general and specific criteria and related software, a model for evaluating trauma registry software was proposed. Based on the proposed model, a checklist designed and its validity and reliability evaluated. Mentioned model by using of the Delphi technique presented to 12 experts and specialists. To analyze the results, an agreed coefficient of %75 was determined in order to apply changes. Finally, when the model was approved by the experts and professionals, the final version of the evaluation model for the trauma registry software was presented. For evaluating of criteria of trauma registry software, two groups were presented: 1- General criteria, 2- Specific criteria. General criteria of trauma registry software were classified into four main categories including: 1- usability, 2- security, 3- maintainability, and 4-interoperability. Specific criteria were divided into four main categories including: 1- data submission and entry, 2- reporting, 3- quality control, 4- decision and research support. The presented model in this research has introduced important general and specific criteria of trauma registry software

  3. Free software and open source databases

    Directory of Open Access Journals (Sweden)

    Napoleon Alexandru SIRITEANU

    2006-01-01

    Full Text Available The emergence of free/open source software -FS/OSS- enterprises seeks to push software development out of the academic stream into the commercial mainstream, and as a result, end-user applications such as open source database management systems (PostgreSQL, MySQL, Firebird are becoming more popular. Companies like Sybase, Oracle, Sun, IBM are increasingly implementing open source strategies and porting programs/applications into the Linux environment. Open source software is redefining the software industry in general and database development in particular.

  4. Collected software engineering papers, volume 9

    Science.gov (United States)

    1991-01-01

    This document is a collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) from November 1990 through October 1991. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. This is the ninth such volume of technical papers produced by the SEL. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. For the convenience of this presentation, the eight papers contained here are grouped into three major categories: (1) software models studies; (2) software measurement studies; and (3) Ada technology studies. The first category presents studies on reuse models, including a software reuse model applied to maintenance and a model for an organization to support software reuse. The second category includes experimental research methods and software measurement techniques. The third category presents object-oriented approaches using Ada and object-oriented features proposed for Ada. The SEL is actively working to understand and improve the software development process at GSFC.

  5. Software Design Level Security Vulnerabilities

    OpenAIRE

    S. Rehman; K. Mustafa

    2011-01-01

    Several thousand software design vulnerabilities have been reported through established databases. But they need to be structured and classified to be optimally usable in the pursuit of minimal and effective mitigation mechanism. In order we developed a criterion set for a communicative description of the same to serve the purpose as a taxonomic description of security vulnerabilities, arising in the design phase of Software development lifecycle. This description is a part of an effort to id...

  6. Experimental research control software system

    International Nuclear Information System (INIS)

    Cohn, I A; Kovalenko, A G; Vystavkin, A N

    2014-01-01

    A software system, intended for automation of a small scale research, has been developed. The software allows one to control equipment, acquire and process data by means of simple scripts. The main purpose of that development is to increase experiment automation easiness, thus significantly reducing experimental setup automation efforts. In particular, minimal programming skills are required and supervisors have no reviewing troubles. Interactions between scripts and equipment are managed automatically, thus allowing to run multiple scripts simultaneously. Unlike well-known data acquisition commercial software systems, the control is performed by an imperative scripting language. This approach eases complex control and data acquisition algorithms implementation. A modular interface library performs interaction with external interfaces. While most widely used interfaces are already implemented, a simple framework is developed for fast implementations of new software and hardware interfaces. While the software is in continuous development with new features being implemented, it is already used in our laboratory for automation of a helium-3 cryostat control and data acquisition. The software is open source and distributed under Gnu Public License.

  7. Experimental research control software system

    Science.gov (United States)

    Cohn, I. A.; Kovalenko, A. G.; Vystavkin, A. N.

    2014-05-01

    A software system, intended for automation of a small scale research, has been developed. The software allows one to control equipment, acquire and process data by means of simple scripts. The main purpose of that development is to increase experiment automation easiness, thus significantly reducing experimental setup automation efforts. In particular, minimal programming skills are required and supervisors have no reviewing troubles. Interactions between scripts and equipment are managed automatically, thus allowing to run multiple scripts simultaneously. Unlike well-known data acquisition commercial software systems, the control is performed by an imperative scripting language. This approach eases complex control and data acquisition algorithms implementation. A modular interface library performs interaction with external interfaces. While most widely used interfaces are already implemented, a simple framework is developed for fast implementations of new software and hardware interfaces. While the software is in continuous development with new features being implemented, it is already used in our laboratory for automation of a helium-3 cryostat control and data acquisition. The software is open source and distributed under Gnu Public License.

  8. Collected software engineering papers, volume 7

    Science.gov (United States)

    1989-01-01

    A collection is presented of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) during the period Dec. 1988 to Oct. 1989. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. For the convenience of this presentation, the seven papers contained here are grouped into three major categories: (1) Software Measurement and Technology Studies; (2) Measurement Environment Studies; and (3) Ada Technology Studies. The first category presents experimental research and evaluation of software measurement and technology; the second presents studies on software environments pertaining to measurement. The last category represents Ada technology and includes research, development, and measurement studies.

  9. Computer mapping software and geographic data base development: Oak Ridge National Laboratory user experience

    International Nuclear Information System (INIS)

    Honea, B.; Johnson, P.

    1978-01-01

    As users of computer display tools, our opinion is that the researcher's needs should guide and direct the computer scientist's development of mapping software and data bases. Computer graphic techniques developed for the sake of the computer graphics community tend to be esoteric and rarely suitable for user problems. Two types of users exist for computer graphic tools: the researcher who is generally satisfied with abstract but accurate displays for analysis purposes and the decision maker who requires synoptic and easily comprehended displays relevant to the issues he or she must address. Computer mapping software and data bases should be developed for the user in a generalized and standardized format for ease in transferring and to facilitate the linking or merging with larger analysis systems. Maximum utility of computer mapping tools is accomplished when linked to geographic information and analysis systems. Computer graphic techniques have varying degrees of utility depending upon whether they are used for data validation, analysis procedures or presenting research results

  10. TOGAF usage in outsourcing of software development

    Directory of Open Access Journals (Sweden)

    Aziz Ahmad Rais

    2013-12-01

    Full Text Available TOGAF is an Enterprise Architecture framework that provides a method for developing Enterprise Architecture called architecture development method (ADM. The purpose of this paper is whether TOGAF ADM can be used for developing software application architecture. Because the software application architecture is one of the disciplines in application development life cycle, it is important to find out how the enterprise architecture development method can support the application architecture development. Having an open standard that can be used in the application architecture development could help in outsourcing of software development. If ADM could be used for software application architecture development, then we could consider its usability in outsourcing of software development.

  11. Teaching Empirical Software Engineering Using Expert Teams

    DEFF Research Database (Denmark)

    Kuhrmann, Marco

    2017-01-01

    Empirical software engineering aims at making software engineering claims measurable, i.e., to analyze and understand phenomena in software engineering and to evaluate software engineering approaches and solutions. Due to the involvement of humans and the multitude of fields for which software...... is crucial, software engineering is considered hard to teach. Yet, empirical software engineering increases this difficulty by adding the scientific method as extra dimension. In this paper, we present a Master-level course on empirical software engineering in which different empirical instruments...... an extra specific expertise that they offer as service to other teams, thus, fostering cross-team collaboration. The paper outlines the general course setup, topics addressed, and it provides initial lessons learned....

  12. High-Level Synthesis: Productivity, Performance, and Software Constraints

    Directory of Open Access Journals (Sweden)

    Yun Liang

    2012-01-01

    Full Text Available FPGAs are an attractive platform for applications with high computation demand and low energy consumption requirements. However, design effort for FPGA implementations remains high—often an order of magnitude larger than design effort using high-level languages. Instead of this time-consuming process, high-level synthesis (HLS tools generate hardware implementations from algorithm descriptions in languages such as C/C++ and SystemC. Such tools reduce design effort: high-level descriptions are more compact and less error prone. HLS tools promise hardware development abstracted from software designer knowledge of the implementation platform. In this paper, we present an unbiased study of the performance, usability and productivity of HLS using AutoPilot (a state-of-the-art HLS tool. In particular, we first evaluate AutoPilot using the popular embedded benchmark kernels. Then, to evaluate the suitability of HLS on real-world applications, we perform a case study of stereo matching, an active area of computer vision research that uses techniques also common for image denoising, image retrieval, feature matching, and face recognition. Based on our study, we provide insights on current limitations of mapping general-purpose software to hardware using HLS and some future directions for HLS tool development. We also offer several guidelines for hardware-friendly software design. For popular embedded benchmark kernels, the designs produced by HLS achieve 4X to 126X speedup over the software version. The stereo matching algorithms achieve between 3.5X and 67.9X speedup over software (but still less than manual RTL design with a fivefold reduction in design effort versus manual RTL design.

  13. A system for automatic evaluation of simulation software

    Science.gov (United States)

    Ryan, J. P.; Hodges, B. C.

    1976-01-01

    Within the field of computer software, simulation and verification are complementary processes. Simulation methods can be used to verify software by performing variable range analysis. More general verification procedures, such as those described in this paper, can be implicitly, viewed as attempts at modeling the end-product software. From software requirement methodology, each component of the verification system has some element of simulation to it. Conversely, general verification procedures can be used to analyze simulation software. A dynamic analyzer is described which can be used to obtain properly scaled variables for an analog simulation, which is first digitally simulated. In a similar way, it is thought that the other system components and indeed the whole system itself have the potential of being effectively used in a simulation environment.

  14. Prometheus Reactor I&C Software Development Methodology, for Action

    Energy Technology Data Exchange (ETDEWEB)

    T. Hamilton

    2005-07-30

    The purpose of this letter is to submit the Reactor Instrumentation and Control (I&C) software life cycle, development methodology, and programming language selections and rationale for project Prometheus to NR for approval. This letter also provides the draft Reactor I&C Software Development Process Manual and Reactor Module Software Development Plan to NR for information.

  15. Software for nuclear spectrometry

    International Nuclear Information System (INIS)

    1998-10-01

    The Advisory Group Meeting (AGM) on Software for Nuclear Spectrometry was dedicated to review the present status of software for nuclear spectrometry and to advise on future activities in this field. Because similar AGM and consultant's meetings had been held in the past; together with an attempt to get more streamlined, this AGM was devoted to the specific field of software for gamma ray spectrometry. Nevertheless, many of the issues discussed and the recommendations made are of general concern for any software on nuclear spectrometry. The report is organized by sections. The 'Summary' gives conclusions and recommendations adopted at the AGM. These conclusions and recommendations resulted from the discussions held during and after presentations of the scientific and technical papers. These papers are reported here in their integral form in the following Sections

  16. The Systems Biology Research Tool: evolvable open-source software

    OpenAIRE

    Wright, J; Wagner, A

    2008-01-01

    Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput) experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform calle...

  17. Study of evaluation techniques of software configuration management and reliability

    Energy Technology Data Exchange (ETDEWEB)

    Youn, Cheong; Baek, Y. W.; Kim, H. C.; Han, H. C.; Choi, C. R. [Chungnam National Univ., Taejon (Korea, Republic of)

    2001-03-15

    The Study of activities to solve software safety and quality must be executed in base of establishing software development process for digitalized nuclear plant. Especially study of software testing and Verification and Validation must executed. For this purpose methodologies and tools which can improve software qualities are evaluated and software Testing, V and V and Configuration Management which can be applied to software life cycle are investigated. This study establish a guideline that can be used to assure software safety and reliability requirements in digitalized nuclear plant systems.

  18. Adding Hierarchical Objects to Relational Database General-Purpose XML-Based Information Managements

    Science.gov (United States)

    Lin, Shu-Chun; Knight, Chris; La, Tracy; Maluf, David; Bell, David; Tran, Khai Peter; Gawdiak, Yuri

    2006-01-01

    NETMARK is a flexible, high-throughput software system for managing, storing, and rapid searching of unstructured and semi-structured documents. NETMARK transforms such documents from their original highly complex, constantly changing, heterogeneous data formats into well-structured, common data formats in using Hypertext Markup Language (HTML) and/or Extensible Markup Language (XML). The software implements an object-relational database system that combines the best practices of the relational model utilizing Structured Query Language (SQL) with those of the object-oriented, semantic database model for creating complex data. In particular, NETMARK takes advantage of the Oracle 8i object-relational database model using physical-address data types for very efficient keyword searches of records across both context and content. NETMARK also supports multiple international standards such as WEBDAV for drag-and-drop file management and SOAP for integrated information management using Web services. The document-organization and -searching capabilities afforded by NETMARK are likely to make this software attractive for use in disciplines as diverse as science, auditing, and law enforcement.

  19. A LOW-COST AND LIGHTWEIGHT 3D INTERACTIVE REAL ESTATE-PURPOSED INDOOR VIRTUAL REALITY APPLICATION

    Directory of Open Access Journals (Sweden)

    K. Ozacar

    2017-11-01

    Full Text Available Interactive 3D architectural indoor design have been more popular after it benefited from Virtual Reality (VR technologies. VR brings computer-generated 3D content to real life scale and enable users to observe immersive indoor environments so that users can directly modify it. This opportunity enables buyers to purchase a property off-the-plan cheaper through virtual models. Instead of showing property through 2D plan or renders, this visualized interior architecture of an on-sale unbuilt property is demonstrated beforehand so that the investors have an impression as if they were in the physical building. However, current applications either use highly resource consuming software, or are non-interactive, or requires specialist to create such environments. In this study, we have created a real-estate purposed low-cost high quality fully interactive VR application that provides a realistic interior architecture of the property by using free and lightweight software: Sweet Home 3D and Unity. A preliminary study showed that participants generally liked proposed real estate-purposed VR application, and it satisfied the expectation of the property buyers.

  20. a Low-Cost and Lightweight 3d Interactive Real Estate-Purposed Indoor Virtual Reality Application

    Science.gov (United States)

    Ozacar, K.; Ortakci, Y.; Kahraman, I.; Durgut, R.; Karas, I. R.

    2017-11-01

    Interactive 3D architectural indoor design have been more popular after it benefited from Virtual Reality (VR) technologies. VR brings computer-generated 3D content to real life scale and enable users to observe immersive indoor environments so that users can directly modify it. This opportunity enables buyers to purchase a property off-the-plan cheaper through virtual models. Instead of showing property through 2D plan or renders, this visualized interior architecture of an on-sale unbuilt property is demonstrated beforehand so that the investors have an impression as if they were in the physical building. However, current applications either use highly resource consuming software, or are non-interactive, or requires specialist to create such environments. In this study, we have created a real-estate purposed low-cost high quality fully interactive VR application that provides a realistic interior architecture of the property by using free and lightweight software: Sweet Home 3D and Unity. A preliminary study showed that participants generally liked proposed real estate-purposed VR application, and it satisfied the expectation of the property buyers.

  1. System Engineering Strategy for Distributed Multi-Purpose Simulation Architectures

    Science.gov (United States)

    Bhula, Dlilpkumar; Kurt, Cindy Marie; Luty, Roger

    2007-01-01

    This paper describes the system engineering approach used to develop distributed multi-purpose simulations. The multi-purpose simulation architecture focuses on user needs, operations, flexibility, cost and maintenance. This approach was used to develop an International Space Station (ISS) simulator, which is called the International Space Station Integrated Simulation (ISIS)1. The ISIS runs unmodified ISS flight software, system models, and the astronaut command and control interface in an open system design that allows for rapid integration of multiple ISS models. The initial intent of ISIS was to provide a distributed system that allows access to ISS flight software and models for the creation, test, and validation of crew and ground controller procedures. This capability reduces the cost and scheduling issues associated with utilizing standalone simulators in fixed locations, and facilitates discovering unknowns and errors earlier in the development lifecycle. Since its inception, the flexible architecture of the ISIS has allowed its purpose to evolve to include ground operator system and display training, flight software modification testing, and as a realistic test bed for Exploration automation technology research and development.

  2. Virtual Exercise Training Software System

    Science.gov (United States)

    Vu, L.; Kim, H.; Benson, E.; Amonette, W. E.; Barrera, J.; Perera, J.; Rajulu, S.; Hanson, A.

    2018-01-01

    The purpose of this study was to develop and evaluate a virtual exercise training software system (VETSS) capable of providing real-time instruction and exercise feedback during exploration missions. A resistive exercise instructional system was developed using a Microsoft Kinect depth-camera device, which provides markerless 3-D whole-body motion capture at a small form factor and minimal setup effort. It was hypothesized that subjects using the newly developed instructional software tool would perform the deadlift exercise with more optimal kinematics and consistent technique than those without the instructional software. Following a comprehensive evaluation in the laboratory, the system was deployed for testing and refinement in the NASA Extreme Environment Mission Operations (NEEMO) analog.

  3. Evaluation of general-purpose collimators against high-resolution collimators with resolution recovery with a view to reducing radiation dose in myocardial perfusion SPECT: A preliminary phantom study.

    Science.gov (United States)

    Armstrong, Ian S; Saint, Kimberley J; Tonge, Christine M; Arumugam, Parthiban

    2017-04-01

    There is a growing focus on reducing radiation dose to patients undergoing myocardial perfusion imaging. This preliminary phantom study aims to evaluate the use of general-purpose collimators with resolution recovery (RR) to allow a reduction in patient radiation dose. Images of a cardiac torso phantom with inferior and anterior wall defects were acquired on a GE Infinia and Siemens Symbia T6 using both high-resolution and general-purpose collimators. Imaging time, a surrogate for administered activity, was reduced between 35% and 40% with general-purpose collimators to match the counts acquired with high-resolution collimators. Images were reconstructed with RR with and without attenuation correction. Two pixel sizes were also investigated. Defect contrast was measured. Defect contrast on general-purpose images was superior or comparable to the high-resolution collimators on both systems despite the reduced imaging time. Infinia general-purpose images required a smaller pixel size to be used to maintain defect contrast, while Symbia T6 general-purpose images did not require a change in pixel size to that used for standard myocardial perfusion SPECT. This study suggests that general-purpose collimators with RR offer a potential for substantial dose reductions while providing similar or better image quality to images acquired using high-resolution collimators.

  4. Implementation of the dynamic Monte Carlo method for transient analysis in the general purpose code Tripoli

    Energy Technology Data Exchange (ETDEWEB)

    Sjenitzer, Bart L.; Hoogenboom, J. Eduard, E-mail: B.L.Sjenitzer@TUDelft.nl, E-mail: J.E.Hoogenboom@TUDelft.nl [Delft University of Technology (Netherlands)

    2011-07-01

    A new Dynamic Monte Carlo method is implemented in the general purpose Monte Carlo code Tripoli 4.6.1. With this new method incorporated, a general purpose code can be used for safety transient analysis, such as the movement of a control rod or in an accident scenario. To make the Tripoli code ready for calculating on dynamic systems, the Tripoli scheme had to be altered to incorporate time steps, to include the simulation of delayed neutron precursors and to simulate prompt neutron chains. The modified Tripoli code is tested on two sample cases, a steady-state system and a subcritical system and the resulting neutron fluxes behave just as expected. The steady-state calculation has a constant neutron flux over time and this result shows the stability of the calculation. The neutron flux stays constant with acceptable variance. This also shows that the starting conditions are determined correctly. The sub-critical case shows that the code can also handle dynamic systems with a varying neutron flux. (author)

  5. Implementation of the dynamic Monte Carlo method for transient analysis in the general purpose code Tripoli

    International Nuclear Information System (INIS)

    Sjenitzer, Bart L.; Hoogenboom, J. Eduard

    2011-01-01

    A new Dynamic Monte Carlo method is implemented in the general purpose Monte Carlo code Tripoli 4.6.1. With this new method incorporated, a general purpose code can be used for safety transient analysis, such as the movement of a control rod or in an accident scenario. To make the Tripoli code ready for calculating on dynamic systems, the Tripoli scheme had to be altered to incorporate time steps, to include the simulation of delayed neutron precursors and to simulate prompt neutron chains. The modified Tripoli code is tested on two sample cases, a steady-state system and a subcritical system and the resulting neutron fluxes behave just as expected. The steady-state calculation has a constant neutron flux over time and this result shows the stability of the calculation. The neutron flux stays constant with acceptable variance. This also shows that the starting conditions are determined correctly. The sub-critical case shows that the code can also handle dynamic systems with a varying neutron flux. (author)

  6. Edge corrections to electromagnetic Casimir energies from general-purpose Mathieu-function routines

    Science.gov (United States)

    Blose, Elizabeth Noelle; Ghimire, Biswash; Graham, Noah; Stratton-Smith, Jeremy

    2015-01-01

    Scattering theory methods make it possible to calculate the Casimir energy of a perfectly conducting elliptic cylinder opposite a perfectly conducting plane in terms of Mathieu functions. In the limit of zero radius, the elliptic cylinder becomes a finite-width strip, which allows for the study of edge effects. However, existing packages for computing Mathieu functions are insufficient for this calculation because none can compute Mathieu functions of both the first and second kind for complex arguments. To address this shortcoming, we have written a general-purpose Mathieu-function package, based on algorithms developed by Alhargan. We use these routines to find edge corrections to the proximity force approximation for the Casimir energy of a perfectly conducting strip opposite a perfectly conducting plane.

  7. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan : ASC software quality engineering practices Version 3.0.

    Energy Technology Data Exchange (ETDEWEB)

    Turgeon, Jennifer L.; Minana, Molly A.; Hackney, Patricia; Pilch, Martin M.

    2009-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in the US Department of Energy/National Nuclear Security Agency (DOE/NNSA) Quality Criteria, Revision 10 (QC-1) as 'conformance to customer requirements and expectations'. This quality plan defines the SNL ASC Program software quality engineering (SQE) practices and provides a mapping of these practices to the SNL Corporate Process Requirement (CPR) 001.3.6; 'Corporate Software Engineering Excellence'. This plan also identifies ASC management's and the software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals. This SNL ASC Software Quality Plan establishes the signatories commitments to improving software products by applying cost-effective SQE practices. This plan enumerates the SQE practices that comprise the development of SNL ASC's software products and explains the project teams opportunities for tailoring and implementing the practices.

  8. Open source in Finnish software companies

    OpenAIRE

    Seppä, Arto

    2006-01-01

    This paper explores survey data focusing on open source software supply collected from 170 Finnish software firms using descriptive statistical analysis. The first half of the report contains general data about software companies and the differences between proprietary and open source firms. The second half focuses on open source firms. A subject of analysis are copyrights, products and services supply, the firms’ relationships with the open source community, and their views on opportunities ...

  9. Collected software engineering papers, volume 6

    Science.gov (United States)

    1988-01-01

    A collection is presented of technical papers produced by participants in the Software Engineering Laboratory (SEL) during the period 1 Jun. 1987 to 1 Jan. 1989. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. For the convenience of this presentation, the twelve papers contained here are grouped into three major categories: (1) Software Measurement and Technology Studies; (2) Measurement Environment Studies; and (3) Ada Technology Studies. The first category presents experimental research and evaluation of software measurement and technology; the second presents studies on software environments pertaining to measurement. The last category represents Ada technology and includes research, development, and measurement studies.

  10. An overview of 3D software visualization.

    Science.gov (United States)

    Teyseyre, Alfredo R; Campo, Marcelo R

    2009-01-01

    Software visualization studies techniques and methods for graphically representing different aspects of software. Its main goal is to enhance, simplify and clarify the mental representation a software engineer has of a computer system. During many years, visualization in 2D space has been actively studied, but in the last decade, researchers have begun to explore new 3D representations for visualizing software. In this article, we present an overview of current research in the area, describing several major aspects like: visual representations, interaction issues, evaluation methods and development tools. We also perform a survey of some representative tools to support different tasks, i.e., software maintenance and comprehension, requirements validation and algorithm animation for educational purposes, among others. Finally, we conclude identifying future research directions.

  11. High-Level Design for Ultra-Fast Software Defined Radio Prototyping on Multi-Processors Heterogeneous Platforms

    OpenAIRE

    Moy , Christophe; Raulet , Mickaël

    2010-01-01

    International audience; The design of Software Defined Radio (SDR) equipments (terminals, base stations, etc.) is still very challenging. We propose here a design methodology for ultra-fast prototyping on heterogeneous platforms made of GPPs (General Purpose Processors), DSPs (Digital Signal Processors) and FPGAs (Field Programmable Gate Array). Lying on a component-based approach, the methodology mainly aims at automating as much as possible the design from an algorithmic validation to a mul...

  12. Reusable Rack Interface Controller Common Software for Various Science Research Racks on the International Space Station

    Science.gov (United States)

    Lu, George C.

    2003-01-01

    The purpose of the EXPRESS (Expedite the PRocessing of Experiments to Space Station) rack project is to provide a set of predefined interfaces for scientific payloads which allow rapid integration into a payload rack on International Space Station (ISS). VxWorks' was selected as the operating system for the rack and payload resource controller, primarily based on the proliferation of VME (Versa Module Eurocard) products. These products provide needed flexibility for future hardware upgrades to meet everchanging science research rack configuration requirements. On the International Space Station, there are multiple science research rack configurations, including: 1) Human Research Facility (HRF); 2) EXPRESS ARIS (Active Rack Isolation System); 3) WORF (Window Observational Research Facility); and 4) HHR (Habitat Holding Rack). The RIC (Rack Interface Controller) connects payloads to the ISS bus architecture for data transfer between the payload and ground control. The RIC is a general purpose embedded computer which supports multiple communication protocols, including fiber optic communication buses, Ethernet buses, EIA-422, Mil-Std-1553 buses, SMPTE (Society Motion Picture Television Engineers)-170M video, and audio interfaces to payloads and the ISS. As a cost saving and software reliability strategy, the Boeing Payload Software Organization developed reusable common software where appropriate. These reusable modules included a set of low-level driver software interfaces to 1553B. RS232, RS422, Ethernet buses, HRDL (High Rate Data Link), video switch functionality, telemetry processing, and executive software hosted on the FUC computer. These drivers formed the basis for software development of the HRF, EXPRESS, EXPRESS ARIS, WORF, and HHR RIC executable modules. The reusable RIC common software has provided extensive benefits, including: 1) Significant reduction in development flow time; 2) Minimal rework and maintenance; 3) Improved reliability; and 4) Overall

  13. Assessing Software Quality Through Visualised Cohesion Metrics

    Directory of Open Access Journals (Sweden)

    Timothy Shih

    2001-05-01

    Full Text Available Cohesion is one of the most important factors for software quality as well as maintainability, reliability and reusability. Module cohesion is defined as a quality attribute that seeks for measuring the singleness of the purpose of a module. The module of poor quality can be a serious obstacle to the system quality. In order to design a good software quality, software managers and engineers need to introduce cohesion metrics to measure and produce desirable software. A highly cohesion software is thought to be a desirable constructing. In this paper, we propose a function-oriented cohesion metrics based on the analysis of live variables, live span and the visualization of processing element dependency graph. We give six typical cohesion examples to be measured as our experiments and justification. Therefore, a well-defined, well-normalized, well-visualized and well-experimented cohesion metrics is proposed to indicate and thus enhance software cohesion strength. Furthermore, this cohesion metrics can be easily incorporated with software CASE tool to help software engineers to improve software quality.

  14. Software Independent Verification and Validation (SIV&V) Simplified

    Science.gov (United States)

    2006-12-01

    Midcourse Defense GOTS Government-Off-The-Shelf GSAM General Service Administration Acquisition Manual GTE General Telephone and Electronics GUI...Graphical User Interfaces HSI Hardware and Software Integration HSI Human Systems Integration HTP Hardware Test Plan HWCI Hardware...requirements are extracted and traced to the developer’s Software and Hardware Test Plans (STP and HTP ). This ensures adequacy of the test plans and also

  15. How did the General Purpose Technology Electricity contribute to the Second Industrial Revolution (II): The Communication Engines

    NARCIS (Netherlands)

    van der Kooij, B.J.G.

    2017-01-01

    The concept of the General Purpose Technology (GPT) of the late 1990s is a culmination of many evolutionairy views in innovation-thinking. By definition the GPT considers the technical, social, and economic effects of meta-technologies like steam-technology and electric technology. This paper uses

  16. Object oriented software for simulation and reconstruction of big alignment systems

    CERN Document Server

    Arce, P

    2003-01-01

    Modern high-energy physics experiments require tracking detectors to provide high precision under difficult working conditions (high magnetic field, gravity loads and temperature gradients). This is the reason why several of them are deciding to implement optical alignment systems to monitor the displacement of tracking elements in operation. To simulate and reconstruct optical alignment systems a general purpose software, named COCOA, has been developed, using the object oriented paradigm and software engineering techniques. Thanks to the big flexibility in its design, COCOA is able to reconstruct any optical system made of a combination of the following objects: laser, x-hair laser, incoherent source - pinhole, lens, mirror, plate splitter, cube splitter, optical square, rhomboid prism, 2D sensor, 1D sensor, distance-meter, tilt-meter, user-defined. COCOA was designed to satisfy the requirements of the CMS alignment system, which has several thousands of components. Sparse matrix techniques had been investi...

  17. Software Engineering Infrastructure in a Large Virtual Campus

    Science.gov (United States)

    Cristobal, Jesus; Merino, Jorge; Navarro, Antonio; Peralta, Miguel; Roldan, Yolanda; Silveira, Rosa Maria

    2011-01-01

    Purpose: The design, construction and deployment of a large virtual campus are a complex issue. Present virtual campuses are made of several software applications that complement e-learning platforms. In order to develop and maintain such virtual campuses, a complex software engineering infrastructure is needed. This paper aims to analyse the…

  18. Closing gaps between open software and public data in a hackathon setting: User-centered software prototyping.

    Science.gov (United States)

    Busby, Ben; Lesko, Matthew; Federer, Lisa

    2016-01-01

    In genomics, bioinformatics and other areas of data science, gaps exist between extant public datasets and the open-source software tools built by the community to analyze similar data types.  The purpose of biological data science hackathons is to assemble groups of genomics or bioinformatics professionals and software developers to rapidly prototype software to address these gaps.  The only two rules for the NCBI-assisted hackathons run so far are that 1) data either must be housed in public data repositories or be deposited to such repositories shortly after the hackathon's conclusion, and 2) all software comprising the final pipeline must be open-source or open-use.  Proposed topics, as well as suggested tools and approaches, are distributed to participants at the beginning of each hackathon and refined during the event.  Software, scripts, and pipelines are developed and published on GitHub, a web service providing publicly available, free-usage tiers for collaborative software development. The code resulting from each hackathon is published at https://github.com/NCBI-Hackathons/ with separate directories or repositories for each team.

  19. Workshop on Developing Safe Software

    International Nuclear Information System (INIS)

    Lawrence, J.D.

    1994-11-01

    The Workshop on Developing Safe Software was held July 22--23, 1992, at the Hotel del Coronado, San Diego, California. The purpose of the workshop was to have four world experts discuss among themselves software safety issues which are of interest to the US Nuclear Regulatory Commission. These issues concern the development of software systems for use in nuclear power plant protection systems. The workshop comprised four sessions. Wednesday morning, July 22, consisted of presentations from each of the four panel members. On Wednesday afternoon, the panel members went through a list of possible software development techniques and commented on them. The Thursday morning, July 23, session consisted of an extended discussion among the panel members and the observers from the NRC. A final session on Thursday afternoon consisted of a discussion among the NRC observers as to what was learned from the workshop

  20. Workshop on developing safe software

    International Nuclear Information System (INIS)

    Lawrence, J.D.

    1992-01-01

    The Workshop on Developing Safe Software was held July 22--23 at the Hotel del Coronado, San Diego, California. The purpose of the workshop was to have four world experts discuss among themselves software safety issues which are of interest to the U. S. Nuclear Regulatory Commission (NRC). These issues concern the development of software systems for use in nuclear power plant protection systems. The workshop comprised four sessions. Wednesday morning, July 22, consisted of presentations from each of the four panel members. On Wednesday afternoon, the panel members went through a list of possible software development techniques and commented on them. The Thursday morning, July 23, session consisted of an extended discussion among the panel members and the observers from the NRC. A final session on Thursday afternoon consisted of a discussion among the NRC observers as to what was teamed from the workshop

  1. Characterizing the Development and Usage of Diagrams in Embedded Software Systems

    NARCIS (Netherlands)

    Akdur, Deniz; Demirörs, Onur; Garousi, V.

    2017-01-01

    To cope with growing complexity of embedded software, modeling has become popular. The usage of models in embedded software industry and the relevant practices usually vary since the purposes of diagram development and usage differ. Since a large variety of software modeling practices used in

  2. Software for people fundamentals, trends and best practices

    CERN Document Server

    Maedche, Alexander; Neer, Ludwig

    2012-01-01

    The highly competitive and globalized software market is creating pressure on software companies. Given the current boundary conditions, it is critical to continuously increase time-to-market and reduce development costs. In parallel, driven by private life experiences with mobile computing devices, the World Wide Web and software-based services, people, general expectations with regards to software are growing. They expect software that is simple and joyful to use. In the light of the changes that have taken place in recent years, software companies need to fundamentally reconsider the way th

  3. Software engineering methodologies and tools

    Science.gov (United States)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  4. Interfacing a General Purpose Fluid Network Flow Program with the SINDA/G Thermal Analysis Program

    Science.gov (United States)

    Schallhorn, Paul; Popok, Daniel

    1999-01-01

    A general purpose, one dimensional fluid flow code is currently being interfaced with the thermal analysis program Systems Improved Numerical Differencing Analyzer/Gaski (SINDA/G). The flow code, Generalized Fluid System Simulation Program (GFSSP), is capable of analyzing steady state and transient flow in a complex network. The flow code is capable of modeling several physical phenomena including compressibility effects, phase changes, body forces (such as gravity and centrifugal) and mixture thermodynamics for multiple species. The addition of GFSSP to SINDA/G provides a significant improvement in convective heat transfer modeling for SINDA/G. The interface development is conducted in multiple phases. This paper describes the first phase of the interface which allows for steady and quasi-steady (unsteady solid, steady fluid) conjugate heat transfer modeling.

  5. General purpose graphic processing unit implementation of adaptive pulse compression algorithms

    Science.gov (United States)

    Cai, Jingxiao; Zhang, Yan

    2017-07-01

    This study introduces a practical approach to implement real-time signal processing algorithms for general surveillance radar based on NVIDIA graphical processing units (GPUs). The pulse compression algorithms are implemented using compute unified device architecture (CUDA) libraries such as CUDA basic linear algebra subroutines and CUDA fast Fourier transform library, which are adopted from open source libraries and optimized for the NVIDIA GPUs. For more advanced, adaptive processing algorithms such as adaptive pulse compression, customized kernel optimization is needed and investigated. A statistical optimization approach is developed for this purpose without needing much knowledge of the physical configurations of the kernels. It was found that the kernel optimization approach can significantly improve the performance. Benchmark performance is compared with the CPU performance in terms of processing accelerations. The proposed implementation framework can be used in various radar systems including ground-based phased array radar, airborne sense and avoid radar, and aerospace surveillance radar.

  6. Containment and surveillance for software

    International Nuclear Information System (INIS)

    Andress, J.C.; Adams, G.N.; Cotton, J.H.

    1993-07-01

    Some operators and state authorities are offering their computer systems, both hardware and software, to be used for safeguards purposes by the International Atomic Energy Agency. Therefore a need exists to develop a method of authenticating the data produced by a computer program before it can be used by the Agency. As part of a complete Computer Systems Authentication (COMSAT) package, a method of software containment and surveillance has been developed to compliment existing software authentication techniques. The package is applicable to both operator and Agency provided systems. A program to demonstrate the principles has been written. With this facility, the Agency will be able to leave unattended software in the field, either to be used by the operator to generate data for inspection on their own computer, or to save an inspector having to re-install inspection-specific software on an Agency computer, in the knowledge that the operation of the protected computer is being continuously monitored. If adopted, either of these uses will enable the Agency to reduce their costs. (Author)

  7. Applying Agile MethodstoWeapon/Weapon-Related Software

    Energy Technology Data Exchange (ETDEWEB)

    Adams, D; Armendariz, M; Blackledge, M; Campbell, F; Cloninger, M; Cox, L; Davis, J; Elliott, M; Granger, K; Hans, S; Kuhn, C; Lackner, M; Loo, P; Matthews, S; Morrell, K; Owens, C; Peercy, D; Pope, G; Quirk, R; Schilling, D; Stewart, A; Tran, A; Ward, R; Williamson, M

    2007-05-02

    This white paper provides information and guidance to the Department of Energy (DOE) sites on Agile software development methods and the impact of their application on weapon/weapon-related software development. The purpose of this white paper is to provide an overview of Agile methods, examine the accepted interpretations/uses/practices of these methodologies, and discuss the applicability of Agile methods with respect to Nuclear Weapons Complex (NWC) Technical Business Practices (TBPs). It also provides recommendations on the application of Agile methods to the development of weapon/weapon-related software.

  8. Application of a general-purpose scintigraphic scanner to transverse-section (tomographic) gamma-ray imaging

    International Nuclear Information System (INIS)

    Bradstock, P.A.; Milward, R.C.

    1976-01-01

    The paper describes the recent application of a general-purpose commercial scintigraphic scanner to transverse-section radioisotope tomography. The principle of the method is to obtain the distribution of radioactive material in a thin transverse slice of the body or brain, from a mathematical reconstruction using the measured transverse projections of the activity within that slice. The usefulness of the radioisotope section-scanning technique for clinical diagnosis, as evidenced from one year's use of the machine at the Midland Centre for Neurology and Neurosurgery, Birmingham, U.K., is briefly discussed. (orig.) [de

  9. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan part 2 mappings for the ASC software quality engineering practices, version 2.0.

    Energy Technology Data Exchange (ETDEWEB)

    Heaphy, Robert; Sturtevant, Judith E.; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr. (,; .); Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR001.3.2 and CPR001.3.6 and to a Department of Energy document, ''ASCI Software Quality Engineering: Goals, Principles, and Guidelines''. This document also identifies ASC management and software project teams' responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  10. Review of the Educational Software Evaluation Forms and Scales

    Directory of Open Access Journals (Sweden)

    Ahmet ARSLAN

    2016-12-01

    Full Text Available The main purpose of this study is to review existing evaluation forms and scales that have been prepared for educational software evaluation. In addition to this purpose, the study aims to provide insight and guidance for future studies in this context. In total, forty-two studies that including evaluation forms and scales have been taken into consideration. “Educational software evaluation”, “Software evaluation”, “Educational software evaluation forms/scales” were searched as keywords in the: “Education Resources Information Centre (ERIC”, “Marmara University e-Library”, “National Thesis Center” and “Science Direct” databases. Twenty-nine of them have met the review selection criteria and been evaluated. There is an increase in the number of evaluation tools between 2006 – 2010. However, it was noticed that there is no sufficient number of evaluation tools targeting “educational games”. It was concluded that reliability and validity studies are very important part of developing educational software evaluation tools and this is a matter that should be considered in future studies.

  11. Environmental assessment of general-purpose heat source safety verification testing

    International Nuclear Information System (INIS)

    1995-02-01

    This Environmental Assessment (EA) was prepared to identify and evaluate potential environmental, safety, and health impacts associated with the Proposed Action to test General-Purpose Heat Source (GPHS) Radioisotope Thermoelectric Generator (RTG) assemblies at the Sandia National Laboratories (SNL) 10,000-Foot Sled Track Facility, Albuquerque, New Mexico. RTGs are used to provide a reliable source of electrical power on board some spacecraft when solar power is inadequate during long duration space missions. These units are designed to convert heat from the natural decay of radioisotope fuel into electrical power. Impact test data are required to support DOE's mission to provide radioisotope power systems to NASA and other user agencies. The proposed tests will expand the available safety database regarding RTG performance under postulated accident conditions. Direct observations and measurements of GPHS/RTG performance upon impact with hard, unyielding surfaces are required to verify model predictions and to ensure the continual evolution of the RTG designs that perform safely under varied accident environments. The Proposed Action is to conduct impact testing of RTG sections containing GPHS modules with simulated fuel. End-On and Side-On impact test series are planned

  12. Computing OpenSURF on OpenCL and General Purpose GPU

    Directory of Open Access Journals (Sweden)

    Wanglong Yan

    2013-10-01

    Full Text Available Speeded-Up Robust Feature (SURF algorithm is widely used for image feature detecting and matching in computer vision area. Open Computing Language (OpenCL is a framework for writing programs that execute across heterogeneous platforms consisting of CPUs, GPUs, and other processors. This paper introduces how to implement an open-sourced SURF program, namely OpenSURF, on general purpose GPU by OpenCL, and discusses the optimizations in terms of the thread architectures and memory models in detail. Our final OpenCL implementation of OpenSURF is on average 37% and 64% faster than the OpenCV SURF v2.4.5 CUDA implementation on NVidia's GTX660 and GTX460SE GPUs, repectively. Our OpenCL program achieved real-time performance (>25 Frames Per Second for almost all the input images with different sizes from 320*240 to 1024*768 on NVidia's GTX660 GPU, NVidia's GTX460SE GPU and AMD's Radeon HD 6850 GPU. Our OpenCL approach on NVidia's GTX660 GPU is more than 22.8 times faster than its original CPU version on Intel's Dual-Core E5400 2.7G on average.

  13. 49 CFR 236.18 - Software management control plan.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Software management control plan. 236.18 Section... Instructions: All Systems General § 236.18 Software management control plan. (a) Within 6 months of June 6, 2005, each railroad shall develop and adopt a software management control plan for its signal and train...

  14. Failure mode and effects analysis of software-based automation systems

    International Nuclear Information System (INIS)

    Haapanen, P.; Helminen, A.

    2002-08-01

    Failure mode and effects analysis (FMEA) is one of the well-known analysis methods having an established position in the traditional reliability analysis. The purpose of FMEA is to identify possible failure modes of the system components, evaluate their influences on system behaviour and propose proper countermeasures to suppress these effects. The generic nature of FMEA has enabled its wide use in various branches of industry reaching from business management to the design of spaceships. The popularity and diverse use of the analysis method has led to multiple interpretations, practices and standards presenting the same analysis method. FMEA is well understood at the systems and hardware levels, where the potential failure modes usually are known and the task is to analyse their effects on system behaviour. Nowadays, more and more system functions are realised on software level, which has aroused the urge to apply the FMEA methodology also on software based systems. Software failure modes generally are unknown - 'software modules do not fail, they only display incorrect behaviour' - and depend on dynamic behaviour of the application. These facts set special requirements on the FMEA of software based systems and make it difficult to realise. In this report the failure mode and effects analysis is studied for the use of reliability analysis of software-based systems. More precisely, the target system of FMEA is defined to be a safety-critical software-based automation application in a nuclear power plant, implemented on an industrial automation system platform. Through a literature study the report tries to clarify the intriguing questions related to the practical use of software failure mode and effects analysis. The study is a part of the research project 'Programmable Automation System Safety Integrity assessment (PASSI)', belonging to the Finnish Nuclear Safety Research Programme (FINNUS, 1999-2002). In the project various safety assessment methods and tools for

  15. Multistage switching hardware and software implementations for student experiment purpose

    Science.gov (United States)

    Sani, A.; Suherman

    2018-02-01

    Current communication and internet networks are underpinned by the switching technologies that interconnect one network to the others. Students’ understanding on networks rely on how they conver the theories. However, understanding theories without touching the reality may exert spots in the overall knowledge. This paper reports the progress of the multistage switching design and implementation for student laboratory activities. The hardware and software designs are based on three stages clos switching architecture with modular 2x2 switches, controlled by an arduino microcontroller. The designed modules can also be extended for batcher and bayan switch, and working on circuit and packet switching systems. The circuit analysis and simulation show that the blocking probability for each switch combinations can be obtained by generating random or patterned traffics. The mathematic model and simulation analysis shows 16.4% blocking probability differences as the traffic generation is uniform. The circuits design components and interfacing solution have been identified to allow next step implementation.

  16. Prometheus Reactor I and C Software Development Methodology, for Action

    International Nuclear Information System (INIS)

    T. Hamilton

    2005-01-01

    The purpose of this letter is to submit the Reactor Instrumentation and Control (I and C) software life cycle, development methodology, and programming language selections and rationale for project Prometheus to NR for approval. This letter also provides the draft Reactor I and C Software Development Process Manual and Reactor Module Software Development Plan to NR for information

  17. Software Innovation in a Mission Critical Environment

    Science.gov (United States)

    Fredrickson, Steven

    2015-01-01

    Operating in mission-critical environments requires trusted solutions, and the preference for "tried and true" approaches presents a potential barrier to infusing innovation into mission-critical systems. This presentation explores opportunities to overcome this barrier in the software domain. It outlines specific areas of innovation in software development achieved by the Johnson Space Center (JSC) Engineering Directorate in support of NASA's major human spaceflight programs, including International Space Station, Multi-Purpose Crew Vehicle (Orion), and Commercial Crew Programs. Software engineering teams at JSC work with hardware developers, mission planners, and system operators to integrate flight vehicles, habitats, robotics, and other spacecraft elements for genuinely mission critical applications. The innovations described, including the use of NASA Core Flight Software and its associated software tool chain, can lead to software that is more affordable, more reliable, better modelled, more flexible, more easily maintained, better tested, and enabling of automation.

  18. Software Engineering Research/Developer Collaborations in 2005

    Science.gov (United States)

    Pressburger, Tom

    2006-01-01

    In CY 2005, three collaborations between software engineering technology providers and NASA software development personnel deployed three software engineering technologies on NASA development projects (a different technology on each project). The main purposes were to benefit the projects, infuse the technologies if beneficial into NASA, and give feedback to the technology providers to improve the technologies. Each collaboration project produced a final report. Section 2 of this report summarizes each project, drawing from the final reports and communications with the software developers and technology providers. Section 3 indicates paths to further infusion of the technologies into NASA practice. Section 4 summarizes some technology transfer lessons learned. Also included is an acronym list.

  19. Software Quality Assurance and Verification for the MPACT Library Generation Process

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Yuxuan [Univ. of Michigan, Ann Arbor, MI (United States); Williams, Mark L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wiarda, Dorothea [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Clarno, Kevin T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kim, Kang Seog [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Celik, Cihangir [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-05-01

    This report fulfills the requirements for the Consortium for the Advanced Simulation of Light-Water Reactors (CASL) milestone L2:RTM.P14.02, “SQA and Verification for MPACT Library Generation,” by documenting the current status of the software quality, verification, and acceptance testing of nuclear data libraries for MPACT. It provides a brief overview of the library generation process, from general-purpose evaluated nuclear data files (ENDF/B) to a problem-dependent cross section library for modeling of light-water reactors (LWRs). The software quality assurance (SQA) programs associated with each of the software used to generate the nuclear data libraries are discussed; specific tests within the SCALE/AMPX and VERA/XSTools repositories are described. The methods and associated tests to verify the quality of the library during the generation process are described in detail. The library generation process has been automated to a degree to (1) ensure that it can be run without user intervention and (2) to ensure that the library can be reproduced. Finally, the acceptance testing process that will be performed by representatives from the Radiation Transport Methods (RTM) Focus Area prior to the production library’s release is described in detail.

  20. SOFTM: a software maintenance expert system in Prolog

    DEFF Research Database (Denmark)

    Pau, L.; Negret, J. M.

    1988-01-01

    A description is given of a knowledge-based system called SOFTM, serving the following purposes: (1) assisting a software programmer or analyst in his application code maintenance tasks, (2) generating and updating automatically software correction documentation, (3) helping the end user register......, and on interfacing capabilities of Prolog II to a variety of other languages...

  1. Increasing the reliability of ecological models using modern software engineering techniques

    Science.gov (United States)

    Robert M. Scheller; Brian R. Sturtevant; Eric J. Gustafson; Brendan C. Ward; David J. Mladenoff

    2009-01-01

    Modern software development techniques are largely unknown to ecologists. Typically, ecological models and other software tools are developed for limited research purposes, and additional capabilities are added later, usually in an ad hoc manner. Modern software engineering techniques can substantially increase scientific rigor and confidence in ecological models and...

  2. 7 CFR 249.1 - General purpose and scope.

    Science.gov (United States)

    2010-01-01

    ... Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE CHILD NUTRITION PROGRAMS SENIOR FARMERS' MARKET NUTRITION PROGRAM (SFMNP) General § 249.1 General.... 2011, et seq.), and to any other Federal or State food or nutrition assistance program under which...

  3. Technical Evaluation Report 37: Assistive Software for Disabled Learners

    Directory of Open Access Journals (Sweden)

    Jon Baggaley

    2004-11-01

    Full Text Available Previous reports in this series (#32 and 36 have discussed online software features of value to disabled learners in distance education. The current report evaluates four specific assistive software products with useful features for visually and hearing impaired learners: ATutor, ACollab, Natural Voice, and Just Vanilla. The evaluative criteria discussed include the purpose, uses, costs, and features of each software product, all considered primarily from the accessibility perspective.

  4. Quality control in diagnostic radiology: software (Visual Basic 6) and database applications

    International Nuclear Information System (INIS)

    Md Saion Salikin; Muhammad Farid Abdul Khalid

    2002-01-01

    Quality Assurance programme in diagnostic Radiology is being implemented by the Ministry of Health (MoH) in Malaysia. Under this program the performance of an x-ray machine used for diagnostic purpose is tested by using the approved procedure which is commonly known as Quality Control in diagnostic radiology. The quality control or performance tests are carried out b a class H licence holder issued the Atomic Energy Licensing Act 1984. There are a few computer applications (software) that are available in the market which can be used for this purpose. A computer application (software) using Visual Basics 6 and Microsoft Access, is being developed to expedite data handling, analysis and storage as well as report writing of the quality control tests. In this paper important features of the software for quality control tests are explained in brief. A simple database is being established for this purpose which is linked to the software. Problems encountered in the preparation of database are discussed in this paper. A few examples of practical usage of the software and database applications are presented in brief. (Author)

  5. Aiming toward perfection with POBSYS, a new software system

    International Nuclear Information System (INIS)

    Osudar, J.; Parks, J.E.; Levitz, N.M.

    1985-01-01

    An integrated general-purpose software system, POBSYS, has been developed that provides the foundation and tools for building a highly interactive system for carrying out detailed operating procedures and performing conventional process control, data acquisition, and data management functions. Features of the present system, which may be of particular interest to the problem of the man-machine interface include: (a) a multi-level safety system for fail-safe operation; (b) hierarchical operational control; (c) documented responsibility; (d) equipment status tracking; and (e) quality assurance checks on operations. The system runs on commercially available microprocessors and is presently in use in the destructive analysis of irradiated fuel rods from the Light Water Breeder Reactor

  6. Competing Values in Software Process Improvement

    DEFF Research Database (Denmark)

    Mûller, Sune Dueholm; Nielsen, Peter Axel

    2013-01-01

    Purpose The purpose of the article is to investigate the impact of organizational culture on software process improvement (SPI). Is cultural congruence between an organization and an adopted process model required? How can the level of congruence between an organizational culture and the values...... and assumptions underlying an adopted process model be assessed? How can cultural incongruence be managed to facilitate success of software process improvement? Design/methodology/approach The competing values framework and its associated assessment instrument are used in a case study to establish......-step process, SPI managers establish and compare culture profiles and decide how to address identified problems. To that end the text analysis technique is offered as a web service that allows for analysis of all text-based process models and standards, and of internal process documentation. Originality...

  7. Maintenance simulation: Software issues

    Energy Technology Data Exchange (ETDEWEB)

    Luk, C.H.; Jette, M.A.

    1995-07-01

    The maintenance of a distributed software system in a production environment involves: (1) maintaining software integrity, (2) maintaining and database integrity, (3) adding new features, and (4) adding new systems. These issues will be discussed in general: what they are and how they are handled. This paper will present our experience with a distributed resource management system that accounts for resources consumed, in real-time, on a network of heterogenous computers. The simulated environments to maintain this system will be presented relate to the four maintenance areas.

  8. Green in software engineering

    CERN Document Server

    Calero Munoz, Coral

    2015-01-01

    This is the first book that presents a comprehensive overview of sustainability aspects in software engineering. Its format follows the structure of the SWEBOK and covers the key areas involved in the incorporation of green aspects in software engineering, encompassing topics from requirement elicitation to quality assurance and maintenance, while also considering professional practices and economic aspects. The book consists of thirteen chapters, which are structured in five parts. First the "Introduction" gives an overview of the primary general concepts related to Green IT, discussing wha

  9. The development of digital oscilloscope control software in nuclear measurement

    International Nuclear Information System (INIS)

    Pu Minghui; Tian Geng; Li Xianyou

    2004-01-01

    This essay presents the development of an all-purpose digital oscilloscope control software on Windows 95/98 OS. The background and method are discussed in detail, together with the function and characteristics of the software. With the use of this software, a single PC can control several digital oscilloscopes. Solution of main problems encountered in the development is also discussed. (authors)

  10. Storage system software solutions for high-end user needs

    Science.gov (United States)

    Hogan, Carole B.

    1992-01-01

    Today's high-end storage user is one that requires rapid access to a reliable terabyte-capacity storage system running in a distributed environment. This paper discusses conventional storage system software and concludes that this software, designed for other purposes, cannot meet high-end storage requirements. The paper also reviews the philosophy and design of evolving storage system software. It concludes that this new software, designed with high-end requirements in mind, provides the potential for solving not only the storage needs of today but those of the foreseeable future as well.

  11. Open core control software for surgical robots.

    Science.gov (United States)

    Arata, Jumpei; Kozuka, Hiroaki; Kim, Hyung Wook; Takesue, Naoyuki; Vladimirov, B; Sakaguchi, Masamichi; Tokuda, Junichi; Hata, Nobuhiko; Chinzei, Kiyoyuki; Fujimoto, Hideo

    2010-05-01

    techniques for this purpose were introduced. Virtual fixture is well known technique as a "force guide" for supporting operators to perform precise manipulation by using a master-slave robot. The virtual fixture for precise and safety surgery was implemented on the system to demonstrate an idea of high-level collaboration between a surgical robot and a navigation system. The extension of virtual fixture is not a part of the Open Core Control system, however, the function such as virtual fixture cannot be realized without a tight collaboration between cutting-edge medical devices. By using the virtual fixture, operators can pre-define an accessible area on the navigation system, and the area information can be transferred to the robot. In this manner, the surgical console generates the reflection force when the operator tries to get out from the pre-defined accessible area during surgery. The Open Core Control software was implemented on a surgical master-slave robot and stable operation was observed in a motion test. The tip of the surgical robot was displayed on a navigation system by connecting the surgical robot with a 3D position sensor through the OpenIGTLink. The accessible area was pre-defined before the operation, and the virtual fixture was displayed as a "force guide" on the surgical console. In addition, the system showed stable performance in a duration test with network disturbance. In this paper, a design of the Open Core Control software for surgical robots and the implementation of virtual fixture were described. The Open Core Control software was implemented on a surgical robot system and showed stable performance in high-level collaboration works. The Open Core Control software is developed to be a widely used platform of surgical robots. Safety issues are essential for control software of these complex medical devices. It is important to follow the global specifications such as a FDA requirement "General Principles of Software Validation" or IEC62304. For

  12. A General Purpose High Performance Linux Installation Infrastructure

    International Nuclear Information System (INIS)

    Wachsmann, Alf

    2002-01-01

    With more and more and larger and larger Linux clusters, the question arises how to install them. This paper addresses this question by proposing a solution using only standard software components. This installation infrastructure scales well for a large number of nodes. It is also usable for installing desktop machines or diskless Linux clients, thus, is not designed for cluster installations in particular but is, nevertheless, highly performant. The infrastructure proposed uses PXE as the network boot component on the nodes. It uses DHCP and TFTP servers to get IP addresses and a bootloader to all nodes. It then uses kickstart to install Red Hat Linux over NFS. We have implemented this installation infrastructure at SLAC with our given server hardware and installed a 256 node cluster in 30 minutes. This paper presents the measurements from this installation and discusses the bottlenecks in our installation

  13. A Component-Oriented Programming for Embedded Mobile Robot Software

    Directory of Open Access Journals (Sweden)

    Safaai Deris

    2008-11-01

    Full Text Available Applying software reuse to many Embedded Real-Time (ERT systems poses significant challenges to industrial software processes due to the resource-constrained and real-time requirements of the systems. Autonomous Mobile Robot (AMR system is a class of ERT systems, hence, inherits the challenge of applying software reuse in general ERT systems. Furthermore, software reuse in AMR systems is challenged by the diversities in terms of robot physical size and shape, environmental interaction and implementation platform. Thus, it is envisioned that component-based software engineering will be the suitable way to promote software reuse in AMR systems with consideration to general requirements to be self-contained, platform-independent and real-time predictable. A framework for component-oriented programming for AMR software development using PECOS component model is proposed in this paper. The main features of this framework are: (1 use graphical representation for components definition and composition; (2 target C language for optimal code generation with resource-constrained micro-controller; and (3 minimal requirement for run-time support. Real-time implementation indicates that, the PECOS component model together with the proposed framework is suitable for resource constrained embedded AMR systems software development.

  14. Software engineering and automatic continuous verification of scientific software

    Science.gov (United States)

    Piggott, M. D.; Hill, J.; Farrell, P. E.; Kramer, S. C.; Wilson, C. R.; Ham, D.; Gorman, G. J.; Bond, T.

    2011-12-01

    Software engineering of scientific code is challenging for a number of reasons including pressure to publish and a lack of awareness of the pitfalls of software engineering by scientists. The Applied Modelling and Computation Group at Imperial College is a diverse group of researchers that employ best practice software engineering methods whilst developing open source scientific software. Our main code is Fluidity - a multi-purpose computational fluid dynamics (CFD) code that can be used for a wide range of scientific applications from earth-scale mantle convection, through basin-scale ocean dynamics, to laboratory-scale classic CFD problems, and is coupled to a number of other codes including nuclear radiation and solid modelling. Our software development infrastructure consists of a number of free tools that could be employed by any group that develops scientific code and has been developed over a number of years with many lessons learnt. A single code base is developed by over 30 people for which we use bazaar for revision control, making good use of the strong branching and merging capabilities. Using features of Canonical's Launchpad platform, such as code review, blueprints for designing features and bug reporting gives the group, partners and other Fluidity uers an easy-to-use platform to collaborate and allows the induction of new members of the group into an environment where software development forms a central part of their work. The code repositoriy are coupled to an automated test and verification system which performs over 20,000 tests, including unit tests, short regression tests, code verification and large parallel tests. Included in these tests are build tests on HPC systems, including local and UK National HPC services. The testing of code in this manner leads to a continuous verification process; not a discrete event performed once development has ceased. Much of the code verification is done via the "gold standard" of comparisons to analytical

  15. Computer software summaries. Numbers 1 through 423

    International Nuclear Information System (INIS)

    1979-09-01

    The National Energy Software Center (NESC) serves as the software exchange and information center for the US Department of Energy and the Nuclear Regulatory Commission. A major activity of the Center is the preparation and publication of two reports issued periodically - the Center's compilation of program abstracts, ANL-7411, and this software summaries report, ANL-8040. The abstracts describe the softward packages available in the software exchange library maintained and distributed by the Center. The summaries describe agency-sponsored software that is at the specification stage, under development, being checked out, in use, or available at agency offices, laboratories, and contractor installations. Summaries describe software that is not included in the NESC library due to its preliminary status or because it is believed to be of limited interest. The purpose of the summaries report is to keep agency and contractor personnel informed as to the existence, status, and availability of computer programs within the agency, and thereby minimize duplication costs and maximize the value of agency software development efforts

  16. Evaluation of Agricultural Accounting Software. Improved Decision Making. Third Edition.

    Science.gov (United States)

    Lovell, Ashley C., Comp.

    Following a discussion of the evaluation criteria for choosing accounting software, this guide contains reviews of 27 accounting software programs that could be used by farm or ranch business managers. The information in the reviews was provided by the software vendors and covers the following points for each software package: general features,…

  17. SOFTWARE OPEN SOURCE, SOFTWARE GRATIS?

    Directory of Open Access Journals (Sweden)

    Nur Aini Rakhmawati

    2006-01-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 Berlakunya Undang – undang Hak Atas Kekayaan Intelektual (HAKI, memunculkan suatu alternatif baru untuk menggunakan software open source. Penggunaan software open source menyebar seiring dengan isu global pada Information Communication Technology (ICT saat ini. Beberapa organisasi dan perusahaan mulai menjadikan software open source sebagai pertimbangan. Banyak konsep mengenai software open source ini. Mulai dari software yang gratis sampai software tidak berlisensi. Tidak sepenuhnya isu software open source benar, untuk itu perlu dikenalkan konsep software open source mulai dari sejarah, lisensi dan bagaimana cara memilih lisensi, serta pertimbangan dalam memilih software open source yang ada. Kata kunci :Lisensi, Open Source, HAKI

  18. Design evolution and verification of the general-purpose heat source

    International Nuclear Information System (INIS)

    Schock, A.

    The General-Purpose Heat Source (GPHS) is a radioisotope heat source for use in space power systems. It employs a modular design, to make it adaptable to a wide range of energy conversion systems and power levels. Each 250 W module is completely autonomous, with its own passive safety provisions to prevent fuel release under all abort modes, including atmospheric reentry and earth impact. Prior development tests had demonstrated good impact survival as long as the iridium fuel capsules retained their ductility. This requires high impact temperatures, typically above 900 0 C and reasonably fine grain size, which in turn requires avoidance of excessive operating temperatures and reentry temperatures. These three requirements - on operating, reentry, and impact temperatures - are in mutual conflict, since thermal design changes to improve any one of these temperatures tend to worsen one or both of the others. This conflict creates a difficult design problem, which for a time threatened the success of the program. The present paper describes how this problem was overcome by successive design revisions, supplemented by thermal analyses and confirmatory vibration and impact tests; and how this may be achieved while raising the specific power of the GPHS to 83 W/lb, a 50% improvement over previously flown radioisotope heat sources

  19. Proceedings of the Fifteenth Annual Software Engineering Workshop

    Science.gov (United States)

    1990-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by GSFC and created for the purpose of investigating the effectiveness of software engineering technologies when applied to the development of applications software. The goals of the SEL are: (1) to understand the software development process in the GSFC environment; (2) to measure the effect of various methodologies, tools, and models on this process; and (3) to identify and then to apply successful development practices. Fifteen papers were presented at the Fifteenth Annual Software Engineering Workshop in five sessions: (1) SEL at age fifteen; (2) process improvement; (3) measurement; (4) reuse; and (5) process assessment. The sessions were followed by two panel discussions: (1) experiences in implementing an effective measurement program; and (2) software engineering in the 1980's. A summary of the presentations and panel discussions is given.

  20. Low-cost general purpose spectral display unit using an IBM PC

    International Nuclear Information System (INIS)

    Robinson, S.L.

    1985-10-01

    Many physics experiments require acquisition and analysis of spectral data. commercial minicomputer-based multichannel analyzers collect detected counts at various energies, create a histogram of the counts in memory, and display the resultant spectra. They acquire data and provide the user-to-display interface. The system discussed separates functions into the three modular components of data acquisition, storage, and display. This decoupling of functions allows the experimenter to use any number of detectors for data collection before forwarding up to 64 spectra to the display unit, thereby increasing data throughput over that available with commercial systems. An IBM PC was chosen for the low-cost, general purpose display unit. Up to four spectra may be displayed simultaneously in different colors. The histogram saves 1024 channels per detector, 640 of which may be distinctly displayed per spectra. The IEEE-488 standard provides the data path between the IBM PC and the data collection unit. Data is sent to the PC under interrupt control, using direct memory access. Display manipulations available via keyboard are also discussed

  1. Geometric correction of radiographic images using general purpose image processing program

    International Nuclear Information System (INIS)

    Kim, Eun Kyung; Cheong, Ji Seong; Lee, Sang Hoon

    1994-01-01

    The present study was undertaken to compare geometric corrected image by general-purpose image processing program for the Apple Macintosh II computer (NIH Image, Adobe Photoshop) with standardized image by individualized custom fabricated alignment instrument. Two non-standardized periapical films with XCP film holder only were taken at the lower molar portion of 19 volunteers. Two standardized periapical films with customized XCP film holder with impression material on the bite-block were taken for each person. Geometric correction was performed with Adobe Photoshop and NIH Image program. Specially, arbitrary image rotation function of 'Adobe Photoshop' and subtraction with transparency function of 'NIH Image' were utilized. The standard deviations of grey values of subtracted images were used to measure image similarity. Average standard deviation of grey values of subtracted images if standardized group was slightly lower than that of corrected group. However, the difference was found to be statistically insignificant (p>0.05). It is considered that we can use 'NIH Image' and 'Adobe Photoshop' program for correction of nonstandardized film, taken with XCP film holder at lower molar portion.

  2. How did the General Purpose Technology ’Electricity’ contribute to the Second Industrial Revolution (I): The Power Engines.

    NARCIS (Netherlands)

    van der Kooij, B.J.G.

    2016-01-01

    The concept of the General Purpose Technology (GPT) of the late 1990s is a culmination of many evolutionairy views in innovation-thinking. By definition the GPT considers the technical, social, and economic effects of meta-technologies like steam-technology and electric technology. This paper uses

  3. RUMD: A general purpose molecular dynamics package optimized to utilize GPU hardware down to a few thousand particles

    DEFF Research Database (Denmark)

    Bailey, Nicholas; Ingebrigtsen, Trond; Hansen, Jesper Schmidt

    2017-01-01

    RUMD is a general purpose, high-performance molecular dynamics (MD) simulation package running on graphical processing units (GPU’s). RUMD addresses the challenge of utilizing the many-core nature of modern GPU hardware when simulating small to medium system sizes (roughly from a few thousand up...

  4. 45 CFR 84.1 - Purpose.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Purpose. 84.1 Section 84.1 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION NONDISCRIMINATION ON THE BASIS OF HANDICAP IN PROGRAMS OR ACTIVITIES RECEIVING FEDERAL FINANCIAL ASSISTANCE General Provisions § 84.1 Purpose. The...

  5. Space-Based Reconfigurable Software Defined Radio Test Bed Aboard International Space Station

    Science.gov (United States)

    Reinhart, Richard C.; Lux, James P.

    2014-01-01

    The National Aeronautical and Space Administration (NASA) recently launched a new software defined radio research test bed to the International Space Station. The test bed, sponsored by the Space Communications and Navigation (SCaN) Office within NASA is referred to as the SCaN Testbed. The SCaN Testbed is a highly capable communications system, composed of three software defined radios, integrated into a flight system, and mounted to the truss of the International Space Station. Software defined radios offer the future promise of in-flight reconfigurability, autonomy, and eventually cognitive operation. The adoption of software defined radios offers space missions a new way to develop and operate space transceivers for communications and navigation. Reconfigurable or software defined radios with communications and navigation functions implemented in software or VHDL (Very High Speed Hardware Description Language) provide the capability to change the functionality of the radio during development or after launch. The ability to change the operating characteristics of a radio through software once deployed to space offers the flexibility to adapt to new science opportunities, recover from anomalies within the science payload or communication system, and potentially reduce development cost and risk by adapting generic space platforms to meet specific mission requirements. The software defined radios on the SCaN Testbed are each compliant to NASA's Space Telecommunications Radio System (STRS) Architecture. The STRS Architecture is an open, non-proprietary architecture that defines interfaces for the connections between radio components. It provides an operating environment to abstract the communication waveform application from the underlying platform specific hardware such as digital-to-analog converters, analog-to-digital converters, oscillators, RF attenuators, automatic gain control circuits, FPGAs, general-purpose processors, etc. and the interconnections among

  6. Application of Formal Methods in Software Engineering

    Directory of Open Access Journals (Sweden)

    Adriana Morales

    2011-12-01

    Full Text Available The purpose of this research work is to examine: (1 why are necessary the formal methods for software systems today, (2 high integrity systems through the methodology C-by-C –Correctness-by-Construction–, and (3 an affordable methodology to apply formal methods in software engineering. The research process included reviews of the literature through Internet, in publications and presentations in events. Among the Research results found that: (1 there is increasing the dependence that the nations have, the companies and people of software systems, (2 there is growing demand for software Engineering to increase social trust in the software systems, (3 exist methodologies, as C-by-C, that can provide that level of trust, (4 Formal Methods constitute a principle of computer science that can be applied software engineering to perform reliable process in software development, (5 software users have the responsibility to demand reliable software products, and (6 software engineers have the responsibility to develop reliable software products. Furthermore, it is concluded that: (1 it takes more research to identify and analyze other methodologies and tools that provide process to apply the Formal Software Engineering methods, (2 Formal Methods provide an unprecedented ability to increase the trust in the exactitude of the software products and (3 by development of new methodologies and tools is being achieved costs are not more a disadvantage for application of formal methods.

  7. The social disutility of software ownership.

    Science.gov (United States)

    Douglas, David M

    2011-09-01

    Software ownership allows the owner to restrict the distribution of software and to prevent others from reading the software's source code and building upon it. However, free software is released to users under software licenses that give them the right to read the source code, modify it, reuse it, and distribute the software to others. Proponents of free software such as Richard M. Stallman and Eben Moglen argue that the social disutility of software ownership is a sufficient justification for prohibiting it. This social disutility includes the social instability of disregarding laws and agreements covering software use and distribution, inequality of software access, and the inability to help others by sharing software with them. Here I consider these and other social disutility claims against withholding specific software rights from users, in particular, the rights to read the source code, duplicate, distribute, modify, imitate, and reuse portions of the software within new programs. I find that generally while withholding these rights from software users does cause some degree of social disutility, only the rights to duplicate, modify and imitate cannot legitimately be denied to users on this basis. The social disutility of withholding the rights to distribute the software, read its source code and reuse portions of it in new programs is insufficient to prohibit software owners from denying them to users. A compromise between the software owner and user can minimise the social disutility of withholding these particular rights from users. However, the social disutility caused by software patents is sufficient for rejecting such patents as they restrict the methods of reducing social disutility possible with other forms of software ownership.

  8. Software architecture as a freedom for 3D content providers and users along with independency on purposes and used devices

    Science.gov (United States)

    Sultana, Razia; Christ, Andreas; Meyrueis, Patrick

    2014-05-01

    The improvements in the hardware and software of communication devices have allowed running Virtual Reality (VR) and Augmented Reality (AR) applications on those. Nowadays, it is possible to overlay synthetic information on real images, or even to play 3D on-line games on smart phones or some other mobile devices. Hence the use of 3D data for business and specially for education purposes is ubiquitous. Due to always available at hand and always ready to use properties of mobile phones, those are considered as most potential communication devices. The total numbers of mobile phone users are increasing all over the world every day and that makes mobile phones the most suitable device to reach a huge number of end clients either for education or for business purposes. There are different standards, protocols and specifications to establish the communication among different communication devices but there is no initiative taken so far to make it sure that the send data through this communication process will be understood and used by the destination device. Since all the devices are not able to deal with all kind of 3D data formats and it is also not realistic to have different version of the same data to make it compatible with the destination device, it is necessary to have a prevalent solution. The proposed architecture in this paper describes a device and purpose independent 3D data visibility any time anywhere to the right person in suitable format. There is no solution without limitation. The architecture is implemented in a prototype to make an experimental validation of the architecture which also shows the difference between theory and practice.

  9. Generic functional requirements for a NASA general-purpose data base management system

    Science.gov (United States)

    Lohman, G. M.

    1981-01-01

    Generic functional requirements for a general-purpose, multi-mission data base management system (DBMS) for application to remotely sensed scientific data bases are detailed. The motivation for utilizing DBMS technology in this environment is explained. The major requirements include: (1) a DBMS for scientific observational data; (2) a multi-mission capability; (3) user-friendly; (4) extensive and integrated information about data; (5) robust languages for defining data structures and formats; (6) scientific data types and structures; (7) flexible physical access mechanisms; (8) ways of representing spatial relationships; (9) a high level nonprocedural interactive query and data manipulation language; (10) data base maintenance utilities; (11) high rate input/output and large data volume storage; and adaptability to a distributed data base and/or data base machine configuration. Detailed functions are specified in a top-down hierarchic fashion. Implementation, performance, and support requirements are also given.

  10. SQuAVisiT : A Software Quality Assessment and Visualisation Toolset

    NARCIS (Netherlands)

    Roubtsov, Serguei; Telea, Alexandru; Holten, Danny

    2007-01-01

    Software quality assessment of large COBOL industrial legacy systems, both for maintenance or migration purposes, mounts a serious challenge. We present the Software Quality Assessment and Visualisation Toolset (SQuAVisiT), which assists users in performing the above task. First, it allows a fully

  11. SQuAVisiT: a software quality assessment and visualisation toolset

    NARCIS (Netherlands)

    Roubtsov, S.; Telea, A.C.; Holten, D.H.R.

    2007-01-01

    Software quality assessment of large COBOL industrial legacy systems, both for maintenance or migration purposes, mounts a serious challenge. We present the software quality assessment and visualisation toolset (SQuAVisiT), which assists users in performing the above task. First, it allows a fully

  12. Design of the SLAC RCE Platform: A General Purpose ATCA Based Data Acquisition System

    International Nuclear Information System (INIS)

    Herbst, R.; Claus, R.; Freytag, M.; Haller, G.; Huffer, M.; Maldonado, S.; Nishimura, K.; O'Grady, C.; Panetta, J.; Perazzo, A.; Reese, B.; Ruckman, L.; Thayer, J.G.; Weaver, M.

    2015-01-01

    The SLAC RCE platform is a general purpose clustered data acquisition system implemented on a custom ATCA compliant blade, called the Cluster On Board (COB). The core of the system is the Reconfigurable Cluster Element (RCE), which is a system-on-chip design based upon the Xilinx Zynq family of FPGAs, mounted on custom COB daughter-boards. The Zynq architecture couples a dual core ARM Cortex A9 based processor with a high performance 28nm FPGA. The RCE has 12 external general purpose bi-directional high speed links, each supporting serial rates of up to 12Gbps. 8 RCE nodes are included on a COB, each with a 10Gbps connection to an on-board 24-port Ethernet switch integrated circuit. The COB is designed to be used with a standard full-mesh ATCA backplane allowing multiple RCE nodes to be tightly interconnected with minimal interconnect latency. Multiple shelves can be clustered using the front panel 10-gbps connections. The COB also supports local and inter-blade timing and trigger distribution. An experiment specific Rear Transition Module adapts the 96 high speed serial links to specific experiments and allows an experiment-specific timing and busy feedback connection. This coupling of processors with a high performance FPGA fabric in a low latency, multiple node cluster allows high speed data processing that can be easily adapted to any physics experiment. RTEMS and Linux are both ported to the module. The RCE has been used or is the baseline for several current and proposed experiments (LCLS, HPS, LSST, ATLAS-CSC, LBNE, DarkSide, ILC-SiD, etc).

  13. Bilingual language control and general purpose cognitive control among individuals with bilingual aphasia: evidence based on negative priming and flanker tasks.

    Science.gov (United States)

    Dash, Tanya; Kar, Bhoomika R

    2014-01-01

    Bilingualism results in an added advantage with respect to cognitive control. The interaction between bilingual language control and general purpose cognitive control systems can also be understood by studying executive control among individuals with bilingual aphasia. objectives: The current study examined the subcomponents of cognitive control in bilingual aphasia. A case study approach was used to investigate whether cognitive control and language control are two separate systems and how factors related to bilingualism interact with control processes. Four individuals with bilingual aphasia performed a language background questionnaire, picture description task, and two experimental tasks (nonlinguistic negative priming task and linguistic and nonlinguistic versions of flanker task). A descriptive approach was used to analyse the data using reaction time and accuracy measures. The cumulative distribution function plots were used to visualize the variations in performance across conditions. The results highlight the distinction between general purpose cognitive control and bilingual language control mechanisms. All participants showed predominant use of the reactive control mechanism to compensate for the limited resources system. Independent yet interactive systems for bilingual language control and general purpose cognitive control were postulated based on the experimental data derived from individuals with bilingual aphasia.

  14. Why Free Software Matters for Literacy Educators.

    Science.gov (United States)

    Brunelle, Michael D.; Bruce, Bertram C.

    2002-01-01

    Notes that understanding what "free software" means and its implications for access and use of new technologies is an important component of the new literacies. Concludes that if free speech and free press are essential to the development of a general literacy, then free software can promote the development of computer literacy. (SG)

  15. Can Universities Profit from General Purpose Inventions?

    DEFF Research Database (Denmark)

    Barirani, Ahmad; Beaudry, Catherine; Agard, Bruno

    2017-01-01

    The lack of control over downstream assets can hinder universities’ ability to extract rents from their inventive activities. We explore this possibility by assessing the relationship between invention generality and renewal decisions for a sample of Canadian nanotechnology patents. Our results s...

  16. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1 : ASC software quality engineering practices version 1.0.

    Energy Technology Data Exchange (ETDEWEB)

    Minana, Molly A.; Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Forsythe, Christi A.; Schofield, Joseph Richard, Jr.; Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2005-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in DOE/AL Quality Criteria (QC-1) as conformance to customer requirements and expectations. This quality plan defines the ASC program software quality practices and provides mappings of these practices to the SNL Corporate Process Requirements (CPR 1.3.2 and CPR 1.3.6) and the Department of Energy (DOE) document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines (GP&G). This quality plan identifies ASC management and software project teams' responsibilities for cost-effective software engineering quality practices. The SNL ASC Software Quality Plan establishes the signatories commitment to improving software products by applying cost-effective software engineering quality practices. This document explains the project teams opportunities for tailoring and implementing the practices; enumerates the practices that compose the development of SNL ASC's software products; and includes a sample assessment checklist that was developed based upon the practices in this document.

  17. RoboCon: A general purpose telerobotic control center

    Energy Technology Data Exchange (ETDEWEB)

    Draper, J.V.; Noakes, M.W. [Oak Ridge National Lab., TN (United States). Robotics and Process Systems Div.; Schempf, H. [Carnegie Mellon Univ., Pittsburgh, PA (United States); Blair, L.M. [Human Machine Interfaces, Inc., Knoxville, TN (United States)

    1997-02-01

    This report describes human factors issues involved in the design of RoboCon, a multi-purpose control center for use in US Department of Energy remote handling applications. RoboCon is intended to be a flexible, modular control center capable of supporting a wide variety of robotic devices.

  18. RoboCon: A general purpose telerobotic control center

    International Nuclear Information System (INIS)

    Draper, J.V.; Noakes, M.W.; Blair, L.M.

    1997-01-01

    This report describes human factors issues involved in the design of RoboCon, a multi-purpose control center for use in US Department of Energy remote handling applications. RoboCon is intended to be a flexible, modular control center capable of supporting a wide variety of robotic devices

  19. Advanced information processing system: Input/output network management software

    Science.gov (United States)

    Nagle, Gail; Alger, Linda; Kemp, Alexander

    1988-01-01

    The purpose of this document is to provide the software requirements and specifications for the Input/Output Network Management Services for the Advanced Information Processing System. This introduction and overview section is provided to briefly outline the overall architecture and software requirements of the AIPS system before discussing the details of the design requirements and specifications of the AIPS I/O Network Management software. A brief overview of the AIPS architecture followed by a more detailed description of the network architecture.

  20. Crisis management for software development and knowledge transfer

    CERN Document Server

    Zykov, Sergey V

    2016-01-01

    This well structured book discusses lifecycle optimization of software projects for crisis management by means of software engineering methods and tools. Its outcomes are based on lessons learned from the software engineering crisis which started in the 1960s. The book presents a systematic approach to overcome the crisis in software engineering depends which not only depends on technology-related but also on human-related factors. It proposes an adaptive methodology for software product development, which optimizes the software product lifecycle in order to avoid “local” crises of software production. The general lifecycle pattern and its stages are discussed, and their impact on the time and budget of the software product development is analyzed. The book identifies key advantages and disadvantages for various models selected and concludes that there is no “silver bullet”, or universal model, which suits all software products equally well. It approaches software architecture in terms of process, dat...

  1. Software for natural gas pipeline design and simulation (gaspisim ...

    African Journals Online (AJOL)

    Software for natural gas pipeline design and simulation (gaspisim) ... This paper focuses on the development of software for optimum design and simulation of natural gas pipeline. General ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  2. Views of Health Information Management Staff on the Medical Coding Software in Mashhad, Iran.

    Science.gov (United States)

    Kimiafar, Khalil; Hemmati, Fatemeh; Banaye Yazdipour, Alireza; Sarbaz, Masoumeh

    2018-01-01

    Systematic evaluation of Health Information Technology (HIT) and users' views leads to the modification and development of these technologies in accordance with their needs. The purpose of this study was to investigate the views of Health Information Management (HIM) staff on the quality of medical coding software. A descriptive cross-sectional study was conducted between May to July 2016 in 26 hospitals (academic and non-academic) in Mashhad, north-eastern Iran. The study population consisted of the chairs of HIM departments and medical coders (58 staff). Data were collected through a valid and reliable questionnaire. The data were analyzed using the SPSS version 16.0. From the views of staff, the advantages of coding software such as reducing coding time had the highest average (Mean=3.82) while cost reduction had the lowest average (Mean =3.20), respectively. Meanwhile, concern about losing job opportunities was the least important disadvantage (15.5%) to the use of coding software. In general, the results of this study showed that coding software in some cases have deficiencies. Designers and developers of health information coding software should pay more attention to technical aspects, in-work reminders, help in deciding on proper codes selection by access coding rules, maintenance services, link to other relevant databases and the possibility of providing brief and detailed reports in different formats.

  3. Advanced Transport Operating System (ATOPS) control display unit software description

    Science.gov (United States)

    Slominski, Christopher J.; Parks, Mark A.; Debure, Kelly R.; Heaphy, William J.

    1992-01-01

    The software created for the Control Display Units (CDUs), used for the Advanced Transport Operating Systems (ATOPS) project, on the Transport Systems Research Vehicle (TSRV) is described. Module descriptions are presented in a standardized format which contains module purpose, calling sequence, a detailed description, and global references. The global reference section includes subroutines, functions, and common variables referenced by a particular module. The CDUs, one for the pilot and one for the copilot, are used for flight management purposes. Operations performed with the CDU affects the aircraft's guidance, navigation, and display software.

  4. Teaching English for Specific Purposes

    Directory of Open Access Journals (Sweden)

    Nijolė Netikšienė

    2011-04-01

    Full Text Available Teaching English for Specific Purposes and General English is analysed in the article. The scientific approach of a scientist M. Rosenberg is presented. The experience of teaching English for Specific Purposesat VGTU is alsopresented. The ideas and teaching methods from the classes of general English can be transferred to the classes of English for Specific Purposes.

  5. The ICVSIE: A General Purpose Integral Equation Method for Bio-Electromagnetic Analysis.

    Science.gov (United States)

    Gomez, Luis J; Yucel, Abdulkadir C; Michielssen, Eric

    2018-03-01

    An internally combined volume surface integral equation (ICVSIE) for analyzing electromagnetic (EM) interactions with biological tissue and wide ranging diagnostic, therapeutic, and research applications, is proposed. The ICVSIE is a system of integral equations in terms of volume and surface equivalent currents in biological tissue subject to fields produced by externally or internally positioned devices. The system is created by using equivalence principles and solved numerically; the resulting current values are used to evaluate scattered and total electric fields, specific absorption rates, and related quantities. The validity, applicability, and efficiency of the ICVSIE are demonstrated by EM analysis of transcranial magnetic stimulation, magnetic resonance imaging, and neuromuscular electrical stimulation. Unlike previous integral equations, the ICVSIE is stable regardless of the electric permittivities of the tissue or frequency of operation, providing an application-agnostic computational framework for EM-biomedical analysis. Use of the general purpose and robust ICVSIE permits streamlining the development, deployment, and safety analysis of EM-biomedical technologies.

  6. A software for parameter estimation in dynamic models

    Directory of Open Access Journals (Sweden)

    M. Yuceer

    2008-12-01

    Full Text Available A common problem in dynamic systems is to determine parameters in an equation used to represent experimental data. The goal is to determine the values of model parameters that provide the best fit to measured data, generally based on some type of least squares or maximum likelihood criterion. In the most general case, this requires the solution of a nonlinear and frequently non-convex optimization problem. Some of the available software lack in generality, while others do not provide ease of use. A user-interactive parameter estimation software was needed for identifying kinetic parameters. In this work we developed an integration based optimization approach to provide a solution to such problems. For easy implementation of the technique, a parameter estimation software (PARES has been developed in MATLAB environment. When tested with extensive example problems from literature, the suggested approach is proven to provide good agreement between predicted and observed data within relatively less computing time and iterations.

  7. View of software for HEP experiments

    Energy Technology Data Exchange (ETDEWEB)

    Johnstad, H.; Lebrun, P.; Lessner, E.S.; Montgomery, H.E.

    1986-05-01

    A view of the software structure typical of a High Energy Physics experiment is given and the availability of general software modules in most of the important regions is discussed. The aim is to provide a framework for discussion of capabilities and inadequecies and thereby define areas where effort should be assigned and perhaps also to serve as a useful source document for the newcomer to High Energy Physics. 74 refs.

  8. View of software for HEP experiments

    International Nuclear Information System (INIS)

    Johnstad, H.; Lebrun, P.; Lessner, E.S.; Montgomery, H.E.

    1986-05-01

    A view of the software structure typical of a High Energy Physics experiment is given and the availability of general software modules in most of the important regions is discussed. The aim is to provide a framework for discussion of capabilities and inadequecies and thereby define areas where effort should be assigned and perhaps also to serve as a useful source document for the newcomer to High Energy Physics. 74 refs

  9. 12 CFR 41.1 - Purpose.

    Science.gov (United States)

    2010-01-01

    ... COMPTROLLER OF THE CURRENCY, DEPARTMENT OF THE TREASURY FAIR CREDIT REPORTING General Provisions § 41.1 Purpose. (a) Purpose. The purpose of this part is to establish standards for national banks regarding consumer report information. In addition, the purpose of this part is to specify the extent to which...

  10. The ATLAS Trigger Simulation with Legacy Software

    CERN Document Server

    Bernius, Catrin; The ATLAS collaboration

    2017-01-01

    Physics analyses at the LHC require accurate simulations of the detector response and the event selection processes, generally done with the most recent software releases. The trigger response simulation is crucial for determination of overall selection efficiencies and signal sensitivities and should be done with the same software release with which data were recorded. This requires potentially running with software dating many years back, the so-called legacy software. Therefore having a strategy for running legacy software in a modern environment becomes essential when data simulated for past years start to present a sizeable fraction of the total. The requirements and possibilities for such a simulation scheme within the ATLAS software framework were examined and a proof-of-concept simulation chain has been successfully implemented. One of the greatest challenges was the choice of a data format which promises long term compatibility with old and new software releases. Over the time periods envisaged, data...

  11. Elucidation of covariant proofs in general relativity: example of the use of algebraic software in the shear-free conjecture in MAPLE

    Science.gov (United States)

    Huf, P. A.; Carminati, J.

    2018-01-01

    In this paper we explore the use of a new algebraic software package in providing independent covariant proof of a conjecture in general relativity. We examine the proof of two sub-cases of the shear-free conjecture σ =0 => ω Θ =0 by Senovilla et al. (Gen. Relativ. Gravit 30:389-411, 1998): case 1: for dust; case 2: for acceleration parallel to vorticity. We use TensorPack, a software package recently released for the Maple environment. In this paper, we briefly summarise the key features of the software and then demonstrate its use by providing and discussing examples of independent proofs of the paper in question. A full set of our completed proofs is available online at http://www.bach2roq.com/science/maths/GR/ShearFreeProofs.html. We are in agreeance with the equations provided in the original paper, noting that the proofs often require many steps. Furthermore, in our proofs we provide fully worked algebraic steps in such a way that the proofs can be examined systematically, and avoiding hand calculation. It is hoped that the elucidated proofs may be of use to other researchers in verifying the algebraic consistency of the expressions in the paper in question, as well as related literature. Furthermore we suggest that the appropriate use of algebraic software in covariant formalism could be useful for developing research and teaching in GR theory.

  12. Computing and software

    Directory of Open Access Journals (Sweden)

    White, G. C.

    2004-06-01

    pieces of likelihood functions together to generate estimates. The idea is interesting, and maybe some bright young statistician can work out the specifics to implement the procedure. Choquet et al. (2004 describe MSURGE, a software package that implements the multistate capture–recapture models. The unique feature of MSURGE is that the design matrix is constructed with an interpreted language called GEMACO. Because MSURGE is limited to just multistate models, the special requirements of these likelihoods can be provided. The software and methods presented in these papers gives biologists and wildlife managers an expanding range of possibilities for data analysis. Although ease–of–use is generally getting better, it does not replace the need for understanding of the requirements and structure of the models being computed. The internet provides access to many free software packages as well as user–discussion groups to share knowledge and ideas. (A starting point for wildlife–related applications is (http://www.phidot.org.

  13. 37 CFR 201.24 - Warning of copyright for software lending by nonprofit libraries.

    Science.gov (United States)

    2010-07-01

    ... software lending by nonprofit libraries. 201.24 Section 201.24 Patents, Trademarks, and Copyrights... copyright for software lending by nonprofit libraries. (a) Definition. A Warning of Copyright for Software... packaging that contains the computer program which is lent by a nonprofit library for nonprofit purposes. (b...

  14. Factors negatively influencing knowledge sharing in software development

    Directory of Open Access Journals (Sweden)

    Lucas T. Khoza

    2017-07-01

    Objective: This study seeks to identify factors that negatively influence knowledge sharing in software development in the developing country context. Method: Expert sampling as a subcategory of purposive sampling was employed to extract information, views and opinions from experts in the field of information and communication technology, more specifically from those who are involved in software development projects. Four Johannesburg-based software developing organisations listed on the Johannesburg Stock Exchange (JSE, South Africa, participated in this research study. Quantitative data were collected using an online questionnaire with closed-ended questions. Results: Findings of this research reveal that job security, motivation, time constraints, physiological factors, communication, resistance to change and rewards are core factors negatively influencing knowledge sharing in software developing organisations. Conclusions: Improved understanding of factors negatively influencing knowledge sharing is expected to assist software developing organisations in closing the gap for software development projects failing to meet the triple constraint of time, cost and scope.

  15. Open Source Software Acquisition

    DEFF Research Database (Denmark)

    Holck, Jesper; Kühn Pedersen, Mogens; Holm Larsen, Michael

    2005-01-01

    Lately we have seen a growing interest from both public and private organisations to adopt OpenSource Software (OSS), not only for a few, specific applications but also on a more general levelthroughout the organisation. As a consequence, the organisations' decisions on adoption of OSS arebecoming...

  16. NASA Data Acquisition System Software Development for Rocket Propulsion Test Facilities

    Science.gov (United States)

    Herbert, Phillip W., Sr.; Elliot, Alex C.; Graves, Andrew R.

    2015-01-01

    Current NASA propulsion test facilities include Stennis Space Center in Mississippi, Marshall Space Flight Center in Alabama, Plum Brook Station in Ohio, and White Sands Test Facility in New Mexico. Within and across these centers, a diverse set of data acquisition systems exist with different hardware and software platforms. The NASA Data Acquisition System (NDAS) is a software suite designed to operate and control many critical aspects of rocket engine testing. The software suite combines real-time data visualization, data recording to a variety formats, short-term and long-term acquisition system calibration capabilities, test stand configuration control, and a variety of data post-processing capabilities. Additionally, data stream conversion functions exist to translate test facility data streams to and from downstream systems, including engine customer systems. The primary design goals for NDAS are flexibility, extensibility, and modularity. Providing a common user interface for a variety of hardware platforms helps drive consistency and error reduction during testing. In addition, with an understanding that test facilities have different requirements and setups, the software is designed to be modular. One engine program may require real-time displays and data recording; others may require more complex data stream conversion, measurement filtering, or test stand configuration management. The NDAS suite allows test facilities to choose which components to use based on their specific needs. The NDAS code is primarily written in LabVIEW, a graphical, data-flow driven language. Although LabVIEW is a general-purpose programming language; large-scale software development in the language is relatively rare compared to more commonly used languages. The NDAS software suite also makes extensive use of a new, advanced development framework called the Actor Framework. The Actor Framework provides a level of code reuse and extensibility that has previously been difficult

  17. JMorph: Software for performing rapid morphometric measurements on digital images of fossil assemblages

    Science.gov (United States)

    Lelièvre, Peter G.; Grey, Melissa

    2017-08-01

    Quantitative morphometric analyses of form are widely used in palaeontology, especially for taxonomic and evolutionary research. These analyses can involve several measurements performed on hundreds or even thousands of samples. Performing measurements of size and shape on large assemblages of macro- or microfossil samples is generally infeasible or impossible with traditional instruments such as vernier calipers. Instead, digital image processing software is required to perform measurements via suitable digital images of samples. Many software packages exist for morphometric analyses but there is not much available for the integral stage of data collection, particularly for the measurement of the outlines of samples. Some software exists to automatically detect the outline of a fossil sample from a digital image. However, automatic outline detection methods may perform inadequately when samples have incomplete outlines or images contain poor contrast between the sample and staging background. Hence, a manual digitization approach may be the only option. We are not aware of any software packages that are designed specifically for efficient digital measurement of fossil assemblages with numerous samples, especially for the purposes of manual outline analysis. Throughout several previous studies, we have developed a new software tool, JMorph, that is custom-built for that task. JMorph provides the means to perform many different types of measurements, which we describe in this manuscript. We focus on JMorph's ability to rapidly and accurately digitize the outlines of fossils. JMorph is freely available from the authors.

  18. Integration of software for scenario exploration

    International Nuclear Information System (INIS)

    Oyamada, Kiyoshi; Ikeda, Takao

    1999-03-01

    The scenario exploration methodology using shadow models is a variation of the environmental simulation method. Key aspect of the scenario exploration is the use of shadow models which are not corresponding to any specific assumptions on physical processes and, instead, abstract their general features relevant to the effects on nuclide transport in a general manner so that benefit of using simulation approach can be maximized. In developing the shadow models, all the modelling options that have not yet been denied by the experts are kept and parametrized in a very general framework. This, in turn, enables one to treat various types of the uncertainty in performance assessment, i.e., scenario uncertainty, conceptual model uncertainty, mathematical model uncertainty and parameter uncertainty, in a common framework of uncertainty / sensitivity analysis. Objective of the current study is to review / modify the tools which have been developed separately and, thence, not fully consistent from one to the other and to integrate them into a unified methodology and software. Tasks for this are; 1. modification / integration of tools for scenario exploration of nuclide transport in the EBS and the near-field host rock, 2. verification of the software modified and integrated, 3. installation of the software at JNC. (author)

  19. Interactive software tool to comprehend the calculation of optimal sequence alignments with dynamic programming.

    Science.gov (United States)

    Ibarra, Ignacio L; Melo, Francisco

    2010-07-01

    Dynamic programming (DP) is a general optimization strategy that is successfully used across various disciplines of science. In bioinformatics, it is widely applied in calculating the optimal alignment between pairs of protein or DNA sequences. These alignments form the basis of new, verifiable biological hypothesis. Despite its importance, there are no interactive tools available for training and education on understanding the DP algorithm. Here, we introduce an interactive computer application with a graphical interface, for the purpose of educating students about DP. The program displays the DP scoring matrix and the resulting optimal alignment(s), while allowing the user to modify key parameters such as the values in the similarity matrix, the sequence alignment algorithm version and the gap opening/extension penalties. We hope that this software will be useful to teachers and students of bioinformatics courses, as well as researchers who implement the DP algorithm for diverse applications. The software is freely available at: http:/melolab.org/sat. The software is written in the Java computer language, thus it runs on all major platforms and operating systems including Windows, Mac OS X and LINUX. All inquiries or comments about this software should be directed to Francisco Melo at fmelo@bio.puc.cl.

  20. Bilingual Language Control and General Purpose Cognitive Control among Individuals with Bilingual Aphasia: Evidence Based on Negative Priming and Flanker Tasks

    Science.gov (United States)

    Dash, Tanya; Kar, Bhoomika R.

    2014-01-01

    Background. Bilingualism results in an added advantage with respect to cognitive control. The interaction between bilingual language control and general purpose cognitive control systems can also be understood by studying executive control among individuals with bilingual aphasia. Objectives. The current study examined the subcomponents of cognitive control in bilingual aphasia. A case study approach was used to investigate whether cognitive control and language control are two separate systems and how factors related to bilingualism interact with control processes. Methods. Four individuals with bilingual aphasia performed a language background questionnaire, picture description task, and two experimental tasks (nonlinguistic negative priming task and linguistic and nonlinguistic versions of flanker task). Results. A descriptive approach was used to analyse the data using reaction time and accuracy measures. The cumulative distribution function plots were used to visualize the variations in performance across conditions. The results highlight the distinction between general purpose cognitive control and bilingual language control mechanisms. Conclusion. All participants showed predominant use of the reactive control mechanism to compensate for the limited resources system. Independent yet interactive systems for bilingual language control and general purpose cognitive control were postulated based on the experimental data derived from individuals with bilingual aphasia. PMID:24982591

  1. A general purpose subroutine for fast fourier transform on a distributed memory parallel machine

    Science.gov (United States)

    Dubey, A.; Zubair, M.; Grosch, C. E.

    1992-01-01

    One issue which is central in developing a general purpose Fast Fourier Transform (FFT) subroutine on a distributed memory parallel machine is the data distribution. It is possible that different users would like to use the FFT routine with different data distributions. Thus, there is a need to design FFT schemes on distributed memory parallel machines which can support a variety of data distributions. An FFT implementation on a distributed memory parallel machine which works for a number of data distributions commonly encountered in scientific applications is presented. The problem of rearranging the data after computing the FFT is also addressed. The performance of the implementation on a distributed memory parallel machine Intel iPSC/860 is evaluated.

  2. OCRWM procedure for reporting software baseline change information

    International Nuclear Information System (INIS)

    1994-07-01

    The purpose of this procedure is to establish a requirement and method for participant organizations to report software baseline change information to the M ampersand O Configuration Management (CM) organization for inclusion in the OCRWM Configuration Information System (CIS). (The requirements for performing software configuration management (SCM) are found in the OCRWM Quality Assurance Requirements and Description (QARD) document and in applicable DOE orders, and not in this procedure.) This procedure provides a linkage between each participant's SCM system and the CIS, which may be accessed for identification, descriptive, and contact information pertaining to software released by a participant. Such information from the CIS will enable retrieval of details and copies of software code and documentation from the participant SCM system

  3. Presentation of the ASTRAL software; Presentation du logiciel ASTRAL

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2011-07-01

    This report presents the ASTRAL software (ASTRAL means technical assistance in post-accidental radioprotection) which is aimed to be used as an aid-to-decision tool in the case of an important release of radionuclides in the environment by assessing radionuclide concentration in different environments and food products, determining the potential exposure to irradiation of concerned populations, foreseeing the evolution of the situation, and proposing different scenarios for the management of contaminated areas. The report describes the software general operation, presents the calculation module (main functionalities, concentration index calculation, dose calculation or radiological impact calculation, how countermeasures are taken into account), the data bases (contextual data, data for radio-ecological calculations and for radiological calculations), the software ergonomics (general principles, result selection and display, result printing and input data). Its briefly evokes the development quality assurance, and describes the software implementation architecture

  4. Aligning flexibility with uncertainty in software development arrangements through a contractual typology

    OpenAIRE

    Lichtenstein, Y.; Finkelstein, L.; Wyss, S.

    2018-01-01

    Purpose\\ud The purpose of this study is to identify a typology of procurement contracts in the context of software development projects that allows firms to align design flexibility with design uncertainty at the project level. The theoretical lenses of contract theory and software engineering are used to explain why the five archetypes in the proposed typology provide gradually increasing levels of design flexibility and to develop hypotheses about the associations between design flexibility...

  5. Innovation Initiatives in Large Software Companies

    DEFF Research Database (Denmark)

    Edison, Henry; Wang, Xiaofeng; Jabangwe, Ronald

    2018-01-01

    empirical studies on innovation initiative in the context of large software companies. A total of 7 studies are conducted in the context of large software companies, which reported 5 types of initiatives: intrapreneurship, bootlegging, internal venture, spin-off and crowdsourcing. Our study offers three......Context: To keep the competitive advantage and adapt to changes in the market and technology, companies need to innovate in an organised, purposeful and systematic manner. However, due to their size and complexity, large companies tend to focus on the structure in maintaining their business, which...... can potentially lower their agility to innovate. Objective:The aims of this study are to provide an overview of the current research on innovation initiatives and to identify the challenges of implementing those initiatives in the context of large software companies. Method: The investigation...

  6. 7 Processes that Enable NASA Software Engineering Technologies: Value-Added Process Engineering

    Science.gov (United States)

    Housch, Helen; Godfrey, Sally

    2011-01-01

    The presentation reviews Agency process requirements and the purpose, benefits, and experiences or seven software engineering processes. The processes include: product integration, configuration management, verification, software assurance, measurement and analysis, requirements management, and planning and monitoring.

  7. TU-D-201-01: Definition and Purpose of End-Of-Life for Brachytherapy Devices and Software

    International Nuclear Information System (INIS)

    Melhus, C.

    2015-01-01

    Brachytherapy devices and software are designed to last for a certain period of time. Due to a number of considerations, such as material factors, wear-and-tear, backwards compatibility, and others, they all reach a date when they are no longer supported by the manufacturer. Most of these products have a limited duration for their use, and the information is provided to the user at time of purchase. Because of issues or concerns determined by the manufacturer, certain products are retired sooner than the anticipated date, and the user is immediately notified. In these situations, the institution is facing some difficult choices: remove these products from the clinic or perform tests and continue their usage. Both of these choices come with a financial burden: replacing the product or assuming a potential medicolegal liability. This session will provide attendees with the knowledge and tools to make better decisions when facing these issues. Learning Objectives: Understand the meaning of “end-of-life or “life expectancy” for brachytherapy devices and software Review items (devices and software) affected by “end-of-life” restrictions Learn how to effectively formulate “end-of-life” policies at your institution Learn about possible implications of “end-of-life” policy Review other possible approaches to “end-of-life” issue

  8. Analysis of Software Development Methodologies to Build Safety Software Applications for the SATEX-II: A Mexican Experimental Satellite

    Science.gov (United States)

    Aguilar Cisneros, Jorge; Vargas Martinez, Hector; Pedroza Melendez, Alejandro; Alonso Arevalo, Miguel

    2013-09-01

    Mexico is a country where the experience to build software for satellite applications is beginning. This is a delicate situation because in the near future we will need to develop software for the SATEX-II (Mexican Experimental Satellite). SATEX- II is a SOMECyTA's project (the Mexican Society of Aerospace Science and Technology). We have experienced applying software development methodologies, like TSP (Team Software Process) and SCRUM in other areas. Then, we analyzed these methodologies and we concluded: these can be applied to develop software for the SATEX-II, also, we supported these methodologies with SSP-05-0 Standard in particular with ESA PSS-05-11. Our analysis was focusing on main characteristics of each methodology and how these methodologies could be used with the ESA PSS 05-0 Standards. Our outcomes, in general, may be used by teams who need to build small satellites, but, in particular, these are going to be used when we will build the on board software applications for the SATEX-II.

  9. Software Certification - Coding, Code, and Coders

    Science.gov (United States)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  10. ScienceSoft: Open software for open science

    CERN Document Server

    Di Meglio, Alberto

    2012-01-01

    Most of the software developed today by research institutes, university, research projects, etc. is typically stored in local source and binary repositories and available for the duration of a project lifetime only. Finding software based on given functional characteristics is almost impossible and binary packages are mostly available from local university or project repositories rather than the open source community repositories like Fedora/EPEL or Debian. Furthermore general information about who develops, contributes to and most importantly uses a given software program is very difficult to find out and yet the widespread availability of such information would give more visibility and credibility to the software products. The creation of links or relationships not only among pieces of software, but equally among the people interacting with the software across and beyond specific project and communities would foster a more active community and create the conditions for sharing ideas and skills, a ...

  11. General-purpose heat source project and space nuclear safety fuels program. Progress report, February 1980

    International Nuclear Information System (INIS)

    Maraman, W.J.

    1980-05-01

    This formal monthly report covers the studies related to the use of 238 PuO 2 in radioisotopic power systems carried out for the Advanced Nuclear Systems and Projects Division of the Los Alamos Scientific Laboratory. The two programs involved are: General-Purpose Heat Source Development and Space Nuclear Safety and Fuels. Most of the studies discussed here are of a continuing nature. Results and conclusions described may change as the work continues. Published reference to the results cited in this report should not be made without the explicit permission of the person in charge of the work

  12. Screening tests in toxicity or drug effect studies with use of centrifichem general-purpose spectrophotometeric analyzer

    International Nuclear Information System (INIS)

    Nagy, B.; Bercz, J.P.

    1986-01-01

    CentrifiChem System 400 general-purpose spectrophotometric analyzer which can process simultaneously 30 samples and reads the reactions within milliseconds was used for toxicity studies. Organic and inorganic chemicals were screened for inhibitory action of the hydrolytic activity of sarcoplasmic reticulum (SR) Ca,Mg-ATPase and that of the sacrolemmal (SL) Na,K-ATPase, or mitochondrial ATPase (M). SR and SL were prepared from rabbit muscles, Na,K-ATPase from pig kidneys, M from pig hearts. Pseudosubstrates of paranitrophenyl phosphate and 2,4-dinitrophenyl phosphate, both proven high energy phosphate substitutes for ATPase coupled ion transfer were used. The reaction rates were followed spectrophotometrically at 405 nm measuring the accumulation of yellow nitrophenolate ions. The reported calcium transfer coupling ratio to hydrolysis of 2:1 was ascertained with use of 45 Ca in case of SR. Inhibition constants (pI) on SR, SL, and M for the pseudosubstrate hydrolysis will be given for over 20 chemicals tested. The applicability of the system to general toxicity testing and to general cardio-effective drug screening will be presented

  13. Cross-border software development of health information system: A case study on project between India and Pakistan based on open source software

    OpenAIRE

    Sabir, Uzma

    2017-01-01

    Global software development is a phenomenon that is receiving considerable interest from researchers during past two decades. Several challenges have been identified and approaches to deal with these challenges have been developed. Typically, western companies outsource their projects to countries where costs are lower and skilled professionals are easily available. Majority of these projects are developed for commercial purposes. However, software development projects between India and Pakis...

  14. General-purpose stepping motor-encoder positioning subsystem with standard asynchronous serial-line interface

    International Nuclear Information System (INIS)

    Stubblefield, F.W.; Alberi, J.L.

    1982-01-01

    A general-purpose mechanical positioning subsystem for open-loop control of experiment devices which have their positions established and read out by stepping motor-encoder combinations has been developed. The subsystem is to be used mainly for experiments to be conducted at the National Synchrotron Light Source at Brookhaven National Laboratory. The subsystem unit has been designed to be compatible with a wide variety of stepping motor and encoder types. The unit may be operated by any device capable of driving a standard RS-232-C asynchronous serial communication line. An informal survey has shown that several experiments at the Light Source will use one particular type of computer, operating system, and programming language. Accordingly, a library of subroutines compatible with this combination of computer system elements has been written to facilitate driving the positioning subsystem unit

  15. General-purpose heat source safety verification test series: SVT-11 through SVT-13

    International Nuclear Information System (INIS)

    George, T.G.; Pavone, D.

    1986-05-01

    The General-Purpose Heat Source (GPHS) is a modular component of the radioisotope thermoelectric generator that will provide power for the Galileo and Ulysses (formerly ISPM) space missions. The GPHS provides power by transmitting the heat of 238 Pu α-decay to an array of thermoelectric elements. Because the possibility of an orbital abort always exists, the heat source was designed and constructed to minimize plutonia release in any accident environment. The Safety Verification Test (SVT) series was formulated to evaluate the effectiveness of GPHS plutonia containment after atmospheric reentry and Earth impact. The first two reports (covering SVT-1 through SVT-10) described the results of flat, side-on, and angular module impacts against steel targets at 54 m/s. This report describes flat-on module impacts against concrete and granite targets, at velocities equivalent to or higher than previous SVTs

  16. Revisiting the Global Software Engineering Terminology

    DEFF Research Database (Denmark)

    Tell, Paolo; Giuffrida, Rosalba; Shah, Hina

    2013-01-01

    Even though Global Software Engineering (GSE) has been a research topic of interest for many years, some of its ground terminology is still lacking a unified, coherent, and shared definition and/or classification. The purpose of this report is to collect, outline, and relate several fundamental...

  17. Software Verification and Validation Test Report for the HEPA filter Differential Pressure Fan Interlock System

    International Nuclear Information System (INIS)

    ERMI, A.M.

    2000-01-01

    The HEPA Filter Differential Pressure Fan Interlock System PLC ladder logic software was tested using a Software Verification and Validation (VandV) Test Plan as required by the ''Computer Software Quality Assurance Requirements''. The purpose of his document is to report on the results of the software qualification

  18. KASS v.2.2. scheduling software for construction

    OpenAIRE

    Krzemiński Michał

    2016-01-01

    The paper presents fourth version of specialist useful software in scheduling KASS v.2.2 (Algorithm Scheduling Krzeminski System). KASS software is designed for construction scheduling, specially form flow shop models. The program is being dedicated closely for the purposes of the construction. In distinguishing to other used programs in tasks of this type operational research criteria were designed closely with the thought about construction works and about the specificity of the building pr...

  19. Prototype Software for Automated Structural Analysis of Systems

    DEFF Research Database (Denmark)

    Jørgensen, A.; Izadi-Zamanabadi, Roozbeh; Kristensen, M.

    2004-01-01

    In this paper we present a prototype software tool that is developed to analyse the structural model of automated systems in order to identify redundant information that is hence utilized for Fault detection and Isolation (FDI) purposes. The dedicated algorithms in this software tool use a tri......-partite graph that represents the structural model of the system. A component-based approach has been used to address issues such as system complexity and reconfigurability possibilities....

  20. Software engineering with application-specific languages

    Science.gov (United States)

    Campbell, David J.; Barker, Linda; Mitchell, Deborah; Pollack, Robert H.

    1993-01-01

    Application-Specific Languages (ASL's) are small, special-purpose languages that are targeted to solve a specific class of problems. Using ASL's on software development projects can provide considerable cost savings, reduce risk, and enhance quality and reliability. ASL's provide a platform for reuse within a project or across many projects and enable less-experienced programmers to tap into the expertise of application-area experts. ASL's have been used on several software development projects for the Space Shuttle Program. On these projects, the use of ASL's resulted in considerable cost savings over conventional development techniques. Two of these projects are described.