WorldWideScience

Sample records for process automation software

  1. Integration of drinking water treatment plant process models and emulated process automation software

    NARCIS (Netherlands)

    Worm, G.I.M.

    2012-01-01

    The objective of this research is to limit the risks of fully automated operation of drinking water treatment plants and to improve their operation by using an integrated system of process models and emulated process automation software. This thesis contains the design of such an integrated system.

  2. Automation of the software production process for multiple cryogenic control applications

    OpenAIRE

    Fluder, Czeslaw; Lefebvre, Victor; Pezzetti, Marco; Plutecki, Przemyslaw; Tovar-González, Antonio; Wolak, Tomasz

    2018-01-01

    The development of process control systems for the cryogenic infrastructure at CERN is based on an automatic software generation approach. The overall complexity of the systems, their frequent evolution as well as the extensive use of databases, repositories, commercial engineering software and CERN frameworks have led to further efforts towards improving the existing automation based software production methodology. A large number of control system upgrades were successfully performed for th...

  3. The use of process simulation models in virtual commissioning of process automation software in drinking water treatment plants

    NARCIS (Netherlands)

    Worm, G.I.M.; Kelderman, J.P.; Lapikas, T.; Van der Helm, A.W.C.; Van Schagen, K.M.; Rietveld, L.C.

    2012-01-01

    This research deals with the contribution of process simulation models to the factory acceptance test (FAT) of process automation (PA) software of drinking water treatment plants. Two test teams tested the same piece of modified PA-software. One team used an advanced virtual commissioning (AVC)

  4. Honeywell Modular Automation System Computer Software Documentation for the Magnesium Hydroxide Precipitation Process

    International Nuclear Information System (INIS)

    STUBBS, A.M.

    2001-01-01

    The purpose of this Computer Software Document (CSWD) is to provide configuration control of the Honeywell Modular Automation System (MAS) in use at the Plutonium Finishing Plant (PFP) for the Magnesium Hydroxide Precipitation Process in Rm 230C/234-5Z. The magnesium hydroxide process control software Rev 0 is being updated to include control programming for a second hot plate. The process control programming was performed by the system administrator. Software testing for the additional hot plate was performed per PFP Job Control Work Package 2Z-00-1703. The software testing was verified by Quality Control to comply with OSD-Z-184-00044, Magnesium Hydroxide Precipitation Process

  5. CATS, continuous automated testing of seismological, hydroacoustic, and infrasound (SHI) processing software.

    Science.gov (United States)

    Brouwer, Albert; Brown, David; Tomuta, Elena

    2017-04-01

    To detect nuclear explosions, waveform data from over 240 SHI stations world-wide flows into the International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO), located in Vienna, Austria. A complex pipeline of software applications processes this data in numerous ways to form event hypotheses. The software codebase comprises over 2 million lines of code, reflects decades of development, and is subject to frequent enhancement and revision. Since processing must run continuously and reliably, software changes are subjected to thorough testing before being put into production. To overcome the limitations and cost of manual testing, the Continuous Automated Testing System (CATS) has been created. CATS provides an isolated replica of the IDC processing environment, and is able to build and test different versions of the pipeline software directly from code repositories that are placed under strict configuration control. Test jobs are scheduled automatically when code repository commits are made. Regressions are reported. We present the CATS design choices and test methods. Particular attention is paid to how the system accommodates the individual testing of strongly interacting software components that lack test instrumentation.

  6. Process automation

    International Nuclear Information System (INIS)

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  7. WISE: Automated support for software project management and measurement. M.S. Thesis

    Science.gov (United States)

    Ramakrishnan, Sudhakar

    1995-01-01

    One important aspect of software development and IV&V is measurement. Unless a software development effort is measured in some way, it is difficult to judge the effectiveness of current efforts and predict future performances. Collection of metrics and adherence to a process are difficult tasks in a software project. Change activity is a powerful indicator of project status. Automated systems that can handle change requests, issues, and other process documents provide an excellent platform for tracking the status of the project. A World Wide Web based architecture is developed for (a) making metrics collection an implicit part of the software process, (b) providing metric analysis dynamically, (c) supporting automated tools that can complement current practices of in-process improvement, and (d) overcoming geographical barrier. An operational system (WISE) instantiates this architecture allowing for the improvement of software process in a realistic environment. The tool tracks issues in software development process, provides informal communication between the users with different roles, supports to-do lists (TDL), and helps in software process improvement. WISE minimizes the time devoted to metrics collection, analysis, and captures software change data. Automated tools like WISE focus on understanding and managing the software process. The goal is improvement through measurement.

  8. RISK MANAGEMENT AUTOMATION OF SOFTWARE PROJECTS BASED ОN FUZZY INFERENCE

    Directory of Open Access Journals (Sweden)

    T. M. Zubkova

    2015-09-01

    Full Text Available Application suitability for one of the intelligent methods for risk management of software projects has been shown based on the review of existing algorithms for fuzzy inference in the field of applied problems. Information sources in the management of software projects are analyzed; major and minor risks are highlighted. The most critical parameters have been singled out giving the possibility to estimate the occurrence of an adverse situations (project duration, the frequency of customer’s requirements changing, work deadlines, experience of developers’ participation in such projects and others.. The method of qualitative fuzzy description based on fuzzy logic has been developed for analysis of these parameters. Evaluation of possible situations and knowledge base formation rely on a survey of experts. The main limitations of existing automated systems have been identified in relation to their applicability to risk management in the software design. Theoretical research set the stage for software system that makes it possible to automate the risk management process for software projects. The developed software system automates the process of fuzzy inference in the following stages: rule base formation of the fuzzy inference systems, fuzzification of input variables, aggregation of sub-conditions, activation and accumulation of conclusions for fuzzy production rules, variables defuzzification. The result of risk management automation process in the software design is their quantitative and qualitative assessment and expert advice for their minimization. Practical significance of the work lies in the fact that implementation of the developed automated system gives the possibility for performance improvement of software projects.

  9. Software for an automated processing system for radioisotope information from multichannel radiodiagnostic instruments

    International Nuclear Information System (INIS)

    Zelenin, P.E.; Meier, V.P.

    1985-01-01

    The SAORI-01 system for the automated processing of radioisotope information is designed for the collection, processing, and representation of information coming from gamma chambers and multichannel radiodiagnostic instruments (MRI) and is basically oriented toward the radiodiagnostic laboratories of major multidisciplinary hospitals and scientific-research institutes. The functional characteristics of the basic software are discussed, and permits performance of the following functions: collection of information regarding MRI; processing and representation of recorded information; storage of patient files on magnetic carriers; and writing of special processing programs in the FORTRAN and BASIC high-level language

  10. Processing and review interface for strong motion data (PRISM) software, version 1.0.0—Methodology and automated processing

    Science.gov (United States)

    Jones, Jeanne; Kalkan, Erol; Stephens, Christopher

    2017-02-23

    A continually increasing number of high-quality digital strong-motion records from stations of the National Strong-Motion Project (NSMP) of the U.S. Geological Survey (USGS), as well as data from regional seismic networks within the United States, call for automated processing of strong-motion records with human review limited to selected significant or flagged records. The NSMP has developed the Processing and Review Interface for Strong Motion data (PRISM) software to meet this need. In combination with the Advanced National Seismic System Quake Monitoring System (AQMS), PRISM automates the processing of strong-motion records. When used without AQMS, PRISM provides batch-processing capabilities. The PRISM version 1.0.0 is platform independent (coded in Java), open source, and does not depend on any closed-source or proprietary software. The software consists of two major components: a record processing engine and a review tool that has a graphical user interface (GUI) to manually review, edit, and process records. To facilitate use by non-NSMP earthquake engineers and scientists, PRISM (both its processing engine and review tool) is easy to install and run as a stand-alone system on common operating systems such as Linux, OS X, and Windows. PRISM was designed to be flexible and extensible in order to accommodate new processing techniques. This report provides a thorough description and examples of the record processing features supported by PRISM. All the computing features of PRISM have been thoroughly tested.

  11. Product-oriented Software Certification Process for Software Synthesis

    Science.gov (United States)

    Nelson, Stacy; Fischer, Bernd; Denney, Ewen; Schumann, Johann; Richardson, Julian; Oh, Phil

    2004-01-01

    The purpose of this document is to propose a product-oriented software certification process to facilitate use of software synthesis and formal methods. Why is such a process needed? Currently, software is tested until deemed bug-free rather than proving that certain software properties exist. This approach has worked well in most cases, but unfortunately, deaths still occur due to software failure. Using formal methods (techniques from logic and discrete mathematics like set theory, automata theory and formal logic as opposed to continuous mathematics like calculus) and software synthesis, it is possible to reduce this risk by proving certain software properties. Additionally, software synthesis makes it possible to automate some phases of the traditional software development life cycle resulting in a more streamlined and accurate development process.

  12. Automatic sample changer control software for automation of neutron activation analysis process in Malaysian Nuclear Agency

    Science.gov (United States)

    Yussup, N.; Ibrahim, M. M.; Rahman, N. A. A.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.; Lombigit, L.; Azman, A.; Omar, S. A.

    2018-01-01

    Most of the procedures in neutron activation analysis (NAA) process that has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s were performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient especially for sample counting and measurement process. The sample needs to be changed and the measurement software needs to be setup for every one hour counting time. Both of these procedures are performed manually for every sample. Hence, an automatic sample changer system (ASC) that consists of hardware and software is developed to automate sample counting process for up to 30 samples consecutively. This paper describes the ASC control software for NAA process which is designed and developed to control the ASC hardware and call GammaVision software for sample measurement. The software is developed by using National Instrument LabVIEW development package.

  13. Program software for the automated processing of gravity and magnetic survey data for the Mir computer

    Energy Technology Data Exchange (ETDEWEB)

    Lyubimov, G.A.

    1980-01-01

    A presentation is made of the content of program software for the automated processing of gravity and magnetic survey data for the small Mir-1 and Mir-2 computers as worked out on the Voronezh geophysical expedition.

  14. Geocoding uncertainty analysis for the automated processing of Sentinel-1 data using Sentinel-1 Toolbox software

    Science.gov (United States)

    Dostálová, Alena; Naeimi, Vahid; Wagner, Wolfgang; Elefante, Stefano; Cao, Senmao; Persson, Henrik

    2016-10-01

    One of the major advantages of the Sentinel-1 data is its capability to provide very high spatio-temporal coverage allowing the mapping of large areas as well as creation of dense time-series of the Sentinel-1 acquisitions. The SGRT software developed at TU Wien aims at automated processing of Sentinel-1 data for global and regional products. The first step of the processing consists of the Sentinel-1 data geocoding with the help of S1TBX software and their resampling to a common grid. These resampled images serve as an input for the product derivation. Thus, it is very important to select the most reliable processing settings and assess the geocoding uncertainty for both backscatter and projected local incidence angle images. Within this study, selection of Sentinel-1 acquisitions over 3 test areas in Europe were processed manually in the S1TBX software, testing multiple software versions, processing settings and digital elevation models (DEM) and the accuracy of the resulting geocoded images were assessed. Secondly, all available Sentinel-1 data over the areas were processed using selected settings and detailed quality check was performed. Overall, strong influence of the used DEM on the geocoding quality was confirmed with differences up to 80 meters in areas with higher terrain variations. In flat areas, the geocoding accuracy of backscatter images was overall good, with observed shifts between 0 and 30m. Larger systematic shifts were identified in case of projected local incidence angle images. These results encourage the automated processing of large volumes of Sentinel-1 data.

  15. Software maintenance and evolution and automated software engineering

    NARCIS (Netherlands)

    Carver, Jeffrey C.; Serebrenik, Alexander

    2018-01-01

    This issue's column reports on the 33rd International Conference on Software Maintenance and Evolution and 32nd International Conference on Automated Software Engineering. Topics include flaky tests, technical debt, QA bots, and regular expressions.

  16. Exploration on Automated Software Requirement Document Readability Approaches

    OpenAIRE

    Chen, Mingda; He, Yao

    2017-01-01

    Context. The requirements analysis phase, as the very beginning of software development process, has been identified as a quite important phase in the software development lifecycle. Software Requirement Specification (SRS) is the output of requirements analysis phase, whose quality factors play an important role in the evaluation work. Readability is a quite important SRS quality factor, but there are few available automated approaches for readability measurement, because of the tight depend...

  17. Maneuver Automation Software

    Science.gov (United States)

    Uffelman, Hal; Goodson, Troy; Pellegrin, Michael; Stavert, Lynn; Burk, Thomas; Beach, David; Signorelli, Joel; Jones, Jeremy; Hahn, Yungsun; Attiyah, Ahlam; hide

    2009-01-01

    The Maneuver Automation Software (MAS) automates the process of generating commands for maneuvers to keep the spacecraft of the Cassini-Huygens mission on a predetermined prime mission trajectory. Before MAS became available, a team of approximately 10 members had to work about two weeks to design, test, and implement each maneuver in a process that involved running many maneuver-related application programs and then serially handing off data products to other parts of the team. MAS enables a three-member team to design, test, and implement a maneuver in about one-half hour after Navigation has process-tracking data. MAS accepts more than 60 parameters and 22 files as input directly from users. MAS consists of Practical Extraction and Reporting Language (PERL) scripts that link, sequence, and execute the maneuver- related application programs: "Pushing a single button" on a graphical user interface causes MAS to run navigation programs that design a maneuver; programs that create sequences of commands to execute the maneuver on the spacecraft; and a program that generates predictions about maneuver performance and generates reports and other files that enable users to quickly review and verify the maneuver design. MAS can also generate presentation materials, initiate electronic command request forms, and archive all data products for future reference.

  18. Configuring the Orion Guidance, Navigation, and Control Flight Software for Automated Sequencing

    Science.gov (United States)

    Odegard, Ryan G.; Siliwinski, Tomasz K.; King, Ellis T.; Hart, Jeremy J.

    2010-01-01

    The Orion Crew Exploration Vehicle is being designed with greater automation capabilities than any other crewed spacecraft in NASA s history. The Guidance, Navigation, and Control (GN&C) flight software architecture is designed to provide a flexible and evolvable framework that accommodates increasing levels of automation over time. Within the GN&C flight software, a data-driven approach is used to configure software. This approach allows data reconfiguration and updates to automated sequences without requiring recompilation of the software. Because of the great dependency of the automation and the flight software on the configuration data, the data management is a vital component of the processes for software certification, mission design, and flight operations. To enable the automated sequencing and data configuration of the GN&C subsystem on Orion, a desktop database configuration tool has been developed. The database tool allows the specification of the GN&C activity sequences, the automated transitions in the software, and the corresponding parameter reconfigurations. These aspects of the GN&C automation on Orion are all coordinated via data management, and the database tool provides the ability to test the automation capabilities during the development of the GN&C software. In addition to providing the infrastructure to manage the GN&C automation, the database tool has been designed with capabilities to import and export artifacts for simulation analysis and documentation purposes. Furthermore, the database configuration tool, currently used to manage simulation data, is envisioned to evolve into a mission planning tool for generating and testing GN&C software sequences and configurations. A key enabler of the GN&C automation design, the database tool allows both the creation and maintenance of the data artifacts, as well as serving the critical role of helping to manage, visualize, and understand the data-driven parameters both during software development

  19. Early Validation of Automation Plant Control Software using Simulation Based on Assumption Modeling and Validation Use Cases

    Directory of Open Access Journals (Sweden)

    Veronika Brandstetter

    2015-10-01

    Full Text Available In automation plants, technical processes must be conducted in a way that products, substances, or services are produced reliably, with sufficient quality and with minimal strain on resources. A key driver in conducting these processes is the automation plant’s control software, which controls the technical plant components and thereby affects the physical, chemical, and mechanical processes that take place in automation plants. To this end, the control software of an automation plant must adhere to strict process requirements arising from the technical processes, and from the physical plant design. Currently, the validation of the control software often starts late in the engineering process in many cases – once the automation plant is almost completely constructed. However, as widely acknowledged, the later the control software of the automation plant is validated, the higher the effort for correcting revealed defects is, which can lead to serious budget overruns and project delays. In this article we propose an approach that allows the early validation of automation control software against the technical plant processes and assumptions about the physical plant design by means of simulation. We demonstrate the application of our approach on the example of an actual plant project from the automation industry and present it’s technical implementation

  20. A control system verifier using automated reasoning software

    International Nuclear Information System (INIS)

    Smith, D.E.; Seeman, S.E.

    1985-08-01

    An on-line, automated reasoning software system for verifying the actions of other software or human control systems has been developed. It was demonstrated by verifying the actions of an automated procedure generation system. The verifier uses an interactive theorem prover as its inference engine with the rules included as logical axioms. Operation of the verifier is generally transparent except when the verifier disagrees with the actions of the monitored software. Testing with an automated procedure generation system demonstrates the successful application of automated reasoning software for verification of logical actions in a diverse, redundant manner. A higher degree of confidence may be placed in the verified actions of the combined system

  1. FY1995 study of low power LSI design automation software with parallel processing; 1995 nendo heiretsu shori wo katsuyoshita shodenryoku LSI muke sekkei jidoka software no kenkyu kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    The needs for low power LSIs have rapidly increased recently. For the low power LSI development, not only new circuit technologies but also new design automation tools supporting the new technologies are indispensable. The purpose of this project is to develop a new design automation software, which is able to design new digital LSIs with much lower power than that of conventional CMOS LSIs. A new design automation software for very low power LSIs has been developed targeting the pass-transistor logic SPL, a dedicated low power circuit technology. The software includes a logic synthesis function for pass-transistor-based macrocells and a macrocell placement function. Several new algorithms have been developed for the software, e.g. BDD construction. Some of them are designed and implemented for parallel processing in order to reduce the processing time. The logic synthesis function was tested on a set of benchmarks and finally applied to a low power CPU design. The designed 8-bit CPU was fully compatible with Zilog Z-80. The power dissipation of the CPU was compared with that of commercial CMOS Z-80. At most 82% of power of CMOS was reduced by the new CPU. On the other hand, parallel processing speed up was measured on the macrocell placement function. 34 folds speed up was realized. (NEDO)

  2. Automated sampling and data processing derived from biomimetic membranes

    DEFF Research Database (Denmark)

    Perry, Mark; Vissing, Thomas; Boesen, P.

    2009-01-01

    data processing software to analyze and organize the large amounts of data generated. In this work, we developed an automated instrumental voltage clamp solution based on a custom-designed software controller application (the WaveManager), which enables automated on-line voltage clamp data acquisition...... applicable to long-time series experiments. We designed another software program for off-line data processing. The automation of the on-line voltage clamp data acquisition and off-line processing was furthermore integrated with a searchable database (DiscoverySheet (TM)) for efficient data management......Recent advances in biomimetic membrane systems have resulted in an increase in membrane lifetimes from hours to days and months. Long-lived membrane systems demand the development of both new automated monitoring equipment capable of measuring electrophysiological membrane characteristics and new...

  3. Hardware and software for automating the process of studying high-speed gas flows in wind tunnels of short-term action

    Science.gov (United States)

    Yakovlev, V. V.; Shakirov, S. R.; Gilyov, V. M.; Shpak, S. I.

    2017-10-01

    In this paper, we propose a variant of constructing automation systems for aerodynamic experiments on the basis of modern hardware-software means of domestic development. The structure of the universal control and data collection system for performing experiments in wind tunnels of continuous, periodic or short-term action is proposed. The proposed hardware and software development tools for ICT SB RAS and ITAM SB RAS, as well as subsystems based on them, can be widely applied to any scientific and experimental installations, as well as to the automation of technological processes in production.

  4. Automating Software Development Process using Fuzzy Logic

    NARCIS (Netherlands)

    Marcelloni, Francesco; Aksit, Mehmet; Damiani, Ernesto; Jain, Lakhmi C.; Madravio, Mauro

    2004-01-01

    In this chapter, we aim to highlight how fuzzy logic can be a valid expressive tool to manage the software development process. We characterize a software development method in terms of two major components: artifact types and methodological rules. Classes, attributes, operations, and inheritance

  5. Automated sampling and data processing derived from biomimetic membranes

    International Nuclear Information System (INIS)

    Perry, M; Vissing, T; Hansen, J S; Nielsen, C H; Boesen, T P; Emneus, J

    2009-01-01

    Recent advances in biomimetic membrane systems have resulted in an increase in membrane lifetimes from hours to days and months. Long-lived membrane systems demand the development of both new automated monitoring equipment capable of measuring electrophysiological membrane characteristics and new data processing software to analyze and organize the large amounts of data generated. In this work, we developed an automated instrumental voltage clamp solution based on a custom-designed software controller application (the WaveManager), which enables automated on-line voltage clamp data acquisition applicable to long-time series experiments. We designed another software program for off-line data processing. The automation of the on-line voltage clamp data acquisition and off-line processing was furthermore integrated with a searchable database (DiscoverySheet(TM)) for efficient data management. The combined solution provides a cost efficient and fast way to acquire, process and administrate large amounts of voltage clamp data that may be too laborious and time consuming to handle manually. (communication)

  6. Automated sampling and data processing derived from biomimetic membranes

    Energy Technology Data Exchange (ETDEWEB)

    Perry, M; Vissing, T; Hansen, J S; Nielsen, C H [Aquaporin A/S, Diplomvej 377, DK-2800 Kgs. Lyngby (Denmark); Boesen, T P [Xefion ApS, Kildegaardsvej 8C, DK-2900 Hellerup (Denmark); Emneus, J, E-mail: Claus.Nielsen@fysik.dtu.d [DTU Nanotech, Technical University of Denmark, DK-2800 Kgs. Lyngby (Denmark)

    2009-12-15

    Recent advances in biomimetic membrane systems have resulted in an increase in membrane lifetimes from hours to days and months. Long-lived membrane systems demand the development of both new automated monitoring equipment capable of measuring electrophysiological membrane characteristics and new data processing software to analyze and organize the large amounts of data generated. In this work, we developed an automated instrumental voltage clamp solution based on a custom-designed software controller application (the WaveManager), which enables automated on-line voltage clamp data acquisition applicable to long-time series experiments. We designed another software program for off-line data processing. The automation of the on-line voltage clamp data acquisition and off-line processing was furthermore integrated with a searchable database (DiscoverySheet(TM)) for efficient data management. The combined solution provides a cost efficient and fast way to acquire, process and administrate large amounts of voltage clamp data that may be too laborious and time consuming to handle manually. (communication)

  7. Factors to Consider When Implementing Automated Software Testing

    Science.gov (United States)

    2016-11-10

    development and integration is a continuous process throughout the acquisition life cycle . Automated Software Testing can improve testing capabilities...requires a lab, conference room, or both, and whether it should be located in- house or an external facility. 2. Ensure space is adequate to support team

  8. The Harvard Automated Processing Pipeline for Electroencephalography (HAPPE): Standardized Processing Software for Developmental and High-Artifact Data.

    Science.gov (United States)

    Gabard-Durnam, Laurel J; Mendez Leal, Adriana S; Wilkinson, Carol L; Levin, April R

    2018-01-01

    Electroenchephalography (EEG) recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE) as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact contamination and

  9. The Harvard Automated Processing Pipeline for Electroencephalography (HAPPE: Standardized Processing Software for Developmental and High-Artifact Data

    Directory of Open Access Journals (Sweden)

    Laurel J. Gabard-Durnam

    2018-02-01

    Full Text Available Electroenchephalography (EEG recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact

  10. Application of automated reasoning software: procedure generation system verifier

    International Nuclear Information System (INIS)

    Smith, D.E.; Seeman, S.E.

    1984-09-01

    An on-line, automated reasoning software system for verifying the actions of other software or human control systems has been developed. It was demonstrated by verifying the actions of an automated procedure generation system. The verifier uses an interactive theorem prover as its inference engine with the rules included as logic axioms. Operation of the verifier is generally transparent except when the verifier disagrees with the actions of the monitored software. Testing with an automated procedure generation system demonstrates the successful application of automated reasoning software for verification of logical actions in a diverse, redundant manner. A higher degree of confidence may be placed in the verified actions gathered by the combined system

  11. Automated quantification of epicardial adipose tissue using CT angiography: evaluation of a prototype software

    Energy Technology Data Exchange (ETDEWEB)

    Spearman, James V.; Silverman, Justin R.; Krazinski, Aleksander W.; Costello, Philip [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); Meinel, Felix G.; Geyer, Lucas L. [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); Ludwig-Maximilians-University Hospital, Institute for Clinical Radiology, Munich (Germany); Schoepf, U.J. [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); Medical University of South Carolina, Division of Cardiology, Department of Medicine, Charleston, SC (United States); Apfaltrer, Paul [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); University Medical Center Mannheim, Medical Faculty Mannheim - Heidelberg University, Institute of Clinical Radiology and Nuclear Medicine, Mannheim (Germany); Canstein, Christian [Siemens Medical Solutions USA, Inc., Malvern, PA (United States); De Cecco, Carlo Nicola [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); University of Rome ' ' Sapienza' ' - Polo Pontino, Department of Radiological Sciences, Oncology and Pathology, Latina (Italy)

    2014-02-15

    This study evaluated the performance of a novel automated software tool for epicardial fat volume (EFV) quantification compared to a standard manual technique at coronary CT angiography (cCTA). cCTA data sets of 70 patients (58.6 ± 12.9 years, 33 men) were retrospectively analysed using two different post-processing software applications. Observer 1 performed a manual single-plane pericardial border definition and EFV{sub M} segmentation (manual approach). Two observers used a software program with fully automated 3D pericardial border definition and EFV{sub A} calculation (automated approach). EFV and time required for measuring EFV (including software processing time and manual optimization time) for each method were recorded. Intraobserver and interobserver reliability was assessed on the prototype software measurements. T test, Spearman's rho, and Bland-Altman plots were used for statistical analysis. The final EFV{sub A} (with manual border optimization) was strongly correlated with the manual axial segmentation measurement (60.9 ± 33.2 mL vs. 65.8 ± 37.0 mL, rho = 0.970, P < 0.001). A mean of 3.9 ± 1.9 manual border edits were performed to optimize the automated process. The software prototype required significantly less time to perform the measurements (135.6 ± 24.6 s vs. 314.3 ± 76.3 s, P < 0.001) and showed high reliability (ICC > 0.9). Automated EFV{sub A} quantification is an accurate and time-saving method for quantification of EFV compared to established manual axial segmentation methods. (orig.)

  12. Lifecycle, Iteration, and Process Automation with SMS Gateway

    Directory of Open Access Journals (Sweden)

    Fenny Fenny

    2015-12-01

    Full Text Available Producing a better quality software system requires an understanding of the indicators of the software quality through defect detection, and automated testing. This paper aims to elevate the design and automated testing process in an engine water pump of a drinking water plant. This paper proposes how software developers can improve the maintainability and reliability of automated testing system and report the abnormal state when an error occurs on the machine. The method in this paper uses literature to explain best practices and case studies of a drinking water plant. Furthermore, this paper is expected to be able to provide insights into the efforts to better handle errors and perform automated testing and monitoring on a machine.

  13. Automated quantification of epicardial adipose tissue using CT angiography: evaluation of a prototype software

    International Nuclear Information System (INIS)

    Spearman, James V.; Silverman, Justin R.; Krazinski, Aleksander W.; Costello, Philip; Meinel, Felix G.; Geyer, Lucas L.; Schoepf, U.J.; Apfaltrer, Paul; Canstein, Christian; De Cecco, Carlo Nicola

    2014-01-01

    This study evaluated the performance of a novel automated software tool for epicardial fat volume (EFV) quantification compared to a standard manual technique at coronary CT angiography (cCTA). cCTA data sets of 70 patients (58.6 ± 12.9 years, 33 men) were retrospectively analysed using two different post-processing software applications. Observer 1 performed a manual single-plane pericardial border definition and EFV M segmentation (manual approach). Two observers used a software program with fully automated 3D pericardial border definition and EFV A calculation (automated approach). EFV and time required for measuring EFV (including software processing time and manual optimization time) for each method were recorded. Intraobserver and interobserver reliability was assessed on the prototype software measurements. T test, Spearman's rho, and Bland-Altman plots were used for statistical analysis. The final EFV A (with manual border optimization) was strongly correlated with the manual axial segmentation measurement (60.9 ± 33.2 mL vs. 65.8 ± 37.0 mL, rho = 0.970, P 0.9). Automated EFV A quantification is an accurate and time-saving method for quantification of EFV compared to established manual axial segmentation methods. (orig.)

  14. Automating risk analysis of software design models.

    Science.gov (United States)

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  15. A Software Architecture for Simulation Support in Building Automation

    Directory of Open Access Journals (Sweden)

    Sergio Leal

    2014-07-01

    Full Text Available Building automation integrates the active components in a building and, thus, has to connect components of different industries. The goal is to provide reliable and efficient operation. This paper describes how simulation can support building automation and how the deployment process of simulation assisted building control systems can be structured. We look at the process as a whole and map it to a set of formally described workflows that can partly be automated. A workbench environment supports the process execution by means of improved planning, collaboration and deployment. This framework allows integration of existing tools, as well as manual tasks, and is, therefore, many more intricate than regular software deployment tools. The complex environment of building commissioning requires expertise in different domains, especially lighting, heating, ventilation, air conditioning, measurement and control technology, as well as energy efficiency; therefore, we present a framework for building commissioning and describe a deployment process that is capable of supporting the various phases of this approach.

  16. Delineated Analysis of Robotic Process Automation Tools

    OpenAIRE

    Ruchi Isaac; Riya Muni; Kenali Desai

    2017-01-01

    In this age and time when celerity is expected out of all the sectors of the country, the speed of execution of various processes and hence efficiency, becomes a prominent factor. To facilitate the speeding demands of these diverse platforms, Robotic Process Automation (RPA) is used. Robotic Process Automation can expedite back-office tasks in commercial industries, remote management tasks in IT industries and conservation of resources in multiple sectors. To implement RPA, many software ...

  17. Automating the Object-Oriented Software Development Process: Workshop Report

    NARCIS (Netherlands)

    Aksit, Mehmet; Tekinerdogan, B.

    1998-01-01

    Cost-effective realization of robust, adaptable and reusable software systems demands efficient and effective management of the overall software production process. Current object-oriented methods are not completely formalized and lack the ability of reasoning about the quality of processes and

  18. Automating the Object-Oriented Software Development Process: Workshop Report

    NARCIS (Netherlands)

    Aksit, Mehmet; Demeyer, S.; Bosch, H.G.P.; Tekinerdogan, B.

    Cost-effective realization of robust, adaptable and reusable software systems demands efficient and effective management of the overall software production process. Current object-oriented methods are not completely formalized and lack the ability of reasoning about the quality of processes and

  19. Automated Cryocooler Monitor and Control System Software

    Science.gov (United States)

    Britchcliffe, Michael J.; Conroy, Bruce L.; Anderson, Paul E.; Wilson, Ahmad

    2011-01-01

    This software is used in an automated cryogenic control system developed to monitor and control the operation of small-scale cryocoolers. The system was designed to automate the cryogenically cooled low-noise amplifier system described in "Automated Cryocooler Monitor and Control System" (NPO-47246), NASA Tech Briefs, Vol. 35, No. 5 (May 2011), page 7a. The software contains algorithms necessary to convert non-linear output voltages from the cryogenic diode-type thermometers and vacuum pressure and helium pressure sensors, to temperature and pressure units. The control function algorithms use the monitor data to control the cooler power, vacuum solenoid, vacuum pump, and electrical warm-up heaters. The control algorithms are based on a rule-based system that activates the required device based on the operating mode. The external interface is Web-based. It acts as a Web server, providing pages for monitor, control, and configuration. No client software from the external user is required.

  20. SAGA: A project to automate the management of software production systems

    Science.gov (United States)

    Campbell, Roy H.; Laliberte, D.; Render, H.; Sum, R.; Smith, W.; Terwilliger, R.

    1987-01-01

    The Software Automation, Generation and Administration (SAGA) project is investigating the design and construction of practical software engineering environments for developing and maintaining aerospace systems and applications software. The research includes the practical organization of the software lifecycle, configuration management, software requirements specifications, executable specifications, design methodologies, programming, verification, validation and testing, version control, maintenance, the reuse of software, software libraries, documentation, and automated management.

  1. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis

    Directory of Open Access Journals (Sweden)

    Qu Lijia

    2009-03-01

    Full Text Available Abstract Background Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. Results In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion, data reduction (PCA, LDA, ULDA, unsupervised clustering (K-Mean and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM. Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Conclusion Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases

  2. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis.

    Science.gov (United States)

    Wang, Tao; Shao, Kang; Chu, Qinying; Ren, Yanfei; Mu, Yiming; Qu, Lijia; He, Jie; Jin, Changwen; Xia, Bin

    2009-03-16

    Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion), data reduction (PCA, LDA, ULDA), unsupervised clustering (K-Mean) and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM). Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases. Moreover, with its open source architecture, interested

  3. A software application for the processing of students results | Ukem ...

    African Journals Online (AJOL)

    A software application for the processing of students results. ... In this work, a computer software application was developed to facilitate the automated processing of the results. ... Full Text: EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  4. Enhancing Seismic Calibration Research Through Software Automation and Scientific Information Management

    Energy Technology Data Exchange (ETDEWEB)

    Ruppert, S D; Dodge, D A; Ganzberger, M D; Harris, D B; Hauk, T F

    2009-07-07

    The National Nuclear Security Administration (NNSA) Ground-Based Nuclear Explosion Monitoring Research and Development (GNEMRD) Program at LLNL continues to make significant progress enhancing the process of deriving seismic calibrations and performing scientific integration, analysis, and information management with software automation tools. Our tool efforts address the problematic issues of very large datasets and varied formats encountered during seismic calibration research. New information management and analysis tools have resulted in demonstrated gains in efficiency of producing scientific data products and improved accuracy of derived seismic calibrations. In contrast to previous years, software development work this past year has emphasized development of automation at the data ingestion level. This change reflects a gradually-changing emphasis in our program from processing a few large data sets that result in a single integrated delivery, to processing many different data sets from a variety of sources. The increase in the number of sources had resulted in a large increase in the amount of metadata relative to the final volume of research products. Software developed this year addresses the problems of: (1) Efficient metadata ingestion and conflict resolution; (2) Automated ingestion of bulletin information; (3) Automated ingestion of waveform information from global data centers; and (4) Site Metadata and Response transformation required for certain products. This year, we also made a significant step forward in meeting a long-standing goal of developing and using a waveform correlation framework. Our objective for such a framework is to extract additional calibration data (e.g. mining blasts) and to study the extent to which correlated seismicity can be found in global and regional scale environments.

  5. Improvement of Computer Software Quality through Software Automated Tools.

    Science.gov (United States)

    1986-08-30

    information that are returned from the tools to the human user, and the forms in which these outputs are presented. Page 2 of 4 STAGE OF DEVELOPMENT: What... AUTOMIATED SOFTWARE TOOL MONITORING SYSTEM APPENDIX 2 2-1 INTRODUCTION This document and Automated Software Tool Monitoring Program (Appendix 1) are...t Output Output features provide links from the tool to both the human user and the target machine (where applicable). They describe the types

  6. Semi-automated software service integration in virtual organisations

    Science.gov (United States)

    Afsarmanesh, Hamideh; Sargolzaei, Mahdi; Shadi, Mahdieh

    2015-08-01

    To enhance their business opportunities, organisations involved in many service industries are increasingly active in pursuit of both online provision of their business services (BSs) and collaborating with others. Collaborative Networks (CNs) in service industry sector, however, face many challenges related to sharing and integration of their collection of provided BSs and their corresponding software services. Therefore, the topic of service interoperability for which this article introduces a framework is gaining momentum in research for supporting CNs. It contributes to generation of formal machine readable specification for business processes, aimed at providing their unambiguous definitions, as needed for developing their equivalent software services. The framework provides a model and implementation architecture for discovery and composition of shared services, to support the semi-automated development of integrated value-added services. In support of service discovery, a main contribution of this research is the formal representation of services' behaviour and applying desired service behaviour specified by users for automated matchmaking with other existing services. Furthermore, to support service integration, mechanisms are developed for automated selection of the most suitable service(s) according to a number of service quality aspects. Two scenario cases are presented, which exemplify several specific features related to service discovery and service integration aspects.

  7. Launch Control System Software Development System Automation Testing

    Science.gov (United States)

    Hwang, Andrew

    2017-01-01

    The Spaceport Command and Control System (SCCS) is the National Aeronautics and Space Administration's (NASA) launch control system for the Orion capsule and Space Launch System, the next generation manned rocket currently in development. This system requires high quality testing that will measure and test the capabilities of the system. For the past two years, the Exploration and Operations Division at Kennedy Space Center (KSC) has assigned a group including interns and full-time engineers to develop automated tests to save the project time and money. The team worked on automating the testing process for the SCCS GUI that would use streamed simulated data from the testing servers to produce data, plots, statuses, etc. to the GUI. The software used to develop automated tests included an automated testing framework and an automation library. The automated testing framework has a tabular-style syntax, which means the functionality of a line of code must have the appropriate number of tabs for the line to function as intended. The header section contains either paths to custom resources or the names of libraries being used. The automation library contains functionality to automate anything that appears on a desired screen with the use of image recognition software to detect and control GUI components. The data section contains any data values strictly created for the current testing file. The body section holds the tests that are being run. The function section can include any number of functions that may be used by the current testing file or any other file that resources it. The resources and body section are required for all test files; the data and function sections can be left empty if the data values and functions being used are from a resourced library or another file. To help equip the automation team with better tools, the Project Lead of the Automated Testing Team, Jason Kapusta, assigned the task to install and train an optical character recognition (OCR

  8. Automated Software Testing : A Study of the State of Practice

    OpenAIRE

    Rafi, Dudekula Mohammad; Reddy, Kiran Moses Katam

    2012-01-01

    Context: Software testing is expensive, labor intensive and consumes lot of time in a software development life cycle. There was always a need in software testing to decrease the testing time. This also resulted to focus on Automated Software Testing (AST), because using automated testing, with specific tools, this effort can be dramatically reduced and the costs related with testing can decrease [11]. Manual Testing (MT) requires lot of effort and hard work, if we measure in terms of person ...

  9. Software Process Improvement: Supporting the Linking of the Software and the Business Strategies

    Science.gov (United States)

    Albuquerque, Adriano Bessa; Rocha, Ana Regina; Lima, Andreia Cavalcanti

    The market is becoming more and more competitive, a lot of products and services depend of the software product and the software is one of the most important assets, which influence the organizations’ businesses. Considering this context, we can observe that the companies must to deal with the software, developing or acquiring, carefully. One of the perspectives that can help to take advantage of the software, supporting effectively the business, is to invest on the organization’s software processes. This paper presents an approach to evaluate and improve the processes assets of the software organizations, based on internationally well-known standards and process models. This approach is supported by automated tools from the TABA Workstation and is part of a wider improvement strategy constituted of three layers (organizational layer, process execution layer and external entity layer). Moreover, this paper presents the experience of use and their results.

  10. Spaceport Command and Control System Automated Verification Software Development

    Science.gov (United States)

    Backus, Michael W.

    2017-01-01

    For as long as we have walked the Earth, humans have always been explorers. We have visited our nearest celestial body and sent Voyager 1 beyond our solar system1 out into interstellar space. Now it is finally time for us to step beyond our home and onto another planet. The Spaceport Command and Control System (SCCS) is being developed along with the Space Launch System (SLS) to take us on a journey further than ever attempted. Within SCCS are separate subsystems and system level software, each of which have to be tested and verified. Testing is a long and tedious process, so automating it will be much more efficient and also helps to remove the possibility of human error from mission operations. I was part of a team of interns and full-time engineers who automated tests for the requirements on SCCS, and with that was able to help verify that the software systems are performing as expected.

  11. Honeywell Modular Automation System Computer Software Documentation

    International Nuclear Information System (INIS)

    STUBBS, A.M.

    2000-01-01

    The purpose of this Computer Software Document (CSWD) is to provide configuration control of the Honeywell Modular Automation System (MAS) in use at the Plutonium Finishing Plant (PFP). This CSWD describes hardware and PFP developed software for control of stabilization furnaces. The Honeywell software can generate configuration reports for the developed control software. These reports are described in the following section and are attached as addendum's. This plan applies to PFP Engineering Manager, Thermal Stabilization Cognizant Engineers, and the Shift Technical Advisors responsible for the Honeywell MAS software/hardware and administration of the Honeywell System

  12. Automated real-time software development

    Science.gov (United States)

    Jones, Denise R.; Walker, Carrie K.; Turkovich, John J.

    1993-01-01

    A Computer-Aided Software Engineering (CASE) system has been developed at the Charles Stark Draper Laboratory (CSDL) under the direction of the NASA Langley Research Center. The CSDL CASE tool provides an automated method of generating source code and hard copy documentation from functional application engineering specifications. The goal is to significantly reduce the cost of developing and maintaining real-time scientific and engineering software while increasing system reliability. This paper describes CSDL CASE and discusses demonstrations that used the tool to automatically generate real-time application code.

  13. Three-dimensional rotation electron diffraction: software RED for automated data collection and data processing.

    Science.gov (United States)

    Wan, Wei; Sun, Junliang; Su, Jie; Hovmöller, Sven; Zou, Xiaodong

    2013-12-01

    Implementation of a computer program package for automated collection and processing of rotation electron diffraction (RED) data is described. The software package contains two computer programs: RED data collection and RED data processing. The RED data collection program controls the transmission electron microscope and the camera. Electron beam tilts at a fine step (0.05-0.20°) are combined with goniometer tilts at a coarse step (2.0-3.0°) around a common tilt axis, which allows a fine relative tilt to be achieved between the electron beam and the crystal in a large tilt range. An electron diffraction (ED) frame is collected at each combination of beam tilt and goniometer tilt. The RED data processing program processes three-dimensional ED data generated by the RED data collection program or by other approaches. It includes shift correction of the ED frames, peak hunting for diffraction spots in individual ED frames and identification of these diffraction spots as reflections in three dimensions. Unit-cell parameters are determined from the positions of reflections in three-dimensional reciprocal space. All reflections are indexed, and finally a list with hkl indices and intensities is output. The data processing program also includes a visualizer to view and analyse three-dimensional reciprocal lattices reconstructed from the ED frames. Details of the implementation are described. Data collection and data processing with the software RED are demonstrated using a calcined zeolite sample, silicalite-1. The structure of the calcined silicalite-1, with 72 unique atoms, could be solved from the RED data by routine direct methods.

  14. Software engineering and data management for automated payload experiment tool

    Science.gov (United States)

    Maddux, Gary A.; Provancha, Anna; Chattam, David

    1994-01-01

    The Microgravity Projects Office identified a need to develop a software package that will lead experiment developers through the development planning process, obtain necessary information, establish an electronic data exchange avenue, and allow easier manipulation/reformatting of the collected information. An MS-DOS compatible software package called the Automated Payload Experiment Tool (APET) has been developed and delivered. The objective of this task is to expand on the results of the APET work previously performed by University of Alabama in Huntsville (UAH) and provide versions of the software in a Macintosh and Windows compatible format. Appendix 1 science requirements document (SRD) Users Manual is attached.

  15. Development of an automated asbestos counting software based on fluorescence microscopy.

    Science.gov (United States)

    Alexandrov, Maxym; Ichida, Etsuko; Nishimura, Tomoki; Aoki, Kousuke; Ishida, Takenori; Hirota, Ryuichi; Ikeda, Takeshi; Kawasaki, Tetsuo; Kuroda, Akio

    2015-01-01

    An emerging alternative to the commonly used analytical methods for asbestos analysis is fluorescence microscopy (FM), which relies on highly specific asbestos-binding probes to distinguish asbestos from interfering non-asbestos fibers. However, all types of microscopic asbestos analysis require laborious examination of large number of fields of view and are prone to subjective errors and large variability between asbestos counts by different analysts and laboratories. A possible solution to these problems is automated counting of asbestos fibers by image analysis software, which would lower the cost and increase the reliability of asbestos testing. This study seeks to develop a fiber recognition and counting software for FM-based asbestos analysis. We discuss the main features of the developed software and the results of its testing. Software testing showed good correlation between automated and manual counts for the samples with medium and high fiber concentrations. At low fiber concentrations, the automated counts were less accurate, leading us to implement correction mode for automated counts. While the full automation of asbestos analysis would require further improvements in accuracy of fiber identification, the developed software could already assist professional asbestos analysts and record detailed fiber dimensions for the use in epidemiological research.

  16. Quality Control in Automated Manufacturing Processes – Combined Features for Image Processing

    Directory of Open Access Journals (Sweden)

    B. Kuhlenkötter

    2006-01-01

    Full Text Available In production processes the use of image processing systems is widespread. Hardware solutions and cameras respectively are available for nearly every application. One important challenge of image processing systems is the development and selection of appropriate algorithms and software solutions in order to realise ambitious quality control for production processes. This article characterises the development of innovative software by combining features for an automatic defect classification on product surfaces. The artificial intelligent method Support Vector Machine (SVM is used to execute the classification task according to the combined features. This software is one crucial element for the automation of a manually operated production process

  17. Sample registration software for process automation in the Neutron Activation Analysis (NAA) Facility in Malaysia nuclear agency

    Energy Technology Data Exchange (ETDEWEB)

    Rahman, Nur Aira Abd, E-mail: nur-aira@nuclearmalaysia.gov.my; Yussup, Nolida; Ibrahim, Maslina Bt. Mohd; Mokhtar, Mukhlis B.; Soh Shaari, Syirrazie Bin Che; Azman, Azraf B. [Technical Support Division, Malaysian Nuclear Agency, 43000, Kajang, Selangor (Malaysia); Salim, Nazaratul Ashifa Bt. Abdullah [Division of Waste and Environmental Technology, Malaysian Nuclear Agency, 43000, Kajang, Selangor (Malaysia); Ismail, Nadiah Binti [Fakulti Kejuruteraan Elektrik, UiTM Pulau Pinang, 13500 Permatang Pauh, Pulau Pinang (Malaysia)

    2015-04-29

    Neutron Activation Analysis (NAA) had been established in Nuclear Malaysia since 1980s. Most of the procedures established were done manually including sample registration. The samples were recorded manually in a logbook and given ID number. Then all samples, standards, SRM and blank were recorded on the irradiation vial and several forms prior to irradiation. These manual procedures carried out by the NAA laboratory personnel were time consuming and not efficient. Sample registration software is developed as part of IAEA/CRP project on ‘Development of Process Automation in the Neutron Activation Analysis (NAA) Facility in Malaysia Nuclear Agency (RC17399)’. The objective of the project is to create a pc-based data entry software during sample preparation stage. This is an effective method to replace redundant manual data entries that needs to be completed by laboratory personnel. The software developed will automatically generate sample code for each sample in one batch, create printable registration forms for administration purpose, and store selected parameters that will be passed to sample analysis program. The software is developed by using National Instruments Labview 8.6.

  18. Sample registration software for process automation in the Neutron Activation Analysis (NAA) Facility in Malaysia nuclear agency

    Science.gov (United States)

    Rahman, Nur Aira Abd; Yussup, Nolida; Salim, Nazaratul Ashifa Bt. Abdullah; Ibrahim, Maslina Bt. Mohd; Mokhtar, Mukhlis B.; Soh@Shaari, Syirrazie Bin Che; Azman, Azraf B.; Ismail, Nadiah Binti

    2015-04-01

    Neutron Activation Analysis (NAA) had been established in Nuclear Malaysia since 1980s. Most of the procedures established were done manually including sample registration. The samples were recorded manually in a logbook and given ID number. Then all samples, standards, SRM and blank were recorded on the irradiation vial and several forms prior to irradiation. These manual procedures carried out by the NAA laboratory personnel were time consuming and not efficient. Sample registration software is developed as part of IAEA/CRP project on `Development of Process Automation in the Neutron Activation Analysis (NAA) Facility in Malaysia Nuclear Agency (RC17399)'. The objective of the project is to create a pc-based data entry software during sample preparation stage. This is an effective method to replace redundant manual data entries that needs to be completed by laboratory personnel. The software developed will automatically generate sample code for each sample in one batch, create printable registration forms for administration purpose, and store selected parameters that will be passed to sample analysis program. The software is developed by using National Instruments Labview 8.6.

  19. Test/score/report: Simulation techniques for automating the test process

    Science.gov (United States)

    Hageman, Barbara H.; Sigman, Clayton B.; Koslosky, John T.

    1994-01-01

    A Test/Score/Report capability is currently being developed for the Transportable Payload Operations Control Center (TPOCC) Advanced Spacecraft Simulator (TASS) system which will automate testing of the Goddard Space Flight Center (GSFC) Payload Operations Control Center (POCC) and Mission Operations Center (MOC) software in three areas: telemetry decommutation, spacecraft command processing, and spacecraft memory load and dump processing. Automated computer control of the acceptance test process is one of the primary goals of a test team. With the proper simulation tools and user interface, the task of acceptance testing, regression testing, and repeatability of specific test procedures of a ground data system can be a simpler task. Ideally, the goal for complete automation would be to plug the operational deliverable into the simulator, press the start button, execute the test procedure, accumulate and analyze the data, score the results, and report the results to the test team along with a go/no recommendation to the test team. In practice, this may not be possible because of inadequate test tools, pressures of schedules, limited resources, etc. Most tests are accomplished using a certain degree of automation and test procedures that are labor intensive. This paper discusses some simulation techniques that can improve the automation of the test process. The TASS system tests the POCC/MOC software and provides a score based on the test results. The TASS system displays statistics on the success of the POCC/MOC system processing in each of the three areas as well as event messages pertaining to the Test/Score/Report processing. The TASS system also provides formatted reports documenting each step performed during the tests and the results of each step. A prototype of the Test/Score/Report capability is available and currently being used to test some POCC/MOC software deliveries. When this capability is fully operational it should greatly reduce the time necessary

  20. Diversity requirements for safety critical software-based automation systems

    International Nuclear Information System (INIS)

    Korhonen, J.; Pulkkinen, U.; Haapanen, P.

    1998-03-01

    System vendors nowadays propose software-based systems even for the most critical safety functions in nuclear power plants. Due to the nature and mechanisms of influence of software faults new methods are needed for the safety and reliability evaluation of these systems. In the research project 'Programmable automation systems in nuclear power plants (OHA)' various safety assessment methods and tools for software based systems are developed and evaluated. This report first discusses the (common cause) failure mechanisms in software-based systems, then defines fault-tolerant system architectures to avoid common cause failures, then studies the various alternatives to apply diversity and their influence on system reliability. Finally, a method for the assessment of diversity is described. Other recently published reports in OHA-report series handles the statistical reliability assessment of software based (STUK-YTO-TR 119), usage models in reliability assessment of software-based systems (STUK-YTO-TR 128) and handling of programmable automation in plant PSA-studies (STUK-YTO-TR 129)

  1. Flexible software architecture for user-interface and machine control in laboratory automation.

    Science.gov (United States)

    Arutunian, E B; Meldrum, D R; Friedman, N A; Moody, S E

    1998-10-01

    We describe a modular, layered software architecture for automated laboratory instruments. The design consists of a sophisticated user interface, a machine controller and multiple individual hardware subsystems, each interacting through a client-server architecture built entirely on top of open Internet standards. In our implementation, the user-interface components are built as Java applets that are downloaded from a server integrated into the machine controller. The user-interface client can thereby provide laboratory personnel with a familiar environment for experiment design through a standard World Wide Web browser. Data management and security are seamlessly integrated at the machine-controller layer using QNX, a real-time operating system. This layer also controls hardware subsystems through a second client-server interface. This architecture has proven flexible and relatively easy to implement and allows users to operate laboratory automation instruments remotely through an Internet connection. The software architecture was implemented and demonstrated on the Acapella, an automated fluid-sample-processing system that is under development at the University of Washington.

  2. AUTOMATION OF CONTROL OF THE BUSINESS PROCESS OF PUBLISHING SCIENTIFIC JOURNALS

    Directory of Open Access Journals (Sweden)

    O. Yu. Sakaliuk

    2016-09-01

    Full Text Available We consider business process automation publishing scientific journals. It describes the focal point of publishing houses Odessa National Academy of Food Technology and the automation of business processes. A complex business process models publishing scientific journals. Analyzed organizational structure of Coordinating Centre of Scientific Journals' Publishing ONAFT structure and created its model. A process model simulation conducted business process notation eEPC and BPMN. Also held database design, creation of file structure and create AIS interface. Implemented interaction with the webcam. Justification feasibility of software development, and the definition of performance based on the results petal chart, it is safe to say that an automated way to much more efficient compared to manual mode. The developed software will accelerate the development of scientific periodicals ONAFT, which in turn improve the academy ratings at the global level, improve its image and credibility.

  3. An Automated Weather Research and Forecasting (WRF)-Based Nowcasting System: Software Description

    Science.gov (United States)

    2013-10-01

    14. ABSTRACT A Web service /Web interface software package has been engineered to address the need for an automated means to run the Weather Research...An Automated Weather Research and Forecasting (WRF)- Based Nowcasting System: Software Description by Stephen F. Kirby, Brian P. Reen, and...Based Nowcasting System: Software Description Stephen F. Kirby, Brian P. Reen, and Robert E. Dumais Jr. Computational and Information Sciences

  4. Honeywell modular automation system computer software documentation

    International Nuclear Information System (INIS)

    Cunningham, L.T.

    1997-01-01

    This document provides a Computer Software Documentation for a new Honeywell Modular Automation System (MAS) being installed in the Plutonium Finishing Plant (PFP). This system will be used to control new thermal stabilization furnaces in HA-21I

  5. PRISM Software: Processing and Review Interface for Strong‐Motion Data

    Science.gov (United States)

    Jones, Jeanne M.; Kalkan, Erol; Stephens, Christopher D.; Ng, Peter

    2017-01-01

    A continually increasing number of high‐quality digital strong‐motion records from stations of the National Strong Motion Project (NSMP) of the U.S. Geological Survey, as well as data from regional seismic networks within the United States, calls for automated processing of strong‐motion records with human review limited to selected significant or flagged records. The NSMP has developed the Processing and Review Interface for Strong Motion data (PRISM) software to meet this need. In combination with the Advanced National Seismic System Quake Monitoring System (AQMS), PRISM automates the processing of strong‐motion records. When used without AQMS, PRISM provides batch‐processing capabilities. The PRISM software is platform independent (coded in Java), open source, and does not depend on any closed‐source or proprietary software. The software consists of two major components: a record processing engine composed of modules for each processing step, and a review tool, which is a graphical user interface for manual review, edit, and processing. To facilitate use by non‐NSMP earthquake engineers and scientists, PRISM (both its processing engine and review tool) is easy to install and run as a stand‐alone system on common operating systems such as Linux, OS X, and Windows. PRISM was designed to be flexible and extensible to accommodate implementation of new processing techniques. All the computing features have been thoroughly tested.

  6. Automated software analysis of nuclear core discharge data

    International Nuclear Information System (INIS)

    Larson, T.W.; Halbig, J.K.; Howell, J.A.; Eccleston, G.W.; Klosterbuer, S.F.

    1993-03-01

    Monitoring the fueling process of an on-load nuclear reactor is a full-time job for nuclear safeguarding agencies. Nuclear core discharge monitors (CDMS) can provide continuous, unattended recording of the reactor's fueling activity for later, qualitative review by a safeguards inspector. A quantitative analysis of this collected data could prove to be a great asset to inspectors because more information can be extracted from the data and the analysis time can be reduced considerably. This paper presents a prototype for an automated software analysis system capable of identifying when fuel bundle pushes occurred and monitoring the power level of the reactor. Neural network models were developed for calculating the region on the reactor face from which the fuel was discharged and predicting the burnup. These models were created and tested using actual data collected from a CDM system at an on-load reactor facility. Collectively, these automated quantitative analysis programs could help safeguarding agencies to gain a better perspective on the complete picture of the fueling activity of an on-load nuclear reactor. This type of system can provide a cost-effective solution for automated monitoring of on-load reactors significantly reducing time and effort

  7. Managing Automation: A Process, Not a Project.

    Science.gov (United States)

    Hoffmann, Ellen

    1988-01-01

    Discussion of issues in management of library automation includes: (1) hardware, including systems growth and contracts; (2) software changes, vendor relations, local systems, and microcomputer software; (3) item and authority databases; (4) automation and library staff, organizational structure, and managing change; and (5) environmental issues,…

  8. Automated remedial assessment methodology software system

    International Nuclear Information System (INIS)

    Whiting, M.; Wilkins, M.; Stiles, D.

    1994-11-01

    The Automated Remedial Analysis Methodology (ARAM) software system has been developed by the Pacific Northwest Laboratory to assist the U.S. Department of Energy (DOE) in evaluating cleanup options for over 10,000 contaminated sites across the DOE complex. The automated methodology comprises modules for decision logic diagrams, technology applicability and effectiveness rules, mass balance equations, cost and labor estimating factors and equations, and contaminant stream routing. ARAM is used to select technologies for meeting cleanup targets; determine the effectiveness of the technologies in destroying, removing, or immobilizing contaminants; decide the nature and amount of secondary waste requiring further treatment; and estimate the cost and labor involved when applying technologies

  9. NeuronMetrics: software for semi-automated processing of cultured neuron images.

    Science.gov (United States)

    Narro, Martha L; Yang, Fan; Kraft, Robert; Wenk, Carola; Efrat, Alon; Restifo, Linda L

    2007-03-23

    Using primary cell culture to screen for changes in neuronal morphology requires specialized analysis software. We developed NeuronMetrics for semi-automated, quantitative analysis of two-dimensional (2D) images of fluorescently labeled cultured neurons. It skeletonizes the neuron image using two complementary image-processing techniques, capturing fine terminal neurites with high fidelity. An algorithm was devised to span wide gaps in the skeleton. NeuronMetrics uses a novel strategy based on geometric features called faces to extract a branch number estimate from complex arbors with numerous neurite-to-neurite contacts, without creating a precise, contact-free representation of the neurite arbor. It estimates total neurite length, branch number, primary neurite number, territory (the area of the convex polygon bounding the skeleton and cell body), and Polarity Index (a measure of neuronal polarity). These parameters provide fundamental information about the size and shape of neurite arbors, which are critical factors for neuronal function. NeuronMetrics streamlines optional manual tasks such as removing noise, isolating the largest primary neurite, and correcting length for self-fasciculating neurites. Numeric data are output in a single text file, readily imported into other applications for further analysis. Written as modules for ImageJ, NeuronMetrics provides practical analysis tools that are easy to use and support batch processing. Depending on the need for manual intervention, processing time for a batch of approximately 60 2D images is 1.0-2.5 h, from a folder of images to a table of numeric data. NeuronMetrics' output accelerates the quantitative detection of mutations and chemical compounds that alter neurite morphology in vitro, and will contribute to the use of cultured neurons for drug discovery.

  10. Progress on standardization and automation in software development on W7X

    International Nuclear Information System (INIS)

    Kühner, Georg; Bluhm, Torsten; Heimann, Peter; Hennig, Christine; Kroiss, Hugo; Krom, Jon; Laqua, Heike; Lewerentz, Marc; Maier, Josef; Schacht, Jörg; Spring, Anett; Werner, Andreas; Zilker, Manfred

    2012-01-01

    Highlights: ► For W7X software development the use of ISO/IEC15504-5 is further extended. ► The standard provides a basis to manage software multi-projects for a large system project. ► Adoption of a scrum-like management allows for quick reaction on priority changes. ► A high degree of software build automation allows for quick responses to user requests. ► It provides additional resources to concentrate work on product quality (ISO/IEC 25000). - Abstract: For a complex experiment like W7X being subject to changes all along its projected lifetime the advantages of a formalized software development method have already been stated. Quality standards like ISO/IEC-12207 provide a guideline for structuring of development work and improving process and product quality. A considerable number of tools has emerged supporting and automating parts of development work. On W7X progress has been made during the last years in exploiting the benefit of automation and management during software development: –Continuous build, integration and automated test of software artefacts. ∘Syntax checks and code quality metrics. ∘Documentation generation. ∘Feedback for developers by temporal statistics. –Versioned repository for build products (libraries, executables). –Separate snapshot and release repositories and automatic deployment. –Semi-automatic provisioning of applications. –Feedback from testers and feature requests by ticket system. This toolset is working efficiently and allows the team to concentrate on development. The activity there is presently focused on increasing the quality of the existing software to become a dependable product. Testing of single functions and qualities must be simplified. So a restructuring is underway which relies more on small, individually testable components with standardized interfaces providing the capability to construct arbitrary function aggregates for dedicated tests of quality attributes as availability, reliability

  11. Progress on standardization and automation in software development on W7X

    Energy Technology Data Exchange (ETDEWEB)

    Kuehner, Georg, E-mail: kuehner@ipp.mpg.de [Max-Planck-Institut fuer Plasmaphysik, Wendelsteinstrasse 1, D-17491 Greifswald (Germany); Bluhm, Torsten [Max-Planck-Institut fuer Plasmaphysik, Wendelsteinstrasse 1, D-17491 Greifswald (Germany); Heimann, Peter [Max-Planck-Institut fuer Plasmaphysik, Boltzmannstrasse 2, D-85748 Garching (Germany); Hennig, Christine [Max-Planck-Institut fuer Plasmaphysik, Wendelsteinstrasse 1, D-17491 Greifswald (Germany); Kroiss, Hugo [Max-Planck-Institut fuer Plasmaphysik, Boltzmannstrasse 2, D-85748 Garching (Germany); Krom, Jon; Laqua, Heike; Lewerentz, Marc [Max-Planck-Institut fuer Plasmaphysik, Wendelsteinstrasse 1, D-17491 Greifswald (Germany); Maier, Josef [Max-Planck-Institut fuer Plasmaphysik, Boltzmannstrasse 2, D-85748 Garching (Germany); Schacht, Joerg; Spring, Anett; Werner, Andreas [Max-Planck-Institut fuer Plasmaphysik, Wendelsteinstrasse 1, D-17491 Greifswald (Germany); Zilker, Manfred [Max-Planck-Institut fuer Plasmaphysik, Boltzmannstrasse 2, D-85748 Garching (Germany)

    2012-12-15

    Highlights: Black-Right-Pointing-Pointer For W7X software development the use of ISO/IEC15504-5 is further extended. Black-Right-Pointing-Pointer The standard provides a basis to manage software multi-projects for a large system project. Black-Right-Pointing-Pointer Adoption of a scrum-like management allows for quick reaction on priority changes. Black-Right-Pointing-Pointer A high degree of software build automation allows for quick responses to user requests. Black-Right-Pointing-Pointer It provides additional resources to concentrate work on product quality (ISO/IEC 25000). - Abstract: For a complex experiment like W7X being subject to changes all along its projected lifetime the advantages of a formalized software development method have already been stated. Quality standards like ISO/IEC-12207 provide a guideline for structuring of development work and improving process and product quality. A considerable number of tools has emerged supporting and automating parts of development work. On W7X progress has been made during the last years in exploiting the benefit of automation and management during software development: -Continuous build, integration and automated test of software artefacts. Ring-Operator Syntax checks and code quality metrics. Ring-Operator Documentation generation. Ring-Operator Feedback for developers by temporal statistics. -Versioned repository for build products (libraries, executables). -Separate snapshot and release repositories and automatic deployment. -Semi-automatic provisioning of applications. -Feedback from testers and feature requests by ticket system. This toolset is working efficiently and allows the team to concentrate on development. The activity there is presently focused on increasing the quality of the existing software to become a dependable product. Testing of single functions and qualities must be simplified. So a restructuring is underway which relies more on small, individually testable components with standardized

  12. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data.

    Science.gov (United States)

    Jaitly, Navdeep; Mayampurath, Anoop; Littlefield, Kyle; Adkins, Joshua N; Anderson, Gordon A; Smith, Richard D

    2009-03-17

    Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS)-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC) elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to identify and quantify peptides in the

  13. Development of a fully automated software system for rapid analysis/processing of the falling weight deflectometer data.

    Science.gov (United States)

    2009-02-01

    The Office of Special Investigations at Iowa Department of Transportation (DOT) collects FWD data on regular basis to evaluate pavement structural conditions. The primary objective of this study was to develop a fully-automated software system for ra...

  14. E-GRANTHALAYA: LIBRARY INFORMATION SCIENCE OPEN SOURCE AUTOMATION SOFTWARE: AN OVERVIEW

    OpenAIRE

    Umaiyorubagam, R.; JohnAnish, R; Jeyapragash, B

    2015-01-01

    The paper describes that Free Library software’s availability on-line. The open source software is available on three categories.They are library automation software, Digital Library software and integrated library packages. The paper discusses these aspect in detail.

  15. Development of Hardware and Software for Automated Ultrasonic Testing

    International Nuclear Information System (INIS)

    Choi, Sung Nam; Lee, Hee Jong; Yang, Seung Ok

    2012-01-01

    Nondestructive testing (NDT) for the construction and operating of NPPs plays an important role in confirming the integrity of the NPPs. Especially, Automated ultrasonic testing (AUT) is one of the primary nondestructive examination methods for in-service inspection of the welding parts in major components in NPPs. AUT is a reliable nondestructive testing because the data of AUT are saved and reviewed with other examiners. Korea Hydro and Nuclear Power-Central Research Institute (KHNP-CRI) has developed an automated ultrasonic testing (AUT) system based on a high speed pulser-receiver. In combination with the designed software and hardware architecture, this new system permits user configurations for a wide range of user-specific applications through fully automated inspections using compact portable systems with up to eight channels. This paper gives an overview of hardware (H/W) and software (S/W) for the AUT system to inspect welds in NPPs

  16. Honeywell Modular Automation System Computer Software Documentation

    International Nuclear Information System (INIS)

    CUNNINGHAM, L.T.

    1999-01-01

    This document provides a Computer Software Documentation for a new Honeywell Modular Automation System (MAS) being installed in the Plutonium Finishing Plant (PFP). This system will be used to control new thermal stabilization furnaces in HA-211 and vertical denitration calciner in HC-230C-2

  17. AUTOMATION OF TRACEABILITY PROCESS AT GRAIN TERMINAL LLC “ UKRTRANSAGRO"

    Directory of Open Access Journals (Sweden)

    F. A. TRISHYN

    2017-07-01

    Full Text Available A positive trend of growth in both grain production and export is indicated. In the current marketing year the export potential of the Ukrainian grain market is close to the record level. However, the high positions in the rating of world exporters are achieved not only due to the high export potential, but also because of higher quality and logistics. These factors depend directly on the quality of enterprise management and all processes occurring at it. One of the perspective ways of enterprise development is the implementation of the traceability system and further automation of the traceability process. European integration laws are obliging Ukrainian enterprises to have a traceability system. Traceability is an ability to follow the movement of a feed or food through specified stages of production, processing and distribution. The process of traceability is managing by people, which implies a human factor. Automation will allow, in a greater extent, to exclude the human factor that will mean decreasing of errors in documentation and will speed up the process of grain transshipment. Research work on the process was carried out on the most modern grain terminal - LLC “UkrTransAgro”. The terminal is located in the Ukrainian water area of the Azov Sea (Mariupol, Ukraine. Characteristics of the terminal: capacity of a simultaneous storage - 48,120 thousand tons, acceptance of crops from transport - 4,500 tons / day; acceptance of crops from railway transport - 3000 tons / day, transshipment capacity - up to 1.2 million tons per year, shipment to the sea vessels - 7000 tons / day. The analysis of the automation level of the grain terminal is carried out. The company uses software from 1C - «1C: Enterprise 8. Accounting for grain elevator, mill, and feed mill for Ukraine». This software is used for quantitative and qualitative registration at the elevator in accordance with industry guidelines and standards. The software product has many

  18. IFDOTMETER: A New Software Application for Automated Immunofluorescence Analysis.

    Science.gov (United States)

    Rodríguez-Arribas, Mario; Pizarro-Estrella, Elisa; Gómez-Sánchez, Rubén; Yakhine-Diop, S M S; Gragera-Hidalgo, Antonio; Cristo, Alejandro; Bravo-San Pedro, Jose M; González-Polo, Rosa A; Fuentes, José M

    2016-04-01

    Most laboratories interested in autophagy use different imaging software for managing and analyzing heterogeneous parameters in immunofluorescence experiments (e.g., LC3-puncta quantification and determination of the number and size of lysosomes). One solution would be software that works on a user's laptop or workstation that can access all image settings and provide quick and easy-to-use analysis of data. Thus, we have designed and implemented an application called IFDOTMETER, which can run on all major operating systems because it has been programmed using JAVA (Sun Microsystems). Briefly, IFDOTMETER software has been created to quantify a variety of biological hallmarks, including mitochondrial morphology and nuclear condensation. The program interface is intuitive and user-friendly, making it useful for users not familiar with computer handling. By setting previously defined parameters, the software can automatically analyze a large number of images without the supervision of the researcher. Once analysis is complete, the results are stored in a spreadsheet. Using software for high-throughput cell image analysis offers researchers the possibility of performing comprehensive and precise analysis of a high number of images in an automated manner, making this routine task easier. © 2015 Society for Laboratory Automation and Screening.

  19. Development of Process Automation in the Neutron Activation Analysis Facility in Malaysian Nuclear Agency

    International Nuclear Information System (INIS)

    Yussup, N.; Azman, A.; Ibrahim, M.M.; Rahman, N.A.A.; Che Sohashaari, S.; Atan, M.N.; Hamzah, M.A.; Mokhtar, M.; Khalid, M.A.; Salim, N.A.A.; Hamzah, M.S.

    2018-01-01

    Neutron Activation Analysis (NAA) has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s. Most of the procedures established from sample registration to analysis are performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient. Hence, system automation is developed in order to provide an effective method to replace redundant manual data entries and produce faster sample analysis and calculation process. This report explains NAA process in Nuclear Malaysia and describes the automation development in detail which includes sample registration software, automatic sample changer system which consists of hardware and software; and sample analysis software. (author)

  20. Automation of Flight Software Regression Testing

    Science.gov (United States)

    Tashakkor, Scott B.

    2016-01-01

    NASA is developing the Space Launch System (SLS) to be a heavy lift launch vehicle supporting human and scientific exploration beyond earth orbit. SLS will have a common core stage, an upper stage, and different permutations of boosters and fairings to perform various crewed or cargo missions. Marshall Space Flight Center (MSFC) is writing the Flight Software (FSW) that will operate the SLS launch vehicle. The FSW is developed in an incremental manner based on "Agile" software techniques. As the FSW is incrementally developed, testing the functionality of the code needs to be performed continually to ensure that the integrity of the software is maintained. Manually testing the functionality on an ever-growing set of requirements and features is not an efficient solution and therefore needs to be done automatically to ensure testing is comprehensive. To support test automation, a framework for a regression test harness has been developed and used on SLS FSW. The test harness provides a modular design approach that can compile or read in the required information specified by the developer of the test. The modularity provides independence between groups of tests and the ability to add and remove tests without disturbing others. This provides the SLS FSW team a time saving feature that is essential to meeting SLS Program technical and programmatic requirements. During development of SLS FSW, this technique has proved to be a useful tool to ensure all requirements have been tested, and that desired functionality is maintained, as changes occur. It also provides a mechanism for developers to check functionality of the code that they have developed. With this system, automation of regression testing is accomplished through a scheduling tool and/or commit hooks. Key advantages of this test harness capability includes execution support for multiple independent test cases, the ability for developers to specify precisely what they are testing and how, the ability to add

  1. Prototype Software for Automated Structural Analysis of Systems

    DEFF Research Database (Denmark)

    Jørgensen, A.; Izadi-Zamanabadi, Roozbeh; Kristensen, M.

    2004-01-01

    In this paper we present a prototype software tool that is developed to analyse the structural model of automated systems in order to identify redundant information that is hence utilized for Fault detection and Isolation (FDI) purposes. The dedicated algorithms in this software tool use a tri......-partite graph that represents the structural model of the system. A component-based approach has been used to address issues such as system complexity and reconfigurability possibilities....

  2. AUTOMATION OF IMAGE DATA PROCESSING

    Directory of Open Access Journals (Sweden)

    Preuss Ryszard

    2014-12-01

    Full Text Available This article discusses the current capabilities of automate processing of the image data on the example of using PhotoScan software by Agisoft . At present, image data obtained by various registration systems (metric and non - metric cameras placed on airplanes , satellites , or more often on UAVs is used to create photogrammetric products. Multiple registrations of object or land area (large groups of photos are captured are usually performed in order to eliminate obscured area as well as to raise the final accuracy of the photogrammetric product. Because of such a situation t he geometry of the resulting image blocks is far from the typical configuration of images . For fast images georeferencing automatic image matching algorithms are currently applied . They can create a model of a block in the local coordinate system or using initial exterior orientation and measured control points can provide image georeference in an external reference frame. In the case of non - metric image application, it is also possible to carry out self - calibration process at this stage . Image matching algorithm is also used in generation of dense point clouds reconstructing spatial shape of the object ( area. In subsequent processing steps it is possible to obtain typical photogrammetric products such as orthomosaic , DSM or DTM and a photorealistic solid model of an object . All aforementioned processing steps are implemented in a single program in contrary to standard commercial software dividing all steps into dedicated modules . I mage processing leading to final geo referenced products can be fully automated including sequential implementation of the processing steps at predetermined control parameters . The paper presents the practical results of the application fully automatic generation of othomosaic for both images obtained by a metric Vexell camera and a block of images acquired by a non - metric UAV system.

  3. Licensing process for safety-critical software-based systems

    Energy Technology Data Exchange (ETDEWEB)

    Haapanen, P. [VTT Automation, Espoo (Finland); Korhonen, J. [VTT Electronics, Espoo (Finland); Pulkkinen, U. [VTT Automation, Espoo (Finland)

    2000-12-01

    System vendors nowadays propose software-based technology even for the most critical safety functions in nuclear power plants. Due to the nature of software faults and the way they cause system failures new methods are needed for the safety and reliability evaluation of these systems. In the research project 'Programmable automation systems in nuclear power plants (OHA)', financed together by the Radiation and Nuclear Safety Authority (STUK), the Ministry of Trade and Industry (KTM) and the Technical Research Centre of Finland (VTT), various safety assessment methods and tools for software based systems are developed and evaluated. As a part of the OHA-work a reference model for the licensing process for software-based safety automation systems is defined. The licensing process is defined as the set of interrelated activities whose purpose is to produce and assess evidence concerning the safety and reliability of the system/application to be licensed and to make the decision about the granting the construction and operation permissions based on this evidence. The parties of the licensing process are the authority, the licensee (the utility company), system vendors and their subcontractors and possible external independent assessors. The responsibility about the production of the evidence in first place lies at the licensee who in most cases rests heavily on the vendor expertise. The evaluation and gauging of the evidence is carried out by the authority (possibly using external experts), who also can acquire additional evidence by using their own (independent) methods and tools. Central issue in the licensing process is to combine the quality evidence about the system development process with the information acquired through tests, analyses and operational experience. The purpose of the licensing process described in this report is to act as a reference model both for the authority and the licensee when planning the licensing of individual applications

  4. Licensing process for safety-critical software-based systems

    International Nuclear Information System (INIS)

    Haapanen, P.; Korhonen, J.; Pulkkinen, U.

    2000-12-01

    System vendors nowadays propose software-based technology even for the most critical safety functions in nuclear power plants. Due to the nature of software faults and the way they cause system failures new methods are needed for the safety and reliability evaluation of these systems. In the research project 'Programmable automation systems in nuclear power plants (OHA)', financed together by the Radiation and Nuclear Safety Authority (STUK), the Ministry of Trade and Industry (KTM) and the Technical Research Centre of Finland (VTT), various safety assessment methods and tools for software based systems are developed and evaluated. As a part of the OHA-work a reference model for the licensing process for software-based safety automation systems is defined. The licensing process is defined as the set of interrelated activities whose purpose is to produce and assess evidence concerning the safety and reliability of the system/application to be licensed and to make the decision about the granting the construction and operation permissions based on this evidence. The parties of the licensing process are the authority, the licensee (the utility company), system vendors and their subcontractors and possible external independent assessors. The responsibility about the production of the evidence in first place lies at the licensee who in most cases rests heavily on the vendor expertise. The evaluation and gauging of the evidence is carried out by the authority (possibly using external experts), who also can acquire additional evidence by using their own (independent) methods and tools. Central issue in the licensing process is to combine the quality evidence about the system development process with the information acquired through tests, analyses and operational experience. The purpose of the licensing process described in this report is to act as a reference model both for the authority and the licensee when planning the licensing of individual applications. Many of the

  5. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data

    Directory of Open Access Journals (Sweden)

    Anderson Gordon A

    2009-03-01

    Full Text Available Abstract Background Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. Results With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to

  6. Prototype Software for Automated Structural Analysis of Systems

    DEFF Research Database (Denmark)

    Jørgensen, A.; Izadi-Zamanabadi, Roozbeh; Kristensen, M.

    2004-01-01

    In this paper we present a prototype software tool that is developed to analyse the structural model of automated systems in order to identify redundant information that is hence utilized for Fault detection and Isolation (FDI) purposes. The dedicated algorithms in this software tool use a tri......-partite graph that represents the structural model of the system. A component-based approach has been used to address issues such as system complexity and recon¯gurability possibilities....

  7. Automated Freedom from Interference Analysis for Automotive Software

    OpenAIRE

    Leitner-Fischer , Florian; Leue , Stefan; Liu , Sirui

    2016-01-01

    International audience; Freedom from Interference for automotive software systems developed according to the ISO 26262 standard means that a fault in a less safety critical software component will not lead to a fault in a more safety critical component. It is an important concern in the realm of functional safety for automotive systems. We present an automated method for the analysis of concurrency-related interferences based on the QuantUM approach and tool that we have previously developed....

  8. Failure mode and effects analysis of software-based automation systems

    International Nuclear Information System (INIS)

    Haapanen, P.; Helminen, A.

    2002-08-01

    Failure mode and effects analysis (FMEA) is one of the well-known analysis methods having an established position in the traditional reliability analysis. The purpose of FMEA is to identify possible failure modes of the system components, evaluate their influences on system behaviour and propose proper countermeasures to suppress these effects. The generic nature of FMEA has enabled its wide use in various branches of industry reaching from business management to the design of spaceships. The popularity and diverse use of the analysis method has led to multiple interpretations, practices and standards presenting the same analysis method. FMEA is well understood at the systems and hardware levels, where the potential failure modes usually are known and the task is to analyse their effects on system behaviour. Nowadays, more and more system functions are realised on software level, which has aroused the urge to apply the FMEA methodology also on software based systems. Software failure modes generally are unknown - 'software modules do not fail, they only display incorrect behaviour' - and depend on dynamic behaviour of the application. These facts set special requirements on the FMEA of software based systems and make it difficult to realise. In this report the failure mode and effects analysis is studied for the use of reliability analysis of software-based systems. More precisely, the target system of FMEA is defined to be a safety-critical software-based automation application in a nuclear power plant, implemented on an industrial automation system platform. Through a literature study the report tries to clarify the intriguing questions related to the practical use of software failure mode and effects analysis. The study is a part of the research project 'Programmable Automation System Safety Integrity assessment (PASSI)', belonging to the Finnish Nuclear Safety Research Programme (FINNUS, 1999-2002). In the project various safety assessment methods and tools for

  9. Automated transportation management system (ATMS) software project management plan (SPMP)

    Energy Technology Data Exchange (ETDEWEB)

    Weidert, R.S., Westinghouse Hanford

    1996-05-20

    The Automated Transportation Management System (ATMS) Software Project Management plan (SPMP) is the lead planning document governing the life cycle of the ATMS and its integration into the Transportation Information Network (TIN). This SPMP defines the project tasks, deliverables, and high level schedules involved in developing the client/server ATMS software.

  10. Automating the Fireshed Assessment Process with ArcGIS

    Science.gov (United States)

    Alan Ager; Klaus Barber

    2006-01-01

    A library of macros was developed to automate the Fireshed process within ArcGIS. The macros link a number of vegetation simulation and wildfire behavior models (FVS, SVS, FARSITE, and FlamMap) with ESRI geodatabases, desktop software (Access, Excel), and ArcGIS. The macros provide for (1) an interactive linkage between digital imagery, vegetation data, FVS-FFE, and...

  11. Computer-Aided Software Engineering - An approach to real-time software development

    Science.gov (United States)

    Walker, Carrie K.; Turkovich, John J.

    1989-01-01

    A new software engineering discipline is Computer-Aided Software Engineering (CASE), a technology aimed at automating the software development process. This paper explores the development of CASE technology, particularly in the area of real-time/scientific/engineering software, and a history of CASE is given. The proposed software development environment for the Advanced Launch System (ALS CASE) is described as an example of an advanced software development system for real-time/scientific/engineering (RT/SE) software. The Automated Programming Subsystem of ALS CASE automatically generates executable code and corresponding documentation from a suitably formatted specification of the software requirements. Software requirements are interactively specified in the form of engineering block diagrams. Several demonstrations of the Automated Programming Subsystem are discussed.

  12. MATHEMATICAL MODEL FOR SOFTWARE USABILITY AUTOMATED EVALUATION AND ASSURANCE

    Directory of Open Access Journals (Sweden)

    І. Гученко

    2011-04-01

    Full Text Available The subject of the research is software usability and the aim is construction of mathematicalmodel of estimation and providing of the set level of usability. Methodology of structural analysis,methods of multicriterion optimization and theory of making decision, method of convolution,scientific methods of analysis and analogies is used in the research. The result of executed work isthe model for software usability automated evaluation and assurance that allows not only toestimate the current level of usability during every iteration of agile development but also tomanage the usability of created software products. Results can be used for the construction ofautomated support systems of management the software usability.

  13. Possibilities for using software tools in the process of secuirty design

    Directory of Open Access Journals (Sweden)

    Ladislav Mariš

    2013-07-01

    Full Text Available The authors deal with the use of software support the process of security design. The article proposes the theoretical basis of the implementation of software tools to design activities. Based on the selected design standards of electrical safety systems application design solutions, especially in drawing documentation. The article should serve the needs of the project team members in order to use selected software tools and a subsequent increase in the degree of automation of design activities.

  14. Towards automated construction of dependable software/hardware systems

    Energy Technology Data Exchange (ETDEWEB)

    Yakhnis, A.; Yakhnis, V. [Pioneer Technologies & Rockwell Science Center, Albuquerque, NM (United States)

    1997-11-01

    This report contains viewgraphs on the automated construction of dependable computer architecture systems. The outline of this report is: examples of software/hardware systems; dependable systems; partial delivery of dependability; proposed approach; removing obstacles; advantages of the approach; criteria for success; current progress of the approach; and references.

  15. ENHANCING SEISMIC CALIBRATION RESEARCH THROUGH SOFTWARE AUTOMATION AND SCIENTIFIC INFORMATION MANAGEMENT

    Energy Technology Data Exchange (ETDEWEB)

    Ruppert, S; Dodge, D A; Ganzberger, M D; Hauk, T F; Matzel, E M

    2008-07-03

    The National Nuclear Security Administration (NNSA) Ground-Based Nuclear Explosion Monitoring Research and Development (GNEMRD) Program at LLNL continues to make significant progress enhancing the process of deriving seismic calibrations and performing scientific integration, analysis, and information management with software automation tools. Our tool efforts address the problematic issues of very large datasets and varied formats encountered during seismic calibration research. New information management and analysis tools have resulted in demonstrated gains in efficiency of producing scientific data products and improved accuracy of derived seismic calibrations. The foundation of a robust, efficient data development and processing environment is comprised of many components built upon engineered versatile libraries. We incorporate proven industry 'best practices' throughout our code and apply source code and bug tracking management as well as automatic generation and execution of unit tests for our experimental, development and production lines. Significant software engineering and development efforts have produced an object-oriented framework that provides database centric coordination between scientific tools, users, and data. Over a half billion parameters, signals, measurements, and metadata entries are all stored in a relational database accessed by an extensive object-oriented multi-technology software framework that includes stored procedures, real-time transactional database triggers and constraints, as well as coupled Java and C++ software libraries to handle the information interchange and validation requirements. Significant resources were applied to schema design to enable management of processing methods and station parameters, responses and metadata. This allowed for the development of merged ground-truth (GT) data sets compiled by the NNSA labs and AFTAC that include hundreds of thousands of events and tens of millions of arrivals. The

  16. The Orion GN and C Data-Driven Flight Software Architecture for Automated Sequencing and Fault Recovery

    Science.gov (United States)

    King, Ellis; Hart, Jeremy; Odegard, Ryan

    2010-01-01

    The Orion Crew Exploration Vehicle (CET) is being designed to include significantly more automation capability than either the Space Shuttle or the International Space Station (ISS). In particular, the vehicle flight software has requirements to accommodate increasingly automated missions throughout all phases of flight. A data-driven flight software architecture will provide an evolvable automation capability to sequence through Guidance, Navigation & Control (GN&C) flight software modes and configurations while maintaining the required flexibility and human control over the automation. This flexibility is a key aspect needed to address the maturation of operational concepts, to permit ground and crew operators to gain trust in the system and mitigate unpredictability in human spaceflight. To allow for mission flexibility and reconfrgurability, a data driven approach is being taken to load the mission event plan as well cis the flight software artifacts associated with the GN&C subsystem. A database of GN&C level sequencing data is presented which manages and tracks the mission specific and algorithm parameters to provide a capability to schedule GN&C events within mission segments. The flight software data schema for performing automated mission sequencing is presented with a concept of operations for interactions with ground and onboard crew members. A prototype architecture for fault identification, isolation and recovery interactions with the automation software is presented and discussed as a forward work item.

  17. Knowledge-based requirements analysis for automating software development

    Science.gov (United States)

    Markosian, Lawrence Z.

    1988-01-01

    We present a new software development paradigm that automates the derivation of implementations from requirements. In this paradigm, informally-stated requirements are expressed in a domain-specific requirements specification language. This language is machine-understable and requirements expressed in it are captured in a knowledge base. Once the requirements are captured, more detailed specifications and eventually implementations are derived by the system using transformational synthesis. A key characteristic of the process is that the required human intervention is in the form of providing problem- and domain-specific engineering knowledge, not in writing detailed implementations. We describe a prototype system that applies the paradigm in the realm of communication engineering: the prototype automatically generates implementations of buffers following analysis of the requirements on each buffer.

  18. AUTOMATED PROCESSING OF DAIRY PRODUCT MICROPHOTOS USING IMAGEJ AND STATISTICA

    Directory of Open Access Journals (Sweden)

    V. K. Bitiukov

    2014-01-01

    Full Text Available Summary. The article discusses the construction of algorithms for automated processing of microphotos of dairy products. Automated processing of micro photos of dairy products relevant in the study of the degree of homogenization. Microphotos of dairy products contain information about the distribution of fat globules in the mass fractions. Today, there are some of software products, offering image processing and relieving researchers from routine operations manual data processing. But it need to be adapted for performing the processing of microphotos of dairy products. In this paper we propose to use for processing the application package ImageJ for processing image files taken with digital microscope, and to calculate the statistical characteristics of the proposed use of the software package Statistica. Processing algorithm consists of successive stages of conversion to gray scale, scaling, filtering, binarization, object recognition and statistical processing of the results of recognition. The result of the implemented data processing algorithms is the distribution function of the fat globules in terms of volume or mass fraction, as well as the statistical parameters of the distribution (the mathematical expectation, variance, skewness and kurtosis coefficients. For the inspection of the algorithm and its debugging experimental studieswere carried out. Carries out the homogenization of farm milk at different pressures of homogenization. For each sample were made microphoto sand image processing carried out in accordance with the proposed algorithm. Studies have shown the effectiveness and feasibility of the proposed algorithm in the form of java script for ImageJ and then send the data to a file for the software package Statistica.

  19. Verification and Validation in a Rapid Software Development Process

    Science.gov (United States)

    Callahan, John R.; Easterbrook, Steve M.

    1997-01-01

    The high cost of software production is driving development organizations to adopt more automated design and analysis methods such as rapid prototyping, computer-aided software engineering (CASE) tools, and high-level code generators. Even developers of safety-critical software system have adopted many of these new methods while striving to achieve high levels Of quality and reliability. While these new methods may enhance productivity and quality in many cases, we examine some of the risks involved in the use of new methods in safety-critical contexts. We examine a case study involving the use of a CASE tool that automatically generates code from high-level system designs. We show that while high-level testing on the system structure is highly desirable, significant risks exist in the automatically generated code and in re-validating releases of the generated code after subsequent design changes. We identify these risks and suggest process improvements that retain the advantages of rapid, automated development methods within the quality and reliability contexts of safety-critical projects.

  20. Flexible test automation a software framework for easily developing measurement applications

    CERN Document Server

    Arpaia, Pasquale; De Matteis, Ernesto

    2014-01-01

    In laboratory management of an industrial test division, a test laboratory, or a research center, one of the main activities is producing suitable software for automatic benches by satisfying a given set of requirements. This activity is particularly costly and burdensome when test requirements are variable over time. If the batches of objects have small size and frequent occurrence, the activity of measurement automation becomes predominating with respect to the test execution. Flexible Test Automation shows the development of a software framework as a useful solution to satisfy this exigency. The framework supports the user in producing measurement applications for a wide range of requirements with low effort and development time.

  1. AmeriFlux Data Processing: Integrating automated and manual data management across software technologies and an international network to generate timely data products

    Science.gov (United States)

    Christianson, D. S.; Beekwilder, N.; Chan, S.; Cheah, Y. W.; Chu, H.; Dengel, S.; O'Brien, F.; Pastorello, G.; Sandesh, M.; Torn, M. S.; Agarwal, D.

    2017-12-01

    AmeriFlux is a network of scientists who independently collect eddy covariance and related environmental observations at over 250 locations across the Americas. As part of the AmeriFlux Management Project, the AmeriFlux Data Team manages standardization, collection, quality assurance / quality control (QA/QC), and distribution of data submitted by network members. To generate data products that are timely, QA/QC'd, and repeatable, and have traceable provenance, we developed a semi-automated data processing pipeline. The new pipeline consists of semi-automated format and data QA/QC checks. Results are communicated via on-line reports as well as an issue-tracking system. Data processing time has been reduced from 2-3 days to a few hours of manual review time, resulting in faster data availability from the time of data submission. The pipeline is scalable to the network level and has the following key features. (1) On-line results of the format QA/QC checks are available immediately for data provider review. This enables data providers to correct and resubmit data quickly. (2) The format QA/QC assessment includes an automated attempt to fix minor format errors. Data submissions that are formatted in the new AmeriFlux FP-In standard can be queued for the data QA/QC assessment, often with minimal delay. (3) Automated data QA/QC checks identify and communicate potentially erroneous data via online, graphical quick views that highlight observations with unexpected values, incorrect units, time drifts, invalid multivariate correlations, and/or radiation shadows. (4) Progress through the pipeline is integrated with an issue-tracking system that facilitates communications between data providers and the data processing team in an organized and searchable fashion. Through development of these and other features of the pipeline, we present solutions to challenges that include optimizing automated with manual processing, bridging legacy data management infrastructure with

  2. Process automation system for integration and operation of Large Volume Plasma Device

    International Nuclear Information System (INIS)

    Sugandhi, R.; Srivastava, P.K.; Sanyasi, A.K.; Srivastav, Prabhakar; Awasthi, L.M.; Mattoo, S.K.

    2016-01-01

    Highlights: • Analysis and design of process automation system for Large Volume Plasma Device (LVPD). • Data flow modeling for process model development. • Modbus based data communication and interfacing. • Interface software development for subsystem control in LabVIEW. - Abstract: Large Volume Plasma Device (LVPD) has been successfully contributing towards understanding of the plasma turbulence driven by Electron Temperature Gradient (ETG), considered as a major contributor for the plasma loss in the fusion devices. Large size of the device imposes certain difficulties in the operation, such as access of the diagnostics, manual control of subsystems and large number of signals monitoring etc. To achieve integrated operation of the machine, automation is essential for the enhanced performance and operational efficiency. Recently, the machine is undergoing major upgradation for the new physics experiments. The new operation and control system consists of following: (1) PXIe based fast data acquisition system for the equipped diagnostics; (2) Modbus based Process Automation System (PAS) for the subsystem controls and (3) Data Utilization System (DUS) for efficient storage, processing and retrieval of the acquired data. In the ongoing development, data flow model of the machine’s operation has been developed. As a proof of concept, following two subsystems have been successfully integrated: (1) Filament Power Supply (FPS) for the heating of W- filaments based plasma source and (2) Probe Positioning System (PPS) for control of 12 number of linear probe drives for a travel length of 100 cm. The process model of the vacuum production system has been prepared and validated against acquired pressure data. In the next upgrade, all the subsystems of the machine will be integrated in a systematic manner. The automation backbone is based on 4-wire multi-drop serial interface (RS485) using Modbus communication protocol. Software is developed on LabVIEW platform using

  3. Process automation system for integration and operation of Large Volume Plasma Device

    Energy Technology Data Exchange (ETDEWEB)

    Sugandhi, R., E-mail: ritesh@ipr.res.in; Srivastava, P.K.; Sanyasi, A.K.; Srivastav, Prabhakar; Awasthi, L.M.; Mattoo, S.K.

    2016-11-15

    Highlights: • Analysis and design of process automation system for Large Volume Plasma Device (LVPD). • Data flow modeling for process model development. • Modbus based data communication and interfacing. • Interface software development for subsystem control in LabVIEW. - Abstract: Large Volume Plasma Device (LVPD) has been successfully contributing towards understanding of the plasma turbulence driven by Electron Temperature Gradient (ETG), considered as a major contributor for the plasma loss in the fusion devices. Large size of the device imposes certain difficulties in the operation, such as access of the diagnostics, manual control of subsystems and large number of signals monitoring etc. To achieve integrated operation of the machine, automation is essential for the enhanced performance and operational efficiency. Recently, the machine is undergoing major upgradation for the new physics experiments. The new operation and control system consists of following: (1) PXIe based fast data acquisition system for the equipped diagnostics; (2) Modbus based Process Automation System (PAS) for the subsystem controls and (3) Data Utilization System (DUS) for efficient storage, processing and retrieval of the acquired data. In the ongoing development, data flow model of the machine’s operation has been developed. As a proof of concept, following two subsystems have been successfully integrated: (1) Filament Power Supply (FPS) for the heating of W- filaments based plasma source and (2) Probe Positioning System (PPS) for control of 12 number of linear probe drives for a travel length of 100 cm. The process model of the vacuum production system has been prepared and validated against acquired pressure data. In the next upgrade, all the subsystems of the machine will be integrated in a systematic manner. The automation backbone is based on 4-wire multi-drop serial interface (RS485) using Modbus communication protocol. Software is developed on LabVIEW platform using

  4. Automated Theorem Proving in High-Quality Software Design

    Science.gov (United States)

    Schumann, Johann; Swanson, Keith (Technical Monitor)

    2001-01-01

    The amount and complexity of software developed during the last few years has increased tremendously. In particular, programs are being used more and more in embedded systems (from car-brakes to plant-control). Many of these applications are safety-relevant, i.e. a malfunction of hardware or software can cause severe damage or loss. Tremendous risks are typically present in the area of aviation, (nuclear) power plants or (chemical) plant control. Here, even small problems can lead to thousands of casualties and huge financial losses. Large financial risks also exist when computer systems are used in the area of telecommunication (telephone, electronic commerce) or space exploration. Computer applications in this area are not only subject to safety considerations, but also security issues are important. All these systems must be designed and developed to guarantee high quality with respect to safety and security. Even in an industrial setting which is (or at least should be) aware of the high requirements in Software Engineering, many incidents occur. For example, the Warshaw Airbus crash, was caused by an incomplete requirements specification. Uncontrolled reuse of an Ariane 4 software module was the reason for the Ariane 5 disaster. Some recent incidents in the telecommunication area, like illegal "cloning" of smart-cards of D2GSM handies, or the extraction of (secret) passwords from German T-online users show that also in this area serious flaws can happen. Due to the inherent complexity of computer systems, most authors claim that only a rigorous application of formal methods in all stages of the software life cycle can ensure high quality of the software and lead to real safe and secure systems. In this paper, we will have a look, in how far automated theorem proving can contribute to a more widespread application of formal methods and their tools, and what automated theorem provers (ATPs) must provide in order to be useful.

  5. An Automation Survival Guide for Media Centers.

    Science.gov (United States)

    Whaley, Roger E.

    1989-01-01

    Reviews factors that should affect the decision to automate a school media center and offers suggestions for the automation process. Topics discussed include getting the library collection ready for automation, deciding what automated functions are needed, evaluating software vendors, selecting software, and budgeting. (CLB)

  6. TreeRipper web application: towards a fully automated optical tree recognition software

    Directory of Open Access Journals (Sweden)

    Hughes Joseph

    2011-05-01

    Full Text Available Abstract Background Relationships between species, genes and genomes have been printed as trees for over a century. Whilst this may have been the best format for exchanging and sharing phylogenetic hypotheses during the 20th century, the worldwide web now provides faster and automated ways of transferring and sharing phylogenetic knowledge. However, novel software is needed to defrost these published phylogenies for the 21st century. Results TreeRipper is a simple website for the fully-automated recognition of multifurcating phylogenetic trees (http://linnaeus.zoology.gla.ac.uk/~jhughes/treeripper/. The program accepts a range of input image formats (PNG, JPG/JPEG or GIF. The underlying command line c++ program follows a number of cleaning steps to detect lines, remove node labels, patch-up broken lines and corners and detect line edges. The edge contour is then determined to detect the branch length, tip label positions and the topology of the tree. Optical Character Recognition (OCR is used to convert the tip labels into text with the freely available tesseract-ocr software. 32% of images meeting the prerequisites for TreeRipper were successfully recognised, the largest tree had 115 leaves. Conclusions Despite the diversity of ways phylogenies have been illustrated making the design of a fully automated tree recognition software difficult, TreeRipper is a step towards automating the digitization of past phylogenies. We also provide a dataset of 100 tree images and associated tree files for training and/or benchmarking future software. TreeRipper is an open source project licensed under the GNU General Public Licence v3.

  7. The Business Case for Automated Software Engineering

    Science.gov (United States)

    Menzies, Tim; Elrawas, Oussama; Hihn, Jairus M.; Feather, Martin S.; Madachy, Ray; Boehm, Barry

    2007-01-01

    Adoption of advanced automated SE (ASE) tools would be more favored if a business case could be made that these tools are more valuable than alternate methods. In theory, software prediction models can be used to make that case. In practice, this is complicated by the 'local tuning' problem. Normally. predictors for software effort and defects and threat use local data to tune their predictions. Such local tuning data is often unavailable. This paper shows that assessing the relative merits of different SE methods need not require precise local tunings. STAR 1 is a simulated annealer plus a Bayesian post-processor that explores the space of possible local tunings within software prediction models. STAR 1 ranks project decisions by their effects on effort and defects and threats. In experiments with NASA systems. STARI found one project where ASE were essential for minimizing effort/ defect/ threats; and another project were ASE tools were merely optional.

  8. Simulation based optimization on automated fibre placement process

    Science.gov (United States)

    Lei, Shi

    2018-02-01

    In this paper, a software simulation (Autodesk TruPlan & TruFiber) based method is proposed to optimize the automate fibre placement (AFP) process. Different types of manufacturability analysis are introduced to predict potential defects. Advanced fibre path generation algorithms are compared with respect to geometrically different parts. Major manufacturing data have been taken into consideration prior to the tool paths generation to achieve high success rate of manufacturing.

  9. Future Trends in Process Automation

    OpenAIRE

    Jämsä-Jounela, Sirkka-Liisa

    2007-01-01

    The importance of automation in the process industries has increased dramatically in recent years. In the highly industrialized countries, process automation serves to enhance product quality, master the whole range of products, improve process safety and plant availability, efficiently utilize resources and lower emissions. In the rapidly developing countries, mass production is the main motivation for applying process automation. The greatest demand for process automation is in the chemical...

  10. Software development based on high speed PC oscilloscope for automated pulsed magnetic field measurement system

    International Nuclear Information System (INIS)

    Sun Yuxiang; Shang Lei; Li Ji; Ge Lei

    2011-01-01

    It introduces a method of a software development which is based on high speed PC oscilloscope for pulsed magnetic field measurement system. The previous design has been improved by this design, high-speed virtual oscilloscope has been used in the field for the first time. In the design, the automatic data acquisition, data process, data analysis and storage have been realized. Automated point checking reduces the workload. The use of precise motion bench increases the positioning accuracy. The software gets the data from PC oscilloscope by calling DLLs and includes the function of oscilloscope, such as trigger, ranges, and sample rate setting etc. Spline Interpolation and Bandstop Filter are used to denoise the signals. The core of the software is the state machine which controls the motion of stepper motors and data acquisition and stores the data automatically. NI Vision Acquisition Software and Database Connectivity Toolkit make the video surveillance of laboratory and MySQL database connectivity available. The raw signal and processed signal have been compared in this paper. The waveform has been greatly improved by the signal processing. (authors)

  11. Automation of Test Cases for Web Applications : Automation of CRM Test Cases

    OpenAIRE

    Seyoum, Alazar

    2012-01-01

    The main theme of this project was to design a test automation framework for automating web related test cases. Automating test cases designed for testing a web interface provide a means of improving a software development process by shortening the testing phase in the software development life cycle. In this project an existing AutoTester framework and iMacros test automation tools were used. CRM Test Agent was developed to integrate AutoTester to iMacros and to enable the AutoTester,...

  12. INFORMATION SYSTEM OF AUTOMATION OF PREPARATION EDUCATIONAL PROCESS DOCUMENTS

    Directory of Open Access Journals (Sweden)

    V. A. Matyushenko

    2016-01-01

    Full Text Available Information technology is rapidly conquering the world, permeating all spheres of human activity. Education is not an exception. An important direction of information of education is the development of university management systems. Modern information systems improve and facilitate the management of all types of activities of the institution. The purpose of this paper is development of system, which allows automating process of formation of accounting documents. The article describes the problem of preparation of the educational process documents. Decided to project and create the information system in Microsoft Access environment. The result is four types of reports obtained by using the developed system. The use of this system now allows you to automate the process and reduce the effort required to prepare accounting documents. All reports was implement in Microsoft Excel software product and can be used for further analysis and processing.

  13. An automation of design and modelling tasks in NX Siemens environment with original software - generator module

    Science.gov (United States)

    Zbiciak, M.; Grabowik, C.; Janik, W.

    2015-11-01

    Nowadays the design constructional process is almost exclusively aided with CAD/CAE/CAM systems. It is evaluated that nearly 80% of design activities have a routine nature. These design routine tasks are highly susceptible to automation. Design automation is usually made with API tools which allow building original software responsible for adding different engineering activities. In this paper the original software worked out in order to automate engineering tasks at the stage of a product geometrical shape design is presented. The elaborated software works exclusively in NX Siemens CAD/CAM/CAE environment and was prepared in Microsoft Visual Studio with application of the .NET technology and NX SNAP library. The software functionality allows designing and modelling of spur and helicoidal involute gears. Moreover, it is possible to estimate relative manufacturing costs. With the Generator module it is possible to design and model both standard and non-standard gear wheels. The main advantage of the model generated in such a way is its better representation of an involute curve in comparison to those which are drawn in specialized standard CAD systems tools. It comes from fact that usually in CAD systems an involute curve is drawn by 3 points that respond to points located on the addendum circle, the reference diameter of a gear and the base circle respectively. In the Generator module the involute curve is drawn by 11 involute points which are located on and upper the base and the addendum circles therefore 3D gear wheels models are highly accurate. Application of the Generator module makes the modelling process very rapid so that the gear wheel modelling time is reduced to several seconds. During the conducted research the analysis of differences between standard 3 points and 11 points involutes was made. The results and conclusions drawn upon analysis are shown in details.

  14. PRISM, Processing and Review Interface for Strong Motion Data Software

    Science.gov (United States)

    Kalkan, E.; Jones, J. M.; Stephens, C. D.; Ng, P.

    2016-12-01

    A continually increasing number of high-quality digital strong-motion records from stations of the National Strong Motion Project (NSMP) of the U.S. Geological Survey (USGS), as well as data from regional seismic networks within the U.S., calls for automated processing of strong-motion records with human review limited to selected significant or flagged records. The NSMP has developed the Processing and Review Interface for Strong Motion data (PRISM) software to meet this need. PRISM automates the processing of strong-motion records by providing batch-processing capabilities. The PRISM software is platform-independent (coded in Java), open-source, and does not depend on any closed-source or proprietary software. The software consists of two major components: a record processing engine composed of modules for each processing step, and a graphical user interface (GUI) for manual review and processing. To facilitate the use by non-NSMP earthquake engineers and scientists, PRISM (both its processing engine and GUI components) is easy to install and run as a stand-alone system on common operating systems such as Linux, OS X and Windows. PRISM was designed to be flexible and extensible in order to accommodate implementation of new processing techniques. Input to PRISM currently is limited to data files in the Consortium of Organizations for Strong-Motion Observation Systems (COSMOS) V0 format, so that all retrieved acceleration time series need to be converted to this format. Output products include COSMOS V1, V2 and V3 files as: (i) raw acceleration time series in physical units with mean removed (V1), (ii) baseline-corrected and filtered acceleration, velocity, and displacement time series (V2), and (iii) response spectra, Fourier amplitude spectra and common earthquake-engineering intensity measures (V3). A thorough description of the record processing features supported by PRISM is presented with examples and validation results. All computing features have been

  15. Automating Test Activities: Test Cases Creation, Test Execution, and Test Reporting with Multiple Test Automation Tools

    OpenAIRE

    Loke Mun Sei

    2015-01-01

    Software testing has become a mandatory process in assuring the software product quality. Hence, test management is needed in order to manage the test activities conducted in the software test life cycle. This paper discusses on the challenges faced in the software test life cycle, and how the test processes and test activities, mainly on test cases creation, test execution, and test reporting is being managed and automated using several test automation tools, i.e. Jira, ...

  16. Preparing for Automated Derivation of Products in a Software Product Line

    National Research Council Canada - National Science Library

    McGregor, John D

    2005-01-01

    ... to bring a product to market, or through other production improvements. Business goals such as these make automated product derivation an appealing strategy to a software product line organization...

  17. Software package to automate the design and production of translucent building structures made of pvc

    Directory of Open Access Journals (Sweden)

    Petrova Irina Yur’evna

    2016-08-01

    Full Text Available The article describes the features of the design and production of translucent building structures made of PVC. The analysis of the automation systems of this process currently existing on the market is carried out, their advantages and disadvantages are identified. Basing on this analysis, a set of requirements for automation systems for the design and production of translucent building structures made of PVC is formulated; the basic entities are involved in those business processes. The necessary functions for the main application and for dealers’ application are specified. The main application is based on technological platform 1C: Enterprise 8.2. The dealers’ module is .NET application and is developed with the use of Microsoft Visual Studio and Microsoft SQL Server because these software products have client versions free for end users (.NET Framework 4.0 Client Profile and Microsoft SQL Server 2008 Express. The features of the developed software complex implementation are described; the relevant charts are given. The scheme of system deployment and protocols of data exchange between 1C server, 1C client and dealer is presented. Also the functions supported by 1C module and .NET module are described. The article describes the content of class library developed for .NET module. The specification of integration of the two applications in a single software package is given. The features of the GUI organization are described; the corresponding screenshots are given. The possible ways of further development of the described software complex are presented and a conclusion about its competitiveness and expediency of new researches is made.

  18. GCE Data Toolbox for MATLAB - a software framework for automating environmental data processing, quality control and documentation

    Science.gov (United States)

    Sheldon, W.; Chamblee, J.; Cary, R. H.

    2013-12-01

    developing automated workflows for unattended processing. Finalized data and structured metadata can be exported in a wide variety of text and MATLAB formats or uploaded to a relational database for long-term archiving and distribution. The GCE Data Toolbox can be used as a complete, light-weight solution for environmental data and metadata management, but it can also be used in conjunction with other cyber infrastructure to provide a more comprehensive solution. For example, newly acquired data can be retrieved from a Data Turbine or Campbell LoggerNet Database server for quality control and processing, then transformed to CUAHSI Observations Data Model format and uploaded to a HydroServer for distribution through the CUAHSI Hydrologic Information System. The GCE Data Toolbox can also be leveraged in analytical workflows developed using Kepler or other systems that support MATLAB integration or tool chaining. This software can therefore be leveraged in many ways to help researchers manage, analyze and distribute the data they collect.

  19. Modularity and Architecture of PLC-based Software for Automated Production Systems: An analysis in industrial companies

    OpenAIRE

    B. Vogel-Heuser, J. Fischer, S. Feldmann, S. Ulewicz, S. Rösch

    2018-01-01

    Adaptive and flexible production systems require modular and reusable software especially considering their long-term life cycle of up to 50 years. SWMAT4aPS, an approach to measure Software Maturity for automated Production Systems is introduced. The approach identifies weaknesses and strengths of various companies’ solutions for modularity of software in the design of automated Production Systems (aPS). At first, a self-assessed questionnaire is used to evaluate a large number of companies ...

  20. Specdata: Automated Analysis Software for Broadband Spectra

    Science.gov (United States)

    Oliveira, Jasmine N.; Martin-Drumel, Marie-Aline; McCarthy, Michael C.

    2017-06-01

    With the advancement of chirped-pulse techniques, broadband rotational spectra with a few tens to several hundred GHz of spectral coverage are now routinely recorded. When studying multi-component mixtures that might result, for example, with the use of an electrical discharge, lines of new chemical species are often obscured by those of known compounds, and analysis can be laborious. To address this issue, we have developed SPECdata, an open source, interactive tool which is designed to simplify and greatly accelerate the spectral analysis and discovery. Our software tool combines both automated and manual components that free the user from computation, while giving him/her considerable flexibility to assign, manipulate, interpret and export their analysis. The automated - and key - component of the new software is a database query system that rapidly assigns transitions of known species in an experimental spectrum. For each experiment, the software identifies spectral features, and subsequently assigns them to known molecules within an in-house database (Pickett .cat files, list of frequencies...), or those catalogued in Splatalogue (using automatic on-line queries). With suggested assignments, the control is then handed over to the user who can choose to accept, decline or add additional species. Data visualization, statistical information, and interactive widgets assist the user in making decisions about their data. SPECdata has several other useful features intended to improve the user experience. Exporting a full report of the analysis, or a peak file in which assigned lines are removed are among several options. A user may also save their progress to continue at another time. Additional features of SPECdata help the user to maintain and expand their database for future use. A user-friendly interface allows one to search, upload, edit or update catalog or experiment entries.

  1. Global Software Engineering: A Software Process Approach

    Science.gov (United States)

    Richardson, Ita; Casey, Valentine; Burton, John; McCaffery, Fergal

    Our research has shown that many companies are struggling with the successful implementation of global software engineering, due to temporal, cultural and geographical distance, which causes a range of factors to come into play. For example, cultural, project managementproject management and communication difficulties continually cause problems for software engineers and project managers. While the implementation of efficient software processes can be used to improve the quality of the software product, published software process models do not cater explicitly for the recent growth in global software engineering. Our thesis is that global software engineering factors should be included in software process models to ensure their continued usefulness in global organisations. Based on extensive global software engineering research, we have developed a software process, Global Teaming, which includes specific practices and sub-practices. The purpose is to ensure that requirements for successful global software engineering are stipulated so that organisations can ensure successful implementation of global software engineering.

  2. AROSICS: An Automated and Robust Open-Source Image Co-Registration Software for Multi-Sensor Satellite Data

    Directory of Open Access Journals (Sweden)

    Daniel Scheffler

    2017-07-01

    Full Text Available Geospatial co-registration is a mandatory prerequisite when dealing with remote sensing data. Inter- or intra-sensoral misregistration will negatively affect any subsequent image analysis, specifically when processing multi-sensoral or multi-temporal data. In recent decades, many algorithms have been developed to enable manual, semi- or fully automatic displacement correction. Especially in the context of big data processing and the development of automated processing chains that aim to be applicable to different remote sensing systems, there is a strong need for efficient, accurate and generally usable co-registration. Here, we present AROSICS (Automated and Robust Open-Source Image Co-Registration Software, a Python-based open-source software including an easy-to-use user interface for automatic detection and correction of sub-pixel misalignments between various remote sensing datasets. It is independent of spatial or spectral characteristics and robust against high degrees of cloud coverage and spectral and temporal land cover dynamics. The co-registration is based on phase correlation for sub-pixel shift estimation in the frequency domain utilizing the Fourier shift theorem in a moving-window manner. A dense grid of spatial shift vectors can be created and automatically filtered by combining various validation and quality estimation metrics. Additionally, the software supports the masking of, e.g., clouds and cloud shadows to exclude such areas from spatial shift detection. The software has been tested on more than 9000 satellite images acquired by different sensors. The results are evaluated exemplarily for two inter-sensoral and two intra-sensoral use cases and show registration results in the sub-pixel range with root mean square error fits around 0.3 pixels and better.

  3. A software for computer automated radioactive particle tracking

    International Nuclear Information System (INIS)

    Vieira, Wilson S.; Brandao, Luis E.; Braz, Delson

    2008-01-01

    TRACO-1 is the first software developed in Brazil for optimization and diagnosis of multiphase chemical reactors employing the technique known as 'Computer Automated Radioactive Particle Tracking' whose main idea is to follow the movement of a punctual radioactive particle inside a vessel. Considering that this particle has a behavior similar of the phase under investigation, important conclusions can be achieved. As a preliminary TRACO-1 evaluation, a simulation was carried out with the aid of a commercial software called MICROSHIELD, version 5.05, to obtain values of photon counting rates at four detector surfaces. These counting were related to the emission of gamma radiation from a radioactive source because they are the main TRACO-1 input variables. Although the results that has been found are incipient, the analysis of them suggest that the tracking of a radioactive source using TRACO- 1 can be well succeed, but a better evaluation of the capabilities of this software will only be achieved after its application in real experiments. (author)

  4. BioXTAS RAW, a software program for high-throughput automated small-angle X-ray scattering data reduction and preliminary analysis

    DEFF Research Database (Denmark)

    Nielsen, S.S.; Toft, K.N.; Snakenborg, Detlef

    2009-01-01

    A fully open source software program for automated two-dimensional and one-dimensional data reduction and preliminary analysis of isotropic small-angle X-ray scattering (SAXS) data is presented. The program is freely distributed, following the open-source philosophy, and does not rely on any...... commercial software packages. BioXTAS RAW is a fully automated program that, via an online feature, reads raw two-dimensional SAXS detector output files and processes and plots data as the data files are created during measurement sessions. The software handles all steps in the data reduction. This includes...... mask creation, radial averaging, error bar calculation, artifact removal, normalization and q calibration. Further data reduction such as background subtraction and absolute intensity scaling is fast and easy via the graphical user interface. BioXTAS RAW also provides preliminary analysis of one...

  5. Software complex AS (automation of spectrometry). User interface of experiment automation system implementation

    International Nuclear Information System (INIS)

    Astakhova, N.V.; Beskrovnyj, A.I.; Bogdzel', A.A.; Butorin, P.E.; Vasilovskij, S.G.; Gundorin, N.A.; Zlokazov, V.B.; Kutuzov, S.A.; Salamatin, I.M.; Shvetsov, V.N.

    2003-01-01

    An instrumental software complex for automation of spectrometry (AS) that enables prompt realization of experiment automation systems for spectrometers, which use data buferisation, has been developed. In the development new methods of programming and building of automation systems together with novel net technologies were employed. It is suggested that programs to schedule and conduct experiments should be based on the parametric model of the spectrometer, the approach that will make it possible to write programs suitable for any FLNP (Frank Laboratory of Neutron Physics) spectrometer and experimental technique applied and use different hardware interfaces for introducing the spectrometric data into the data acquisition system. The article describes the possibilities provided to the user in the field of scheduling and control of the experiment, data viewing, and control of the spectrometer parameters. The possibility of presenting the current spectrometer state, programs and the experimental data in the Internet in the form of dynamically formed protocols and graphs, as well as of the experiment control via the Internet is realized. To use the means of the Internet on the side of the client, applied programs are not needed. It suffices to know how to use the two programs to carry out experiments in the automated mode. The package is designed for experiments in condensed matter and nuclear physics and is ready for using. (author)

  6. Software Engineering Program: Software Process Improvement Guidebook

    Science.gov (United States)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  7. Software component quality evaluation

    Science.gov (United States)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  8. BOA: Framework for Automated Builds

    CERN Document Server

    Ratnikova, N

    2003-01-01

    Managing large-scale software products is a complex software engineering task. The automation of the software development, release and distribution process is most beneficial in the large collaborations, where the big number of developers, multiple platforms and distributed environment are typical factors. This paper describes Build and Output Analyzer framework and its components that have been developed in CMS to facilitate software maintenance and improve software quality. The system allows to generate, control and analyze various types of automated software builds and tests, such as regular rebuilds of the development code, software integration for releases and installation of the existing versions.

  9. BOA: Framework for automated builds

    International Nuclear Information System (INIS)

    Ratnikova, N.

    2003-01-01

    Managing large-scale software products is a complex software engineering task. The automation of the software development, release and distribution process is most beneficial in the large collaborations, where the big number of developers, multiple platforms and distributed environment are typical factors. This paper describes Build and Output Analyzer framework and its components that have been developed in CMS to facilitate software maintenance and improve software quality. The system allows to generate, control and analyze various types of automated software builds and tests, such as regular rebuilds of the development code, software integration for releases and installation of the existing versions

  10. Application of a path sensitizing method on automated generation of test specifications for control software

    International Nuclear Information System (INIS)

    Morimoto, Yuuichi; Fukuda, Mitsuko

    1995-01-01

    An automated generation method for test specifications has been developed for sequential control software in plant control equipment. Sequential control software can be represented as sequential circuits. The control software implemented in a control equipment is designed from these circuit diagrams. In logic tests of VLSI's, path sensitizing methods are widely used to generate test specifications. But the method generates test specifications at a single time only, and can not be directly applied to sequential control software. The basic idea of the proposed method is as follows. Specifications of each logic operator in the diagrams are defined in the software design process. Therefore, test specifications of each operator in the control software can be determined from these specifications, and validity of software can be judged by inspecting all of the operators in the logic circuit diagrams. Candidates for sensitized paths, on which test data for each operator propagates, can be generated by the path sensitizing method. To confirm feasibility of the method, it was experimentally applied to control software in digital control equipment. The program could generate test specifications exactly, and feasibility of the method was confirmed. (orig.) (3 refs., 7 figs.)

  11. Accuracy of automated software-guided detection of significant coronary artery stenosis by CT angiography: comparison with invasive catheterisation

    International Nuclear Information System (INIS)

    Anders, Katharina; Uder, Michael; Achenbach, Stephan; Petit, Isabel; Daniel, Werner G.; Pflederer, Tobias

    2013-01-01

    True automated detection of coronary artery stenoses might be useful whenever expert evaluation is not available, or as a ''second reader'' to enhance diagnostic confidence. We evaluated the accuracy of a PC-based stenosis detection tool alone and combined with expert interpretation. One hundred coronary CT angiography datasets were evaluated with the automated software alone, by manual interpretation (axial images, multiplanar reformations and maximum intensity projections in free double-oblique planes), and by expert interpretation aware of the automated findings. Stenoses ≥ 50 % were noted per-vessel and per-patient, and compared with invasive angiography. Automated post-processing was successful in 90 % of patients (88 % of vessels). When excluding uninterpretable datasets, per-patient sensitivity, specificity, positive predictive value (PPV) and negative predictive value (NPV) were 89 %, 79 %, 74 % and 92 % (per-vessel: 82 %, 85 %, 48 % and 96 %). All 100 datasets were evaluable by expert interpretation. Per-patient sensitivity, specificity, PPV and NPV were 95 %, 95 %, 93 % and 97 % (per-vessel: 89 %,98 %, 88 % and 98 %). Knowing the results of automated interpretation did not improve the performance of expert readers. Automated off-line post-processing of coronary CT angiography shows adequate sensitivity, but relatively low specificity in coronary stenosis detection. It does not increase accuracy of expert interpretation. Failure of post-processing in 10 % of all patients necessitates additional manual image work-up. (orig.)

  12. Analysis of the thoracic aorta using a semi-automated post processing tool

    International Nuclear Information System (INIS)

    Entezari, Pegah; Kino, Aya; Honarmand, Amir R.; Galizia, Mauricio S.; Yang, Yan; Collins, Jeremy; Yaghmai, Vahid; Carr, James C.

    2013-01-01

    Objective: To evaluates a semi-automated method for Thoracic Aortic Aneurysm (TAA) measurement using ECG-gated Dual Source CT Angiogram (DSCTA). Methods: This retrospective HIPAA compliant study was approved by our IRB. Transaxial maximum diameters of outer wall to outer wall were studied in fifty patients at seven anatomic locations of the thoracic aorta: annulus, sinus, sinotubular junction (STJ), mid ascending aorta (MAA) at the level of right pulmonary artery, proximal aortic arch (PROX) immediately proximal to innominate artery, distal aortic arch (DIST) immediately distal to left subclavian artery, and descending aorta (DESC) at the level of diaphragm. Measurements were performed using a manual method and semi-automated software. All readers repeated their measurements. Inter-method, intra-observer and inter-observer agreements were evaluated according to intraclass correlation coefficient (ICC) and Bland–Altman plot. The number of cases with manual contouring or center line adjustment for the semi-automated method and also the post-processing time for each method were recorded. Results: The mean difference between semi-automated and manual methods was less than 1.3 mm at all seven points. Strong inter-method, inter-observer and intra-observer agreement was recorded at all levels (ICC ≥ 0.9). The maximum rate of manual adjustment of center line and contour was at the level of annulus. The average time for manual post-processing of the aorta was 19 ± 0.3 min, while it took 8.26 ± 2.1 min to do the measurements with the semi-automated tool (Vitrea version 6.0.0.1 software). The center line was edited manually at all levels, with most corrections at the level of annulus (60%), while the contour was adjusted at all levels with highest and lowest number of corrections at the levels of annulus and DESC (75% and 0.07% of the cases), respectively. Conclusion: Compared to the commonly used manual method, semi-automated measurement of vessel dimensions is

  13. Arquitectura orientada a servicios para software de apoyo para el proceso personal de software A service oriented architecture for the implementation of the personal software process

    Directory of Open Access Journals (Sweden)

    Erick Salinas

    2011-06-01

    Full Text Available El presente trabajo describe una arquitectura orientada a servicios para un software que tiene como objetivo facilitar la implementación de un Proceso Personal de Software en un equipo de desarrollo u organización. Entre las características que posee este software y que son relevantes de mencionar están las de entregar extensibilidad e independencia, esto se ve reflejado en la facilidad para agregar nuevas herramientas al proceso de desarrollo de software integradas al Proceso Personal de Software con un máximo de independencia de sistemas operativos y lenguajes de programación. El software implementado realiza la recolección de los datos necesarios para el Proceso Personal de Software casi completamente automática, considerando que el administrador solamente clasifica los errores que pueden ocurrir cuando se utiliza algún lenguaje de programación en particular, entre otras pequeñas tareas. Esta facilidad de uso hace que la implementación del Proceso Personal de Software se realice exitosamente con un bajo esfuerzo requerido por los integrantes del equipo de desarrollo.This work describes a service oriented architecture of a software application that facilitates the implementation of the Personal Software Process by a development team or an organization. Some of the characteristics of this software and which are important to mention are extensibility and technical environment independence. These characteristics facilitate the process of adding new tools to the software development process integrating them to the Personal Software Process independently of the operating systems and programming languages being used. The implemented software undertakes the data collection necessary to the Personal Software Process almost automatically, since the administrator must only classify the errors that may occur when a particular programming language is used, among other small tasks. This ease of use approach helps to make the implementation of

  14. Lean automation development : applying lean principles to the automation development process

    OpenAIRE

    Granlund, Anna; Wiktorsson, Magnus; Grahn, Sten; Friedler, Niklas

    2014-01-01

    By a broad empirical study it is indicated that automation development show potential of improvement. In the paper, 13 lean product development principles are contrasted to the automation development process and it is suggested why and how these principles can facilitate, support and improve the automation development process. The paper summarises a description of what characterises a lean automation development process and what consequences it entails. Main differences compared to current pr...

  15. Automated Source Code Analysis to Identify and Remove Software Security Vulnerabilities: Case Studies on Java Programs

    OpenAIRE

    Natarajan Meghanathan

    2013-01-01

    The high-level contribution of this paper is to illustrate the development of generic solution strategies to remove software security vulnerabilities that could be identified using automated tools for source code analysis on software programs (developed in Java). We use the Source Code Analyzer and Audit Workbench automated tools, developed by HP Fortify Inc., for our testing purposes. We present case studies involving a file writer program embedded with features for password validation, and ...

  16. Computer software configuration management plan for the Honeywell modular automation system

    International Nuclear Information System (INIS)

    Cunningham, L.T.

    1997-01-01

    This document provides a Computer Software management plan for a new Honeywell Modular Automation System (MAS) being installed in the Plutonium Finishing Plant (PFP). This type of system will be used to control new thermal stabilization furnaces, a vertical denitrator calciner, and a pyrolysis furnace

  17. Development of a phantom to test fully automated breast density software – A work in progress

    International Nuclear Information System (INIS)

    Waade, G.G.; Hofvind, S.; Thompson, J.D.; Highnam, R.; Hogg, P.

    2017-01-01

    Objectives: Mammographic density (MD) is an independent risk factor for breast cancer and may have a future role for stratified screening. Automated software can estimate MD but the relationship between breast thickness reduction and MD is not fully understood. Our aim is to develop a deformable breast phantom to assess automated density software and the impact of breast thickness reduction on MD. Methods: Several different configurations of poly vinyl alcohol (PVAL) phantoms were created. Three methods were used to estimate their density. Raw image data of mammographic images were processed using Volpara to estimate volumetric breast density (VBD%); Hounsfield units (HU) were measured on CT images; and physical density (g/cm 3 ) was calculated using a formula involving mass and volume. Phantom volume versus contact area and phantom volume versus phantom thickness was compared to values of real breasts. Results: Volpara recognized all deformable phantoms as female breasts. However, reducing the phantom thickness caused a change in phantom density and the phantoms were not able to tolerate same level of compression and thickness reduction experienced by female breasts during mammography. Conclusion: Our results are promising as all phantoms resulted in valid data for automated breast density measurement. Further work should be conducted on PVAL and other materials to produce deformable phantoms that mimic female breast structure and density with the ability of being compressed to the same level as female breasts. Advances in knowledge: We are the first group to have produced deformable phantoms that are recognized as breasts by Volpara software. - Highlights: • Several phantoms of different configurations were created. • Three methods to assess phantom density were implemented. • All phantoms were identified as breasts by the Volpara software. • Reducing phantom thickness caused a change in phantom density.

  18. Frameworks for Performing on Cloud Automated Software Testing Using Swarm Intelligence Algorithm: Brief Survey

    Directory of Open Access Journals (Sweden)

    Mohammad Hossain

    2018-04-01

    Full Text Available This paper surveys on Cloud Based Automated Testing Software that is able to perform Black-box testing, White-box testing, as well as Unit and Integration Testing as a whole. In this paper, we discuss few of the available automated software testing frameworks on the cloud. These frameworks are found to be more efficient and cost effective because they execute test suites over a distributed cloud infrastructure. One of the framework effectiveness was attributed to having a module that accepts manual test cases from users and it prioritize them accordingly. Software testing, in general, accounts for as much as 50% of the total efforts of the software development project. To lessen the efforts, one the frameworks discussed in this paper used swarm intelligence algorithms. It uses the Ant Colony Algorithm for complete path coverage to minimize time and the Bee Colony Optimization (BCO for regression testing to ensure backward compatibility.

  19. A software for computer automated radioactive particle tracking

    Energy Technology Data Exchange (ETDEWEB)

    Vieira, Wilson S.; Brandao, Luis E. [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)]. E-mails: wilson@ien.gov.br; brandao@ien.gov.br; Braz, Delson [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Coordenacao dos Programas de Pos-graduacao de Engenharia (COPPE)]. E-mail: delson@smb.lin.ufrj.br

    2008-07-01

    TRACO-1 is the first software developed in Brazil for optimization and diagnosis of multiphase chemical reactors employing the technique known as 'Computer Automated Radioactive Particle Tracking' whose main idea is to follow the movement of a punctual radioactive particle inside a vessel. Considering that this particle has a behavior similar of the phase under investigation, important conclusions can be achieved. As a preliminary TRACO-1 evaluation, a simulation was carried out with the aid of a commercial software called MICROSHIELD, version 5.05, to obtain values of photon counting rates at four detector surfaces. These counting were related to the emission of gamma radiation from a radioactive source because they are the main TRACO-1 input variables. Although the results that has been found are incipient, the analysis of them suggest that the tracking of a radioactive source using TRACO- 1 can be well succeed, but a better evaluation of the capabilities of this software will only be achieved after its application in real experiments. (author)

  20. Software systems for processing and analysis at the NOVA high-energy laser facility

    International Nuclear Information System (INIS)

    Auerbach, J.M.; Montgomery, D.S.; McCauley, E.W.; Stone, G.F.

    1986-01-01

    A typical laser interaction experiment at the NOVA high-energy laser facility produces in excess of 20 Mbytes of digitized data. Extensive processing and analysis of this raw data from a wide variety of instruments is necessary to produce results that can be readily used to interpret the experiment. Using VAX-based computer hardware, software systems have been set up to convert the digitized instrument output to physics quantities describing the experiment. A relational data-base management system is used to coordinate all levels of processing and analysis. Software development emphasizes structured design, flexibility, automation, and ease of use

  1. Experimental research control software system

    International Nuclear Information System (INIS)

    Cohn, I A; Kovalenko, A G; Vystavkin, A N

    2014-01-01

    A software system, intended for automation of a small scale research, has been developed. The software allows one to control equipment, acquire and process data by means of simple scripts. The main purpose of that development is to increase experiment automation easiness, thus significantly reducing experimental setup automation efforts. In particular, minimal programming skills are required and supervisors have no reviewing troubles. Interactions between scripts and equipment are managed automatically, thus allowing to run multiple scripts simultaneously. Unlike well-known data acquisition commercial software systems, the control is performed by an imperative scripting language. This approach eases complex control and data acquisition algorithms implementation. A modular interface library performs interaction with external interfaces. While most widely used interfaces are already implemented, a simple framework is developed for fast implementations of new software and hardware interfaces. While the software is in continuous development with new features being implemented, it is already used in our laboratory for automation of a helium-3 cryostat control and data acquisition. The software is open source and distributed under Gnu Public License.

  2. Experimental research control software system

    Science.gov (United States)

    Cohn, I. A.; Kovalenko, A. G.; Vystavkin, A. N.

    2014-05-01

    A software system, intended for automation of a small scale research, has been developed. The software allows one to control equipment, acquire and process data by means of simple scripts. The main purpose of that development is to increase experiment automation easiness, thus significantly reducing experimental setup automation efforts. In particular, minimal programming skills are required and supervisors have no reviewing troubles. Interactions between scripts and equipment are managed automatically, thus allowing to run multiple scripts simultaneously. Unlike well-known data acquisition commercial software systems, the control is performed by an imperative scripting language. This approach eases complex control and data acquisition algorithms implementation. A modular interface library performs interaction with external interfaces. While most widely used interfaces are already implemented, a simple framework is developed for fast implementations of new software and hardware interfaces. While the software is in continuous development with new features being implemented, it is already used in our laboratory for automation of a helium-3 cryostat control and data acquisition. The software is open source and distributed under Gnu Public License.

  3. Comparison of Automated Atlas Based Segmentation Software for postoperative prostate cancer radiotherapy

    Directory of Open Access Journals (Sweden)

    Grégory Delpon

    2016-08-01

    Full Text Available Automated atlas-based segmentation algorithms present the potential to reduce the variability in volume delineation. Several vendors offer software that are mainly used for cranial, head and neck and prostate cases. The present study will compare the contours produced by a radiation oncologist to the contours computed by different automated atlas-based segmentation algorithms for prostate bed cases, including femoral heads, bladder and rectum. Contour comparison was evaluated by different metrics such as volume ratio, Dice coefficient and Hausdorff distance. Results depended on the volume of interest and showed some discrepancies between the different software. Automatic contours could be a good starting point for the delineation of organs since efficient editing tools are provided by different vendors. It should become an important help in the next few years for organ at risk delineation.

  4. Automated data acquisition technology development:Automated modeling and control development

    Science.gov (United States)

    Romine, Peter L.

    1995-01-01

    This report documents the completion of, and improvements made to, the software developed for automated data acquisition and automated modeling and control development on the Texas Micro rackmounted PC's. This research was initiated because a need was identified by the Metal Processing Branch of NASA Marshall Space Flight Center for a mobile data acquisition and data analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC based system was chosen. The Welding Measurement System (WMS), is a dedicated instrument strickly for use of data acquisition and data analysis. In addition to the data acquisition functions described in this thesis, WMS also supports many functions associated with process control. The hardware and software requirements for an automated acquisition system for welding process parameters, welding equipment checkout, and welding process modeling were determined in 1992. From these recommendations, NASA purchased the necessary hardware and software. The new welding acquisition system is designed to collect welding parameter data and perform analysis to determine the voltage versus current arc-length relationship for VPPA welding. Once the results of this analysis are obtained, they can then be used to develop a RAIL function to control welding startup and shutdown without torch crashing.

  5. ERP processes automation in corporate environments

    OpenAIRE

    Antonoaie Victor; Irimeş Adrian; Chicoş Lucia-Antoneta

    2017-01-01

    The automation processes are used in organizations to speed up analyses processes and reduce manual labour. Robotic Automation of IT processes implemented in a modern corporate workspace provides an excellent tool for assisting professionals in making decisions, saving resources and serving as a know-how repository. This study presents the newest trends in process automation, its benefits such as security, ease of use, reduction of overall process duration, and provide examples of SAPERP proj...

  6. Improving the software fault localization process through testability information

    NARCIS (Netherlands)

    Gonzalez-Sanchez, A.; Abreu, R.; Gross, H.; Van Gemund, A.

    2010-01-01

    When failures occur during software testing, automated software fault localization helps to diagnose their root causes and identify the defective components of a program to support debugging. Diagnosis is carried out by selecting test cases in such way that their pass or fail information will narrow

  7. Using automated environmental management information systems to enable compliance: Ten questions to answer before selecting a software system

    Energy Technology Data Exchange (ETDEWEB)

    Gilbert, J.B.

    1999-07-01

    As technology invades the arena of environmental information management, hundreds of software packages have become available in the marketplace. How does the already overwhelmed environmental manager or IT professional decide what's right for the organization? Is there a software package that meets the needs of the organization, and is there a successful way to implement the system? Does this require abandoning existing systems with which users are comfortable? Can the system really save time and/or money? This paper discusses three topics: What drives the need for a system; Ten questions to aid in selecting a system that is right for your organization; and enabling technology and software systems available today, and the future application of technology to environmental data management. Motivating factors for EMIS include regulatory, business and IT drivers. Because of the ever-increasing regulatory burden, the need to demonstrate compliance often is the strongest driver. But they cannot ignore business and IT drivers from the discussion, especially with issues such as Enterprise Resource Planning and The Year 2000 impacting many systems projects. Before selecting a system, the organization should address, at a minimum, the following ten issues: (1) Organization objectives; (2) Organization readiness; (3) High-level processes to be automated; (4) Integration and interfaces; (5) User community and needs; (6) Technical requirements; (7) Degree of customization; (8) Project timing; (9) Implementation resource needs; and (10) System justification. Today, there are hundreds of EH and S software packages available too help automate daily business processes. Only a few are multimedia packages, and all require significant implementation efforts. The EMIS market is still evolving, and software vendors continue to enhance product features and usability.

  8. Ankle-Foot Orthosis Made by 3D Printing Technique and Automated Design Software

    Directory of Open Access Journals (Sweden)

    Yong Ho Cha

    2017-01-01

    Full Text Available We described 3D printing technique and automated design software and clinical results after the application of this AFO to a patient with a foot drop. After acquiring a 3D modelling file of a patient’s lower leg with peroneal neuropathy by a 3D scanner, we loaded this file on the automated orthosis software and created the “STL” file. The designed AFO was printed using a fused filament fabrication type 3D printer, and a mechanical stress test was performed. The patient alternated between the 3D-printed and conventional AFOs for 2 months. There was no crack or damage, and the shape and stiffness of the AFO did not change after the durability test. The gait speed increased after wearing the conventional AFO (56.5 cm/sec and 3D-printed AFO (56.5 cm/sec compared to that without an AFO (42.2 cm/sec. The patient was more satisfied with the 3D-printed AFO than the conventional AFO in terms of the weight and ease of use. The 3D-printed AFO exhibited similar functionality as the conventional AFO and considerably satisfied the patient in terms of the weight and ease of use. We suggest the possibility of the individualized AFO with 3D printing techniques and automated design software.

  9. Bottom-Up Technologies for Reuse: Automated Extractive Adoption of Software Product Lines

    OpenAIRE

    Martinez , Jabier ,; Ziadi , Tewfik; Bissyandé , Tegawendé; Klein , Jacques ,; Le Traon , Yves ,

    2017-01-01

    International audience; Adopting Software Product Line (SPL) engineering principles demands a high up-front investment. Bottom-Up Technologies for Reuse (BUT4Reuse) is a generic and extensible tool aimed to leverage existing similar software products in order to help in extractive SPL adoption. The envisioned users are 1) SPL adopters and 2) Integrators of techniques and algorithms to provide automation in SPL adoption activities. We present the methodology it implies for both types of users ...

  10. Automate functional testing

    Directory of Open Access Journals (Sweden)

    Ramesh Kalindri

    2014-06-01

    Full Text Available Currently, software engineers are increasingly turning to the option of automating functional tests, but not always have successful in this endeavor. Reasons range from low planning until over cost in the process. Some principles that can guide teams in automating these tests are described in this article.

  11. A software engineering process for safety-critical software application

    International Nuclear Information System (INIS)

    Kang, Byung Heon; Kim, Hang Bae; Chang, Hoon Seon; Jeon, Jong Sun

    1995-01-01

    Application of computer software to safety-critical systems in on the increase. To be successful, the software must be designed and constructed to meet the functional and performance requirements of the system. For safety reason, the software must be demonstrated not only to meet these requirements, but also to operate safely as a component within the system. For longer-term cost consideration, the software must be designed and structured to ease future maintenance and modifications. This paper presents a software engineering process for the production of safety-critical software for a nuclear power plant. The presentation is expository in nature of a viable high quality safety-critical software development. It is based on the ideas of a rational design process and on the experience of the adaptation of such process in the production of the safety-critical software for the shutdown system number two of Wolsung 2, 3 and 4 nuclear power generation plants. This process is significantly different from a conventional process in terms of rigorous software development phases and software design techniques, The process covers documentation, design, verification and testing using mathematically precise notations and highly reviewable tabular format to specify software requirements and software requirements and software requirements and code against software design using static analysis. The software engineering process described in this paper applies the principle of information-hiding decomposition in software design using a modular design technique so that when a change is required or an error is detected, the affected scope can be readily and confidently located. it also facilitates a sense of high degree of confidence in the 'correctness' of the software production, and provides a relatively simple and straightforward code implementation effort. 1 figs., 10 refs. (Author)

  12. ERP processes automation in corporate environments

    Directory of Open Access Journals (Sweden)

    Antonoaie Victor

    2017-01-01

    Full Text Available The automation processes are used in organizations to speed up analyses processes and reduce manual labour. Robotic Automation of IT processes implemented in a modern corporate workspace provides an excellent tool for assisting professionals in making decisions, saving resources and serving as a know-how repository. This study presents the newest trends in process automation, its benefits such as security, ease of use, reduction of overall process duration, and provide examples of SAPERP projects where this technology was implemented and meaningful impact was obtained.

  13. Automated processing of endoscopic surgical instruments.

    Science.gov (United States)

    Roth, K; Sieber, J P; Schrimm, H; Heeg, P; Buess, G

    1994-10-01

    This paper deals with the requirements for automated processing of endoscopic surgical instruments. After a brief analysis of the current problems, solutions are discussed. Test-procedures have been developed to validate the automated processing, so that the cleaning results are guaranteed and reproducable. Also a device for testing and cleaning was designed together with Netzsch Newamatic and PCI, called TC-MIC, to automate processing and reduce manual work.

  14. Data acquisition and processing software for linear PSD based neutron diffractometers

    International Nuclear Information System (INIS)

    Pande, S.S.; Borkar, S.P.; Ghodgaonkar, M.D.

    2003-01-01

    As a part of data acquisition system for various single and multi-PSD diffractometers software is developed to acquire the data and support the requirements of diffraction experiments. The software is a front-end Windows 98 application on PC and a transputer program on the MPSD card. The front-end application provides entire user interface required for data acquisition, control, presentation and system setup. Data is acquired and the diffraction spectra are generated in the transputer program. All the required hardware control is also implemented in the transputer program. The two programs communicate using a device driver named VTRANSPD. The software plays a vital role in customizing and integrating the data acquisition system for various diffractometer setups. Also the experiments are effectively automated in the software which has helped in making best use of available beam time. These and other features of the data acquisition and processing software are presented here. This software is being used along with the data acquisition system at a few single PSD and multi-PSD diffractometers. (author)

  15. Software Quality Assurance and Verification for the MPACT Library Generation Process

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Yuxuan [Univ. of Michigan, Ann Arbor, MI (United States); Williams, Mark L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wiarda, Dorothea [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Clarno, Kevin T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kim, Kang Seog [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Celik, Cihangir [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-05-01

    This report fulfills the requirements for the Consortium for the Advanced Simulation of Light-Water Reactors (CASL) milestone L2:RTM.P14.02, “SQA and Verification for MPACT Library Generation,” by documenting the current status of the software quality, verification, and acceptance testing of nuclear data libraries for MPACT. It provides a brief overview of the library generation process, from general-purpose evaluated nuclear data files (ENDF/B) to a problem-dependent cross section library for modeling of light-water reactors (LWRs). The software quality assurance (SQA) programs associated with each of the software used to generate the nuclear data libraries are discussed; specific tests within the SCALE/AMPX and VERA/XSTools repositories are described. The methods and associated tests to verify the quality of the library during the generation process are described in detail. The library generation process has been automated to a degree to (1) ensure that it can be run without user intervention and (2) to ensure that the library can be reproduced. Finally, the acceptance testing process that will be performed by representatives from the Radiation Transport Methods (RTM) Focus Area prior to the production library’s release is described in detail.

  16. Implementation Of Carlson Survey Software2009 In Survey Works And Comparison With CDS Software

    Directory of Open Access Journals (Sweden)

    Mohamed Faraj EL Megrahi

    2017-02-01

    Full Text Available The automation surveying is one of the most influential changes to surveying concept and profession has had to go through, this has taken effect in two major courses, hardware (instrumentation used in data collection and presentation, and the software (the applications used in data processing and manipulation. Automation is majorly computer based and just like all such systems is subject to improvement often; this is manifested in the new kinds of instrumentation models every few years such as total station and newer versions of software’s. The software that has the potential to completely affect survey automation is Carlson Surveying Software. This when coupled with total station as data processing and collection methods respectively; is capable of greatly improving productivity while reducing time and cost required in the long run. However, it is only natural for users to desire a competent software and be able to choose from what is available on the market based on guided research and credible information from previous researches. Such studies not only help in choice of software but are also handy when it comes to testing approaches and recommending improvements based on advantages and disadvantages to the manufacturers to help in advancement in the software industry for better and more comfortable use. The expected outcome of the research is a successful implementation of Carlson survey 2009 software in survey works and a comparison with other existing software like Civil Design Software (CDS was highlighted its advantages and disadvantages.

  17. Software bots -The next frontier for shared services and functional excellence

    NARCIS (Netherlands)

    Suri, Vipin K.; Elia, Marianne; van Hillegersberg, Jos

    2017-01-01

    A Software Bot is a fundamental element of Robotics Process Automation (RPA). RPA can be deployed to automate repeatable, mundane, rules-based work-flowed process tasks across multiple functions in an organization, including Shared Services. While RPA holds high promise, using Software Bots for

  18. Software engineering processes principles and applications

    CERN Document Server

    Wang, Yingxu

    2000-01-01

    Fundamentals of the Software Engineering ProcessIntroductionA Unified Framework of the Software Engineering ProcessProcess AlgebraProcess-Based Software EngineeringSoftware Engineering Process System ModelingThe CMM ModelThe ISO 9001 ModelThe BOOTSTRAP ModelThe ISO/IEC 15504 (SPICE) ModelThe Software Engineering Process Reference Model: SEPRMSoftware Engineering Process System AnalysisBenchmarking the SEPRM ProcessesComparative Analysis of Current Process ModelsTransformation of Capability Levels Between Current Process ModelsSoftware Engineering Process EstablishmentSoftware Process Establish

  19. Time efficiency and diagnostic accuracy of new automated myocardial perfusion analysis software in 320-row CT cardiac imaging

    Energy Technology Data Exchange (ETDEWEB)

    Rief, Matthias; Stenzei, Fabian; Kranz, Anisha; Schlattmann, Peter; Dewey, Marc [Dept. of Radiology, Charite - Universiteitsmedizin Berlin, Berlin (Greece)

    2013-01-15

    We aimed to evaluate the time efficiency and diagnostic accuracy of automated myocardial computed tomography perfusion (CTP) image analysis software. 320-row CTP was performed in 30 patients, and analyses were conducted independently by three different blinded readers by the use of two recent software releases (version 4.6 and novel version 4.71GR001, Toshiba, Tokyo, Japan). Analysis times were compared, and automated epi- and endocardial contour detection was subjectively rated in five categories (excellent, good, fair, poor and very poor). As semi-quantitative perfusion parameters, myocardial attenuation and transmural perfusion ratio (TPR) were calculated for each myocardial segment and agreement was tested by using the intraclass correlation coefficient (ICC). Conventional coronary angiography served as reference standard. The analysis time was significantly reduced with the novel automated software version as compared with the former release (Reader 1: 43:08 ± 11:39 min vs. 09:47 ± 04:51 min, Reader 2: 42:07 ± 06:44 min vs. 09:42 ± 02:50 min and Reader 3: 21:38 ± 3:44 min vs. 07:34 ± 02:12 min; p < 0.001 for all). Epi- and endocardial contour detection for the novel software was rated to be significantly better (p < 0.001) than with the former software. ICCs demonstrated strong agreement (≥ 0.75) for myocardial attenuation in 93% and for TPR in 82%. Diagnostic accuracy for the two software versions was not significantly different (p 0.169) as compared with conventional coronary angiography. The novel automated CTP analysis software offers enhanced time efficiency with an improvement by a factor of about four, while maintaining diagnostic accuracy.

  20. Automated control of the laser welding process of heart valve scaffolds

    Directory of Open Access Journals (Sweden)

    Weber Moritz

    2016-09-01

    Full Text Available Using the electrospinning process the geometry of a heart valve is not replicable by just one manufacturing process. To produce heart valve scaffolds the heart valve leaflets and the vessel have to be produced in separated spinning processes. For the final product of a heart valve they have to be mated afterwards. In this work an already existing three-axes laser was enhanced to laser weld those scaffolds. The automation control software is based on the robot operating system (ROS. The mechatronically control is done by an Arduino Mega. A graphical user interface (GUI is written with Python and Kivy.

  1. SpcAudace: Spectroscopic processing and analysis package of Audela software

    Science.gov (United States)

    Mauclaire, Benjamin

    2017-11-01

    SpcAudace processes long slit spectra with automated pipelines and performs astrophysical analysis of the latter data. These powerful pipelines do all the required steps in one pass: standard preprocessing, masking of bad pixels, geometric corrections, registration, optimized spectrum extraction, wavelength calibration and instrumental response computation and correction. Both high and low resolution long slit spectra are managed for stellar and non-stellar targets. Many types of publication-quality figures can be easily produced: pdf and png plots or annotated time series plots. Astrophysical quantities can be derived from individual or large amount of spectra with advanced functions: from line profile characteristics to equivalent width and periodogram. More than 300 documented functions are available and can be used into TCL scripts for automation. SpcAudace is based on Audela open source software.

  2. Quantification of diffusion tensor imaging in normal white matter maturation of early childhood using an automated processing pipeline

    International Nuclear Information System (INIS)

    Loh, K.B.; Ramli, N.; Tan, L.K.; Roziah, M.; Rahmat, K.; Ariffin, H.

    2012-01-01

    The degree and status of white matter myelination can be sensitively monitored using diffusion tensor imaging (DTI). This study looks at the measurement of fractional anistropy (FA) and mean diffusivity (MD) using an automated ROI with an existing DTI atlas. Anatomical MRI and structural DTI were performed cross-sectionally on 26 normal children (newborn to 48 months old), using 1.5-T MRI. The automated processing pipeline was implemented to convert diffusion-weighted images into the NIfTI format. DTI-TK software was used to register the processed images to the ICBM DTI-81 atlas, while AFNI software was used for automated atlas-based volumes of interest (VOIs) and statistical value extraction. DTI exhibited consistent grey-white matter contrast. Triphasic temporal variation of the FA and MD values was noted, with FA increasing and MD decreasing rapidly early in the first 12 months. The second phase lasted 12-24 months during which the rate of FA and MD changes was reduced. After 24 months, the FA and MD values plateaued. DTI is a superior technique to conventional MR imaging in depicting WM maturation. The use of the automated processing pipeline provides a reliable environment for quantitative analysis of high-throughput DTI data. (orig.)

  3. Quantification of diffusion tensor imaging in normal white matter maturation of early childhood using an automated processing pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Loh, K.B.; Ramli, N.; Tan, L.K.; Roziah, M. [University of Malaya, Department of Biomedical Imaging, University Malaya Research Imaging Centre (UMRIC), Faculty of Medicine, Kuala Lumpur (Malaysia); Rahmat, K. [University of Malaya, Department of Biomedical Imaging, University Malaya Research Imaging Centre (UMRIC), Faculty of Medicine, Kuala Lumpur (Malaysia); University Malaya, Biomedical Imaging Department, Kuala Lumpur (Malaysia); Ariffin, H. [University of Malaya, Department of Paediatrics, Faculty of Medicine, Kuala Lumpur (Malaysia)

    2012-07-15

    The degree and status of white matter myelination can be sensitively monitored using diffusion tensor imaging (DTI). This study looks at the measurement of fractional anistropy (FA) and mean diffusivity (MD) using an automated ROI with an existing DTI atlas. Anatomical MRI and structural DTI were performed cross-sectionally on 26 normal children (newborn to 48 months old), using 1.5-T MRI. The automated processing pipeline was implemented to convert diffusion-weighted images into the NIfTI format. DTI-TK software was used to register the processed images to the ICBM DTI-81 atlas, while AFNI software was used for automated atlas-based volumes of interest (VOIs) and statistical value extraction. DTI exhibited consistent grey-white matter contrast. Triphasic temporal variation of the FA and MD values was noted, with FA increasing and MD decreasing rapidly early in the first 12 months. The second phase lasted 12-24 months during which the rate of FA and MD changes was reduced. After 24 months, the FA and MD values plateaued. DTI is a superior technique to conventional MR imaging in depicting WM maturation. The use of the automated processing pipeline provides a reliable environment for quantitative analysis of high-throughput DTI data. (orig.)

  4. SmartUnit: Empirical Evaluations for Automated Unit Testing of Embedded Software in Industry

    OpenAIRE

    Zhang, Chengyu; Yan, Yichen; Zhou, Hanru; Yao, Yinbo; Wu, Ke; Su, Ting; Miao, Weikai; Pu, Geguang

    2018-01-01

    In this paper, we aim at the automated unit coverage-based testing for embedded software. To achieve the goal, by analyzing the industrial requirements and our previous work on automated unit testing tool CAUT, we rebuild a new tool, SmartUnit, to solve the engineering requirements that take place in our partner companies. SmartUnit is a dynamic symbolic execution implementation, which supports statement, branch, boundary value and MC/DC coverage. SmartUnit has been used to test more than one...

  5. Automated ECU software tests on hardware-in-the-loop test benches; Automatisierte ECU-Software-Tests an Hardware-in-the-Loop-Pruefstaenden

    Energy Technology Data Exchange (ETDEWEB)

    Voegl, R. [AVL List GmbH, Graz (Austria). Abteilung Kalibrierung Ottomotoren; Duerager, Ch. [AVL List GmbH, Graz (Austria). Abteilung Kalibriermethodik; Beer, W.; Martini, E. [AVL List GmbH, Graz (Austria)

    2005-08-01

    Due to the continuous increase in complexity of engine application, AVL List decided to develop methods for automated ECU software commissioning on HiL test benches. This required the intensive co-operation of the departments for calibration methodology, calibration and electrical/electronic engineering. The result is a practical orientated collection of methods that significantly increase the test coverage for a new software version without extending the commissioning time. (orig.)

  6. Sloan Digital Sky Survey photometric telescope automation and observing software

    International Nuclear Information System (INIS)

    Eric H. Neilsen, Jr.; email = neilsen@fnal.gov

    2002-01-01

    The photometric telescope (PT) provides observations necessary for the photometric calibration of the Sloan Digital Sky Survey (SDSS). Because the attention of the observing staff is occupied by the operation of the 2.5 meter telescope which takes the survey data proper, the PT must reliably take data with little supervision. In this paper we describe the PT's observing program, MOP, which automates most tasks necessary for observing. MOP's automated target selection is closely modeled on the actions a human observer might take, and is built upon a user interface that can be (and has been) used for manual operation. This results in an interface that makes it easy for an observer to track the activities of the automating procedures and intervene with minimum disturbance when necessary. MOP selects targets from the same list of standard star and calibration fields presented to the user, and chooses standard star fields covering ranges of airmass, color, and time necessary to monitor atmospheric extinction and produce a photometric solution. The software determines when additional standard star fields are unnecessary, and selects survey calibration fields according to availability and priority. Other automated features of MOP, such as maintaining the focus and keeping a night log, are also built around still functional manual interfaces, allowing the observer to be as active in observing as desired; MOP's automated features may be used as tools for manual observing, ignored entirely, or allowed to run the telescope with minimal supervision when taking routine data

  7. Post-Lamination Manufacturing Process Automation for Photovoltaic Modules: Final Subcontract Report, April 1998 - April 2002

    Energy Technology Data Exchange (ETDEWEB)

    Nowlan, M. J.; Murach, J. M.; Sutherland, S. F.; Miller, D. C.; Moore, S. B.; Hogan, S. J.

    2002-11-01

    This report describes the automated systems developed for PV module assembly and testing processes after lamination. These processes are applicable to a broad range of module types, including those made with wafer-based and thin-film solar cells. Survey data and input from module manufacturers gathered during site visits were used to define system capabilities and process specifications. Spire completed mechanical, electrical, and software engineering for four automation systems: a module edge trimming system, the SPI-TRIM 350; an edge sealing and framing system, the SPI-FRAMER 350; an integrated module testing system, the SPI-MODULE QA 350; and a module buffer storage system, the SPI-BUFFER 350. A fifth system for junction-box installation, the SPI-BOXER 350, was nearly completed during the program. A new-size solar simulator, the SPI-SUN SIMULATOR 350i, was designed as part of the SPI-MODULE QA 350. This simulator occupies minimal production floor space, and its test area is large enough to handle most production modules. The automated systems developed in this program are designed for integration to create automated production lines.

  8. Performance Evaluation of a Software Engineering Tool for Automated Design of Cooling Systems in Injection Moulding

    DEFF Research Database (Denmark)

    Jauregui-Becker, Juan M.; Tosello, Guido; van Houten, Fred J.A.M.

    2013-01-01

    This paper presents a software tool for automating the design of cooling systems for injection moulding and a validation of its performance. Cooling system designs were automatically generated by the proposed software tool and by applying a best practice tool engineering design approach. The two...

  9. Advanced automation for in-space vehicle processing

    Science.gov (United States)

    Sklar, Michael; Wegerif, D.

    1990-01-01

    The primary objective of this 3-year planned study is to assure that the fully evolved Space Station Freedom (SSF) can support automated processing of exploratory mission vehicles. Current study assessments show that required extravehicular activity (EVA) and to some extent intravehicular activity (IVA) manpower requirements for required processing tasks far exceeds the available manpower. Furthermore, many processing tasks are either hazardous operations or they exceed EVA capability. Thus, automation is essential for SSF transportation node functionality. Here, advanced automation represents the replacement of human performed tasks beyond the planned baseline automated tasks. Both physical tasks such as manipulation, assembly and actuation, and cognitive tasks such as visual inspection, monitoring and diagnosis, and task planning are considered. During this first year of activity both the Phobos/Gateway Mars Expedition and Lunar Evolution missions proposed by the Office of Exploration have been evaluated. A methodology for choosing optimal tasks to be automated has been developed. Processing tasks for both missions have been ranked on the basis of automation potential. The underlying concept in evaluating and describing processing tasks has been the use of a common set of 'Primitive' task descriptions. Primitive or standard tasks have been developed both for manual or crew processing and automated machine processing.

  10. Automated analysis of slitless spectra. II. Quasars

    International Nuclear Information System (INIS)

    Edwards, G.; Beauchemin, M.; Borra, F.

    1988-01-01

    Automated software have been developed to process slitless spectra. The software, described in a previous paper, automatically separates stars from extended objects and quasars from stars. This paper describes the quasar search techniques and discusses the results. The performance of the software is compared and calibrated with a plate taken in a region of SA 57 that has been extensively surveyed by others using a variety of techniques: the proposed automated software performs very well. It is found that an eye search of the same plate is less complete than the automated search: surveys that rely on eye searches suffer from incompleteness at least from a magnitude brighter than the plate limit. It is shown how the complete automated analysis of a plate and computer simulations are used to calibrate and understand the characteristics of the present data. 20 references

  11. ActionMap: A web-based software that automates loci assignments to framework maps.

    Science.gov (United States)

    Albini, Guillaume; Falque, Matthieu; Joets, Johann

    2003-07-01

    Genetic linkage computation may be a repetitive and time consuming task, especially when numerous loci are assigned to a framework map. We thus developed ActionMap, a web-based software that automates genetic mapping on a fixed framework map without adding the new markers to the map. Using this tool, hundreds of loci may be automatically assigned to the framework in a single process. ActionMap was initially developed to map numerous ESTs with a small plant mapping population and is limited to inbred lines and backcrosses. ActionMap is highly configurable and consists of Perl and PHP scripts that automate command steps for the MapMaker program. A set of web forms were designed for data import and mapping settings. Results of automatic mapping can be displayed as tables or drawings of maps and may be exported. The user may create personal access-restricted projects to store raw data, settings and mapping results. All data may be edited, updated or deleted. ActionMap may be used either online or downloaded for free (http://moulon.inra.fr/~bioinfo/).

  12. Instrument Control (iC) - An Open-Source Software to Automate Test Equipment.

    Science.gov (United States)

    Pernstich, K P

    2012-01-01

    It has become common practice to automate data acquisition from programmable instrumentation, and a range of different software solutions fulfill this task. Many routine measurements require sequential processing of certain tasks, for instance to adjust the temperature of a sample stage, take a measurement, and repeat that cycle for other temperatures. This paper introduces an open-source Java program that processes a series of text-based commands that define the measurement sequence. These commands are in an intuitive format which provides great flexibility and allows quick and easy adaptation to various measurement needs. For each of these commands, the iC-framework calls a corresponding Java method that addresses the specified instrument to perform the desired task. The functionality of iC can be extended with minimal programming effort in Java or Python, and new measurement equipment can be addressed by defining new commands in a text file without any programming.

  13. Software Process Improvement Defined

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2002-01-01

    This paper argues in favor of the development of explanatory theory on software process improvement. The last one or two decades commitment to prescriptive approaches in software process improvement theory may contribute to the emergence of a gulf dividing theorists and practitioners....... It is proposed that this divide be met by the development of theory evaluating prescriptive approaches and informing practice with a focus on the software process policymaking and process control aspects of improvement efforts...

  14. Quantification of diffusion tensor imaging in normal white matter maturation of early childhood using an automated processing pipeline.

    Science.gov (United States)

    Loh, K B; Ramli, N; Tan, L K; Roziah, M; Rahmat, K; Ariffin, H

    2012-07-01

    The degree and status of white matter myelination can be sensitively monitored using diffusion tensor imaging (DTI). This study looks at the measurement of fractional anistropy (FA) and mean diffusivity (MD) using an automated ROI with an existing DTI atlas. Anatomical MRI and structural DTI were performed cross-sectionally on 26 normal children (newborn to 48 months old), using 1.5-T MRI. The automated processing pipeline was implemented to convert diffusion-weighted images into the NIfTI format. DTI-TK software was used to register the processed images to the ICBM DTI-81 atlas, while AFNI software was used for automated atlas-based volumes of interest (VOIs) and statistical value extraction. DTI exhibited consistent grey-white matter contrast. Triphasic temporal variation of the FA and MD values was noted, with FA increasing and MD decreasing rapidly early in the first 12 months. The second phase lasted 12-24 months during which the rate of FA and MD changes was reduced. After 24 months, the FA and MD values plateaued. DTI is a superior technique to conventional MR imaging in depicting WM maturation. The use of the automated processing pipeline provides a reliable environment for quantitative analysis of high-throughput DTI data. Diffusion tensor imaging outperforms conventional MRI in depicting white matter maturation. • DTI will become an important clinical tool for diagnosing paediatric neurological diseases. • DTI appears especially helpful for developmental abnormalities, tumours and white matter disease. • An automated processing pipeline assists quantitative analysis of high throughput DTI data.

  15. Implementing Kanban for agile process management within the ALMA Software Operations Group

    Science.gov (United States)

    Reveco, Johnny; Mora, Matias; Shen, Tzu-Chiang; Soto, Ruben; Sepulveda, Jorge; Ibsen, Jorge

    2014-07-01

    After the inauguration of the Atacama Large Millimeter/submillimeter Array (ALMA), the Software Operations Group in Chile has refocused its objectives to: (1) providing software support to tasks related to System Integration, Scientific Commissioning and Verification, as well as Early Science observations; (2) testing the remaining software features, still under development by the Integrated Computing Team across the world; and (3) designing and developing processes to optimize and increase the level of automation of operational tasks. Due to their different stakeholders, each of these tasks presents a wide diversity of importances, lifespans and complexities. Aiming to provide the proper priority and traceability for every task without stressing our engineers, we introduced the Kanban methodology in our processes in order to balance the demand on the team against the throughput of the delivered work. The aim of this paper is to share experiences gained during the implementation of Kanban in our processes, describing the difficulties we have found, solutions and adaptations that led us to our current but still evolving implementation, which has greatly improved our throughput, prioritization and problem traceability.

  16. FIGENIX: Intelligent automation of genomic annotation: expertise integration in a new software platform

    Directory of Open Access Journals (Sweden)

    Pontarotti Pierre

    2005-08-01

    Full Text Available Abstract Background Two of the main objectives of the genomic and post-genomic era are to structurally and functionally annotate genomes which consists of detecting genes' position and structure, and inferring their function (as well as of other features of genomes. Structural and functional annotation both require the complex chaining of numerous different software, algorithms and methods under the supervision of a biologist. The automation of these pipelines is necessary to manage huge amounts of data released by sequencing projects. Several pipelines already automate some of these complex chaining but still necessitate an important contribution of biologists for supervising and controlling the results at various steps. Results Here we propose an innovative automated platform, FIGENIX, which includes an expert system capable to substitute to human expertise at several key steps. FIGENIX currently automates complex pipelines of structural and functional annotation under the supervision of the expert system (which allows for example to make key decisions, check intermediate results or refine the dataset. The quality of the results produced by FIGENIX is comparable to those obtained by expert biologists with a drastic gain in terms of time costs and avoidance of errors due to the human manipulation of data. Conclusion The core engine and expert system of the FIGENIX platform currently handle complex annotation processes of broad interest for the genomic community. They could be easily adapted to new, or more specialized pipelines, such as for example the annotation of miRNAs, the classification of complex multigenic families, annotation of regulatory elements and other genomic features of interest.

  17. Dynamic CT myocardial perfusion imaging: performance of 3D semi-automated evaluation software

    Energy Technology Data Exchange (ETDEWEB)

    Ebersberger, Ullrich [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); Heart Center Munich-Bogenhausen, Department of Cardiology and Intensive Care Medicine, Munich (Germany); Marcus, Roy P.; Nikolaou, Konstantin; Bamberg, Fabian [University of Munich, Institute of Clinical Radiology, Munich (Germany); Schoepf, U.J.; Gray, J.C.; McQuiston, Andrew D. [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); Lo, Gladys G. [Hong Kong Sanatorium and Hospital, Department of Diagnostic and Interventional Radiology, Hong Kong (China); Wang, Yining [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); Peking Union Medical College Hospital, Chinese Academy of Medical Sciences, Department of Radiology, Beijing (China); Blanke, Philipp [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); University Hospital Freiburg, Department of Diagnostic Radiology, Freiburg (Germany); Geyer, Lucas L. [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); University of Munich, Institute of Clinical Radiology, Munich (Germany); Cho, Young Jun [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); Konyang University College of Medicine, Department of Radiology, Daejeon (Korea, Republic of); Scheuering, Michael; Canstein, Christian [Siemens Healthcare, CT Division, Forchheim (Germany); Hoffmann, Ellen [Heart Center Munich-Bogenhausen, Department of Cardiology and Intensive Care Medicine, Munich (Germany)

    2014-01-15

    To evaluate the performance of three-dimensional semi-automated evaluation software for the assessment of myocardial blood flow (MBF) and blood volume (MBV) at dynamic myocardial perfusion computed tomography (CT). Volume-based software relying on marginal space learning and probabilistic boosting tree-based contour fitting was applied to CT myocardial perfusion imaging data of 37 subjects. In addition, all image data were analysed manually and both approaches were compared with SPECT findings. Study endpoints included time of analysis and conventional measures of diagnostic accuracy. Of 592 analysable segments, 42 showed perfusion defects on SPECT. Average analysis times for the manual and software-based approaches were 49.1 ± 11.2 and 16.5 ± 3.7 min respectively (P < 0.01). There was strong agreement between the two measures of interest (MBF, ICC = 0.91, and MBV, ICC = 0.88, both P < 0.01) and no significant difference in MBF/MBV with respect to diagnostic accuracy between the two approaches for both MBF and MBV for manual versus software-based approach; respectively; all comparisons P > 0.05. Three-dimensional semi-automated evaluation of dynamic myocardial perfusion CT data provides similar measures and diagnostic accuracy to manual evaluation, albeit with substantially reduced analysis times. This capability may aid the integration of this test into clinical workflows. (orig.)

  18. Next generation software process improvement

    OpenAIRE

    Turnas, Daniel

    2003-01-01

    Approved for public release; distribution is unlimited Software is often developed under a process that can at best be described as ad hoc. While it is possible to develop quality software under an ad hoc process, formal processes can be developed to help increase the overall quality of the software under development. The application of these processes allows for an organization to mature. The software maturity level, and process improvement, of an organization can be measured with the Cap...

  19. SERPent: Automated reduction and RFI-mitigation software for e-MERLIN

    Science.gov (United States)

    Peck, Luke W.; Fenech, Danielle M.

    2013-08-01

    The Scripted E-merlin Rfi-mitigation PipelinE for iNTerferometry (SERPent) is an automated reduction and RFI-mitigation procedure utilising the SumThreshold methodology (Offringa et al., 2010a), originally developed for the LOFAR pipeline. SERPent is written in the Parseltongue language enabling interaction with the Astronomical Image Processing Software (AIPS) program. Moreover, SERPent is a simple 'out of the box' Python script, which is easy to set up and is free of compilers. In addition to the flagging of RFI affected visibilities, the script also flags antenna zero-amplitude dropouts and Lovell telescope phase calibrator stationary scans inherent to the e-MERLIN system. Both the flagging and computational performances of SERPent are presented here, for e-MERLIN commissioning datasets for both L-band (1.3-1.8 GHz) and C-band (4-8 GHz) observations. RFI typically amounts to future interferometers such as the SKA and the associated pathfinders (MeerKAT and ASKAP), where the vast data sizes (>TB) make traditional astronomer interactions unfeasible.

  20. Software complex for developing dynamically packed program system for experiment automation

    International Nuclear Information System (INIS)

    Baluka, G.; Salamatin, I.M.

    1985-01-01

    Software complex for developing dynamically packed program system for experiment automation is considered. The complex includes general-purpose programming systems represented as the RT-11 standard operating system and specially developed problem-oriented moduli providing execution of certain jobs. The described complex is realized in the PASKAL' and MAKRO-2 languages and it is rather flexible to variations of the technique of the experiment

  1. Rapid prototyping of an automated video surveillance system: a hardware-software co-design approach

    Science.gov (United States)

    Ngo, Hau T.; Rakvic, Ryan N.; Broussard, Randy P.; Ives, Robert W.

    2011-06-01

    FPGA devices with embedded DSP and memory blocks, and high-speed interfaces are ideal for real-time video processing applications. In this work, a hardware-software co-design approach is proposed to effectively utilize FPGA features for a prototype of an automated video surveillance system. Time-critical steps of the video surveillance algorithm are designed and implemented in the FPGAs logic elements to maximize parallel processing. Other non timecritical tasks are achieved by executing a high level language program on an embedded Nios-II processor. Pre-tested and verified video and interface functions from a standard video framework are utilized to significantly reduce development and verification time. Custom and parallel processing modules are integrated into the video processing chain by Altera's Avalon Streaming video protocol. Other data control interfaces are achieved by connecting hardware controllers to a Nios-II processor using Altera's Avalon Memory Mapped protocol.

  2. Development of a software for INAA analysis automation

    International Nuclear Information System (INIS)

    Zahn, Guilherme S.; Genezini, Frederico A.; Figueiredo, Ana Maria G.; Ticianelli, Regina B.

    2013-01-01

    In this work, a software to automate the post-counting tasks in comparative INAA has been developed that aims to become more flexible than the available options, integrating itself with some of the routines currently in use in the IPEN Activation Analysis Laboratory and allowing the user to choose between a fully-automatic analysis or an Excel-oriented one. The software makes use of the Genie 2000 data importing and analysis routines and stores each 'energy-counts-uncertainty' table as a separate ASCII file that can be used later on if required by the analyst. Moreover, it generates an Excel-compatible CSV (comma separated values) file with only the relevant results from the analyses for each sample or comparator, as well as the results of the concentration calculations and the results obtained with four different statistical tools (unweighted average, weighted average, normalized residuals and Rajeval technique), allowing the analyst to double-check the results. Finally, a 'summary' CSV file is also produced, with the final concentration results obtained for each element in each sample. (author)

  3. Automation of the process of generation of the students insurance, applying RFID and GPRS technologies

    Directory of Open Access Journals (Sweden)

    Nelson Barrera-Lombana

    2013-07-01

    Full Text Available This article presents the description of the design and implementation of a system which allows the fulfilment of a consultation service on various parameters to a web server using a GSM modem, exchanging information systems over the Internet (ISS and radio-frequency identification (RFID. The application validates for its use in automation of the process of generation of the student insurance, and hardware and software, developed by the Research Group in Robotics and Industrial Automation GIRAof UPTC, are used as a platform.

  4. Aptaligner: automated software for aligning pseudorandom DNA X-aptamers from next-generation sequencing data.

    Science.gov (United States)

    Lu, Emily; Elizondo-Riojas, Miguel-Angel; Chang, Jeffrey T; Volk, David E

    2014-06-10

    Next-generation sequencing results from bead-based aptamer libraries have demonstrated that traditional DNA/RNA alignment software is insufficient. This is particularly true for X-aptamers containing specialty bases (W, X, Y, Z, ...) that are identified by special encoding. Thus, we sought an automated program that uses the inherent design scheme of bead-based X-aptamers to create a hypothetical reference library and Markov modeling techniques to provide improved alignments. Aptaligner provides this feature as well as length error and noise level cutoff features, is parallelized to run on multiple central processing units (cores), and sorts sequences from a single chip into projects and subprojects.

  5. Managing Software Process Evolution

    DEFF Research Database (Denmark)

    This book focuses on the design, development, management, governance and application of evolving software processes that are aligned with changing business objectives, such as expansion to new domains or shifting to global production. In the context of an evolving business world, it examines...... the complete software process lifecycle, from the initial definition of a product to its systematic improvement. In doing so, it addresses difficult problems, such as how to implement processes in highly regulated domains or where to find a suitable notation system for documenting processes, and provides...... essential insights and tips to help readers manage process evolutions. And last but not least, it provides a wealth of examples and cases on how to deal with software evolution in practice. Reflecting these topics, the book is divided into three parts. Part 1 focuses on software business transformation...

  6. The accuracy of a designed software for automated localization of craniofacial landmarks on CBCT images

    International Nuclear Information System (INIS)

    Shahidi, Shoaleh; Bahrampour, Ehsan; Soltanimehr, Elham; Zamani, Ali; Oshagh, Morteza; Moattari, Marzieh; Mehdizadeh, Alireza

    2014-01-01

    Two-dimensional projection radiographs have been traditionally considered the modality of choice for cephalometric analysis. To overcome the shortcomings of two-dimensional images, three-dimensional computed tomography (CT) has been used to evaluate craniofacial structures. However, manual landmark detection depends on medical expertise, and the process is time-consuming. The present study was designed to produce software capable of automated localization of craniofacial landmarks on cone beam (CB) CT images based on image registration and to evaluate its accuracy. The software was designed using MATLAB programming language. The technique was a combination of feature-based (principal axes registration) and voxel similarity-based methods for image registration. A total of 8 CBCT images were selected as our reference images for creating a head atlas. Then, 20 CBCT images were randomly selected as the test images for evaluating the method. Three experts twice located 14 landmarks in all 28 CBCT images during two examinations set 6 weeks apart. The differences in the distances of coordinates of each landmark on each image between manual and automated detection methods were calculated and reported as mean errors. The combined intraclass correlation coefficient for intraobserver reliability was 0.89 and for interobserver reliability 0.87 (95% confidence interval, 0.82 to 0.93). The mean errors of all 14 landmarks were <4 mm. Additionally, 63.57% of landmarks had a mean error of <3 mm compared with manual detection (gold standard method). The accuracy of our approach for automated localization of craniofacial landmarks, which was based on combining feature-based and voxel similarity-based methods for image registration, was acceptable. Nevertheless we recommend repetition of this study using other techniques, such as intensity-based methods

  7. Office automation: a look beyond word processing

    OpenAIRE

    DuBois, Milan Ephriam, Jr.

    1983-01-01

    Approved for public release; distribution is unlimited Word processing was the first of various forms of office automation technologies to gain widespread acceptance and usability in the business world. For many, it remains the only form of office automation technology. Office automation, however, is not just word processing, although it does include the function of facilitating and manipulating text. In reality, office automation is not one innovation, or one office system, or one tech...

  8. Recent development in software and automation tools for high-throughput discovery bioanalysis.

    Science.gov (United States)

    Shou, Wilson Z; Zhang, Jun

    2012-05-01

    Bioanalysis with LC-MS/MS has been established as the method of choice for quantitative determination of drug candidates in biological matrices in drug discovery and development. The LC-MS/MS bioanalytical support for drug discovery, especially for early discovery, often requires high-throughput (HT) analysis of large numbers of samples (hundreds to thousands per day) generated from many structurally diverse compounds (tens to hundreds per day) with a very quick turnaround time, in order to provide important activity and liability data to move discovery projects forward. Another important consideration for discovery bioanalysis is its fit-for-purpose quality requirement depending on the particular experiments being conducted at this stage, and it is usually not as stringent as those required in bioanalysis supporting drug development. These aforementioned attributes of HT discovery bioanalysis made it an ideal candidate for using software and automation tools to eliminate manual steps, remove bottlenecks, improve efficiency and reduce turnaround time while maintaining adequate quality. In this article we will review various recent developments that facilitate automation of individual bioanalytical procedures, such as sample preparation, MS/MS method development, sample analysis and data review, as well as fully integrated software tools that manage the entire bioanalytical workflow in HT discovery bioanalysis. In addition, software tools supporting the emerging high-resolution accurate MS bioanalytical approach are also discussed.

  9. EpHLA: an innovative and user-friendly software automating the HLAMatchmaker algorithm for antibody analysis.

    Science.gov (United States)

    Sousa, Luiz Cláudio Demes da Mata; Filho, Herton Luiz Alves Sales; Von Glehn, Cristina de Queiroz Carrascosa; da Silva, Adalberto Socorro; Neto, Pedro de Alcântara dos Santos; de Castro, José Adail Fonseca; do Monte, Semíramis Jamil Hadad

    2011-12-01

    The global challenge for solid organ transplantation programs is to distribute organs to the highly sensitized recipients. The purpose of this work is to describe and test the functionality of the EpHLA software, a program that automates the analysis of acceptable and unacceptable HLA epitopes on the basis of the HLAMatchmaker algorithm. HLAMatchmaker considers small configurations of polymorphic residues referred to as eplets as essential components of HLA-epitopes. Currently, the analyses require the creation of temporary files and the manual cut and paste of laboratory tests results between electronic spreadsheets, which is time-consuming and prone to administrative errors. The EpHLA software was developed in Object Pascal programming language and uses the HLAMatchmaker algorithm to generate histocompatibility reports. The automated generation of reports requires the integration of files containing the results of laboratory tests (HLA typing, anti-HLA antibody signature) and public data banks (NMDP, IMGT). The integration and the access to this data were accomplished by means of the framework called eDAFramework. The eDAFramework was developed in Object Pascal and PHP and it provides data access functionalities for software developed in these languages. The tool functionality was successfully tested in comparison to actual, manually derived reports of patients from a renal transplantation program with related donors. We successfully developed software, which enables the automated definition of the epitope specificities of HLA antibodies. This new tool will benefit the management of recipient/donor pairs selection for highly sensitized patients. Copyright © 2011 Elsevier B.V. All rights reserved.

  10. Benefit of modelling regarding the quality and efficiency of PLC-programming in process automation; Nutzen von Modellierung fuer die Qualitaet und Effizienz der Steuerungsprogrammierung in der Automatisierungstechnik

    Energy Technology Data Exchange (ETDEWEB)

    Friedrich, D; Vogel-Heuser, B [Bergische Univ. Wuppertal (Germany). Lehrstuhl fuer Automatisierungstechnik/Prozessinformatik

    2006-03-15

    Software development in process automation has many deficiencies in procedures, notations and tool support. As a result, modern software engineering concepts and notations, like object oriented approaches or UML, are not wide spread in this field. Hence, drawbacks regarding start-up times, additional costs and low software quality are immense. This paper will evaluate the benefit of modelling as a design step prior to coding, regarding cognitive paradigms. Two modelling notations (UML and ICL) will be compared analyzing their impact on the quality of automation software written in IEC 61131. (orig.)

  11. Automated software development tools in the MIS (Management Information Systems) environment

    Energy Technology Data Exchange (ETDEWEB)

    Arrowood, L.F.; Emrich, M.L.

    1987-09-11

    Quantitative and qualitative benefits can be obtained through the use of automated software development tools. Such tools are best utilized when they complement existing procedures and standards. They can assist systems analysts and programmers with project specification, design, implementation, testing, and documentation. Commercial products have been evaluated to determine their efficacy. User comments have been included to illustrate actual benefits derived from introducing these tools into MIS organizations.

  12. Proof-of-concept automation of propellant processing

    Science.gov (United States)

    Ramohalli, Kumar; Schallhorn, P. A.

    1989-01-01

    For space-based propellant production, automation of the process is needed. Currently, all phases of terrestrial production have some form of human interaction. A mixer was acquired to help perform the tasks of automation. A heating system to be used with the mixer was designed, built, and installed. Tests performed on the heating system verify design criteria. An IBM PS/2 personal computer was acquired for the future automation work. It is hoped that some the mixing process itself will be automated. This is a concept demonstration task; proving that propellant production can be automated reliably.

  13. Automated software system for checking the structure and format of ACM SIG documents

    Science.gov (United States)

    Mirza, Arsalan Rahman; Sah, Melike

    2017-04-01

    Microsoft (MS) Office Word is one of the most commonly used software tools for creating documents. MS Word 2007 and above uses XML to represent the structure of MS Word documents. Metadata about the documents are automatically created using Office Open XML (OOXML) syntax. We develop a new framework, which is called ADFCS (Automated Document Format Checking System) that takes the advantage of the OOXML metadata, in order to extract semantic information from MS Office Word documents. In particular, we develop a new ontology for Association for Computing Machinery (ACM) Special Interested Group (SIG) documents for representing the structure and format of these documents by using OWL (Web Ontology Language). Then, the metadata is extracted automatically in RDF (Resource Description Framework) according to this ontology using the developed software. Finally, we generate extensive rules in order to infer whether the documents are formatted according to ACM SIG standards. This paper, introduces ACM SIG ontology, metadata extraction process, inference engine, ADFCS online user interface, system evaluation and user study evaluations.

  14. Development of an automation software for reconciliation of INIS/ETDE thesauruses

    International Nuclear Information System (INIS)

    Singh, Manoj; Gupta, Rajiv; Prakasan, E.R.; Vijai Kumar

    1999-01-01

    ETDE (Energy Technology Data Exchange) and INIS (International Nuclear Information System) thesauruses contain nearly twenty thousand descriptors and are not necessarily identical. A project has been undertaken by the international organisations to make a common thesaurus for both INIS and ETDE to facilitate better exchange and retrieval of information between/from these databases. This paper describes the automation implemented during our participation in the project for reconcile the structures of the word blocks in the ETDE and INIS thesauruses, with respect to the descriptors currently in the two thesauruses through a PC based RDBMS Software. The software THEMERGE was developed in FoxPro 2.5 Relational Database Management Systems. The software handles all possible reconcile recommendation suggested by specialist, printing the recommendation sheet for uploading it later. This has not only widened the scope of flexibility, portability and convertibility of recommendations, but also helped to achieve quicker project completion. (author)

  15. Automation in a material processing/storage facility

    International Nuclear Information System (INIS)

    Peterson, K.; Gordon, J.

    1997-01-01

    The Savannah River Site (SRS) is currently developing a new facility, the Actinide Packaging and Storage Facility (APSF), to process and store legacy materials from the United States nuclear stockpile. A variety of materials, with a variety of properties, packaging and handling/storage requirements, will be processed and stored at the facility. Since these materials are hazardous and radioactive, automation will be used to minimize worker exposure. Other benefits derived from automation of the facility include increased throughput capacity and enhanced security. The diversity of materials and packaging geometries to be handled poses challenges to the automation of facility processes. In addition, the nature of the materials to be processed underscores the need for safety, reliability and serviceability. The application of automation in this facility must, therefore, be accomplished in a rational and disciplined manner to satisfy the strict operational requirements of the facility. Among the functions to be automated are the transport of containers between process and storage areas via an Automatic Guided Vehicle (AGV), and various processes in the Shipping Package Unpackaging (SPU) area, the Accountability Measurements (AM) area, the Special Isotope Storage (SIS) vault and the Special Nuclear Materials (SNM) vault. Other areas of the facility are also being automated, but are outside the scope of this paper

  16. Process-based software project management

    CERN Document Server

    Goodman, F Alan

    2006-01-01

    Not connecting software project management (SPM) to actual, real-world development processes can lead to a complete divorcing of SPM to software engineering that can undermine any successful software project. By explaining how a layered process architectural model improves operational efficiency, Process-Based Software Project Management outlines a new method that is more effective than the traditional method when dealing with SPM. With a clear and easy-to-read approach, the book discusses the benefits of an integrated project management-process management connection. The described tight coup

  17. The contaminant analysis automation robot implementation for the automated laboratory

    International Nuclear Information System (INIS)

    Younkin, J.R.; Igou, R.E.; Urenda, T.D.

    1995-01-01

    The Contaminant Analysis Automation (CAA) project defines the automated laboratory as a series of standard laboratory modules (SLM) serviced by a robotic standard support module (SSM). These SLMs are designed to allow plug-and-play integration into automated systems that perform standard analysis methods (SAM). While the SLMs are autonomous in the execution of their particular chemical processing task, the SAM concept relies on a high-level task sequence controller (TSC) to coordinate the robotic delivery of materials requisite for SLM operations, initiate an SLM operation with the chemical method dependent operating parameters, and coordinate the robotic removal of materials from the SLM when its commands and events has been established to allow ready them for transport operations as well as performing the Supervisor and Subsystems (GENISAS) software governs events from the SLMs and robot. The Intelligent System Operating Environment (ISOE) enables the inter-process communications used by GENISAS. CAA selected the Hewlett-Packard Optimized Robot for Chemical Analysis (ORCA) and its associated Windows based Methods Development Software (MDS) as the robot SSM. The MDS software is used to teach the robot each SLM position and required material port motions. To allow the TSC to command these SLM motions, a hardware and software implementation was required that allowed message passing between different operating systems. This implementation involved the use of a Virtual Memory Extended (VME) rack with a Force CPU-30 computer running VxWorks; a real-time multitasking operating system, and a Radiuses PC compatible VME computer running MDS. A GENISAS server on The Force computer accepts a transport command from the TSC, a GENISAS supervisor, over Ethernet and notifies software on the RadiSys PC of the pending command through VMEbus shared memory. The command is then delivered to the MDS robot control software using a Windows Dynamic Data Exchange conversation

  18. Laboratory automation: trajectory, technology, and tactics.

    Science.gov (United States)

    Markin, R S; Whalen, S A

    2000-05-01

    Laboratory automation is in its infancy, following a path parallel to the development of laboratory information systems in the late 1970s and early 1980s. Changes on the horizon in healthcare and clinical laboratory service that affect the delivery of laboratory results include the increasing age of the population in North America, the implementation of the Balanced Budget Act (1997), and the creation of disease management companies. Major technology drivers include outcomes optimization and phenotypically targeted drugs. Constant cost pressures in the clinical laboratory have forced diagnostic manufacturers into less than optimal profitability states. Laboratory automation can be a tool for the improvement of laboratory services and may decrease costs. The key to improvement of laboratory services is implementation of the correct automation technology. The design of this technology should be driven by required functionality. Automation design issues should be centered on the understanding of the laboratory and its relationship to healthcare delivery and the business and operational processes in the clinical laboratory. Automation design philosophy has evolved from a hardware-based approach to a software-based approach. Process control software to support repeat testing, reflex testing, and transportation management, and overall computer-integrated manufacturing approaches to laboratory automation implementation are rapidly expanding areas. It is clear that hardware and software are functionally interdependent and that the interface between the laboratory automation system and the laboratory information system is a key component. The cost-effectiveness of automation solutions suggested by vendors, however, has been difficult to evaluate because the number of automation installations are few and the precision with which operational data have been collected to determine payback is suboptimal. The trend in automation has moved from total laboratory automation to a

  19. ASAP (Automatic Software for ASL Processing): A toolbox for processing Arterial Spin Labeling images.

    Science.gov (United States)

    Mato Abad, Virginia; García-Polo, Pablo; O'Daly, Owen; Hernández-Tamames, Juan Antonio; Zelaya, Fernando

    2016-04-01

    The method of Arterial Spin Labeling (ASL) has experienced a significant rise in its application to functional imaging, since it is the only technique capable of measuring blood perfusion in a truly non-invasive manner. Currently, there are no commercial packages for processing ASL data and there is no recognized standard for normalizing ASL data to a common frame of reference. This work describes a new Automated Software for ASL Processing (ASAP) that can automatically process several ASL datasets. ASAP includes functions for all stages of image pre-processing: quantification, skull-stripping, co-registration, partial volume correction and normalization. To assess the applicability and validity of the toolbox, this work shows its application in the study of hypoperfusion in a sample of healthy subjects at risk of progressing to Alzheimer's disease. ASAP requires limited user intervention, minimizing the possibility of random and systematic errors, and produces cerebral blood flow maps that are ready for statistical group analysis. The software is easy to operate and results in excellent quality of spatial normalization. The results found in this evaluation study are consistent with previous studies that find decreased perfusion in Alzheimer's patients in similar regions and demonstrate the applicability of ASAP. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Software Development Standard Processes (SDSP)

    Science.gov (United States)

    Lavin, Milton L.; Wang, James J.; Morillo, Ronald; Mayer, John T.; Jamshidian, Barzia; Shimizu, Kenneth J.; Wilkinson, Belinda M.; Hihn, Jairus M.; Borgen, Rosana B.; Meyer, Kenneth N.; hide

    2011-01-01

    A JPL-created set of standard processes is to be used throughout the lifecycle of software development. These SDSPs cover a range of activities, from management and engineering activities, to assurance and support activities. These processes must be applied to software tasks per a prescribed set of procedures. JPL s Software Quality Improvement Project is currently working at the behest of the JPL Software Process Owner to ensure that all applicable software tasks follow these procedures. The SDSPs are captured as a set of 22 standards in JPL s software process domain. They were developed in-house at JPL by a number of Subject Matter Experts (SMEs) residing primarily within the Engineering and Science Directorate, but also from the Business Operations Directorate and Safety and Mission Success Directorate. These practices include not only currently performed best practices, but also JPL-desired future practices in key thrust areas like software architecting and software reuse analysis. Additionally, these SDSPs conform to many standards and requirements to which JPL projects are beholden.

  1. PEMBELAJARAN SISTEM HIDROLIK DAN PNEUMATIK DENGAN MENGGUNAKAN AUTOMATION STUDIO

    Directory of Open Access Journals (Sweden)

    Adi Dewanto

    2015-02-01

    Full Text Available ABSTRACT Students find it difficult to master the hydraulic and pneumatic system due to the lack of imagination on the component movement. It affects students’ learningprocess on the hydraulic and pneumatic system application. In order to solve the problem, the lecturer of Mechatronics course used the Automation Studio application. This software was helpful to design various automations, such as combination of hydraulic system, pneumatic system, electric system, and PLC. The lecturing process and design simulation were conducted by using Automation Studio. In general, the students were so much helped by this program in mastering the theory and practice of hydraulic and Pneumatic. On the other hand, it was found some problem in applying the Automation Studio on the classroom. The problems were limited onthe menu option as well ason the technical aspects related to the number of the computer. The implications from the writers’ experience in using Automation Studio were there was an opportunity for computer programmer to create learningmedia/ software for certain competence which was relevant, accessible and applicable. Also, in case of software preparation, it should be conducted by the lecturers and the students before the learning process. Keywords: automation studio program, learning process, Pneumatic and hydraulic learning

  2. Database Security for an Integrated Solution to Automate Sales Processes in Banking

    OpenAIRE

    Alexandra Maria Ioana FLOREA

    2013-01-01

    In order to maintain a competitive edge in a very active banking market the implementation of a web-based solution to standardize, optimize and manage the flow of sales / pre-sales and generating new leads is requested by a company. This article presents the realization of a development framework for software interoperability in the banking financial institutions and an integrated solution for achieving sales process automation in banking. The paper focuses on presenting the requirements for ...

  3. Software Process Improvement: Blueprints versus Recipes

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2003-01-01

    Viewing software processes as blueprints emphasizes that design is separate from use, and thus that software process designers and users are independent. In the approach presented here, software processes are viewed as recipes; developers individually and collectively design their own software...... processes through facilitation, reflection, and improvisation. Udgivelsesdato: SEP-OCT...

  4. Model-based software process improvement

    Science.gov (United States)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  5. Highly Automated Agile Testing Process: An Industrial Case Study

    Directory of Open Access Journals (Sweden)

    Jarosław Berłowski

    2016-09-01

    Full Text Available This paper presents a description of an agile testing process in a medium size software project that is developed using Scrum. The research methods used is the case study were as follows: surveys, quantifiable project data sources and qualitative project members opinions were used for data collection. Challenges related to the testing process regarding a complex project environment and unscheduled releases were identified. Based on the obtained results, we concluded that the described approach addresses well the aforementioned issues. Therefore, recommendations were made with regard to the employed principles of agility, specifically: continuous integration, responding to change, test automation and test driven development. Furthermore, an efficient testing environment that combines a number of test frameworks (e.g. JUnit, Selenium, Jersey Test with custom-developed simulators is presented.

  6. Automated software for hydraulic simulation of pipeline operation

    Directory of Open Access Journals (Sweden)

    Hurgin Roman

    2018-01-01

    Full Text Available Design of modern water supply systems of large cities as well as their management via renovation of hydraulic models poses time-consuming tasks to researchers, and coping with this task requires specific approaches. When tackling these tasks, water services companies come across a lot of information about various objects of water infrastructure, the majority of which is located underground. In those cases, modern computer-aided design systems containing various components come to help. These systems help to solve a wide array of problems using existing information regarding pipelines, analysis and optimization of their basic parameters. CAD software is becoming an integral part of water supply systems management in large cities, and its capabilities allow engineering and operating companies to not only collect all the necessary data concerning water supply systems in any given city, but also to conduct research aimed at improving various parameters of these systems, including optimization of their hydraulic properties which directly determine the quality of water. This paper contains the analysis of automated CAD software for hydraulic design and management of city water supply systems in order to provide safe and efficient operation of these water supply systems. Authors select the most suitable software that might be used to provide hydraulic compatibility of old and new sections of water supply ring mains after selective or continuous draw-in renovation and decrease in diameter of distribution networks against the background of water consumption decrease in the cities.

  7. Entropy based software processes improvement

    NARCIS (Netherlands)

    Trienekens, J.J.M.; Kusters, R.J.; Kriek, D.; Siemons, P.

    2009-01-01

    Actual results of software process improvement projects show different levels of success. Although many software development organisations have adopted improvement models such as CMMI, it appears to be difficult to improve software development processes in the right way, e.g. tuned to the actual

  8. Development strategy and process models for phased automation of design and digital manufacturing electronics

    Science.gov (United States)

    Korshunov, G. I.; Petrushevskaya, A. A.; Lipatnikov, V. A.; Smirnova, M. S.

    2018-03-01

    The strategy of quality of electronics insurance is represented as most important. To provide quality, the processes sequence is considered and modeled by Markov chain. The improvement is distinguished by simple database means of design for manufacturing for future step-by-step development. Phased automation of design and digital manufacturing electronics is supposed. The MatLab modelling results showed effectiveness increase. New tools and software should be more effective. The primary digital model is proposed to represent product in the processes sequence from several processes till the whole life circle.

  9. Application of knowledge based software to industrial automation in Japan

    International Nuclear Information System (INIS)

    Matsumoto, Yoshihiro

    1985-01-01

    In Japan, large industrial undertakings such as electric utilities or steel works are making first steps towards knowledge engineering, testing the applicability of knowledge based software to industrial automation. The goal is to achieve more intelligent, computer-aided assistance for the personnel and thus to enhance safety, reliability, and maintenance efficiency in large industrial plants. The article presents various examples showing advantages and draw-backs of such systems, and potential applications among others in nuclear or fossil fueled power plants or in electricity supply control systems. (orig./HP) [de

  10. Software process in Geant4

    International Nuclear Information System (INIS)

    Cosmo, G.

    2001-01-01

    Since its erliest years of R and D, the GEANT4 simulation toolkit has been developed following software process standards which dictated the overall evolution of the project. The complexity of the software involved, the wide areas of application of the software product, the huge amount of code and Category complexity, the size and distributed nature of the Collaboration itself are all ingredients which involve and correlate together a wide variety of software processes. Although in 'production' and available to the public since December 1998, the GEANT4 software product includes Category Domains which are still under active development. Therefore they require different treatment also in terms of improvement of the development cycle, system testing and user support. The author is meant to describe some of the software processes as they are applied in GEANT4 for both development, testing and maintenance of the software

  11. JTst - An Automated Unit Testing Tool for Java Program

    OpenAIRE

    Kamal Z.  Zamli; Nor A. M.  Isa

    2008-01-01

    Software testing is an integral part of software development lifecycle. Lack of testing can often lead to disastrous consequences including lost of data, fortunes, and even lives. Despite its importance, current software testing practice lacks automation, and is still primarily based on highly manual processes from the generation of test cases up to the actual execution of the test. Although the emergence of helpful automated testing tools in the market is blooming, their adoptions are lackin...

  12. A system for automated quantification of cutaneous electrogastrograms

    DEFF Research Database (Denmark)

    Paskaranandavadivel, Niranchan; Bull, Simon Henry; Parsell, Doug

    2015-01-01

    and amplitude were compared to automated estimates. The methods were packaged into a software executable which processes the data and presents the results in an intuitive graphical and a spreadsheet formats. Automated EGG analysis allows for clinical translation of bio-electrical analysis for potential......Clinical evaluation of cutaneous electrogastrograms (EGG) is important for understanding the role of slow waves in functional motility disorders and may be a useful diagnostic aid. An automated software package has been developed which computes metrics of interest from EGG and from slow wave...

  13. Instrument Control (iC) – An Open-Source Software to Automate Test Equipment

    Science.gov (United States)

    Pernstich, K. P.

    2012-01-01

    It has become common practice to automate data acquisition from programmable instrumentation, and a range of different software solutions fulfill this task. Many routine measurements require sequential processing of certain tasks, for instance to adjust the temperature of a sample stage, take a measurement, and repeat that cycle for other temperatures. This paper introduces an open-source Java program that processes a series of text-based commands that define the measurement sequence. These commands are in an intuitive format which provides great flexibility and allows quick and easy adaptation to various measurement needs. For each of these commands, the iC-framework calls a corresponding Java method that addresses the specified instrument to perform the desired task. The functionality of iC can be extended with minimal programming effort in Java or Python, and new measurement equipment can be addressed by defining new commands in a text file without any programming. PMID:26900522

  14. A hardware-software system for the automation of verification and calibration of oil metering units secondary equipment

    Science.gov (United States)

    Boyarnikov, A. V.; Boyarnikova, L. V.; Kozhushko, A. A.; Sekachev, A. F.

    2017-08-01

    In the article the process of verification (calibration) of oil metering units secondary equipment is considered. The purpose of the work is to increase the reliability and reduce the complexity of this process by developing a software and hardware system that provides automated verification and calibration. The hardware part of this complex carries out the commutation of the measuring channels of the verified controller and the reference channels of the calibrator in accordance with the introduced algorithm. The developed software allows controlling the commutation of channels, setting values on the calibrator, reading the measured data from the controller, calculating errors and compiling protocols. This system can be used for checking the controllers of the secondary equipment of the oil metering units in the automatic verification mode (with the open communication protocol) or in the semi-automatic verification mode (without it). The peculiar feature of the approach used is the development of a universal signal switch operating under software control, which can be configured for various verification methods (calibration), which allows to cover the entire range of controllers of metering units secondary equipment. The use of automatic verification with the help of a hardware and software system allows to shorten the verification time by 5-10 times and to increase the reliability of measurements, excluding the influence of the human factor.

  15. Will the future of knowledge work automation transform personalized medicine?

    OpenAIRE

    Gauri Naik; Sanika S. Bhide

    2014-01-01

    Today, we live in a world of ?information overload? which demands high level of knowledge-based work. However, advances in computer hardware and software have opened possibilities to automate ?routine cognitive tasks? for knowledge processing. Engineering intelligent software systems that can process large data sets using unstructured commands and subtle judgments and have the ability to learn ?on the fly? are a significant step towards automation of knowledge work. The applications of this t...

  16. Adaptive Algorithms for Automated Processing of Document Images

    Science.gov (United States)

    2011-01-01

    ABSTRACT Title of dissertation: ADAPTIVE ALGORITHMS FOR AUTOMATED PROCESSING OF DOCUMENT IMAGES Mudit Agrawal, Doctor of Philosophy, 2011...2011 4. TITLE AND SUBTITLE Adaptive Algorithms for Automated Processing of Document Images 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM...ALGORITHMS FOR AUTOMATED PROCESSING OF DOCUMENT IMAGES by Mudit Agrawal Dissertation submitted to the Faculty of the Graduate School of the University

  17. Application of Novel Software Algorithms to Spectral-Domain Optical Coherence Tomography for Automated Detection of Diabetic Retinopathy.

    Science.gov (United States)

    Adhi, Mehreen; Semy, Salim K; Stein, David W; Potter, Daniel M; Kuklinski, Walter S; Sleeper, Harry A; Duker, Jay S; Waheed, Nadia K

    2016-05-01

    To present novel software algorithms applied to spectral-domain optical coherence tomography (SD-OCT) for automated detection of diabetic retinopathy (DR). Thirty-one diabetic patients (44 eyes) and 18 healthy, nondiabetic controls (20 eyes) who underwent volumetric SD-OCT imaging and fundus photography were retrospectively identified. A retina specialist independently graded DR stage. Trained automated software generated a retinal thickness score signifying macular edema and a cluster score signifying microaneurysms and/or hard exudates for each volumetric SD-OCT. Of 44 diabetic eyes, 38 had DR and six eyes did not have DR. Leave-one-out cross-validation using a linear discriminant at missed detection/false alarm ratio of 3.00 computed software sensitivity and specificity of 92% and 69%, respectively, for DR detection when compared to clinical assessment. Novel software algorithms applied to commercially available SD-OCT can successfully detect DR and may have potential as a viable screening tool for DR in future. [Ophthalmic Surg Lasers Imaging Retina. 2016;47:410-417.]. Copyright 2016, SLACK Incorporated.

  18. Automated pre-processing and multivariate vibrational spectra analysis software for rapid results in clinical settings

    Science.gov (United States)

    Bhattacharjee, T.; Kumar, P.; Fillipe, L.

    2018-02-01

    Vibrational spectroscopy, especially FTIR and Raman, has shown enormous potential in disease diagnosis, especially in cancers. Their potential for detecting varied pathological conditions are regularly reported. However, to prove their applicability in clinics, large multi-center multi-national studies need to be undertaken; and these will result in enormous amount of data. A parallel effort to develop analytical methods, including user-friendly software that can quickly pre-process data and subject them to required multivariate analysis is warranted in order to obtain results in real time. This study reports a MATLAB based script that can automatically import data, preprocess spectra— interpolation, derivatives, normalization, and then carry out Principal Component Analysis (PCA) followed by Linear Discriminant Analysis (LDA) of the first 10 PCs; all with a single click. The software has been verified on data obtained from cell lines, animal models, and in vivo patient datasets, and gives results comparable to Minitab 16 software. The software can be used to import variety of file extensions, asc, .txt., .xls, and many others. Options to ignore noisy data, plot all possible graphs with PCA factors 1 to 5, and save loading factors, confusion matrices and other parameters are also present. The software can provide results for a dataset of 300 spectra within 0.01 s. We believe that the software will be vital not only in clinical trials using vibrational spectroscopic data, but also to obtain rapid results when these tools get translated into clinics.

  19. Software process improvement in the NASA software engineering laboratory

    Science.gov (United States)

    Mcgarry, Frank; Pajerski, Rose; Page, Gerald; Waligora, Sharon; Basili, Victor; Zelkowitz, Marvin

    1994-01-01

    The Software Engineering Laboratory (SEL) was established in 1976 for the purpose of studying and measuring software processes with the intent of identifying improvements that could be applied to the production of ground support software within the Flight Dynamics Division (FDD) at the National Aeronautics and Space Administration (NASA)/Goddard Space Flight Center (GSFC). The SEL has three member organizations: NASA/GSFC, the University of Maryland, and Computer Sciences Corporation (CSC). The concept of process improvement within the SEL focuses on the continual understanding of both process and product as well as goal-driven experimentation and analysis of process change within a production environment.

  20. Contaminant analysis automation, an overview

    International Nuclear Information System (INIS)

    Hollen, R.; Ramos, O. Jr.

    1996-01-01

    To meet the environmental restoration and waste minimization goals of government and industry, several government laboratories, universities, and private companies have formed the Contaminant Analysis Automation (CAA) team. The goal of this consortium is to design and fabricate robotics systems that standardize and automate the hardware and software of the most common environmental chemical methods. In essence, the CAA team takes conventional, regulatory- approved (EPA Methods) chemical analysis processes and automates them. The automation consists of standard laboratory modules (SLMs) that perform the work in a much more efficient, accurate, and cost- effective manner

  1. An Automation Planning Primer.

    Science.gov (United States)

    Paynter, Marion

    1988-01-01

    This brief planning guide for library automation incorporates needs assessment and evaluation of options to meet those needs. A bibliography of materials on automation planning and software reviews, library software directories, and library automation journals is included. (CLB)

  2. How to Evaluate Integrated Library Automation Systems.

    Science.gov (United States)

    Powell, James R.; Slach, June E.

    1985-01-01

    This paper describes methodology used in compiling a list of candidate integrated library automation systems at a corporate technical library. Priorities for automation, identification of candidate systems, the filtering process, information for suppliers, software and hardware considerations, on-site evaluations, and final system selection are…

  3. Automated Software Vulnerability Analysis

    Science.gov (United States)

    Sezer, Emre C.; Kil, Chongkyung; Ning, Peng

    Despite decades of research, software continues to have vulnerabilities. Successful exploitations of these vulnerabilities by attackers cost millions of dollars to businesses and individuals. Unfortunately, most effective defensive measures, such as patching and intrusion prevention systems, require an intimate knowledge of the vulnerabilities. Many systems for detecting attacks have been proposed. However, the analysis of the exploited vulnerabilities is left to security experts and programmers. Both the human effortinvolved and the slow analysis process are unfavorable for timely defensive measure to be deployed. The problem is exacerbated by zero-day attacks.

  4. Opening up Library Automation Software

    Science.gov (United States)

    Breeding, Marshall

    2009-01-01

    Throughout the history of library automation, the author has seen a steady advancement toward more open systems. In the early days of library automation, when proprietary systems dominated, the need for standards was paramount since other means of inter-operability and data exchange weren't possible. Today's focus on Application Programming…

  5. SEL's Software Process-Improvement Program

    Science.gov (United States)

    Basili, Victor; Zelkowitz, Marvin; McGarry, Frank; Page, Jerry; Waligora, Sharon; Pajerski, Rose

    1995-01-01

    The goals and operations of the Software Engineering Laboratory (SEL) is reviewed. For nearly 20 years the SEL has worked to understand, assess, and improve software and the development process within the production environment of the Flight Dynamics Division (FDD) of NASA's Goddard Space Flight Center. The SEL was established in 1976 with the goals of reducing: (1) the defect rate of delivered software, (2) the cost of software to support flight projects, and (3) the average time to produce mission-support software. After studying over 125 projects of FDD, the results have guided the standards, management practices, technologies, and the training within the division. The results of the studies have been a 75 percent reduction in defects, a 50 percent reduction in cost, and a 25 percent reduction in development time. Over time the goals of SEL have been clarified. The goals are now stated as: (1) Understand baseline processes and product characteristics, (2) Assess improvements that have been incorporated into the development projects, (3) Package and infuse improvements into the standard SEL process. The SEL improvement goal is to demonstrate continual improvement of the software process by carrying out analysis, measurement and feedback to projects with in the FDD environment. The SEL supports the understanding of the process by study of several processes including, the effort distribution, and error detection rates. The SEL assesses and refines the processes. Once the assessment and refinement of a process is completed, the SEL packages the process by capturing the process in standards, tools and training.

  6. Software quality testing process analysis

    OpenAIRE

    Mera Paz, Julián

    2016-01-01

    Introduction: This article is the result of reading, review, analysis of books, magazines and articles well known for their scientific and research quality, which have addressed the software quality testing process. The author, based on his work experience in software development companies, teaching and other areas, has compiled and selected information to argue and substantiate the importance of the software quality testing process. Methodology: the existing literature on the software qualit...

  7. National Automated Conformity Inspection Process -

    Data.gov (United States)

    Department of Transportation — The National Automated Conformity Inspection Process (NACIP) Application is intended to expedite the workflow process as it pertains to the FAA Form 81 0-10 Request...

  8. Improving Flight Software Module Validation Efforts : a Modular, Extendable Testbed Software Framework

    Science.gov (United States)

    Lange, R. Connor

    2012-01-01

    Ever since Explorer-1, the United States' first Earth satellite, was developed and launched in 1958, JPL has developed many more spacecraft, including landers and orbiters. While these spacecraft vary greatly in their missions, capabilities,and destination, they all have something in common. All of the components of these spacecraft had to be comprehensively tested. While thorough testing is important to mitigate risk, it is also a very expensive and time consuming process. Thankfully,since virtually all of the software testing procedures for SMAP are computer controlled, these procedures can be automated. Most people testing SMAP flight software (FSW) would only need to write tests that exercise specific requirements and then check the filtered results to verify everything occurred as planned. This gives developers the ability to automatically launch tests on the testbed, distill the resulting logs into only the important information, generate validation documentation, and then deliver the documentation to management. With many of the steps in FSW testing automated, developers can use their limited time more effectively and can validate SMAP FSW modules quicker and test them more rigorously. As a result of the various benefits of automating much of the testing process, management is considering this automated tools use in future FSW validation efforts.

  9. Automation of plasma-process fultext bibliography databases. An on-line data-collection, data-mining and data-input system

    International Nuclear Information System (INIS)

    Suzuki, Manabu; Pichl, Lukas; Murakami, Izumi; Kato, Takako; Sasaki, Akira

    2006-01-01

    Searching for relevant data, information retrieval, data extraction and data input are time- and resource-consuming activities in most data centers. Here we develop a Linux system automating the process in case of bibliography, abstract and fulltext databases. The present system is an open-source free-software low-cost solution that connects the target and provider databases in cyberspace through various web publishing formats. The abstract/fulltext relevance assessment is interfaced to external software modules. (author)

  10. Test Automation Process Improvement A case study of BroadSoft

    OpenAIRE

    Gummadi, Jalendar

    2016-01-01

    This master thesis research is about improvement of test automation process at BroadSoft Finland as a case study. Test automation project recently started at BroadSoft but the project is not properly integrated in to existing process. Project is about converting manual test cases to automation test cases. The aim of this thesis is about studying existing BroadSoft test process and studying different test automation frameworks. In this thesis different test automation process are studied ...

  11. Leaf-GP: an open and automated software application for measuring growth phenotypes for arabidopsis and wheat.

    Science.gov (United States)

    Zhou, Ji; Applegate, Christopher; Alonso, Albor Dobon; Reynolds, Daniel; Orford, Simon; Mackiewicz, Michal; Griffiths, Simon; Penfield, Steven; Pullen, Nick

    2017-01-01

    Plants demonstrate dynamic growth phenotypes that are determined by genetic and environmental factors. Phenotypic analysis of growth features over time is a key approach to understand how plants interact with environmental change as well as respond to different treatments. Although the importance of measuring dynamic growth traits is widely recognised, available open software tools are limited in terms of batch image processing, multiple traits analyses, software usability and cross-referencing results between experiments, making automated phenotypic analysis problematic. Here, we present Leaf-GP (Growth Phenotypes), an easy-to-use and open software application that can be executed on different computing platforms. To facilitate diverse scientific communities, we provide three software versions, including a graphic user interface (GUI) for personal computer (PC) users, a command-line interface for high-performance computer (HPC) users, and a well-commented interactive Jupyter Notebook (also known as the iPython Notebook) for computational biologists and computer scientists. The software is capable of extracting multiple growth traits automatically from large image datasets. We have utilised it in Arabidopsis thaliana and wheat ( Triticum aestivum ) growth studies at the Norwich Research Park (NRP, UK). By quantifying a number of growth phenotypes over time, we have identified diverse plant growth patterns between different genotypes under several experimental conditions. As Leaf-GP has been evaluated with noisy image series acquired by different imaging devices (e.g. smartphones and digital cameras) and still produced reliable biological outputs, we therefore believe that our automated analysis workflow and customised computer vision based feature extraction software implementation can facilitate a broader plant research community for their growth and development studies. Furthermore, because we implemented Leaf-GP based on open Python-based computer vision, image

  12. Powder handling for automated fuel processing

    International Nuclear Information System (INIS)

    Frederickson, J.R.; Eschenbaum, R.C.; Goldmann, L.H.

    1989-01-01

    Installation of the Secure Automated Fabrication (SAF) line has been completed. It is located in the Fuel Cycle Plant (FCP) at the Department of Energy's (DOE) Hanford site near Richland, Washington. The SAF line was designed to fabricate advanced reactor fuel pellets and assemble fuel pins by automated, remote operation. This paper describes powder handling equipment and techniques utilized for automated powder processing and powder conditioning systems in this line. 9 figs

  13. Pyrochemical processing automation at Lawrence Livermore National Laboratory

    International Nuclear Information System (INIS)

    Dennison, D.K.; Domning, E.E.; Seivers, R.

    1991-01-01

    Lawrence Livermore National Laboratory (LLNL) is developing a fully automated system for pyrochemical processing of special nuclear materials (SNM). The system utilizes a glove box, an automated tilt-pour furnace (TPF), an IBM developed gantry robot, and specialized automation tooling. All material handling within the glove box (i.e., furnace loading, furnace unloading, product and slag separation, and product packaging) is performed automatically. The objectives of the effort are to increase process productivity, decrease operator radiation, reduce process wastes, and demonstrate system reliability and availability. This paper provides an overview of the automated system hardware, outlines the overall operations sequence, and discusses the current status

  14. Workflow-Based Software Development Environment

    Science.gov (United States)

    Izygon, Michel E.

    2013-01-01

    The Software Developer's Assistant (SDA) helps software teams more efficiently and accurately conduct or execute software processes associated with NASA mission-critical software. SDA is a process enactment platform that guides software teams through project-specific standards, processes, and procedures. Software projects are decomposed into all of their required process steps or tasks, and each task is assigned to project personnel. SDA orchestrates the performance of work required to complete all process tasks in the correct sequence. The software then notifies team members when they may begin work on their assigned tasks and provides the tools, instructions, reference materials, and supportive artifacts that allow users to compliantly perform the work. A combination of technology components captures and enacts any software process use to support the software lifecycle. It creates an adaptive workflow environment that can be modified as needed. SDA achieves software process automation through a Business Process Management (BPM) approach to managing the software lifecycle for mission-critical projects. It contains five main parts: TieFlow (workflow engine), Business Rules (rules to alter process flow), Common Repository (storage for project artifacts, versions, history, schedules, etc.), SOA (interface to allow internal, GFE, or COTS tools integration), and the Web Portal Interface (collaborative web environment

  15. Software systems for processing and analysis of experimental data at the Nova laser facility

    International Nuclear Information System (INIS)

    Auerbach, J.M.; McCauley, E.W.; Stone, G.F.; Montgomery, D.S.

    1986-01-01

    A typical laser-plasma interaction experiment at the Nova laser facility produces in excess of 20 megabytes of digitized data. Extensive processing and analysis of this raw data from a wide variety of instruments is necessary to produce data that can be readily used to interpret the experiment. The authors describe how using VAX based computer hardware, a software system has been set up to convert the digitized instrument output to physics quantities describing the experiment. A relational data base management system is used to coordinate all levels of processing and analysis. Extensive data bases of instrument response and set-up parameters are used at all levels of processing and archiving. An extensive set of programs is used to handle the large amounts of X, Y, Z data recorded on film by the bulk of Nova diagnostics. Software development emphasizes structured design, flexibility, automation and ease of use

  16. QuantiFly: Robust Trainable Software for Automated Drosophila Egg Counting.

    Directory of Open Access Journals (Sweden)

    Dominic Waithe

    Full Text Available We report the development and testing of software called QuantiFly: an automated tool to quantify Drosophila egg laying. Many laboratories count Drosophila eggs as a marker of fitness. The existing method requires laboratory researchers to count eggs manually while looking down a microscope. This technique is both time-consuming and tedious, especially when experiments require daily counts of hundreds of vials. The basis of the QuantiFly software is an algorithm which applies and improves upon an existing advanced pattern recognition and machine-learning routine. The accuracy of the baseline algorithm is additionally increased in this study through correction of bias observed in the algorithm output. The QuantiFly software, which includes the refined algorithm, has been designed to be immediately accessible to scientists through an intuitive and responsive user-friendly graphical interface. The software is also open-source, self-contained, has no dependencies and is easily installed (https://github.com/dwaithe/quantifly. Compared to manual egg counts made from digital images, QuantiFly achieved average accuracies of 94% and 85% for eggs laid on transparent (defined and opaque (yeast-based fly media. Thus, the software is capable of detecting experimental differences in most experimental situations. Significantly, the advanced feature recognition capabilities of the software proved to be robust to food surface artefacts like bubbles and crevices. The user experience involves image acquisition, algorithm training by labelling a subset of eggs in images of some of the vials, followed by a batch analysis mode in which new images are automatically assessed for egg numbers. Initial training typically requires approximately 10 minutes, while subsequent image evaluation by the software is performed in just a few seconds. Given the average time per vial for manual counting is approximately 40 seconds, our software introduces a timesaving advantage for

  17. Computer simulation and automation of data processing

    International Nuclear Information System (INIS)

    Tikhonov, A.N.

    1981-01-01

    The principles of computerized simulation and automation of data processing are presented. The automized processing system is constructed according to the module-hierarchical principle. The main operating conditions of the system are as follows: preprocessing, installation analysis, interpretation, accuracy analysis and controlling parameters. The definition of the quasireal experiment permitting to plan the real experiment is given. It is pointed out that realization of the quasireal experiment by means of the computerized installation model with subsequent automized processing permits to scan the quantitative aspect of the system as a whole as well as provides optimal designing of installation parameters for obtaining maximum resolution [ru

  18. Secure Software Configuration Management Processes for nuclear safety software development environment

    International Nuclear Information System (INIS)

    Chou, I.-Hsin

    2011-01-01

    Highlights: → The proposed method emphasizes platform-independent security processes. → A hybrid process based on the nuclear SCM and security regulations is proposed. → Detailed descriptions and Process Flow Diagram are useful for software developers. - Abstract: The main difference between nuclear and generic software is that the risk factor is infinitely greater in nuclear software - if there is a malfunction in the safety system, it can result in significant economic loss, physical damage or threat to human life. However, secure software development environment have often been ignored in the nuclear industry. In response to the terrorist attacks on September 11, 2001, the US Nuclear Regulatory Commission (USNRC) revised the Regulatory Guide (RG 1.152-2006) 'Criteria for use of computers in safety systems of nuclear power plants' to provide specific security guidance throughout the software development life cycle. Software Configuration Management (SCM) is an essential discipline in the software development environment. SCM involves identifying configuration items, controlling changes to those items, and maintaining integrity and traceability of them. For securing the nuclear safety software, this paper proposes a Secure SCM Processes (S 2 CMP) which infuses regulatory security requirements into proposed SCM processes. Furthermore, a Process Flow Diagram (PFD) is adopted to describe S 2 CMP, which is intended to enhance the communication between regulators and developers.

  19. Automation of Survey Data Processing, Documentation and Dissemination: An Application to Large-Scale Self-Reported Educational Survey.

    Science.gov (United States)

    Shim, Eunjae; Shim, Minsuk K.; Felner, Robert D.

    Automation of the survey process has proved successful in many industries, yet it is still underused in educational research. This is largely due to the facts (1) that number crunching is usually carried out using software that was developed before information technology existed, and (2) that the educational research is to a great extent trapped…

  20. Automated Construction of Node Software Using Attributes in a Ubiquitous Sensor Network Environment

    Science.gov (United States)

    Lee, Woojin; Kim, Juil; Kang, JangMook

    2010-01-01

    In sensor networks, nodes must often operate in a demanding environment facing restrictions such as restricted computing resources, unreliable wireless communication and power shortages. Such factors make the development of ubiquitous sensor network (USN) applications challenging. To help developers construct a large amount of node software for sensor network applications easily and rapidly, this paper proposes an approach to the automated construction of node software for USN applications using attributes. In the proposed technique, application construction proceeds by first developing a model for the sensor network and then designing node software by setting the values of the predefined attributes. After that, the sensor network model and the design of node software are verified. The final source codes of the node software are automatically generated from the sensor network model. We illustrate the efficiency of the proposed technique by using a gas/light monitoring application through a case study of a Gas and Light Monitoring System based on the Nano-Qplus operating system. We evaluate the technique using a quantitative metric—the memory size of execution code for node software. Using the proposed approach, developers are able to easily construct sensor network applications and rapidly generate a large number of node softwares at a time in a ubiquitous sensor network environment. PMID:22163678

  1. Automated Construction of Node Software Using Attributes in a Ubiquitous Sensor Network Environment

    Directory of Open Access Journals (Sweden)

    JangMook Kang

    2010-09-01

    Full Text Available In sensor networks, nodes must often operate in a demanding environment facing restrictions such as restricted computing resources, unreliable wireless communication and power shortages. Such factors make the development of ubiquitous sensor network (USN applications challenging. To help developers construct a large amount of node software for sensor network applications easily and rapidly, this paper proposes an approach to the automated construction of node software for USN applications using attributes. In the proposed technique, application construction proceeds by first developing a model for the sensor network and then designing node software by setting the values of the predefined attributes. After that, the sensor network model and the design of node software are verified. The final source codes of the node software are automatically generated from the sensor network model. We illustrate the efficiency of the proposed technique by using a gas/light monitoring application through a case study of a Gas and Light Monitoring System based on the Nano-Qplus operating system. We evaluate the technique using a quantitative metric—the memory size of execution code for node software. Using the proposed approach, developers are able to easily construct sensor network applications and rapidly generate a large number of node softwares at a time in a ubiquitous sensor network environment.

  2. Automated construction of node software using attributes in a ubiquitous sensor network environment.

    Science.gov (United States)

    Lee, Woojin; Kim, Juil; Kang, JangMook

    2010-01-01

    In sensor networks, nodes must often operate in a demanding environment facing restrictions such as restricted computing resources, unreliable wireless communication and power shortages. Such factors make the development of ubiquitous sensor network (USN) applications challenging. To help developers construct a large amount of node software for sensor network applications easily and rapidly, this paper proposes an approach to the automated construction of node software for USN applications using attributes. In the proposed technique, application construction proceeds by first developing a model for the sensor network and then designing node software by setting the values of the predefined attributes. After that, the sensor network model and the design of node software are verified. The final source codes of the node software are automatically generated from the sensor network model. We illustrate the efficiency of the proposed technique by using a gas/light monitoring application through a case study of a Gas and Light Monitoring System based on the Nano-Qplus operating system. We evaluate the technique using a quantitative metric-the memory size of execution code for node software. Using the proposed approach, developers are able to easily construct sensor network applications and rapidly generate a large number of node softwares at a time in a ubiquitous sensor network environment.

  3. The laws of software process a new model for the production and management of software

    CERN Document Server

    Armour, Phillip G

    2003-01-01

    The Nature of Software and The Laws of Software ProcessA Brief History of KnowledgeThe Characteristics of Knowledge Storage MediaThe Nature of Software DevelopmentThe Laws of Software Process and the Five Orders of IgnoranceThe Laws of Software ProcessThe First Law of Software ProcessThe Corollary to the First Law of Software ProcessThe Reflexive Creation of Systems and ProcessesThe Lemma of Eternal LatenessThe Second Law of Software ProcessThe Rule of Process BifurcationThe Dual Hypotheses of Knowledge DiscoveryArmour's Observation on Software ProcessThe Third Law of Software Process (also kn

  4. Land Cover and Land Use Classification with TWOPAC: towards Automated Processing for Pixel- and Object-Based Image Classification

    Directory of Open Access Journals (Sweden)

    Stefan Dech

    2012-09-01

    Full Text Available We present a novel and innovative automated processing environment for the derivation of land cover (LC and land use (LU information. This processing framework named TWOPAC (TWinned Object and Pixel based Automated classification Chain enables the standardized, independent, user-friendly, and comparable derivation of LC and LU information, with minimized manual classification labor. TWOPAC allows classification of multi-spectral and multi-temporal remote sensing imagery from different sensor types. TWOPAC enables not only pixel-based classification, but also allows classification based on object-based characteristics. Classification is based on a Decision Tree approach (DT for which the well-known C5.0 code has been implemented, which builds decision trees based on the concept of information entropy. TWOPAC enables automatic generation of the decision tree classifier based on a C5.0-retrieved ascii-file, as well as fully automatic validation of the classification output via sample based accuracy assessment.Envisaging the automated generation of standardized land cover products, as well as area-wide classification of large amounts of data in preferably a short processing time, standardized interfaces for process control, Web Processing Services (WPS, as introduced by the Open Geospatial Consortium (OGC, are utilized. TWOPAC’s functionality to process geospatial raster or vector data via web resources (server, network enables TWOPAC’s usability independent of any commercial client or desktop software and allows for large scale data processing on servers. Furthermore, the components of TWOPAC were built-up using open source code components and are implemented as a plug-in for Quantum GIS software for easy handling of the classification process from the user’s perspective.

  5. AUTOMATED PROCESS MONITORING: APPLYING PROVEN AUTOMATION TECHNIQUES TO INTERNATIONAL SAFEGUARDS NEEDS

    International Nuclear Information System (INIS)

    O'Hara, Matthew J.; Durst, Philip C.; Grate, Jay W.; Devol, Timothy A.; Egorov, Oleg; Clements, John P.

    2008-01-01

    Identification and quantification of specific alpha- and beta-emitting radionuclides in complex liquid matrices is highly challenging, and is typically accomplished through laborious wet chemical sample preparation and separations followed by analysis using a variety of detection methodologies (e.g., liquid scintillation, gas proportional counting, alpha energy analysis, mass spectrometry). Analytical results may take days or weeks to report. Chains of custody and sample security measures may also complicate or slow the analytical process. When an industrial process-scale plant requires the monitoring of specific radionuclides as an indication of the composition of its feed stream or of plant performance, radiochemical measurements must be fast, accurate, and reliable. Scientists at Pacific Northwest National Laboratory have assembled a fully automated prototype Process Monitor instrument capable of a variety of tasks: automated sampling directly from a feed stream, sample digestion/analyte redox adjustment, chemical separations, radiochemical detection and data analysis/reporting. The system is compact, its components are fluidically inter-linked, and analytical results could be immediately transmitted to on- or off-site locations. The development of a rapid radiochemical Process Monitor for 99Tc in Hanford tank waste processing streams, capable of performing several measurements per hour, will be discussed in detail. More recently, the automated platform was modified to perform measurements of 90Sr in Hanford tank waste stimulant. The system exemplifies how automation could be integrated into reprocessing facilities to support international nuclear safeguards needs

  6. Cassini-Huygens maneuver automation for navigation

    Science.gov (United States)

    Goodson, Troy; Attiyah, Amy; Buffington, Brent; Hahn, Yungsun; Pojman, Joan; Stavert, Bob; Strange, Nathan; Stumpf, Paul; Wagner, Sean; Wolff, Peter; hide

    2006-01-01

    Many times during the Cassini-Huygens mission to Saturn, propulsive maneuvers must be spaced so closely together that there isn't enough time or workforce to execute the maneuver-related software manually, one subsystem at a time. Automation is required. Automating the maneuver design process has involved close cooperation between teams. We present the contribution from the Navigation system. In scope, this includes trajectory propagation and search, generation of ephemerides, general tasks such as email notification and file transfer, and presentation materials. The software has been used to help understand maneuver optimization results, Huygens probe delivery statistics, and Saturn ring-plane crossing geometry. The Maneuver Automation Software (MAS), developed for the Cassini-Huygens program enables frequent maneuvers by handling mundane tasks such as creation of deliverable files, file delivery, generation and transmission of email announcements, generation of presentation material and other supporting documentation. By hand, these tasks took up hours, if not days, of work for each maneuver. Automated, these tasks may be completed in under an hour. During the cruise trajectory the spacing of maneuvers was such that development of a maneuver design could span about a month, involving several other processes in addition to that described, above. Often, about the last five days of this process covered the generation of a final design using an updated orbit-determination estimate. To support the tour trajectory, the orbit determination data cut-off of five days before the maneuver needed to be reduced to approximately one day and the whole maneuver development process needed to be reduced to less than a week..

  7. FRAME (Force Review Automation Environment): MATLAB-based AFM data processor.

    Science.gov (United States)

    Partola, Kostyantyn R; Lykotrafitis, George

    2016-05-03

    Data processing of force-displacement curves generated by atomic force microscopes (AFMs) for elastic moduli and unbinding event measurements is very time consuming and susceptible to user error or bias. There is an evident need for consistent, dependable, and easy-to-use AFM data processing software. We have developed an open-source software application, the force review automation environment (or FRAME), that provides users with an intuitive graphical user interface, automating data processing, and tools for expediting manual processing. We did not observe a significant difference between manually processed and automatically processed results from the same data sets. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Validation of the process control system of an automated large scale manufacturing plant.

    Science.gov (United States)

    Neuhaus, H; Kremers, H; Karrer, T; Traut, R H

    1998-02-01

    The validation procedure for the process control system of a plant for the large scale production of human albumin from plasma fractions is described. A validation master plan is developed, defining the system and elements to be validated, the interfaces with other systems with the validation limits, a general validation concept and supporting documentation. Based on this master plan, the validation protocols are developed. For the validation, the system is subdivided into a field level, which is the equipment part, and an automation level. The automation level is further subdivided into sections according to the different software modules. Based on a risk categorization of the modules, the qualification activities are defined. The test scripts for the different qualification levels (installation, operational and performance qualification) are developed according to a previously performed risk analysis.

  9. Flexible software process lines in practice: A metamodel-based approach to effectively construct and manage families of software process models

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Ternité, Thomas; Friedrich, Jan

    2017-01-01

    Process flexibility and adaptability is frequently discussed, and several proposals aim to improve software processes for a given organization-/project context. A software process line (SPrL) is an instrument to systematically construct and manage variable software processes, by combining pre-def......: A metamodel-based approach to effectively construct and manage families of software process models [Ku16]. This paper was published as original research article in the Journal of Systems and Software.......Process flexibility and adaptability is frequently discussed, and several proposals aim to improve software processes for a given organization-/project context. A software process line (SPrL) is an instrument to systematically construct and manage variable software processes, by combining pre...... to construct flexible SPrLs and show its practical application in the German V-Modell XT. We contribute a proven approach that is presented as metamodel fragment for reuse and implementation in further process modeling approaches. This summary refers to the paper Flexible software process lines in practice...

  10. COSTMODL - AN AUTOMATED SOFTWARE DEVELOPMENT COST ESTIMATION TOOL

    Science.gov (United States)

    Roush, G. B.

    1994-01-01

    The cost of developing computer software consumes an increasing portion of many organizations' budgets. As this trend continues, the capability to estimate the effort and schedule required to develop a candidate software product becomes increasingly important. COSTMODL is an automated software development estimation tool which fulfills this need. Assimilating COSTMODL to any organization's particular environment can yield significant reduction in the risk of cost overruns and failed projects. This user-customization capability is unmatched by any other available estimation tool. COSTMODL accepts a description of a software product to be developed and computes estimates of the effort required to produce it, the calendar schedule required, and the distribution of effort and staffing as a function of the defined set of development life-cycle phases. This is accomplished by the five cost estimation algorithms incorporated into COSTMODL: the NASA-developed KISS model; the Basic, Intermediate, and Ada COCOMO models; and the Incremental Development model. This choice affords the user the ability to handle project complexities ranging from small, relatively simple projects to very large projects. Unique to COSTMODL is the ability to redefine the life-cycle phases of development and the capability to display a graphic representation of the optimum organizational structure required to develop the subject project, along with required staffing levels and skills. The program is menu-driven and mouse sensitive with an extensive context-sensitive help system that makes it possible for a new user to easily install and operate the program and to learn the fundamentals of cost estimation without having prior training or separate documentation. The implementation of these functions, along with the customization feature, into one program makes COSTMODL unique within the industry. COSTMODL was written for IBM PC compatibles, and it requires Turbo Pascal 5.0 or later and Turbo

  11. Software-Engineering Process Simulation (SEPS) model

    Science.gov (United States)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  12. Automated Processing and Evaluation of Anti-Nuclear Antibody Indirect Immunofluorescence Testing.

    Science.gov (United States)

    Ricchiuti, Vincent; Adams, Joseph; Hardy, Donna J; Katayev, Alexander; Fleming, James K

    2018-01-01

    Indirect immunofluorescence (IIF) is considered by the American College of Rheumatology (ACR) and the international consensus on ANA patterns (ICAP) the gold standard for the screening of anti-nuclear antibodies (ANA). As conventional IIF is labor intensive, time-consuming, subjective, and poorly standardized, there have been ongoing efforts to improve the standardization of reagents and to develop automated platforms for assay incubation, microscopy, and evaluation. In this study, the workflow and performance characteristics of a fully automated ANA IIF system (Sprinter XL, EUROPattern Suite, IFA 40: HEp-20-10 cells) were compared to a manual approach using visual microscopy with a filter device for single-well titration and to technologist reading. The Sprinter/EUROPattern system enabled the processing of large daily workload cohorts in less than 8 h and the reduction of labor hands-on time by more than 4 h. Regarding the discrimination of positive from negative samples, the overall agreement of the EUROPattern software with technologist reading was higher (95.6%) than when compared to the current method (89.4%). Moreover, the software was consistent with technologist reading in 80.6-97.5% of patterns and 71.0-93.8% of titers. In conclusion, the Sprinter/EUROPattern system provides substantial labor savings and good concordance with technologist ANA IIF microscopy, thus increasing standardization, laboratory efficiency, and removing subjectivity.

  13. Automated Processing and Evaluation of Anti-Nuclear Antibody Indirect Immunofluorescence Testing

    Directory of Open Access Journals (Sweden)

    Vincent Ricchiuti

    2018-05-01

    Full Text Available Indirect immunofluorescence (IIF is considered by the American College of Rheumatology (ACR and the international consensus on ANA patterns (ICAP the gold standard for the screening of anti-nuclear antibodies (ANA. As conventional IIF is labor intensive, time-consuming, subjective, and poorly standardized, there have been ongoing efforts to improve the standardization of reagents and to develop automated platforms for assay incubation, microscopy, and evaluation. In this study, the workflow and performance characteristics of a fully automated ANA IIF system (Sprinter XL, EUROPattern Suite, IFA 40: HEp-20-10 cells were compared to a manual approach using visual microscopy with a filter device for single-well titration and to technologist reading. The Sprinter/EUROPattern system enabled the processing of large daily workload cohorts in less than 8 h and the reduction of labor hands-on time by more than 4 h. Regarding the discrimination of positive from negative samples, the overall agreement of the EUROPattern software with technologist reading was higher (95.6% than when compared to the current method (89.4%. Moreover, the software was consistent with technologist reading in 80.6–97.5% of patterns and 71.0–93.8% of titers. In conclusion, the Sprinter/EUROPattern system provides substantial labor savings and good concordance with technologist ANA IIF microscopy, thus increasing standardization, laboratory efficiency, and removing subjectivity.

  14. Automating the radiographic NDT process

    International Nuclear Information System (INIS)

    Aman, J.K.

    1986-01-01

    Automation, the removal of the human element in inspection, has not been generally applied to film radiographic NDT. The justication for automating is not only productivity but also reliability of results. Film remains in the automated system of the future because of its extremely high image content, approximately 8 x 10 9 bits per 14 x 17. The equivalent to 2200 computer floppy discs. Parts handling systems and robotics applied for manufacturing and some NDT modalities, should now be applied to film radiographic NDT systems. Automatic film handling can be achieved with the daylight NDT film handling system. Automatic film processing is becoming the standard in industry and can be coupled to the daylight system. Robots offer the opportunity to automate fully the exposure step. Finally, computer aided interpretation appears on the horizon. A unit which laser scans a 14 x 17 (inch) film in 6 - 8 seconds can digitize film information for further manipulation and possible automatic interrogations (computer aided interpretation). The system called FDRS (for Film Digital Radiography System) is moving toward 50 micron (*approx* 16 lines/mm) resolution. This is believed to meet the need of the majority of image content needs. We expect the automated system to appear first in parts (modules) as certain operations are automated. The future will see it all come together in an automated film radiographic NDT system (author) [pt

  15. A Collaborative System Software Solution for Modeling Business Flows Based on Automated Semantic Web Service Composition

    Directory of Open Access Journals (Sweden)

    Ion SMEUREANU

    2009-01-01

    Full Text Available Nowadays, business interoperability is one of the key factors for assuring competitive advantage for the participant business partners. In order to implement business cooperation, scalable, distributed and portable collaborative systems have to be implemented. This article presents some of the mostly used technologies in this field. Furthermore, it presents a software application architecture based on Business Process Modeling Notation standard and automated semantic web service coupling for modeling business flow in a collaborative manner. The main business processes will be represented in a single, hierarchic flow diagram. Each element of the diagram will represent calls to semantic web services. The business logic (the business rules and constraints will be structured with the help of OWL (Ontology Web Language. Moreover, OWL will also be used to create the semantic web service specifications.

  16. First Annual Workshop on Space Operations Automation and Robotics (SOAR 87)

    Science.gov (United States)

    Griffin, Sandy (Editor)

    1987-01-01

    Several topics relative to automation and robotics technology are discussed. Automation of checkout, ground support, and logistics; automated software development; man-machine interfaces; neural networks; systems engineering and distributed/parallel processing architectures; and artificial intelligence/expert systems are among the topics covered.

  17. SOFTWARE FOR DESIGNING PARALLEL APPLICATIONS

    Directory of Open Access Journals (Sweden)

    M. K. Bouza

    2017-01-01

    Full Text Available The object of research is the tools to support the development of parallel programs in C/C ++. The methods and software which automates the process of designing parallel applications are proposed.

  18. A simple rapid process for semi-automated brain extraction from magnetic resonance images of the whole mouse head.

    Science.gov (United States)

    Delora, Adam; Gonzales, Aaron; Medina, Christopher S; Mitchell, Adam; Mohed, Abdul Faheem; Jacobs, Russell E; Bearer, Elaine L

    2016-01-15

    Magnetic resonance imaging (MRI) is a well-developed technique in neuroscience. Limitations in applying MRI to rodent models of neuropsychiatric disorders include the large number of animals required to achieve statistical significance, and the paucity of automation tools for the critical early step in processing, brain extraction, which prepares brain images for alignment and voxel-wise statistics. This novel timesaving automation of template-based brain extraction ("skull-stripping") is capable of quickly and reliably extracting the brain from large numbers of whole head images in a single step. The method is simple to install and requires minimal user interaction. This method is equally applicable to different types of MR images. Results were evaluated with Dice and Jacquard similarity indices and compared in 3D surface projections with other stripping approaches. Statistical comparisons demonstrate that individual variation of brain volumes are preserved. A downloadable software package not otherwise available for extraction of brains from whole head images is included here. This software tool increases speed, can be used with an atlas or a template from within the dataset, and produces masks that need little further refinement. Our new automation can be applied to any MR dataset, since the starting point is a template mask generated specifically for that dataset. The method reliably and rapidly extracts brain images from whole head images, rendering them useable for subsequent analytical processing. This software tool will accelerate the exploitation of mouse models for the investigation of human brain disorders by MRI. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. What Counts in Software Process?

    DEFF Research Database (Denmark)

    Cohn, Marisa

    2009-01-01

    and conversations in negotiating between prescriptions from a model and the contingencies that arise in an enactment. A qualitative field study at two Agile software development companies was conducted to investigate the role of artifacts in the software development work and the relationship between these artifacts...... and the Software Process. Documentation of software requirements is a major concern among software developers and software researchers. Agile software development denotes a different relationship to documentation, one that warrants investigation. Empirical findings are presented which suggest a new understanding...

  20. Automation synthesis modules review

    International Nuclear Information System (INIS)

    Boschi, S.; Lodi, F.; Malizia, C.; Cicoria, G.; Marengo, M.

    2013-01-01

    The introduction of 68 Ga labelled tracers has changed the diagnostic approach to neuroendocrine tumours and the availability of a reliable, long-lived 68 Ge/ 68 Ga generator has been at the bases of the development of 68 Ga radiopharmacy. The huge increase in clinical demand, the impact of regulatory issues and a careful radioprotection of the operators have boosted for extensive automation of the production process. The development of automated systems for 68 Ga radiochemistry, different engineering and software strategies and post-processing of the eluate were discussed along with impact of automation with regulations. - Highlights: ► Generators availability and robust chemistry boosted for the huge diffusion of 68Ga radiopharmaceuticals. ► Different technological approaches for 68Ga radiopharmaceuticals will be discussed. ► Generator eluate post processing and evolution to cassette based systems were the major issues in automation. ► Impact of regulations on the technological development will be also considered

  1. AUTOMATION OF CHAMPAGNE WINES PROCESS IN SPARKLING WINE PRESSURE TANK

    Directory of Open Access Journals (Sweden)

    E. V. Lukyanchuk

    2016-08-01

    Full Text Available The wine industry is now successfully solved the problem for the implementation of automation receiving points of grapes, crushing and pressing departments installation continuous fermentation work, blend tanks, production lines ordinary Madeira continuously working plants for ethyl alcohol installations champagne wine in continuous flow, etc. With the development of automation of technological progress productivity winemaking process develops in the following areas: organization of complex avtomatization sites grape processing with bulk transportation of the latter; improving the quality and durability of wines by the processing of a wide applying wine cold and heat, as well as technical and microbiological control most powerful automation equipment; the introduction of automated production processes of continuous technical champagne, sherry wine and cognac alcohol madery; the use of complex automation auxiliary production sites (boilers, air conditioners, refrigeration unitsand other.; complex avtomatization creation of enterprises, and sites manufactory bottling wines. In the wine industry developed more sophisticated schemes of automation and devices that enable the transition to integrated production automation, will create, are indicative automated enterprise serving for laboratories to study of the main problems of automation of production processes of winemaking.

  2. Automation software for a materials testing laboratory

    Science.gov (United States)

    Mcgaw, Michael A.; Bonacuse, Peter J.

    1990-01-01

    The software environment in use at the NASA-Lewis Research Center's High Temperature Fatigue and Structures Laboratory is reviewed. This software environment is aimed at supporting the tasks involved in performing materials behavior research. The features and capabilities of the approach to specifying a materials test include static and dynamic control mode switching, enabling multimode test control; dynamic alteration of the control waveform based upon events occurring in the response variables; precise control over the nature of both command waveform generation and data acquisition; and the nesting of waveform/data acquisition strategies so that material history dependencies may be explored. To eliminate repetitive tasks in the coventional research process, a communications network software system is established which provides file interchange and remote console capabilities.

  3. Automated Detection of Binucleated Cell and Micronuclei using CellProfiler 2.0 Software

    Directory of Open Access Journals (Sweden)

    DWI RAMADHANI

    2013-12-01

    Full Text Available Micronucleus assay in human peripheral lymphocytes usually used to assess chromosomal damage. Manual scoring of micronuclei can be time consuming and large numbers of binucleated cells have to be analyzed to obtain statistically relevant data. Automation of the micronuclei analysis using image processing analysis software can provide a faster and more reliable analysis of micronucleus assay. Here the used of CellProfiler an open access cell image analysis software for automatic detection of binucleated cells and micronuclei were reported. We aimed to know whether there was a significant difference in the number of binucleated cells and micronuclei that obtained by manual and CellProfiler counting. Wilcoxon Rank test was used for statistical analysis to test H0 hypothesis that there was no significant difference in the number of binucleated cells and micronuclei that obtained by manual and CellProfiler counting. We analyzed 135 images for both manual and CellProfiler counting. Our results showed that there was no significant difference between manual and CellProfiler counting for binucleated cells (P = 0.851 and for micronuclei (P = 0.917. In conclusion, the binucleated cells and micronuclei counting using CellProfiler were comparable but not better than manual counting.

  4. Hydra: software for tailored processing of H/D exchange data from MS or tandem MS analyses

    Directory of Open Access Journals (Sweden)

    Bennett Melissa

    2009-05-01

    Full Text Available Abstract Background Hydrogen/deuterium exchange mass spectrometry (H/DX-MS experiments implemented to characterize protein interaction and protein folding generate large quantities of data. Organizing, processing and visualizing data requires an automated solution, particularly when accommodating new tandem mass spectrometry modes for H/DX measurement. We sought to develop software that offers flexibility in defining workflows so as to support exploratory treatments of H/DX-MS data, with a particular focus on the analysis of very large protein systems and the mining of tandem mass spectrometry data. Results We present a software package ("Hydra" that supports both traditional and exploratory treatments of H/DX-MS data. Hydra's software architecture tolerates flexible data analysis procedures by allowing the addition of new algorithms without significant change to the underlying code base. Convenient user interfaces ease the organization of raw data files and input of peptide data. After executing a user-defined workflow, extracted deuterium incorporation values can be visualized in tabular and graphical formats. Hydra also automates the extraction and visualization of deuterium distribution values. Manual validation and assessment of results is aided by an interface that aligns extracted ion chromatograms and mass spectra, while providing a means of rapidly reprocessing the data following manual adjustment. A unique feature of Hydra is the automated processing of tandem mass spectrometry data, demonstrated on a large test data set in which 40,000 deuterium incorporation values were extracted from replicate analysis of approximately 1000 fragment ions in one hour using a typical PC. Conclusion The customizable workflows and user-friendly interfaces of Hydra removes a significant bottleneck in processing and visualizing H/DX-MS data and helps the researcher spend more time executing new experiments and interpreting results. This increased

  5. [Algorithm for the automated processing of rheosignals].

    Science.gov (United States)

    Odinets, G S

    1988-01-01

    Algorithm for rheosignals recognition for a microprocessing device with a representation apparatus and with automated and manual cursor control was examined. The algorithm permits to automate rheosignals registrating and processing taking into account their changeability.

  6. Monitoring system for automation of experimental researches in cutting

    International Nuclear Information System (INIS)

    Kuzinovski, Mikolaj; Trajchevski, Neven; Filipovski, Velimir; Tomov, Mite; Cichosz, Piotr

    2009-01-01

    This study presents procedures being performed when projecting and realizing experimental scientific researches by application of the automated measurement system with a computer support in all experiment stages. A special accent is placed on the measurement system integration and mathematical processing of data from experiments. Automation processes are described through the realized own automated monitoring system for research of physical phenomena in the cutting process with computer-aided data acquisition. The monitoring system is intended for determining the tangential, axial and radial component of the cutting force, as well as average temperature in the cutting process. The hardware acquisition art consists of amplifiers and A/D converters, while as for analysis and visualization software for P C is developed by using M S Visual C++. For mathematical description researched physical phenomena CADEX software is made, which in connection with MATLAB is intended for projecting processing and analysis of experimental scientific researches against the theory for planning multi-factorial experiments. The design and construction of the interface and the computerized measurement system were done by the Faculty of Mechanical Engineering in Skopje in collaboration with the Faculty of Electrical Engineering and Information Technologies in Skopje and the Institute of Production Engineering and Automation, Wroclaw University of Technology, Poland. Gaining own scientific research measurement system with free access to hardware and software parts provides conditions for a complete control of the research process and reduction of interval of the measuring uncertainty of gained results from performed researches.

  7. Evaluation of a software package for automated quality assessment of contrast detail images-comparison with subjective visual assessment

    International Nuclear Information System (INIS)

    Pascoal, A; Lawinski, C P; Honey, I; Blake, P

    2005-01-01

    Contrast detail analysis is commonly used to assess image quality (IQ) associated with diagnostic imaging systems. Applications include routine assessment of equipment performance and optimization studies. Most frequently, the evaluation of contrast detail images involves human observers visually detecting the threshold contrast detail combinations in the image. However, the subjective nature of human perception and the variations in the decision threshold pose limits to the minimum image quality variations detectable with reliability. Objective methods of assessment of image quality such as automated scoring have the potential to overcome the above limitations. A software package (CDRAD analyser) developed for automated scoring of images produced with the CDRAD test object was evaluated. Its performance to assess absolute and relative IQ was compared with that of an average observer. Results show that the software does not mimic the absolute performance of the average observer. The software proved more sensitive and was able to detect smaller low-contrast variations. The observer's performance was superior to the software's in the detection of smaller details. Both scoring methods showed frequent agreement in the detection of image quality variations resulting from changes in kVp and KERMA detector , which indicates the potential to use the software CDRAD analyser for assessment of relative IQ

  8. Problem Diagnosis in Software Process Improvement

    DEFF Research Database (Denmark)

    Iversen, Jakob; Nielsen, Peter Axel; Nørbjerg, Jacob

    1998-01-01

    This paper addresses software process improvement. In particular it reports on action research undertaken to understand the problems with software processes of a large Danish company. It is argued that in order to understand what the specific problems are we may, on the one hand, rely on process...... to enable process improvement to effectively take place. It is argued that problem diagnosis a useful approach and that it has advantages over model-based assessment....... models like CMM or Bootstrap. On the other hand, we may also see the specific and unique features of software processes in this company through what we call problem diagnosis. Problem diagnosis deals with eliciting problems perceived by software project managers and with forming commitment structures...

  9. Towards an Evaluation Framework for Software Process Improvement

    OpenAIRE

    Cheng, Chow Kian; Permadi, Rahadian Bayu

    2009-01-01

    Software has gained an essential role in our daily life in the last decades. This condition demands high quality software. To produce high quality software many practitioners and researchers put more attention on the software development process. Large investments are poured to improve the software development process. Software Process Improvement (SPI) is a research area which is aimed to address the assessment and improvement issues in the software development process. One of the most impor...

  10. Library Automation at a Multi-Campus Community College.

    Science.gov (United States)

    Farris, Deirdre A.

    1987-01-01

    Describes the planning and implementation of a library automation system which encompasses four campus locations of a community college, and includes automation of technical processes and full access to holdings and circulation records of all the libraries involved. Software and hardware considerations are discussed, and guidelines to automation…

  11. Toward fully automated processing of dynamic susceptibility contrast perfusion MRI for acute ischemic cerebral stroke.

    Science.gov (United States)

    Kim, Jinsuh; Leira, Enrique C; Callison, Richard C; Ludwig, Bryan; Moritani, Toshio; Magnotta, Vincent A; Madsen, Mark T

    2010-05-01

    We developed fully automated software for dynamic susceptibility contrast (DSC) MR perfusion-weighted imaging (PWI) to efficiently and reliably derive critical hemodynamic information for acute stroke treatment decisions. Brain MR PWI was performed in 80 consecutive patients with acute nonlacunar ischemic stroke within 24h after onset of symptom from January 2008 to August 2009. These studies were automatically processed to generate hemodynamic parameters that included cerebral blood flow and cerebral blood volume, and the mean transit time (MTT). To develop reliable software for PWI analysis, we used computationally robust algorithms including the piecewise continuous regression method to determine bolus arrival time (BAT), log-linear curve fitting, arrival time independent deconvolution method and sophisticated motion correction methods. An optimal arterial input function (AIF) search algorithm using a new artery-likelihood metric was also developed. Anatomical locations of the automatically determined AIF were reviewed and validated. The automatically computed BAT values were statistically compared with estimated BAT by a single observer. In addition, gamma-variate curve-fitting errors of AIF and inter-subject variability of AIFs were analyzed. Lastly, two observes independently assessed the quality and area of hypoperfusion mismatched with restricted diffusion area from motion corrected MTT maps and compared that with time-to-peak (TTP) maps using the standard approach. The AIF was identified within an arterial branch and enhanced areas of perfusion deficit were visualized in all evaluated cases. Total processing time was 10.9+/-2.5s (mean+/-s.d.) without motion correction and 267+/-80s (mean+/-s.d.) with motion correction on a standard personal computer. The MTT map produced with our software adequately estimated brain areas with perfusion deficit and was significantly less affected by random noise of the PWI when compared with the TTP map. Results of image

  12. Use of automated test equipment and open-quotes paperlessclose quotes process control to implement efficient production of SSC dipole magnets

    International Nuclear Information System (INIS)

    Tobin, T.; Fagan, R.; Mitchell, D.

    1994-01-01

    In an effort to minimize human error and maximize process control and test capabilities during Collider Dipole Magnets (CDM) production, General Dynamics is developing automated test and process control equipment; known as Test ampersand Process Control Modules (TPCM's). When used along with software designed to create open-quotes paperlessclose quotes process control documentation, the system becomes the Test ampersand Process Control System (TPCS). This system simplifies business decisions and eliminates some problems normally associated with process control documentation, while reducing human errors during CDM production. It is also designed to reduce test operator errors normally incurred during test setup and data analysis. The authors present an overview of the TPCS hardware and software being developed at General Dynamics, along with the process control techniques included in TPCS

  13. Large - scale Rectangular Ruler Automated Verification Device

    Science.gov (United States)

    Chen, Hao; Chang, Luping; Xing, Minjian; Xie, Xie

    2018-03-01

    This paper introduces a large-scale rectangular ruler automated verification device, which consists of photoelectric autocollimator and self-designed mechanical drive car and data automatic acquisition system. The design of mechanical structure part of the device refer to optical axis design, drive part, fixture device and wheel design. The design of control system of the device refer to hardware design and software design, and the hardware mainly uses singlechip system, and the software design is the process of the photoelectric autocollimator and the automatic data acquisition process. This devices can automated achieve vertical measurement data. The reliability of the device is verified by experimental comparison. The conclusion meets the requirement of the right angle test procedure.

  14. Technical study for the automation and control of processes of the chemical processing plant for liquid radioactive waste at Racso Nuclear Center

    International Nuclear Information System (INIS)

    Quevedo D, M.; Ayala S, A.

    1997-01-01

    The purpose of this study is to introduce the development of an automation and control system in a chemical processing plant for liquid radioactive waste of low and medium activity. The control system established for the chemical processing plant at RACSO Nuclear Center is described. It is an on-off sequential type system with feedback. This type of control has been chosen according to the volumes to be treated at the plant as processing is carried out by batches. The system will be governed by a programmable controller (PLC), modular, with a minimum of 24 digital inputs, 01 analog input, 16 digital outputs and 01 analog input. Digital inputs and outputs are specifically found at the level sensors of the tanks and at the solenoid-type electro valve control. Analog inputs and outputs have been considered at the pH control. The comprehensive system has been divided into three control bonds, The bonds considered for the operation of the plant are described, the plant has storing, fitting, processing and clarifying tanks. National Instruments' Lookout software has been used for simulation, constituting an important tool not only for a design phase but also for a practical one since this software will be used as SCADA system. Finally, the advantages and benefits of this automation system are analyzed, radiation doses received by occupationally exposed workers are reduced and reliability on the operation on the system is increased. (authors)

  15. Will the future of knowledge work automation transform personalized medicine?

    Science.gov (United States)

    Naik, Gauri; Bhide, Sanika S

    2014-09-01

    Today, we live in a world of 'information overload' which demands high level of knowledge-based work. However, advances in computer hardware and software have opened possibilities to automate 'routine cognitive tasks' for knowledge processing. Engineering intelligent software systems that can process large data sets using unstructured commands and subtle judgments and have the ability to learn 'on the fly' are a significant step towards automation of knowledge work. The applications of this technology for high throughput genomic analysis, database updating, reporting clinically significant variants, and diagnostic imaging purposes are explored using case studies.

  16. Perspectives on bioanalytical mass spectrometry and automation in drug discovery.

    Science.gov (United States)

    Janiszewski, John S; Liston, Theodore E; Cole, Mark J

    2008-11-01

    The use of high speed synthesis technologies has resulted in a steady increase in the number of new chemical entities active in the drug discovery research stream. Large organizations can have thousands of chemical entities in various stages of testing and evaluation across numerous projects on a weekly basis. Qualitative and quantitative measurements made using LC/MS are integrated throughout this process from early stage lead generation through candidate nomination. Nearly all analytical processes and procedures in modern research organizations are automated to some degree. This includes both hardware and software automation. In this review we discuss bioanalytical mass spectrometry and automation as components of the analytical chemistry infrastructure in pharma. Analytical chemists are presented as members of distinct groups with similar skillsets that build automated systems, manage test compounds, assays and reagents, and deliver data to project teams. The ADME-screening process in drug discovery is used as a model to highlight the relationships between analytical tasks in drug discovery. Emerging software and process automation tools are described that can potentially address gaps and link analytical chemistry related tasks. The role of analytical chemists and groups in modern 'industrialized' drug discovery is also discussed.

  17. A realization of an automated data flow for data collecting, processing, storing and retrieving

    International Nuclear Information System (INIS)

    Friedsam, H.; Pushor, R.; Ruland, R.

    1986-11-01

    GEONET is a database system developed at the Stanford Linear Accelerator Center for the alignment of the Stanford Linear Collider. It features an automated data flow, ranging from data collection using HP110 handheld computers to processing, storing and retrieving data and finally to adjusted coordinates. This paper gives a brief introduction to the SLC project and the applied survey methods. It emphasizes the hardware and software implementation of GEONET using a network of IBM PC/XT's. 14 refs., 4 figs

  18. Automated video surveillance: teaching an old dog new tricks

    Science.gov (United States)

    McLeod, Alastair

    1993-12-01

    The automated video surveillance market is booming with new players, new systems, new hardware and software, and an extended range of applications. This paper reviews available technology, and describes the features required for a good automated surveillance system. Both hardware and software are discussed. An overview of typical applications is also given. A shift towards PC-based hybrid systems, use of parallel processing, neural networks, and exploitation of modern telecomms are introduced, highlighting the evolution modern video surveillance systems.

  19. Automated Student Aid Processing: The Challenge and Opportunity.

    Science.gov (United States)

    St. John, Edward P.

    1985-01-01

    To utilize automated technology for student aid processing, it is necessary to work with multi-institutional offices (student aid, admissions, registration, and business) and to develop automated interfaces with external processing systems at state and federal agencies and perhaps at need-analysis organizations and lenders. (MLW)

  20. Database Security for an Integrated Solution to Automate Sales Processes in Banking

    Directory of Open Access Journals (Sweden)

    Alexandra Maria Ioana FLOREA

    2013-05-01

    Full Text Available In order to maintain a competitive edge in a very active banking market the implementation of a web-based solution to standardize, optimize and manage the flow of sales / pre-sales and generating new leads is requested by a company. This article presents the realization of a development framework for software interoperability in the banking financial institutions and an integrated solution for achieving sales process automation in banking. The paper focuses on presenting the requirements for security and confidentiality of stored data and also on presenting the identified techniques and procedures to implement these requirements.

  1. Mapping social networks in software process improvement

    DEFF Research Database (Denmark)

    Tjørnehøj, Gitte; Nielsen, Peter Axel

    2005-01-01

    Software process improvement in small, agile organizations is often problematic. Model-based approaches seem to overlook problems. We have been seeking an alternative approach to overcome this through action research. Here we report on a piece of action research from which we developed an approach...... to map social networks and suggest how it can be used in software process improvement. We applied the mapping approach in a small software company to support the realization of new ways of improving software processes. The mapping approach was found useful in improving social networks, and thus furthers...... software process improvement....

  2. PLACE: an open-source python package for laboratory automation, control, and experimentation.

    Science.gov (United States)

    Johnson, Jami L; Tom Wörden, Henrik; van Wijk, Kasper

    2015-02-01

    In modern laboratories, software can drive the full experimental process from data acquisition to storage, processing, and analysis. The automation of laboratory data acquisition is an important consideration for every laboratory. When implementing a laboratory automation scheme, important parameters include its reliability, time to implement, adaptability, and compatibility with software used at other stages of experimentation. In this article, we present an open-source, flexible, and extensible Python package for Laboratory Automation, Control, and Experimentation (PLACE). The package uses modular organization and clear design principles; therefore, it can be easily customized or expanded to meet the needs of diverse laboratories. We discuss the organization of PLACE, data-handling considerations, and then present an example using PLACE for laser-ultrasound experiments. Finally, we demonstrate the seamless transition to post-processing and analysis with Python through the development of an analysis module for data produced by PLACE automation. © 2014 Society for Laboratory Automation and Screening.

  3. Designing a Software Test Automation Framework

    Directory of Open Access Journals (Sweden)

    Sabina AMARICAI

    2014-01-01

    Full Text Available Testing is an art and science that should ultimately lead to lower cost businesses through increasing control and reducing risk. Testing specialists should thoroughly understand the system or application from both the technical and the business perspective, and then design, build and implement the minimum-cost, maximum-coverage validation framework. Test Automation is an important ingredient for testing large scale applications. In this paper we discuss several test automation frameworks, their advantages and disadvantages. We also propose a custom automation framework model that is suited for applications with very complex business requirements and numerous interfaces.

  4. NASA Goddard Space Flight Center Robotic Processing System Program Automation Systems, volume 2

    Science.gov (United States)

    Dobbs, M. E.

    1991-01-01

    Topics related to robot operated materials processing in space (RoMPS) are presented in view graph form. Some of the areas covered include: (1) mission requirements; (2) automation management system; (3) Space Transportation System (STS) Hitchhicker Payload; (4) Spacecraft Command Language (SCL) scripts; (5) SCL software components; (6) RoMPS EasyLab Command & Variable summary for rack stations and annealer module; (7) support electronics assembly; (8) SCL uplink packet definition; (9) SC-4 EasyLab System Memory Map; (10) Servo Axis Control Logic Suppliers; and (11) annealing oven control subsystem.

  5. Online Rule Generation Software Process Model

    OpenAIRE

    Sudeep Marwaha; Alka Aroa; Satma M C; Rajni Jain; R C Goyal

    2013-01-01

    For production systems like expert systems, a rule generation software can facilitate the faster deployment. The software process model for rule generation using decision tree classifier refers to the various steps required to be executed for the development of a web based software model for decision rule generation. The Royce’s final waterfall model has been used in this paper to explain the software development process. The paper presents the specific output of various steps of modified wat...

  6. Understanding flexible and distributed software development processes

    OpenAIRE

    Agerfalk, Par J.; Fitzgerald, Brian

    2006-01-01

    peer-reviewed The minitrack on Flexible and Distributed Software Development Processes addresses two important and partially intertwined current themes in software development: process flexibility and globally distributed software development

  7. The classification and evaluation of Computer-Aided Software Engineering tools

    OpenAIRE

    Manley, Gary W.

    1990-01-01

    Approved for public release; distribution unlimited. The use of Computer-Aided Software Engineering (CASE) tools has been viewed as a remedy for the software development crisis by achieving improved productivity and system quality via the automation of all or part of the software engineering process. The proliferation and tremendous variety of tools available have stretched the understanding of experienced practitioners and has had a profound impact on the software engineering process itse...

  8. Automated Facial Coding Software Outperforms People in Recognizing Neutral Faces as Neutral from Standardized Datasets

    Directory of Open Access Journals (Sweden)

    Peter eLewinski

    2015-09-01

    Full Text Available Little is known about people’s accuracy of recognizing neutral faces as neutral. In this paper, I demonstrate the importance of knowing how well people recognize neutral faces. I contrasted human recognition scores of 100 typical, neutral front-up facial images with scores of an arguably objective judge – automated facial coding (AFC software. I hypothesized that the software would outperform humans in recognizing neutral faces because of the inherently objective nature of computer algorithms. Results confirmed this hypothesis. I provided the first-ever evidence that computer software (90% was more accurate in recognizing neutral faces than people were (59%. I posited two theoretical mechanisms, i.e. smile-as-a-baseline and false recognition of emotion, as possible explanations for my findings.

  9. 2016 International Conference on Software Process Improvement

    CERN Document Server

    Muñoz, Mirna; Rocha, Álvaro; Feliu, Tomas; Peña, Adriana

    2017-01-01

    This book offers a selection of papers from the 2016 International Conference on Software Process Improvement (CIMPS’16), held between the 12th and 14th of October 2016 in Aguascalientes, Aguascalientes, México. The CIMPS’16 is a global forum for researchers and practitioners to present and discuss the most recent innovations, trends, results, experiences and concerns in the different aspects of software engineering with a focus on, but not limited to, software processes, security in information and communication technology, and big data. The main topics covered include: organizational models, standards and methodologies, knowledge management, software systems, applications and tools, information and communication technologies and processes in non-software domains (mining, automotive, aerospace, business, health care, manufacturing, etc.) with a clear focus on software process challenges.

  10. More steps towards process automation for optical fabrication

    Science.gov (United States)

    Walker, David; Yu, Guoyu; Beaucamp, Anthony; Bibby, Matt; Li, Hongyu; McCluskey, Lee; Petrovic, Sanja; Reynolds, Christina

    2017-06-01

    In the context of Industrie 4.0, we have previously described the roles of robots in optical processing, and their complementarity with classical CNC machines, providing both processing and automation functions. After having demonstrated robotic moving of parts between a CNC polisher and metrology station, and auto-fringe-acquisition, we have moved on to automate the wash-down operation. This is part of a wider strategy we describe in this paper, leading towards automating the decision-making operations required before and throughout an optical manufacturing cycle.

  11. Imperfect Information in Software Design Processes

    NARCIS (Netherlands)

    Noppen, J.A.R.

    2007-01-01

    The process of designing high-quality software systems is one of the major issues in software engineering research. Over the years, this has resulted in numerous design methods, each with specific qualities and drawbacks. For example, the Rational Unified Process is a comprehensive design process,

  12. The MINERVA Software Development Process

    Science.gov (United States)

    Narkawicz, Anthony; Munoz, Cesar A.; Dutle, Aaron M.

    2017-01-01

    This paper presents a software development process for safety-critical software components of cyber-physical systems. The process is called MINERVA, which stands for Mirrored Implementation Numerically Evaluated against Rigorously Verified Algorithms. The process relies on formal methods for rigorously validating code against its requirements. The software development process uses: (1) a formal specification language for describing the algorithms and their functional requirements, (2) an interactive theorem prover for formally verifying the correctness of the algorithms, (3) test cases that stress the code, and (4) numerical evaluation on these test cases of both the algorithm specifications and their implementations in code. The MINERVA process is illustrated in this paper with an application to geo-containment algorithms for unmanned aircraft systems. These algorithms ensure that the position of an aircraft never leaves a predetermined polygon region and provide recovery maneuvers when the region is inadvertently exited.

  13. Software Process Automation: Experiences from the Trenches.

    Science.gov (United States)

    1996-07-01

    Integration of problem database Weaver tions) J Process WordPerfect, All-in-One, Oracle, CM Integration of tools Weaver System K Process Framemaker , CM...handle change requests and problem reports. * Autoplan, a project management tool * Framemaker , a document processing system * Worldview, a document...Cadre, Team Work, FrameMaker , some- thing for requirements traceability, their own homegrown scheduling tool, and their own homegrown tool integrator

  14. - GEONET - A Realization of an Automated Data Flow for Data Collecting, Processing, Storing, and Retrieving

    International Nuclear Information System (INIS)

    Friedsam, Horst; Pushor, Robert; Ruland, Robert; SLAC

    2005-01-01

    GEONET is a database system developed at the Stanford Linear Accelerator Center for the alignment of the Stanford Linear Collider. It features an automated data flow, ranging from data collection using HP110 handheld computers to processing, storing and retrieving data and finally to adjusted coordinates. This paper gives a brief introduction to the SLC project and the applied survey methods. It emphasizes the hardware and software implementation of GEONET using a network of IBM PC/XT's

  15. Accuracy of liver lesion assessment using automated measurement and segmentation software in biphasic multislice CT (MSCT)

    International Nuclear Information System (INIS)

    Puesken, M.; Juergens, K.U.; Edenfeld, A.; Buerke, B.; Seifarth, H.; Beyer, F.; Heindel, W.; Wessling, J.; Suehling, M.; Osada, N.

    2009-01-01

    Purpose: To assess the accuracy of liver lesion measurement using automated measurement and segmentation software depending on the vascularization level. Materials and Methods: Arterial and portal venous phase multislice CT (MSCT) was performed for 58 patients. 94 liver lesions were evaluated and classified according to vascularity (hypervascular: 13 hepatocellular carcinomas, 20 hemangiomas; hypovascular: 31 metastases, 3 lymphomas, 4 abscesses; liquid: 23 cysts). The RECIST diameter and volume were obtained using automated measurement and segmentation software and compared to corresponding measurements derived visually by two experienced radiologists as a reference standard. Statistical analysis was performed using the Wilcoxon test and concordance correlation coefficients. Results: Automated measurements revealed no significant difference between the arterial and portal venous phase in hypovascular (mean RECIST diameter: 31.4 vs. 30.2 mm; p = 0.65; κ = 0.875) and liquid lesions (20.4 vs. 20.1 mm; p = 0.1; κ = 0.996). The RECIST diameter and volume of hypervascular lesions were significantly underestimated in the portal venous phase as compared to the arterial phase (30.3 vs. 26.9 mm, p = 0.007, κ 0.834; 10.7 vs. 7.9 ml, p = 0.0045, κ = 0.752). Automated measurements for hypovascular and liquid lesions in the arterial and portal venous phase were concordant to the reference standard. Hypervascular lesion measurements were in line with the reference standard for the arterial phase (30.3 vs. 32.2 mm, p 0.66, κ = 0.754), but revealed a significant difference for the portal venous phase (26.9 vs. 32.1 mm; p = 0.041; κ = 0.606). (orig.)

  16. A process algebra software engineering environment

    NARCIS (Netherlands)

    Diertens, B.

    2008-01-01

    In previous work we described how the process algebra based language PSF can be used in software engineering, using the ToolBus, a coordination architecture also based on process algebra, as implementation model. In this article we summarize that work and describe the software development process

  17. Automation of experimental research of waveguide paths induction soldering

    Science.gov (United States)

    Tynchenko, V. S.; Petrenko, V. E.; Kukartsev, V. V.; Tynchenko, V. V.; Antamoshkin, O. A.

    2018-05-01

    The article presents an automated system of experimental studies of the waveguide paths induction soldering process. The system is a part of additional software for a complex of automated control of the technological process of induction soldering of thin-walled waveguide paths from aluminum alloys, expanding its capabilities. The structure of the software product, the general appearance of the controls and the potential application possibilities are presented. The utility of the developed application by approbation in a series of field experiments was considered and justified. The application of the experimental research system makes it possible to improve the process under consideration, providing the possibility of fine-tuning the control regulators, as well as keeping the statistics of the soldering process in a convenient form for analysis.

  18. PIV Data Validation Software Package

    Science.gov (United States)

    Blackshire, James L.

    1997-01-01

    A PIV data validation and post-processing software package was developed to provide semi-automated data validation and data reduction capabilities for Particle Image Velocimetry data sets. The software provides three primary capabilities including (1) removal of spurious vector data, (2) filtering, smoothing, and interpolating of PIV data, and (3) calculations of out-of-plane vorticity, ensemble statistics, and turbulence statistics information. The software runs on an IBM PC/AT host computer working either under Microsoft Windows 3.1 or Windows 95 operating systems.

  19. Containerless automated processing of intermetallic compounds and composites

    Science.gov (United States)

    Johnson, D. R.; Joslin, S. M.; Reviere, R. D.; Oliver, B. F.; Noebe, R. D.

    1993-01-01

    An automated containerless processing system has been developed to directionally solidify high temperature materials, intermetallic compounds, and intermetallic/metallic composites. The system incorporates a wide range of ultra-high purity chemical processing conditions. The utilization of image processing for automated control negates the need for temperature measurements for process control. The list of recent systems that have been processed includes Cr, Mo, Mn, Nb, Ni, Ti, V, and Zr containing aluminides. Possible uses of the system, process control approaches, and properties and structures of recently processed intermetallics are reviewed.

  20. SCT: Spinal Cord Toolbox, an open-source software for processing spinal cord MRI data.

    Science.gov (United States)

    De Leener, Benjamin; Lévy, Simon; Dupont, Sara M; Fonov, Vladimir S; Stikov, Nikola; Louis Collins, D; Callot, Virginie; Cohen-Adad, Julien

    2017-01-15

    For the past 25 years, the field of neuroimaging has witnessed the development of several software packages for processing multi-parametric magnetic resonance imaging (mpMRI) to study the brain. These software packages are now routinely used by researchers and clinicians, and have contributed to important breakthroughs for the understanding of brain anatomy and function. However, no software package exists to process mpMRI data of the spinal cord. Despite the numerous clinical needs for such advanced mpMRI protocols (multiple sclerosis, spinal cord injury, cervical spondylotic myelopathy, etc.), researchers have been developing specific tools that, while necessary, do not provide an integrative framework that is compatible with most usages and that is capable of reaching the community at large. This hinders cross-validation and the possibility to perform multi-center studies. In this study we introduce the Spinal Cord Toolbox (SCT), a comprehensive software dedicated to the processing of spinal cord MRI data. SCT builds on previously-validated methods and includes state-of-the-art MRI templates and atlases of the spinal cord, algorithms to segment and register new data to the templates, and motion correction methods for diffusion and functional time series. SCT is tailored towards standardization and automation of the processing pipeline, versatility, modularity, and it follows guidelines of software development and distribution. Preliminary applications of SCT cover a variety of studies, from cross-sectional area measures in large databases of patients, to the precise quantification of mpMRI metrics in specific spinal pathways. We anticipate that SCT will bring together the spinal cord neuroimaging community by establishing standard templates and analysis procedures. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. How Does Software Process Improvement Address Global Software Engineering?

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Diebold, Philipp; Münch, Jürgen

    2016-01-01

    For decades, Software Process Improvement (SPI) programs have been implemented, inter alia, to improve quality and speed of software development. To set up, guide, and carry out SPI projects, and to measure SPI state, impact, and success, a multitude of different SPI approaches and considerable...

  2. Automated tools and techniques for distributed Grid Software Development of the testbed infrastructure

    CERN Document Server

    Aguado Sanchez, C

    2007-01-01

    Grid technology is becoming more and more important as the new paradigm for sharing computational resources across different organizations in a secure way. The great powerfulness of this solution, requires the definition of a generic stack of services and protocols and this is the scope of the different Grid initiatives. As a result of international collaborations for its development, the Open Grid Forum created the Open Grid Services Architecture (OGSA) which aims to define the common set of services that will enable interoperability across the different implementations. This master thesis has been developed in this framework, as part of the two European-funded projects ETICS and OMII-Europe. The main objective is to contribute to the design and maintenance of large distributed development projects with the automated tool that enables to implement Software Engineering techniques oriented to achieve an acceptable level of quality at the release process. Specifically, this thesis develops the testbed concept a...

  3. Software development processes and analysis software: a mismatch and a novel framework

    International Nuclear Information System (INIS)

    Kelly, D.; Harauz, J.

    2011-01-01

    This paper discusses the salient characteristics of analysis software and the impact of those characteristics on its development. From this discussion, it can be seen that mainstream software development processes, usually characterized as Plan Driven or Agile, are built upon assumptions that are mismatched to the development and maintenance of analysis software. We propose a novel software development framework that would match the process normally observed in the development of analysis software. In the discussion of this framework, we suggest areas of research and directions for future work. (author)

  4. Using artificial intelligence to automate remittance processing.

    Science.gov (United States)

    Adams, W T; Snow, G M; Helmick, P M

    1998-06-01

    The consolidated business office of the Allegheny Health Education Research Foundation (AHERF), a large integrated healthcare system based in Pittsburgh, Pennsylvania, sought to improve its cash-related business office activities by implementing an automated remittance processing system that uses artificial intelligence. The goal was to create a completely automated system whereby all monies it processed would be tracked, automatically posted, analyzed, monitored, controlled, and reconciled through a central database. Using a phased approach, the automated payment system has become the central repository for all of the remittances for seven of the hospitals in the AHERF system and has allowed for the complete integration of these hospitals' existing billing systems, document imaging system, and intranet, as well as the new automated payment posting, and electronic cash tracking and reconciling systems. For such new technology, which is designed to bring about major change, factors contributing to the project's success were adequate planning, clearly articulated objectives, marketing, end-user acceptance, and post-implementation plan revision.

  5. Complex Automated Negotiations Theories, Models, and Software Competitions

    CERN Document Server

    Zhang, Minjie; Robu, Valentin; Matsuo, Tokuro

    2013-01-01

    Complex Automated Negotiations are a widely studied, emerging area in the field of Autonomous Agents and Multi-Agent Systems. In general, automated negotiations can be complex, since there are a lot of factors that characterize such negotiations. For this book, we solicited papers on all aspects of such complex automated negotiations, which are studied in the field of Autonomous Agents and Multi-Agent Systems. This book includes two parts, which are Part I: Agent-based Complex Automated Negotiations and Part II: Automated Negotiation Agents Competition. Each chapter in Part I is an extended version of ACAN 2011 papers after peer reviews by three PC members. Part II includes ANAC 2011 (The Second Automated Negotiating Agents Competition), in which automated agents who have different negotiation strategies and implemented by different developers are automatically negotiate in the several negotiation domains. ANAC is an international competition in which automated negotiation strategies, submitted by a number of...

  6. Will the future of knowledge work automation transform personalized medicine?

    Directory of Open Access Journals (Sweden)

    Gauri Naik

    2014-09-01

    Full Text Available Today, we live in a world of ‘information overload’ which demands high level of knowledge-based work. However, advances in computer hardware and software have opened possibilities to automate ‘routine cognitive tasks’ for knowledge processing. Engineering intelligent software systems that can process large data sets using unstructured commands and subtle judgments and have the ability to learn ‘on the fly’ are a significant step towards automation of knowledge work. The applications of this technology for high throughput genomic analysis, database updating, reporting clinically significant variants, and diagnostic imaging purposes are explored using case studies.

  7. Implementing The Automated Phases Of The Partially-Automated Digital Triage Process Model

    Directory of Open Access Journals (Sweden)

    Gary D Cantrell

    2012-12-01

    Full Text Available Digital triage is a pre-digital-forensic phase that sometimes takes place as a way of gathering quick intelligence. Although effort has been undertaken to model the digital forensics process, little has been done to date to model digital triage. This work discuses the further development of a model that does attempt to address digital triage the Partially-automated Crime Specific Digital Triage Process model. The model itself will be presented along with a description of how its automated functionality was implemented to facilitate model testing.

  8. Automation of data processing and calculation of retention parameters and thermodynamic data for gas chromatography

    Science.gov (United States)

    Makarycheva, A. I.; Faerman, V. A.

    2017-02-01

    The analyses of automation patterns is performed and the programming solution for the automation of data processing of the chromatographic data and their further information storage with a help of a software package, Mathcad and MS Excel spreadsheets, is developed. The offered approach concedes the ability of data processing algorithm modification and does not require any programming experts participation. The approach provides making a measurement of the given time and retention volumes, specific retention volumes, a measurement of differential molar free adsorption energy, and a measurement of partial molar solution enthalpies and isosteric heats of adsorption. The developed solution is focused on the appliance in a small research group and is tested on the series of some new gas chromatography sorbents. More than 20 analytes were submitted to calculation of retention parameters and thermodynamic sorption quantities. The received data are provided in the form accessible to comparative analysis, and they are able to find sorbing agents with the most profitable properties to solve some concrete analytic issues.

  9. The R package EchoviewR for automated processing of active acoustic data using Echoview

    Directory of Open Access Journals (Sweden)

    Lisa-Marie Katarina Harrison

    2015-02-01

    Full Text Available Acoustic data is time consuming to process due to the large data size and the requirement to often undertake some data processing steps manually. Manual processing may introduce subjective, irreproducible decisions into the data processing work flow, reducing consistency in processing between surveys. We introduce the R package EchoviewR as an interface between R and Echoview, a commercially available acoustic processing software package. EchoviewR allows for automation of Echoview using scripting which can drastically reduce the manual work required when processing acoustic surveys. This package plays an important role in reducing subjectivity in acoustic data processing by allowing exactly the same process to be applied automatically to multiple surveys and documenting where subjective decisions have been made. Using data from a survey of Antarctic krill, we provide two examples of using EchoviewR: krill estimation and swarm detection.

  10. Automated sample analysis and remediation

    International Nuclear Information System (INIS)

    Hollen, R.; Settle, F.

    1995-01-01

    The Contaminant Analysis Automation Project is developing an automated chemical analysis system to address the current needs of the US Department of Energy (DOE). These needs focus on the remediation of large amounts of radioactive and chemically hazardous wastes stored, buried and still being processed at numerous DOE sites. This paper outlines the advantages of the system under development, and details the hardware and software design. A prototype system for characterizing polychlorinated biphenyls in soils is also described

  11. Welding process automation in power machine building

    International Nuclear Information System (INIS)

    Mel'bard, S.N.; Shakhnov, A.F.; Shergov, I.V.

    1977-01-01

    The level of welding automation operations in power engineering and ways of its enhancement are highlighted. Used as the examples of comlex automation are an apparatus for the horizontal welding of turbine rotors, remotely controlled automatic machine for welding ring joint of large-sized vessels, equipment for the electron-beam welding of steam turbine assemblies of alloyed steels. The prospects of industrial robots are noted. The importance of the complex automation of technological process, including stocking, assemblying, transportation and auxiliary operations, is emphasized

  12. Automated Translation of Safety Critical Application Software Specifications into PLC Ladder Logic

    Science.gov (United States)

    Leucht, Kurt W.; Semmel, Glenn S.

    2008-01-01

    The numerous benefits of automatic application code generation are widely accepted within the software engineering community. A few of these benefits include raising the abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at the NASA Kennedy Space Center (KSC) recognized the need for PLC code generation while developing their new ground checkout and launch processing system. They developed a process and a prototype software tool that automatically translates a high-level representation or specification of safety critical application software into ladder logic that executes on a PLC. This process and tool are expected to increase the reliability of the PLC code over that which is written manually, and may even lower life-cycle costs and shorten the development schedule of the new control system at KSC. This paper examines the problem domain and discusses the process and software tool that were prototyped by the KSC software engineers.

  13. Evaluation of automated image analysis software for the detection of diabetic retinopathy to reduce the ophthalmologists' workload.

    Science.gov (United States)

    Soto-Pedre, Enrique; Navea, Amparo; Millan, Saray; Hernaez-Ortega, Maria C; Morales, Jesús; Desco, Maria C; Pérez, Pablo

    2015-02-01

    To assess the safety and workload reduction of an automated 'disease/no disease' grading system for diabetic retinopathy (DR) within a systematic screening programme. Single 45° macular field image per eye was obtained from consecutive patients attending a regional primary care based DR screening programme in Valencia (Spain). The sensitivity and specificity of automated system operating as 'one or more than one microaneurysm detection for disease presence' grader were determined relative to a manual grading as gold standard. Data on age, gender and diabetes mellitus were also recorded. A total of 5278 patients with diabetes were screened. The median age and duration of diabetes was 69 years and 6.9 years, respectively. Estimated prevalence of DR was 15.6%. The software classified 43.9% of the patients as having no DR and 26.1% as having ungradable images. Detection of DR was achieved with 94.5% sensitivity (95% CI 92.6- 96.5) and 68.8% specificity (95%CI 67.2-70.4). The overall accuracy of the automated system was 72.5% (95%CI 71.1-73.9). The present retinal image processing algorithm that can act as prefilter to flag out images with pathological lesions can be implemented in practice. Our results suggest that it could be considered when implementing DR screening programmes. © 2014 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  14. MOST-visualization: software for producing automated textbook-style maps of genome-scale metabolic networks.

    Science.gov (United States)

    Kelley, James J; Maor, Shay; Kim, Min Kyung; Lane, Anatoliy; Lun, Desmond S

    2017-08-15

    Visualization of metabolites, reactions and pathways in genome-scale metabolic networks (GEMs) can assist in understanding cellular metabolism. Three attributes are desirable in software used for visualizing GEMs: (i) automation, since GEMs can be quite large; (ii) production of understandable maps that provide ease in identification of pathways, reactions and metabolites; and (iii) visualization of the entire network to show how pathways are interconnected. No software currently exists for visualizing GEMs that satisfies all three characteristics, but MOST-Visualization, an extension of the software package MOST (Metabolic Optimization and Simulation Tool), satisfies (i), and by using a pre-drawn overview map of metabolism based on the Roche map satisfies (ii) and comes close to satisfying (iii). MOST is distributed for free on the GNU General Public License. The software and full documentation are available at http://most.ccib.rutgers.edu/. dslun@rutgers.edu. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  15. A Novel Automated Method for Analyzing Cylindrical Computed Tomography Data

    Science.gov (United States)

    Roth, D. J.; Burke, E. R.; Rauser, R. W.; Martin, R. E.

    2011-01-01

    A novel software method is presented that is applicable for analyzing cylindrical and partially cylindrical objects inspected using computed tomography. This method involves unwrapping and re-slicing data so that the CT data from the cylindrical object can be viewed as a series of 2-D sheets in the vertical direction in addition to volume rendering and normal plane views provided by traditional CT software. The method is based on interior and exterior surface edge detection and under proper conditions, is FULLY AUTOMATED and requires no input from the user except the correct voxel dimension from the CT scan. The software is available from NASA in 32- and 64-bit versions that can be applied to gigabyte-sized data sets, processing data either in random access memory or primarily on the computer hard drive. Please inquire with the presenting author if further interested. This software differentiates itself in total from other possible re-slicing software solutions due to complete automation and advanced processing and analysis capabilities.

  16. Home Automation

    OpenAIRE

    Ahmed, Zeeshan

    2010-01-01

    In this paper I briefly discuss the importance of home automation system. Going in to the details I briefly present a real time designed and implemented software and hardware oriented house automation research project, capable of automating house's electricity and providing a security system to detect the presence of unexpected behavior.

  17. Suitability of semi-automated tumor response assessment of liver metastases using a dedicated software package

    International Nuclear Information System (INIS)

    Kalkmann, Janine; Ladd, S.C.; Greiff, A. de; Forsting, M.; Stattaus, J.

    2010-01-01

    Purpose: to evaluate the suitability of semi-automated compared to manual tumor response assessment (TRA) of liver metastases. Materials and methods: in total, 32 patients with colorectal cancer and liver metastases were followed by an average of 2.8 contrast-enhanced CT scans. Two observers (O1, O2) measured the longest diameter (LD) of 269 liver metastases manually and semi-automatically using software installed as thin-client on a PACS workstation (LMS-Liver, MEDIAN Technologies). LD and TRA (''progressive'', ''stable'', ''partial remission'') were performed according to RECIST (Response Evaluation Criteria in Solid Tumors) and analyzed for between-method, interobserver and intraobserver variability. The time needed for evaluation was compared for both methods. Results: all measurements correlated excellently (r ≥ 0.96). Intraobserver (semi-automated), interobserver (manual) and between-method differences (by O1) in LD of 1.4 ± 2.6 mm, 1.9 ± 1.9 mm and 2.1 ± 2.0 mm, respectively, were not significant. Interobserver (semi-automated) and between-method (by O2) differences in LD of 3.0 ± 3.0 mm and 2.6 ± 2.0 mm, respectively, reflected a significant variability (p < 0.01). The interobserver agreement in manual and semi-automated TRA was 91.4%. The intraobserver agreement in semi-automated TRA was 84.5%. Between both methods a TRA agreement of 86.2% was obtained. Semi-automated evaluation (2.7 min) took slightly more time than manual evaluation (2.3 min). Conclusion: semi-automated and manual evaluation of liver metastases yield comparable results in response assessments and require comparable effort. (orig.)

  18. Rules-based analysis with JBoss Drools: adding intelligence to automation

    International Nuclear Information System (INIS)

    Ley, E. de; Jacobs, D.

    2012-01-01

    Rule engines are specialized software systems for applying conditional actions (if/then rules) on data. They are also known as 'production rule systems'. Rules engines are less-known as software technology than the traditional procedural, object-oriented, scripting or dynamic development languages. This is a pity, as their usage may offer an important enrichment to a development toolbox. JBoss Drools is an open-source rules engine that can easily be embedded in any Java application. Through an integration in our Passerelle process automation suite, we have been able to provide advanced solutions for intelligent process automation, complex event processing, system monitoring and alarming, automated repair etc. This platform has been proven for many years as an automated diagnosis and repair engine for Belgium's largest telecom provider, and it is being piloted at Synchrotron Soleil for device monitoring and alarming. After an introduction to rules engines in general and JBoss Drools in particular, we will present its integration in a solution platform, some important principles and a practical use case. (authors)

  19. TMT approach to observatory software development process

    Science.gov (United States)

    Buur, Hanne; Subramaniam, Annapurni; Gillies, Kim; Dumas, Christophe; Bhatia, Ravinder

    2016-07-01

    The purpose of the Observatory Software System (OSW) is to integrate all software and hardware components of the Thirty Meter Telescope (TMT) to enable observations and data capture; thus it is a complex software system that is defined by four principal software subsystems: Common Software (CSW), Executive Software (ESW), Data Management System (DMS) and Science Operations Support System (SOSS), all of which have interdependencies with the observatory control systems and data acquisition systems. Therefore, the software development process and plan must consider dependencies to other subsystems, manage architecture, interfaces and design, manage software scope and complexity, and standardize and optimize use of resources and tools. Additionally, the TMT Observatory Software will largely be developed in India through TMT's workshare relationship with the India TMT Coordination Centre (ITCC) and use of Indian software industry vendors, which adds complexity and challenges to the software development process, communication and coordination of activities and priorities as well as measuring performance and managing quality and risk. The software project management challenge for the TMT OSW is thus a multi-faceted technical, managerial, communications and interpersonal relations challenge. The approach TMT is using to manage this multifaceted challenge is a combination of establishing an effective geographically distributed software team (Integrated Product Team) with strong project management and technical leadership provided by the TMT Project Office (PO) and the ITCC partner to manage plans, process, performance, risk and quality, and to facilitate effective communications; establishing an effective cross-functional software management team composed of stakeholders, OSW leadership and ITCC leadership to manage dependencies and software release plans, technical complexities and change to approved interfaces, architecture, design and tool set, and to facilitate

  20. Automating the Small Library.

    Science.gov (United States)

    Skapura, Robert

    1987-01-01

    Discusses the use of microcomputers for automating school libraries, both for entire systems and for specific library tasks. Highlights include available library management software, newsletters that evaluate software, constructing an evaluation matrix, steps to consider in library automation, and a brief discussion of computerized card catalogs.…

  1. Knowledge-Based Aircraft Automation: Managers Guide on the use of Artificial Intelligence for Aircraft Automation and Verification and Validation Approach for a Neural-Based Flight Controller

    Science.gov (United States)

    Broderick, Ron

    1997-01-01

    The ultimate goal of this report was to integrate the powerful tools of artificial intelligence into the traditional process of software development. To maintain the US aerospace competitive advantage, traditional aerospace and software engineers need to more easily incorporate the technology of artificial intelligence into the advanced aerospace systems being designed today. The future goal was to transition artificial intelligence from an emerging technology to a standard technology that is considered early in the life cycle process to develop state-of-the-art aircraft automation systems. This report addressed the future goal in two ways. First, it provided a matrix that identified typical aircraft automation applications conducive to various artificial intelligence methods. The purpose of this matrix was to provide top-level guidance to managers contemplating the possible use of artificial intelligence in the development of aircraft automation. Second, the report provided a methodology to formally evaluate neural networks as part of the traditional process of software development. The matrix was developed by organizing the discipline of artificial intelligence into the following six methods: logical, object representation-based, distributed, uncertainty management, temporal and neurocomputing. Next, a study of existing aircraft automation applications that have been conducive to artificial intelligence implementation resulted in the following five categories: pilot-vehicle interface, system status and diagnosis, situation assessment, automatic flight planning, and aircraft flight control. The resulting matrix provided management guidance to understand artificial intelligence as it applied to aircraft automation. The approach taken to develop a methodology to formally evaluate neural networks as part of the software engineering life cycle was to start with the existing software quality assurance standards and to change these standards to include neural network

  2. Application of neural network and pattern recognition software to the automated analysis of continuous nuclear monitoring of on-load reactors

    Energy Technology Data Exchange (ETDEWEB)

    Howell, J.A.; Eccleston, G.W.; Halbig, J.K.; Klosterbuer, S.F. [Los Alamos National Lab., NM (United States); Larson, T.W. [California Polytechnic State Univ., San Luis Obispo, CA (US)

    1993-08-01

    Automated analysis using pattern recognition and neural network software can help interpret data, call attention to potential anomalies, and improve safeguards effectiveness. Automated software analysis, based on pattern recognition and neural networks, was applied to data collected from a radiation core discharge monitor system located adjacent to an on-load reactor core. Unattended radiation sensors continuously collect data to monitor on-line refueling operations in the reactor. The huge volume of data collected from a number of radiation channels makes it difficult for a safeguards inspector to review it all, check for consistency among the measurement channels, and find anomalies. Pattern recognition and neural network software can analyze large volumes of data from continuous, unattended measurements, thereby improving and automating the detection of anomalies. The authors developed a prototype pattern recognition program that determines the reactor power level and identifies the times when fuel bundles are pushed through the core during on-line refueling. Neural network models were also developed to predict fuel bundle burnup to calculate the region on the on-load reactor face from which fuel bundles were discharged based on the radiation signals. In the preliminary data set, which was limited and consisted of four distinct burnup regions, the neural network model correctly predicted the burnup region with an accuracy of 92%.

  3. Patterns of Software Development Process

    Directory of Open Access Journals (Sweden)

    Sandro Javier Bolaños Castro

    2011-12-01

    Full Text Available "Times New Roman","serif";mso-fareast-font-family:"Times New Roman";mso-ansi-language:EN-US;mso-fareast-language:EN-US;mso-bidi-language:AR-SA">This article presents a set of patterns that can be found to perform best practices in software processes that are directly related to the problem of implementing the activities of the process, the roles involved, the knowledge generated and the inputs and outputs belonging to the process. In this work, a definition of the architecture is encouraged by using different recurrent configurations that strengthen the process and yield efficient results for the development of a software project. The patterns presented constitute a catalog, which serves as a vocabulary for communication among project participants [1], [2], and also can be implemented through software tools, thus facilitating patterns implementation [3]. Additionally, a tool that can be obtained under GPL (General Public license is provided for this purpose

  4. Automated Subsystem Control for Life Support System (ASCLSS)

    Science.gov (United States)

    Block, Roger F.

    1987-01-01

    The Automated Subsystem Control for Life Support Systems (ASCLSS) program has successfully developed and demonstrated a generic approach to the automation and control of space station subsystems. The automation system features a hierarchical and distributed real-time control architecture which places maximum controls authority at the lowest or process control level which enhances system autonomy. The ASCLSS demonstration system pioneered many automation and control concepts currently being considered in the space station data management system (DMS). Heavy emphasis is placed on controls hardware and software commonality implemented in accepted standards. The approach demonstrates successfully the application of real-time process and accountability with the subsystem or process developer. The ASCLSS system completely automates a space station subsystem (air revitalization group of the ASCLSS) which moves the crew/operator into a role of supervisory control authority. The ASCLSS program developed over 50 lessons learned which will aide future space station developers in the area of automation and controls..

  5. Quality of radiomic features in glioblastoma multiforme: Impact of semi-automated tumor segmentation software

    International Nuclear Information System (INIS)

    Lee, Myung Eun; Kim, Jong Hyo; Woo, Bo Yeong; Ko, Micheal D.; Jamshidi, Neema

    2017-01-01

    The purpose of this study was to evaluate the reliability and quality of radiomic features in glioblastoma multiforme (GBM) derived from tumor volumes obtained with semi-automated tumor segmentation software. MR images of 45 GBM patients (29 males, 16 females) were downloaded from The Cancer Imaging Archive, in which post-contrast T1-weighted imaging and fluid-attenuated inversion recovery MR sequences were used. Two raters independently segmented the tumors using two semi-automated segmentation tools (TumorPrism3D and 3D Slicer). Regions of interest corresponding to contrast-enhancing lesion, necrotic portions, and non-enhancing T2 high signal intensity component were segmented for each tumor. A total of 180 imaging features were extracted, and their quality was evaluated in terms of stability, normalized dynamic range (NDR), and redundancy, using intra-class correlation coefficients, cluster consensus, and Rand Statistic. Our study results showed that most of the radiomic features in GBM were highly stable. Over 90% of 180 features showed good stability (intra-class correlation coefficient [ICC] ≥ 0.8), whereas only 7 features were of poor stability (ICC NDR ≥1), while above 35% of the texture features showed poor NDR (< 1). Features were shown to cluster into only 5 groups, indicating that they were highly redundant. The use of semi-automated software tools provided sufficiently reliable tumor segmentation and feature stability; thus helping to overcome the inherent inter-rater and intra-rater variability of user intervention. However, certain aspects of feature quality, including NDR and redundancy, need to be assessed for determination of representative signature features before further development of radiomics

  6. Test process for the safety-critical embedded software

    International Nuclear Information System (INIS)

    Sung, Ahyoung; Choi, Byoungju; Lee, Jangsoo

    2004-01-01

    Digitalization of nuclear Instrumentation and Control (I and C) system requires high reliability of not only hardware but also software. Verification and Validation (V and V) process is recommended for software reliability. But a more quantitative method is necessary such as software testing. Most of software in the nuclear I and C system is safety-critical embedded software. Safety-critical embedded software is specified, verified and developed according to V and V process. Hence two types of software testing techniques are necessary for the developed code. First, code-based software testing is required to examine the developed code. Second, after code-based software testing, software testing affected by hardware is required to reveal the interaction fault that may cause unexpected results. We call the testing of hardware's influence on software, an interaction testing. In case of safety-critical embedded software, it is also important to consider the interaction between hardware and software. Even if no faults are detected when testing either hardware or software alone, combining these components may lead to unexpected results due to the interaction. In this paper, we propose a software test process that embraces test levels, test techniques, required test tasks and documents for safety-critical embedded software. We apply the proposed test process to safety-critical embedded software as a case study, and show the effectiveness of it. (author)

  7. Software Engineering for Human Spaceflight

    Science.gov (United States)

    Fredrickson, Steven E.

    2014-01-01

    The Spacecraft Software Engineering Branch of NASA Johnson Space Center (JSC) provides world-class products, leadership, and technical expertise in software engineering, processes, technology, and systems management for human spaceflight. The branch contributes to major NASA programs (e.g. ISS, MPCV/Orion) with in-house software development and prime contractor oversight, and maintains the JSC Engineering Directorate CMMI rating for flight software development. Software engineering teams work with hardware developers, mission planners, and system operators to integrate flight vehicles, habitats, robotics, and other spacecraft elements. They seek to infuse automation and autonomy into missions, and apply new technologies to flight processor and computational architectures. This presentation will provide an overview of key software-related projects, software methodologies and tools, and technology pursuits of interest to the JSC Spacecraft Software Engineering Branch.

  8. Development of Safety-Critical Software for Nuclear Power Plant using a CASE Tool

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Chang Ho; Oh, Do Young; Kim, Koh Eun; Choi, Woong Seock; Sohn, Se Do; Kim, Jae Hack; Kim, Hang Bae [KEPCO E and C, Daejeon (Korea, Republic of)

    2011-08-15

    The Integrated SOftware Development Environment (ISODE) is developed to provide the major S/W life cycle processes that are composed of development process, V/V process, requirements traceability process, and automated document generation process and target importing process to Programmable Logic Controller (PLC) platform. This provides critical safety software developers with a certified, domain optimized, model-based development environment, and the associated services to reduce time and efforts to develop software such as debugging, simulation, code generation and document generation. This also provides critical safety software verifiers with integrated V/V features of each phase of the software life cycle using appropriate tools such as model test coverage, formal verification, and automated report generation. In addition to development and verification, the ISODE gives a complete traceability solution from the SW design phase to the testing phase. Using this information, the coverage and impact analysis can be done easily whenever software modification is necessary. The final source codes of ISODE are imported into the newly developed PLC environment, as a module based after automatically converted into the format required by PLC. Additional tests for module and unit level are performed on the target platform.

  9. Development of Safety-Critical Software for Nuclear Power Plant using a CASE Tool

    International Nuclear Information System (INIS)

    Kim, Chang Ho; Oh, Do Young; Kim, Koh Eun; Choi, Woong Seock; Sohn, Se Do; Kim, Jae Hack; Kim, Hang Bae

    2011-01-01

    The Integrated SOftware Development Environment (ISODE) is developed to provide the major S/W life cycle processes that are composed of development process, V/V process, requirements traceability process, and automated document generation process and target importing process to Programmable Logic Controller (PLC) platform. This provides critical safety software developers with a certified, domain optimized, model-based development environment, and the associated services to reduce time and efforts to develop software such as debugging, simulation, code generation and document generation. This also provides critical safety software verifiers with integrated V/V features of each phase of the software life cycle using appropriate tools such as model test coverage, formal verification, and automated report generation. In addition to development and verification, the ISODE gives a complete traceability solution from the SW design phase to the testing phase. Using this information, the coverage and impact analysis can be done easily whenever software modification is necessary. The final source codes of ISODE are imported into the newly developed PLC environment, as a module based after automatically converted into the format required by PLC. Additional tests for module and unit level are performed on the target platform

  10. Software quality: Process or people

    Science.gov (United States)

    Palmer, Regina; Labaugh, Modenna

    1993-01-01

    This paper will present data related to software development processes and personnel involvement from the perspective of software quality assurance. We examine eight years of data collected from six projects. Data collected varied by project but usually included defect and fault density with limited use of code metrics, schedule adherence, and budget growth information. The data are a blend of AFSCP 800-14 and suggested productivity measures in Software Metrics: A Practioner's Guide to Improved Product Development. A software quality assurance database tool, SQUID, was used to store and tabulate the data.

  11. FUZZY LOGIC BASED SOFTWARE PROCESS IMPROVIZATION FRAMEWORK FOR INDIAN SMALL SCALE SOFTWARE ORGANIZATIONS

    OpenAIRE

    A.M.Kalpana; Dr.A.Ebenezer Jeyakumar

    2010-01-01

    In this paper, the authors elaborate the results obtained after analyzing and assessing the software process activities in five small to medium sized Indian software companies. This work demonstrates a cost effective framework for software process appraisal, specificallytargeted at Indian software Small-to-Medium-sized Enterprises (SMEs). Improvisation deals with the unforeseen. It involves continual experimentation with new possibilities to create innovative and improved solutions outside cu...

  12. Automated Diatom Analysis Applied to Traditional Light Microscopy: A Proof-of-Concept Study

    Science.gov (United States)

    Little, Z. H. L.; Bishop, I.; Spaulding, S. A.; Nelson, H.; Mahoney, C.

    2017-12-01

    Diatom identification and enumeration by high resolution light microscopy is required for many areas of research and water quality assessment. Such analyses, however, are both expertise and labor-intensive. These challenges motivate the need for an automated process to efficiently and accurately identify and enumerate diatoms. Improvements in particle analysis software have increased the likelihood that diatom enumeration can be automated. VisualSpreadsheet software provides a possible solution for automated particle analysis of high-resolution light microscope diatom images. We applied the software, independent of its complementary FlowCam hardware, to automated analysis of light microscope images containing diatoms. Through numerous trials, we arrived at threshold settings to correctly segment 67% of the total possible diatom valves and fragments from broad fields of view. (183 light microscope images were examined containing 255 diatom particles. Of the 255 diatom particles present, 216 diatoms valves and fragments of valves were processed, with 170 properly analyzed and focused upon by the software). Manual analysis of the images yielded 255 particles in 400 seconds, whereas the software yielded a total of 216 particles in 68 seconds, thus highlighting that the software has an approximate five-fold efficiency advantage in particle analysis time. As in past efforts, incomplete or incorrect recognition was found for images with multiple valves in contact or valves with little contrast. The software has potential to be an effective tool in assisting taxonomists with diatom enumeration by completing a large portion of analyses. Benefits and limitations of the approach are presented to allow for development of future work in image analysis and automated enumeration of traditional light microscope images containing diatoms.

  13. Development and prospective evaluation of an automated software system for quality control of quantitative 99mTc-MAG3 renal studies.

    Science.gov (United States)

    Folks, Russell D; Garcia, Ernest V; Taylor, Andrew T

    2007-03-01

    Quantitative nuclear renography has numerous potential sources of error. We previously reported the initial development of a computer software module for comprehensively addressing the issue of quality control (QC) in the analysis of radionuclide renal images. The objective of this study was to prospectively test the QC software. The QC software works in conjunction with standard quantitative renal image analysis using a renal quantification program. The software saves a text file that summarizes QC findings as possible errors in user-entered values, calculated values that may be unreliable because of the patient's clinical condition, and problems relating to acquisition or processing. To test the QC software, a technologist not involved in software development processed 83 consecutive nontransplant clinical studies. The QC findings of the software were then tabulated. QC events were defined as technical (study descriptors that were out of range or were entered and then changed, unusually sized or positioned regions of interest, or missing frames in the dynamic image set) or clinical (calculated functional values judged to be erroneous or unreliable). Technical QC events were identified in 36 (43%) of 83 studies. Clinical QC events were identified in 37 (45%) of 83 studies. Specific QC events included starting the camera after the bolus had reached the kidney, dose infiltration, oversubtraction of background activity, and missing frames in the dynamic image set. QC software has been developed to automatically verify user input, monitor calculation of renal functional parameters, summarize QC findings, and flag potentially unreliable values for the nuclear medicine physician. Incorporation of automated QC features into commercial or local renal software can reduce errors and improve technologist performance and should improve the efficiency and accuracy of image interpretation.

  14. Transformation as a Design Process and Runtime Architecture for High Integrity Software

    Energy Technology Data Exchange (ETDEWEB)

    Bespalko, S.J.; Winter, V.L.

    1999-04-05

    We have discussed two aspects of creating high integrity software that greatly benefit from the availability of transformation technology, which in this case is manifest by the requirement for a sophisticated backtracking parser. First, because of the potential for correctly manipulating programs via small changes, an automated non-procedural transformation system can be a valuable tool for constructing high assurance software. Second, modeling the processing of translating data into information as a, perhaps, context-dependent grammar leads to an efficient, compact implementation. From a practical perspective, the transformation process should begin in the domain language in which a problem is initially expressed. Thus in order for a transformation system to be practical it must be flexible with respect to domain-specific languages. We have argued that transformation applied to specification results in a highly reliable system. We also attempted to briefly demonstrate that transformation technology applied to the runtime environment will result in a safe and secure system. We thus believe that the sophisticated multi-lookahead backtracking parsing technology is central to the task of being in a position to demonstrate the existence of HIS.

  15. Software engineering processes for Class D missions

    Science.gov (United States)

    Killough, Ronnie; Rose, Debi

    2013-09-01

    Software engineering processes are often seen as anathemas; thoughts of CMMI key process areas and NPR 7150.2A compliance matrices can motivate a software developer to consider other career fields. However, with adequate definition, common-sense application, and an appropriate level of built-in flexibility, software engineering processes provide a critical framework in which to conduct a successful software development project. One problem is that current models seem to be built around an underlying assumption of "bigness," and assume that all elements of the process are applicable to all software projects regardless of size and tolerance for risk. This is best illustrated in NASA's NPR 7150.2A in which, aside from some special provisions for manned missions, the software processes are to be applied based solely on the criticality of the software to the mission, completely agnostic of the mission class itself. That is, the processes applicable to a Class A mission (high priority, very low risk tolerance, very high national significance) are precisely the same as those applicable to a Class D mission (low priority, high risk tolerance, low national significance). This paper will propose changes to NPR 7150.2A, taking mission class into consideration, and discuss how some of these changes are being piloted for a current Class D mission—the Cyclone Global Navigation Satellite System (CYGNSS).

  16. Bioprocessing automation in cell therapy manufacturing: Outcomes of special interest group automation workshop.

    Science.gov (United States)

    Ball, Oliver; Robinson, Sarah; Bure, Kim; Brindley, David A; Mccall, David

    2018-04-01

    Phacilitate held a Special Interest Group workshop event in Edinburgh, UK, in May 2017. The event brought together leading stakeholders in the cell therapy bioprocessing field to identify present and future challenges and propose potential solutions to automation in cell therapy bioprocessing. Here, we review and summarize discussions from the event. Deep biological understanding of a product, its mechanism of action and indication pathogenesis underpin many factors relating to bioprocessing and automation. To fully exploit the opportunities of bioprocess automation, therapeutics developers must closely consider whether an automation strategy is applicable, how to design an 'automatable' bioprocess and how to implement process modifications with minimal disruption. Major decisions around bioprocess automation strategy should involve all relevant stakeholders; communication between technical and business strategy decision-makers is of particular importance. Developers should leverage automation to implement in-process testing, in turn applicable to process optimization, quality assurance (QA)/ quality control (QC), batch failure control, adaptive manufacturing and regulatory demands, but a lack of precedent and technical opportunities can complicate such efforts. Sparse standardization across product characterization, hardware components and software platforms is perceived to complicate efforts to implement automation. The use of advanced algorithmic approaches such as machine learning may have application to bioprocess and supply chain optimization. Automation can substantially de-risk the wider supply chain, including tracking and traceability, cryopreservation and thawing and logistics. The regulatory implications of automation are currently unclear because few hardware options exist and novel solutions require case-by-case validation, but automation can present attractive regulatory incentives. Copyright © 2018 International Society for Cellular Therapy

  17. Automating the Human Factors Engineering and Evaluation Processes

    International Nuclear Information System (INIS)

    Mastromonico, C.

    2002-01-01

    The Westinghouse Savannah River Company (WSRC) has developed a software tool for automating the Human Factors Engineering (HFE) design review, analysis, and evaluation processes. The tool provides a consistent, cost effective, graded, user-friendly approach for evaluating process control system Human System Interface (HSI) specifications, designs, and existing implementations. The initial set of HFE design guidelines, used in the tool, was obtained from NUREG- 0700. Each guideline was analyzed and classified according to its significance (general concept vs. supporting detail), the HSI technology (computer based vs. non-computer based), and the HSI safety function (safety vs. non-safety). Approximately 10 percent of the guidelines were determined to be redundant or obsolete and were discarded. The remaining guidelines were arranged in a Microsoft Access relational database, and a Microsoft Visual Basic user interface was provided to facilitate the HFE design review. The tool also provides the capability to add new criteria to accommodate advances in HSI technology and incorporate lessons learned. Summary reports produced by the tool can be easily ported to Microsoft Word and other popular PC office applications. An IBM compatible PC with Microsoft Windows 95 or higher is required to run the application

  18. Software Tool for Automated Failure Modes and Effects Analysis (FMEA) of Hydraulic Systems

    DEFF Research Database (Denmark)

    Stecki, J. S.; Conrad, Finn; Oh, B.

    2002-01-01

    Offshore, marine,aircraft and other complex engineering systems operate in harsh environmental and operational conditions and must meet stringent requirements of reliability, safety and maintability. To reduce the hight costs of development of new systems in these fields improved the design...... management techniques and a vast array of computer aided techniques are applied during design and testing stages. The paper present and discusses the research and development of a software tool for automated failure mode and effects analysis - FMEA - of hydraulic systems. The paper explains the underlying...

  19. Flexible Software Process Lines in Practice

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Ternité, Thomas; Friedrich, Jan

    2016-01-01

    Process flexibility and adaptability is a frequently discussed topic in literature, and several approaches propose techniques to improve and optimize software processes for a given organization- or project context. A software process line (SPrL) is an instrument to systematically construct...... that can be adapted to the respective context. In this article, we present an approach to construct flexible software process lines and show its practical application in the German V-Modell XT. The presented approach emerges from a 10-year research endeavor and was used to enhance the metamodel of the V......-Modell XT and to allow for improved process variability and lifecycle management. Practical dissemination and complementing empirical research show the suitability of the concept. We therefore contribute a proven approach that is presented as metamodel fragment for reuse and implementation in further...

  20. Integrating Usability Evaluations into the Software Development Process

    DEFF Research Database (Denmark)

    Lizano, Fulvio

    as relevant and strategic human–computer interaction (HCI) activities in the software development process, there are obstacles that limit the complete, effective and efficient integration of this kind of testing into the software development process. Two main obstacles are the cost of usability evaluations...... and the software developers' resistance to accepting users’ opinions regarding the lack of usability in their software systems. The ‘cost obstacle’ refers to the constraint of conducting usability evaluations in the software process due to the significant amount of resources required by this type of testing. Some......This thesis addresses the integration of usability evaluations into the software development process. The integration here is contextualized in terms of how to include usability evaluation as an activity in the software development lifecycle. Even though usability evaluations are considered...

  1. Article I. Multi-platform Automated Software Building and Packaging

    International Nuclear Information System (INIS)

    Rodriguez, A Abad; Gomes Gouveia, V E; Meneses, D; Capannini, F; Aimar, A; Di Meglio, A

    2012-01-01

    One of the major goals of the EMI (European Middleware Initiative) project is the integration of several components of the pre-existing middleware (ARC, gLite, UNICORE and dCache) into a single consistent set of packages with uniform distributions and repositories. Those individual middleware projects have been developed in the last decade by tens of development teams and before EMI were all built and tested using different tools and dedicated services. The software, millions of lines of code, is written in several programming languages and supports multiple platforms. Therefore a viable solution ought to be able to build and test applications on multiple programming languages using common dependencies on all selected platforms. It should, in addition, package the resulting software in formats compatible with the popular Linux distributions, such as Fedora and Debian, and store them in repositories from which all EMI software can be accessed and installed in a uniform way. Despite this highly heterogeneous initial situation, a single common solution, with the aim of quickly automating the integration of the middleware products, had to be selected and implemented in a few months after the beginning of the EMI project. Because of the previous knowledge and the short time available in which to provide this common solution, the ETICS service, where the gLite middleware was already built for years, was selected. This contribution describes how the team in charge of providing a common EMI build and packaging infrastructure to the whole project has developed a homogeneous solution for releasing and packaging the EMI components from the initial set of tools used by the earlier middleware projects. An important element of the presentation is the developers experience and feedback on converging on ETICS and on the on-going work in order to finally add more widely used and supported build and packaging solutions of the Linux platforms

  2. Automation from pictures

    International Nuclear Information System (INIS)

    Kozubal, A.J.

    1992-01-01

    The state transition diagram (STD) model has been helpful in the design of real time software, especially with the emergence of graphical computer aided software engineering (CASE) tools. Nevertheless, the translation of the STD to real time code has in the past been primarily a manual task. At Los Alamos we have automated this process. The designer constructs the STD using a CASE tool (Cadre Teamwork) using a special notation for events and actions. A translator converts the STD into an intermediate state notation language (SNL), and this SNL is compiled directly into C code (a state program). Execution of the state program is driven by external events, allowing multiple state programs to effectively share the resources of the host processor. Since the design and the code are tightly integrated through the CASE tool, the design and code never diverge, and we avoid design obsolescence. Furthermore, the CASE tool automates the production of formal technical documents from the graphic description encapsulated by the CASE tool. (author)

  3. Framework programmable platform for the advanced software development workstation. Integration mechanism design document

    Science.gov (United States)

    Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Reddy, Uday; Ackley, Keith; Futrell, Mike

    1991-01-01

    The Framework Programmable Software Development Platform (FPP) is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software development environment. Guided by this model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated.

  4. Automating the radiographic NDT process

    International Nuclear Information System (INIS)

    Aman, J.K.

    1988-01-01

    Automation, the removal of the human element in inspection has not been generally applied to film radiographic NDT. The justification for automation is not only productivity but also reliability of results. Film remains in the automated system of the future because of its extremely high image content, approximately 3x10 (to the power of nine) bits per 14x17. This is equivalent to 2200 computer floppy disks parts handling systems and robotics applied for manufacturing and some NDT modalities, should now be applied to film radiographic NDT systems. Automatic film handling can be achieved with the daylight NDT film handling system. Automatic film processing is becoming the standard in industry and can be coupled to the daylight system. Robots offer the opportunity to automate fully the exposure step. Finally, a computer aided interpretation appears on the horizon. A unit which laser scans a 14x27 (inch) film in 6-8 seconds can digitize film in information for further manipulation and possible automatic interrogations (computer aided interpretation). The system called FDRS (for film digital radiography system) is moving toward 50 micron (16 lines/mm) resolution. This is believed to meet the need of the majority of image content needs. (Author). 4 refs.; 21 figs

  5. Automating Object-Oriented Software Development Methods

    NARCIS (Netherlands)

    Tekinerdogan, B.; Saeki, Motoshi; Sunyé, Gerson; van den Broek, P.M.; Hruby, Pavel; Tekinerdogan, B.; van den Broek, P.M.; Saeki, M.; Hruby, P.; Sunye, G.

    2001-01-01

    Current software projects have generally to deal with producing and managing large and complex software products. It is generally believed that applying software development methods are useful in coping with this complexity and for supporting quality. As such numerous object-oriented software

  6. Automating Object-Oriented Software Development Methods

    NARCIS (Netherlands)

    Tekinerdogan, B.; Frohner, A´ kos; Saeki, Motoshi; Sunyé, Gerson; van den Broek, P.M.; Hruby, Pavel

    2002-01-01

    Current software projects have generally to deal with producing and managing large and complex software products. It is generally believed that applying software development methods are useful in coping with this complexity and for supporting quality. As such numerous object-oriented software

  7. Process computers automate CERN power supply installations

    International Nuclear Information System (INIS)

    Ullrich, H.; Martin, A.

    1974-01-01

    Higher standards of performance and reliability in the power plants of large particle accelerators necessitate increasing use of automation. The CERN (European Nuclear Research Centre) in Geneva started to employ process computers for plant automation at an early stage in its history. The great complexity and extent of the plants for high-energy physics first led to the setting-up of decentralized automatic systems which are now being increasingly combined into one interconnected automation system. One of these automatic systems controls and monitors the extensive power supply installations for the main ring magnets in the experimental zones. (orig.) [de

  8. Quality of radiomic features in glioblastoma multiforme: Impact of semi-automated tumor segmentation software

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Myung Eun; Kim, Jong Hyo [Center for Medical-IT Convergence Technology Research, Advanced Institutes of Convergence Technology, Seoul National University, Suwon (Korea, Republic of); Woo, Bo Yeong [Dept. of Transdisciplinary Studies, Graduate School of Convergence Science and Technology, Seoul National University, Suwon (Korea, Republic of); Ko, Micheal D.; Jamshidi, Neema [Dept. of Radiological Sciences, University of California, Los Angeles, Los Angeles (United States)

    2017-06-15

    The purpose of this study was to evaluate the reliability and quality of radiomic features in glioblastoma multiforme (GBM) derived from tumor volumes obtained with semi-automated tumor segmentation software. MR images of 45 GBM patients (29 males, 16 females) were downloaded from The Cancer Imaging Archive, in which post-contrast T1-weighted imaging and fluid-attenuated inversion recovery MR sequences were used. Two raters independently segmented the tumors using two semi-automated segmentation tools (TumorPrism3D and 3D Slicer). Regions of interest corresponding to contrast-enhancing lesion, necrotic portions, and non-enhancing T2 high signal intensity component were segmented for each tumor. A total of 180 imaging features were extracted, and their quality was evaluated in terms of stability, normalized dynamic range (NDR), and redundancy, using intra-class correlation coefficients, cluster consensus, and Rand Statistic. Our study results showed that most of the radiomic features in GBM were highly stable. Over 90% of 180 features showed good stability (intra-class correlation coefficient [ICC] ≥ 0.8), whereas only 7 features were of poor stability (ICC < 0.5). Most first order statistics and morphometric features showed moderate-to-high NDR (4 > NDR ≥1), while above 35% of the texture features showed poor NDR (< 1). Features were shown to cluster into only 5 groups, indicating that they were highly redundant. The use of semi-automated software tools provided sufficiently reliable tumor segmentation and feature stability; thus helping to overcome the inherent inter-rater and intra-rater variability of user intervention. However, certain aspects of feature quality, including NDR and redundancy, need to be assessed for determination of representative signature features before further development of radiomics.

  9. 4th International Conference on Software Process Improvement

    CERN Document Server

    Muñoz, Mirna; Rocha, Álvaro; Calvo-Manzano, Jose

    2016-01-01

    This book contains a selection of papers from The 2015 International Conference on Software Process Improvement (CIMPS’15), held between the 28th and 30th of October in Mazatlán, Sinaloa, México. The CIMPS’15 is a global forum for researchers and practitioners that present and discuss the most recent innovations, trends, results, experiences and concerns in the several perspectives of Software Engineering with clear relationship but not limited to software processes, Security in Information and Communication Technology and Big Data Field. The main topics covered are: Organizational Models, Standards and Methodologies, Knowledge Management, Software Systems, Applications and Tools, Information and Communication Technologies and Processes in non-software domains (Mining, automotive, aerospace, business, health care, manufacturing, etc.) with a demonstrated relationship to software process challenges.

  10. Introduction to adoption of lean canvas in software test architecture design

    Directory of Open Access Journals (Sweden)

    Padmaraj Nidagundi

    2017-01-01

    Full Text Available The growth of the software dependent businesses, as well as the use of electronic devices in daily life, brings new challenges requiring the software to work error free all the time, to achieve this goal software needs to be sufficiently and effectively tested during various development phases. Most software development companies make great efforts in testing; it is even more difficult to reach the error-free software goal. Different software development methodologies (e.g. traditional waterfall, agile brought in a new dimension for both - development and testing - introducing new technologies and tools. In software test automation the test architecture design plays a key role in managing written test cases and effectively executing them. Having the more effective software test automation architecture design in test process saves resources, efforts and reduces the technical depth. This paper provides the new dimension and possibilities of using lean canvas in the design of the software test architecture.

  11. AUTOMATION DESIGN FOR MONORAIL - BASED SYSTEM PROCESSES

    Directory of Open Access Journals (Sweden)

    Bunda BESA

    2016-12-01

    Full Text Available Currently, conventional methods of decline development put enormous cost pressure on the profitability of mining operations. This is the case with narrow vein ore bodies where current methods and mine design of decline development may be too expensive to support economic extraction of the ore. According to studies, the time it takes to drill, clean and blast an end in conventional decline development can be up to 224 minutes. This is because once an end is blasted, cleaning should first be completed before drilling can commence, resulting in low advance rates per shift. Improvements in advance rates during decline development can be achieved by application of the Electric Monorail Transport System (EMTS based drilling system. The system consists of the drilling and loading components that use monorail technology to drill and clean the face during decline development. The two systems work simultaneously at the face in such a way that as the top part of the face is being drilled the pneumatic loading system cleans the face. However, to improve the efficiency of the two systems, critical processes performed by the two systems during mining operations must be automated. Automation increases safety and productivity, reduces operator fatigue and also reduces the labour costs of the system. The aim of this paper is, therefore, to describe automation designs of the two processes performed by the monorail drilling and loading systems during operations. During automation design, critical processes performed by the two systems and control requirements necessary to allow the two systems execute such processes automatically have also been identified.

  12. Improved protein hydrogen/deuterium exchange mass spectrometry platform with fully automated data processing.

    Science.gov (United States)

    Zhang, Zhongqi; Zhang, Aming; Xiao, Gang

    2012-06-05

    Protein hydrogen/deuterium exchange (HDX) followed by protease digestion and mass spectrometric (MS) analysis is accepted as a standard method for studying protein conformation and conformational dynamics. In this article, an improved HDX MS platform with fully automated data processing is described. The platform significantly reduces systematic and random errors in the measurement by introducing two types of corrections in HDX data analysis. First, a mixture of short peptides with fast HDX rates is introduced as internal standards to adjust the variations in the extent of back exchange from run to run. Second, a designed unique peptide (PPPI) with slow intrinsic HDX rate is employed as another internal standard to reflect the possible differences in protein intrinsic HDX rates when protein conformations at different solution conditions are compared. HDX data processing is achieved with a comprehensive HDX model to simulate the deuterium labeling and back exchange process. The HDX model is implemented into the in-house developed software MassAnalyzer and enables fully unattended analysis of the entire protein HDX MS data set starting from ion detection and peptide identification to final processed HDX output, typically within 1 day. The final output of the automated data processing is a set (or the average) of the most possible protection factors for each backbone amide hydrogen. The utility of the HDX MS platform is demonstrated by exploring the conformational transition of a monoclonal antibody by increasing concentrations of guanidine.

  13. An Intelligent Automation Platform for Rapid Bioprocess Design.

    Science.gov (United States)

    Wu, Tianyi; Zhou, Yuhong

    2014-08-01

    Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user's inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. © 2013 Society for Laboratory Automation and Screening.

  14. Next Generation Software Process Improvement

    National Research Council Canada - National Science Library

    Turnas, Daniel

    2003-01-01

    .... The application of these processes allows for an organization to mature. The software maturity level, and process improvement, of an organization can be measured with the Capability Maturity Model...

  15. Automated method of phasing difficult nuclear magnetic resonance spectra with application to the unsaturated carbon analysis of oils

    Energy Technology Data Exchange (ETDEWEB)

    Sterna, L.L.; Tong, V.P. (Shell Development Company, Houston, TX (USA). Westhollow Research Center)

    1991-08-01

    A new method for the automated phasing of n.m.r. spectra is described. The basis of the automation is that the software performs the phasing in the same fashion as a trained n.m.r. operator rather than using mathematical relationships between absorptive and dispersive spectra. The method is illustrated with processing of the {sup 13}C n.m.r. spectrum of a catalytic cracking feedstock. The software readily phased the spectrum even though the spectrum had very broad features and a significant baseline correction. The software performed well even when the time-domain data was left-shifted to introduce a large first-order phase error. The method was applied to measure the percentage of unsaturated carbon in hydrocarbons. Extensive tests were performed to compare automated processing with manual processing for this application; the automated method was found to give both better precision and accuracy. The method can be easily tailored to many other types of analyses. 9 refs., 4 figs., 3 tabs.

  16. Automated processing of nuclear materials accounting data

    International Nuclear Information System (INIS)

    Straka, J.; Pacak, P.; Moravec, J.

    1980-01-01

    An automated system was developed of nuclear materials accounting in Czechoslovakia. The system allows automating data processing including data storage. It comprises keeping records of inventories and material balance. In designing the system, the aim of the IAEA was taken into consideration, ie., building a unified information system interconnected with state-run systems of accounting and checking nuclear materials in the signatory countries of the non-proliferation treaty. The nuclear materials accounting programs were written in PL-1 and were tested at an EC 1040 computer at UJV Rez where also the routine data processing takes place. (B.S.)

  17. 10 CFR 1017.28 - Processing on Automated Information Systems (AIS).

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Processing on Automated Information Systems (AIS). 1017.28... UNCLASSIFIED CONTROLLED NUCLEAR INFORMATION Physical Protection Requirements § 1017.28 Processing on Automated Information Systems (AIS). UCNI may be processed or produced on any AIS that complies with the guidance in OMB...

  18. Solar Asset Management Software

    Energy Technology Data Exchange (ETDEWEB)

    Iverson, Aaron [Ra Power Management, Inc., Oakland, CA (United States); Zviagin, George [Ra Power Management, Inc., Oakland, CA (United States)

    2016-09-30

    Ra Power Management (RPM) has developed a cloud based software platform that manages the financial and operational functions of third party financed solar projects throughout their lifecycle. RPM’s software streamlines and automates the sales, financing, and management of a portfolio of solar assets. The software helps solar developers automate the most difficult aspects of asset management, leading to increased transparency, efficiency, and reduction in human error. More importantly, our platform will help developers save money by improving their operating margins.

  19. Media Magic: Automating a K-12 Library Program in a Rural District.

    Science.gov (United States)

    Adams, Helen

    1994-01-01

    Describes the automation process in a library resources center in a small rural school district. Topics discussed include long-range planning; retrospective conversion for an online catalog; library automation software vendors; finances; training; time savings; CD-ROM products; telecomputing; computer literacy skills; professional development…

  20. Analysis and Automation of Calibration Process for Measurement Coils for Particle Accelerator Magnets

    CERN Document Server

    AUTHOR|(CDS)2226624; Azzam, Jamal; Chanthery, Elodie

    Various techniques are used to measure magnetic fields within the Magnetic Measurement (MM) section, all of which require the use of specific sensors. These sensors need to be calibrated in order to map the resulting signals to the characteristics of the magnetic field. Thus, the calibration procedure would benefit from being improved and made more stable, efficient, and technologically up-to-date. Consequently, to improve the efficiency of the calibration procedure and the work processes it was first necessary to analyze them and identify potential issues and weaknesses to be further addressed. The calibration procedure mainly suffered from too many manual operations, which were potential sources of error, and from outdated readout and data acquisition software and hardware. Firstly, a new software program had to be developed using a C++ framework called Flexible Framework for Magnetic Measurements (FFMM), that would automate the data acquisition and analysis. Next, a feasibility study had to be done ...

  1. A taxonomy and discussion of software attack technologies

    Science.gov (United States)

    Banks, Sheila B.; Stytz, Martin R.

    2005-03-01

    Software is a complex thing. It is not an engineering artifact that springs forth from a design by simply following software coding rules; creativity and the human element are at the heart of the process. Software development is part science, part art, and part craft. Design, architecture, and coding are equally important activities and in each of these activities, errors may be introduced that lead to security vulnerabilities. Therefore, inevitably, errors enter into the code. Some of these errors are discovered during testing; however, some are not. The best way to find security errors, whether they are introduced as part of the architecture development effort or coding effort, is to automate the security testing process to the maximum extent possible and add this class of tools to the tools available, which aids in the compilation process, testing, test analysis, and software distribution. Recent technological advances, improvements in computer-generated forces (CGFs), and results in research in information assurance and software protection indicate that we can build a semi-intelligent software security testing tool. However, before we can undertake the security testing automation effort, we must understand the scope of the required testing, the security failures that need to be uncovered during testing, and the characteristics of the failures. Therefore, we undertook the research reported in the paper, which is the development of a taxonomy and a discussion of software attacks generated from the point of view of the security tester with the goal of using the taxonomy to guide the development of the knowledge base for the automated security testing tool. The representation for attacks and threat cases yielded by this research captures the strategies, tactics, and other considerations that come into play during the planning and execution of attacks upon application software. The paper is organized as follows. Section one contains an introduction to our research

  2. WARACS: Wrappers to Automate the Reconstruction of Ancestral Character States.

    Science.gov (United States)

    Gruenstaeudl, Michael

    2016-02-01

    Reconstructions of ancestral character states are among the most widely used analyses for evaluating the morphological, cytological, or ecological evolution of an organismic lineage. The software application Mesquite remains the most popular application for such reconstructions among plant scientists, even though its support for automating complex analyses is limited. A software tool is needed that automates the reconstruction and visualization of ancestral character states with Mesquite and similar applications. A set of command line-based Python scripts was developed that (a) communicates standardized input to and output from the software applications Mesquite, BayesTraits, and TreeGraph2; (b) automates the process of ancestral character state reconstruction; and (c) facilitates the visualization of reconstruction results. WARACS provides a simple tool that streamlines the reconstruction and visualization of ancestral character states over a wide array of parameters, including tree distribution, character state, and optimality criterion.

  3. An automated methodology development. [software design for combat simulation

    Science.gov (United States)

    Hawley, L. R.

    1985-01-01

    The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.

  4. Automated screening for retinopathy

    Directory of Open Access Journals (Sweden)

    A. S. Rodin

    2014-07-01

    Full Text Available Retinal pathology is a common cause of an irreversible decrease of central vision commonly found amongst senior population. Detection of the earliest signs of retinal diseases can be facilitated by viewing retinal images available from the telemedicine networks. To facilitate the process of retinal images, screening software applications based on image recognition technology are currently on the various stages of development.Purpose: To develop and implement computerized image recognition software that can be used as a decision support technologyfor retinal image screening for various types of retinopathies.Methods: The software application for the retina image recognition has been developed using C++ language. It was tested on dataset of 70 images with various types of pathological features (age related macular degeneration, chorioretinitis, central serous chorioretinopathy and diabetic retinopathy.Results: It was shown that the system can achieve a sensitivity of 73 % and specificity of 72 %.Conclusion: Automated detection of macular lesions using proposed software can significantly reduce manual grading workflow. In addition, automated detection of retinal lesions can be implemented as a clinical decision support system for telemedicine screening. It is anticipated that further development of this technology can become a part of diagnostic image analysis system for the electronic health records.

  5. Design and implementation of software for automated quality control and data analysis for a complex LC/MS/MS assay for urine opiates and metabolites.

    Science.gov (United States)

    Dickerson, Jane A; Schmeling, Michael; Hoofnagle, Andrew N; Hoffman, Noah G

    2013-01-16

    Mass spectrometry provides a powerful platform for performing quantitative, multiplexed assays in the clinical laboratory, but at the cost of increased complexity of analysis and quality assurance calculations compared to other methodologies. Here we describe the design and implementation of a software application that performs quality control calculations for a complex, multiplexed, mass spectrometric analysis of opioids and opioid metabolites. The development and implementation of this application improved our data analysis and quality assurance processes in several ways. First, use of the software significantly improved the procedural consistency for performing quality control calculations. Second, it reduced the amount of time technologists spent preparing and reviewing the data, saving on average over four hours per run, and in some cases improving turnaround time by a day. Third, it provides a mechanism for coupling procedural and software changes with the results of each analysis. We describe several key details of the implementation including the use of version control software and automated unit tests. These generally useful software engineering principles should be considered for any software development project in the clinical lab. Copyright © 2012 Elsevier B.V. All rights reserved.

  6. The Automation of Nowcast Model Assessment Processes

    Science.gov (United States)

    2016-09-01

    secondly, provide modelers with the information needed to understand the model errors and how their algorithm changes might mitigate these errors. In...by ARL modelers. 2. Development Environment The automation of Point-Stat processes (i.e., PSA) was developed using Python 3.5.* Python was selected...because it is easy to use, widely used for scripting, and satisfies all the requirements to automate the implementation of the Point-Stat tool. In

  7. A two-dimensionally coincident second difference cosmic ray spike removal method for the fully automated processing of Raman spectra.

    Science.gov (United States)

    Schulze, H Georg; Turner, Robin F B

    2014-01-01

    Charge-coupled device detectors are vulnerable to cosmic rays that can contaminate Raman spectra with positive going spikes. Because spikes can adversely affect spectral processing and data analyses, they must be removed. Although both hardware-based and software-based spike removal methods exist, they typically require parameter and threshold specification dependent on well-considered user input. Here, we present a fully automated spike removal algorithm that proceeds without requiring user input. It is minimally dependent on sample attributes, and those that are required (e.g., standard deviation of spectral noise) can be determined with other fully automated procedures. At the core of the method is the identification and location of spikes with coincident second derivatives along both the spectral and spatiotemporal dimensions of two-dimensional datasets. The method can be applied to spectra that are relatively inhomogeneous because it provides fairly effective and selective targeting of spikes resulting in minimal distortion of spectra. Relatively effective spike removal obtained with full automation could provide substantial benefits to users where large numbers of spectra must be processed.

  8. Tool support for distributed software engineering

    NARCIS (Netherlands)

    Spanjers, H.; Ter Huurne, M.; Bendas, D.; Graaf, B.; Lormans, M.; Van Solingen, R.

    2006-01-01

    Developing a software system in collaboration with other partners, and on different geographical locations is a big challenge for organizations. In this article we first discuss a system that automates build and test processes: SoftFab. This system has been successfully applied in practice in the

  9. Interface-based software testing

    Directory of Open Access Journals (Sweden)

    Aziz Ahmad Rais

    2016-10-01

    Full Text Available Software quality is determined by assessing the characteristics that specify how it should work, which are verified through testing. If it were possible to touch, see, or measure software, it would be easier to analyze and prove its quality. Unfortunately, software is an intangible asset, which makes testing complex. This is especially true when software quality is not a question of particular functions that can be tested through a graphical user interface. The primary objective of software architecture is to design quality of software through modeling and visualization. There are many methods and standards that define how to control and manage quality. However, many IT software development projects still fail due to the difficulties involved in measuring, controlling, and managing software quality. Software quality failure factors are numerous. Examples include beginning to test software too late in the development process, or failing properly to understand, or design, the software architecture and the software component structure. The goal of this article is to provide an interface-based software testing technique that better measures software quality, automates software quality testing, encourages early testing, and increases the software’s overall testability

  10. Model-driven software engineering

    NARCIS (Netherlands)

    Amstel, van M.F.; Brand, van den M.G.J.; Protic, Z.; Verhoeff, T.; Hamberg, R.; Verriet, J.

    2014-01-01

    Software plays an important role in designing and operating warehouses. However, traditional software engineering methods for designing warehouse software are not able to cope with the complexity, size, and increase of automation in modern warehouses. This chapter describes Model-Driven Software

  11. Automation of radiation dosimetry using PTW dosemeter and LabVIEWTM

    International Nuclear Information System (INIS)

    Weiss, C.; Al-Frouh, K.; Anjak, O.

    2011-01-01

    Automation of UNIDOS 'Dosemeter' using personal computer (PC) is discussed in this paper. In order to save time and eliminate human operation errors during the radiation dosimetry, suitable software, using LabVIEW TM graphical programming language, was written to automate and facilitate the processes of measurements, analysis and data storage. The software calculates the calibration factor of the ionization chamber in terms of air kerma or absorbed dose to water according to IAEA dosimetry protocols. It also has the ability to print a calibration certificate. The obtained results using this software are found to be more reliable and flexible than those obtained by manual methods previously employed. Using LabVIEW TM as a development tool is extremely convenient to make things easier when software modifications and improvements are needed.

  12. E-health, phase two: the imperative to integrate process automation with communication automation for large clinical reference laboratories.

    Science.gov (United States)

    White, L; Terner, C

    2001-01-01

    The initial efforts of e-health have fallen far short of expectations. They were buoyed by the hype and excitement of the Internet craze but limited by their lack of understanding of important market and environmental factors. E-health now recognizes that legacy systems and processes are important, that there is a technology adoption process that needs to be followed, and that demonstrable value drives adoption. Initial e-health transaction solutions have targeted mostly low-cost problems. These solutions invariably are difficult to integrate into existing systems, typically requiring manual interfacing to supported processes. This limitation in particular makes them unworkable for large volume providers. To meet the needs of these providers, e-health companies must rethink their approaches, appropriately applying technology to seamlessly integrate all steps into existing business functions. E-automation is a transaction technology that automates steps, integration of steps, and information communication demands, resulting in comprehensive automation of entire business functions. We applied e-automation to create a billing management solution for clinical reference laboratories. Large volume, onerous regulations, small margins, and only indirect access to patients challenge large laboratories' billing departments. Couple these problems with outmoded, largely manual systems and it becomes apparent why most laboratory billing departments are in crisis. Our approach has been to focus on the most significant and costly problems in billing: errors, compliance, and system maintenance and management. The core of the design relies on conditional processing, a "universal" communications interface, and ASP technologies. The result is comprehensive automation of all routine processes, driving out errors and costs. Additionally, compliance management and billing system support and management costs are dramatically reduced. The implications of e-automated processes can extend

  13. Automated radiochemical processing for clinical PET

    International Nuclear Information System (INIS)

    Padgett, H.C.; Schmidt, D.G.; Bida, G.T.; Wieland, B.W.; Pekrul, E.; Kingsbury, W.G.

    1991-01-01

    With the recent emergence of positron emission tomography (PET) as a viable clinical tool, there is a need for a convenient, cost-effective source of the positron emitter-labeled radiotracers labeled with carbon-11, nitrogen-13, oxygen-15, and fluorine-18. These short-lived radioisotopes are accelerator produced and thus, require a cyclotron and radiochemistry processing instrumentation that can be operated 3 in a clinical environment by competant technicians. The basic goal is to ensure safety and reliability while setting new standards for economy and ease of operation. The Siemens Radioisotope Delivery System (RDS 112) is a fully automated system dedicated to the production and delivery of positron-emitter labeled precursors and radiochemicals required to support a clinical PET imaging program. Thus, the entire RDS can be thought of as an automated radiochemical processing apparatus

  14. A New Tool for Automated Data Collection and Complete On-site Flux Data Processing for Eddy Covariance Measurements

    Science.gov (United States)

    Begashaw, I. G.; Kathilankal, J. C.; Li, J.; Beaty, K.; Ediger, K.; Forgione, A.; Fratini, G.; Johnson, D.; Velgersdyk, M.; Hupp, J. R.; Xu, L.; Burba, G. G.

    2014-12-01

    The eddy covariance method is widely used for direct measurements of turbulent exchange of gases and energy between the surface and atmosphere. In the past, raw data were collected first in the field and then processed back in the laboratory to achieve fully corrected publication-ready flux results. This post-processing consumed significant amount of time and resources, and precluded researchers from accessing near real-time final flux results. A new automated measurement system with novel hardware and software designs was developed, tested, and deployed starting late 2013. The major advancements with this automated flux system include: 1) Enabling logging high-frequency, three-dimensional wind speeds and multiple gas densities (CO2, H2O and CH4), low-frequency meteorological data, and site metadata simultaneously through a specially designed file format 2) Conducting fully corrected, real-time on-site flux computations using conventional as well as user-specified methods, by implementing EddyPro Software on a small low-power microprocessor 3) Providing precision clock control and coordinate information for data synchronization and inter-site data comparison by incorporating a GPS and Precision Time Protocol. Along with these innovations, a data management server application was also developed to chart fully corrected real-time fluxes to assist remote system monitoring, to send e-mail alerts, and to automate data QA/QC, transfer and archiving at individual stations or on a network level. Combination of all of these functions was designed to help save substantial amount of time and costs associated with managing a research site by eliminating the post-field data processing, reducing user errors and facilitating real-time access to fully corrected flux results. The design, functionality, and test results from this new eddy covariance measurement tool will be presented.

  15. Organization of film data processing in the PPI-SA automated system

    International Nuclear Information System (INIS)

    Ovsov, Yu.V.; Perekatov, V.G.

    1984-01-01

    Organization of processing nuclear interaction images at PUOS - type standard devices using the PPI-SA automated system is considered. The system is made in the form of a complete module comprising two scanning measuring projectors and a scan-ning automatic device which operate in real time on line with the BESM-4-computer. The system comprises: subsystem for photographic film scanning, selection of events for measurements and preliminary encoding; subsystem for formation and generation of libraries with data required for monitoring the scanning automatic device; subsystem for precision measurements separate coordinates on photo images of nuclear particle tracks and ionization losses. The system software comprises monitoring programs for the projectors and scanning automatic device as well as test functional control programs and operating system. The programs are organized a modular concept. By changing the module set the system can be modified and adapted for image processing in different fields of science and technology

  16. How does Software Process Improvement Address Global Software Engineering?

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Diebold, Philipp; Münch, Jürgen

    2016-01-01

    a systematic mapping study on the state-of-the-art in SPI from a general perspective, we observed Global Software Engineering (GSE) becoming a topic of interest in recent years. Therefore, in this paper, we provide a detailed investigation of those papers from the overall systematic mapping study that were......For decades, Software Process Improvement (SPI) programs have been implemented, inter alia, to improve quality and speed of software development. To set up, guide, and carry out SPI projects, and to measure SPI state, impact, and success, a multitude of different SPI approaches and considerable...... experience are available. SPI addresses many aspects ranging from individual developer skills to entire organizations. It comprises for instance the optimization of specific activities in the software lifecycle as well as the creation of organization awareness and project culture. In the course of conducting...

  17. Means of storage and automated monitoring of versions of text technical documentation

    Science.gov (United States)

    Leonovets, S. A.; Shukalov, A. V.; Zharinov, I. O.

    2018-03-01

    The paper presents automation of the process of preparation, storage and monitoring of version control of a text designer, and program documentation by means of the specialized software is considered. Automation of preparation of documentation is based on processing of the engineering data which are contained in the specifications and technical documentation or in the specification. Data handling assumes existence of strictly structured electronic documents prepared in widespread formats according to templates on the basis of industry standards and generation by an automated method of the program or designer text document. Further life cycle of the document and engineering data entering it are controlled. At each stage of life cycle, archive data storage is carried out. Studies of high-speed performance of use of different widespread document formats in case of automated monitoring and storage are given. The new developed software and the work benches available to the developer of the instrumental equipment are described.

  18. Automated MAD and MIR structure solution

    International Nuclear Information System (INIS)

    Terwilliger, Thomas C.; Berendzen, Joel

    1999-01-01

    A fully automated procedure for solving MIR and MAD structures has been developed using a scoring scheme to convert the structure-solution process into an optimization problem. Obtaining an electron-density map from X-ray diffraction data can be difficult and time-consuming even after the data have been collected, largely because MIR and MAD structure determinations currently require many subjective evaluations of the qualities of trial heavy-atom partial structures before a correct heavy-atom solution is obtained. A set of criteria for evaluating the quality of heavy-atom partial solutions in macromolecular crystallography have been developed. These have allowed the conversion of the crystal structure-solution process into an optimization problem and have allowed its automation. The SOLVE software has been used to solve MAD data sets with as many as 52 selenium sites in the asymmetric unit. The automated structure-solution process developed is a major step towards the fully automated structure-determination, model-building and refinement procedure which is needed for genomic scale structure determinations

  19. Aligning Requirements-Driven Software Processes with IT Governance

    OpenAIRE

    Nguyen Huynh Anh, Vu; Kolp, Manuel; Heng, Samedi; Wautelet, Yves

    2017-01-01

    Requirements Engineering is closely intertwined with Information Technology (IT) Governance. Aligning IT Governance principles with Requirements-Driven Software Processes allows them to propose governance and management rules for software development to cope with stakeholders’ requirements and expectations. Typically, the goal of IT Governance in software engineering is to ensure that the results of a software organization business processes meet the strategic requirements of the organization...

  20. Development of Flexible Software Process Lines with Variability Operations

    DEFF Research Database (Denmark)

    Schramm, Joachim; Dohrmann, Patrick; Kuhrmann, Marco

    2015-01-01

    families of processes and, as part of this, variability operations provide means to modify and reuse pre-defined process assets. Objective: Our goal is to evaluate the feasibility of variability operations to support the development of flexible software process lines. Method: We conducted a longitudinal......Context: Software processes evolve over time and several approaches were proposed to support the required flexibility. Yet, little is known whether these approaches sufficiently support the development of large software processes. A software process line helps to systematically develop and manage...

  1. ARTIP: Automated Radio Telescope Image Processing Pipeline

    Science.gov (United States)

    Sharma, Ravi; Gyanchandani, Dolly; Kulkarni, Sarang; Gupta, Neeraj; Pathak, Vineet; Pande, Arti; Joshi, Unmesh

    2018-02-01

    The Automated Radio Telescope Image Processing Pipeline (ARTIP) automates the entire process of flagging, calibrating, and imaging for radio-interferometric data. ARTIP starts with raw data, i.e. a measurement set and goes through multiple stages, such as flux calibration, bandpass calibration, phase calibration, and imaging to generate continuum and spectral line images. Each stage can also be run independently. The pipeline provides continuous feedback to the user through various messages, charts and logs. It is written using standard python libraries and the CASA package. The pipeline can deal with datasets with multiple spectral windows and also multiple target sources which may have arbitrary combinations of flux/bandpass/phase calibrators.

  2. An Intelligent Automation Platform for Rapid Bioprocess Design

    Science.gov (United States)

    Wu, Tianyi

    2014-01-01

    Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user’s inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. PMID:24088579

  3. Social network analysis in software process improvement

    DEFF Research Database (Denmark)

    Nielsen, Peter Axel; Tjørnehøj, Gitte

    2010-01-01

    Software process improvement in small organisation is often problematic and communication and knowledge sharing is more informal. To improve software processes we need to understand how they communicate and share knowledge. In this article have studied the company SmallSoft through action research...

  4. Decision Support for Software Process Management Teams: An Intelligent Software Agent Approach

    National Research Council Canada - National Science Library

    Church, Lori

    2000-01-01

    ... to market, eliminate redundancy, and ease job stress. This thesis proposes a conceptual model for software process management decision support in the form of an intelligent software agent network...

  5. Automating the conflict resolution process

    Science.gov (United States)

    Wike, Jeffrey S.

    1991-01-01

    The purpose is to initiate a discussion of how the conflict resolution process at the Network Control Center can be made more efficient. Described here are how resource conflicts are currently resolved as well as the impacts of automating conflict resolution in the ATDRSS era. A variety of conflict resolution strategies are presented.

  6. Statistical reliability assessment of software-based systems

    International Nuclear Information System (INIS)

    Korhonen, J.; Pulkkinen, U.; Haapanen, P.

    1997-01-01

    Plant vendors nowadays propose software-based systems even for the most critical safety functions. The reliability estimation of safety critical software-based systems is difficult since the conventional modeling techniques do not necessarily apply to the analysis of these systems, and the quantification seems to be impossible. Due to lack of operational experience and due to the nature of software faults, the conventional reliability estimation methods can not be applied. New methods are therefore needed for the safety assessment of software-based systems. In the research project Programmable automation systems in nuclear power plants (OHA), financed together by the Finnish Centre for Radiation and Nuclear Safety (STUK), the Ministry of Trade and Industry and the Technical Research Centre of Finland (VTT), various safety assessment methods and tools for software based systems are developed and evaluated. This volume in the OHA-report series deals with the statistical reliability assessment of software based systems on the basis of dynamic test results and qualitative evidence from the system design process. Other reports to be published later on in OHA-report series will handle the diversity requirements in safety critical software-based systems, generation of test data from operational profiles and handling of programmable automation in plant PSA-studies. (orig.) (25 refs.)

  7. Problems of complex automation of process at a NPP

    International Nuclear Information System (INIS)

    Naumov, A.V.

    1981-01-01

    The importance of theoretical investigation in determining the level and quality of NPP automation is discussed. Achievements gained in this direction are briefly reviewed on the example of domestic NPPs. Two models of the problem solution on function distribution between the operator and technical means are outlined. The processes subjected to automation are enumerated. Development of the optimal methods of power automatic control of power units is one of the most important problems of NPP automation. Automation of discrete operations especially during the start-up, shut-down or in imergency situations becomes important [ru

  8. The software development process in worldwide collaborations

    International Nuclear Information System (INIS)

    Amako, K.

    1998-01-01

    High energy physics experiments in future colliders are inevitably large scale international collaborations. In these experiments, software development has to be done by a large number of physicists, software engineers and computer scientists, dispersed all over the world. The major subject of this paper is to discuss on various aspects of software development in the worldwide environment. These include software engineering and methodology, software development process and management. (orig.)

  9. AutoDrug: fully automated macromolecular crystallography workflows for fragment-based drug discovery

    International Nuclear Information System (INIS)

    Tsai, Yingssu; McPhillips, Scott E.; González, Ana; McPhillips, Timothy M.; Zinn, Daniel; Cohen, Aina E.; Feese, Michael D.; Bushnell, David; Tiefenbrunn, Theresa; Stout, C. David; Ludaescher, Bertram; Hedman, Britt; Hodgson, Keith O.; Soltis, S. Michael

    2013-01-01

    New software has been developed for automating the experimental and data-processing stages of fragment-based drug discovery at a macromolecular crystallography beamline. A new workflow-automation framework orchestrates beamline-control and data-analysis software while organizing results from multiple samples. AutoDrug is software based upon the scientific workflow paradigm that integrates the Stanford Synchrotron Radiation Lightsource macromolecular crystallography beamlines and third-party processing software to automate the crystallography steps of the fragment-based drug-discovery process. AutoDrug screens a cassette of fragment-soaked crystals, selects crystals for data collection based on screening results and user-specified criteria and determines optimal data-collection strategies. It then collects and processes diffraction data, performs molecular replacement using provided models and detects electron density that is likely to arise from bound fragments. All processes are fully automated, i.e. are performed without user interaction or supervision. Samples can be screened in groups corresponding to particular proteins, crystal forms and/or soaking conditions. A single AutoDrug run is only limited by the capacity of the sample-storage dewar at the beamline: currently 288 samples. AutoDrug was developed in conjunction with RestFlow, a new scientific workflow-automation framework. RestFlow simplifies the design of AutoDrug by managing the flow of data and the organization of results and by orchestrating the execution of computational pipeline steps. It also simplifies the execution and interaction of third-party programs and the beamline-control system. Modeling AutoDrug as a scientific workflow enables multiple variants that meet the requirements of different user groups to be developed and supported. A workflow tailored to mimic the crystallography stages comprising the drug-discovery pipeline of CoCrystal Discovery Inc. has been deployed and successfully

  10. Software process improvement, quality assurance and measurement

    NARCIS (Netherlands)

    Trienekens, J.J.M.; Kusters, R.J.; Balla, K.; Kontogiannis, K.; Zou, Y.; Di Penta, M.

    2006-01-01

    The aim of this workshop was to present and discuss emergent software quality improvement approaches, with an emphasis on practical applications. Different views on the improvement of software processes, software products, and their interrelations, have been addressed during the workshop.

  11. Simulation and visualization tool design for robot software

    NARCIS (Netherlands)

    Lu, Zhou; Ran, Tjalling; Broenink, Johannes F.; Chalmers, K.; Pedersen, J.B.

    2016-01-01

    Modern embedded systems are designed for multiple and increasingly demanding tasks. Complex concurrent software is required by multi-task automated service robotics for implementing their challenging (control) algorithms. TERRA is a communicating Sequential Processes (CSP) algebra-based Eclipse

  12. Automating spectral measurements

    Science.gov (United States)

    Goldstein, Fred T.

    2008-09-01

    This paper discusses the architecture of software utilized in spectroscopic measurements. As optical coatings become more sophisticated, there is mounting need to automate data acquisition (DAQ) from spectrophotometers. Such need is exacerbated when 100% inspection is required, ancillary devices are utilized, cost reduction is crucial, or security is vital. While instrument manufacturers normally provide point-and-click DAQ software, an application programming interface (API) may be missing. In such cases automation is impossible or expensive. An API is typically provided in libraries (*.dll, *.ocx) which may be embedded in user-developed applications. Users can thereby implement DAQ automation in several Windows languages. Another possibility, developed by FTG as an alternative to instrument manufacturers' software, is the ActiveX application (*.exe). ActiveX, a component of many Windows applications, provides means for programming and interoperability. This architecture permits a point-and-click program to act as automation client and server. Excel, for example, can control and be controlled by DAQ applications. Most importantly, ActiveX permits ancillary devices such as barcode readers and XY-stages to be easily and economically integrated into scanning procedures. Since an ActiveX application has its own user-interface, it can be independently tested. The ActiveX application then runs (visibly or invisibly) under DAQ software control. Automation capabilities are accessed via a built-in spectro-BASIC language with industry-standard (VBA-compatible) syntax. Supplementing ActiveX, spectro-BASIC also includes auxiliary serial port commands for interfacing programmable logic controllers (PLC). A typical application is automatic filter handling.

  13. 7 Processes that Enable NASA Software Engineering Technologies: Value-Added Process Engineering

    Science.gov (United States)

    Housch, Helen; Godfrey, Sally

    2011-01-01

    The presentation reviews Agency process requirements and the purpose, benefits, and experiences or seven software engineering processes. The processes include: product integration, configuration management, verification, software assurance, measurement and analysis, requirements management, and planning and monitoring.

  14. jMRUI plugin software (jMRUI2XML) to allow automated MRS processing and XML-based standardized output

    Czech Academy of Sciences Publication Activity Database

    Mocioiu, V.; Ortega-Martorell, S.; Olier, I.; Jabłoński, Michal; Starčuková, Jana; Lisboa, P.; Arús, C.; Julia-Sapé, M.

    2015-01-01

    Roč. 28, S1 (2015), S518 ISSN 0968-5243. [ESMRMB 2015. Annual Scientific Meeting /32./. 01.09.2015-03.09.2015, Edinburgh] Institutional support: RVO:68081731 Keywords : MR Spectroscopy * signal processing * jMRUI * software development * XML Subject RIV: BH - Optics, Masers, Lasers

  15. Quality of Radiomic Features in Glioblastoma Multiforme: Impact of Semi-Automated Tumor Segmentation Software.

    Science.gov (United States)

    Lee, Myungeun; Woo, Boyeong; Kuo, Michael D; Jamshidi, Neema; Kim, Jong Hyo

    2017-01-01

    The purpose of this study was to evaluate the reliability and quality of radiomic features in glioblastoma multiforme (GBM) derived from tumor volumes obtained with semi-automated tumor segmentation software. MR images of 45 GBM patients (29 males, 16 females) were downloaded from The Cancer Imaging Archive, in which post-contrast T1-weighted imaging and fluid-attenuated inversion recovery MR sequences were used. Two raters independently segmented the tumors using two semi-automated segmentation tools (TumorPrism3D and 3D Slicer). Regions of interest corresponding to contrast-enhancing lesion, necrotic portions, and non-enhancing T2 high signal intensity component were segmented for each tumor. A total of 180 imaging features were extracted, and their quality was evaluated in terms of stability, normalized dynamic range (NDR), and redundancy, using intra-class correlation coefficients, cluster consensus, and Rand Statistic. Our study results showed that most of the radiomic features in GBM were highly stable. Over 90% of 180 features showed good stability (intra-class correlation coefficient [ICC] ≥ 0.8), whereas only 7 features were of poor stability (ICC NDR ≥1), while above 35% of the texture features showed poor NDR (software tools provided sufficiently reliable tumor segmentation and feature stability; thus helping to overcome the inherent inter-rater and intra-rater variability of user intervention. However, certain aspects of feature quality, including NDR and redundancy, need to be assessed for determination of representative signature features before further development of radiomics.

  16. Automated Software Analysis of Fetal Movement Recorded during a Pregnant Woman's Sleep at Home.

    Science.gov (United States)

    Nishihara, Kyoko; Ohki, Noboru; Kamata, Hideo; Ryo, Eiji; Horiuchi, Shigeko

    2015-01-01

    Fetal movement is an important biological index of fetal well-being. Since 2008, we have been developing an original capacitive acceleration sensor and device that a pregnant woman can easily use to record fetal movement by herself at home during sleep. In this study, we report a newly developed automated software system for analyzing recorded fetal movement. This study will introduce the system and compare its results to those of a manual analysis of the same fetal movement signals (Experiment I). We will also demonstrate an appropriate way to use the system (Experiment II). In Experiment I, fetal movement data reported previously for six pregnant women at 28-38 gestational weeks were used. We evaluated the agreement of the manual and automated analyses for the same 10-sec epochs using prevalence-adjusted bias-adjusted kappa (PABAK) including quantitative indicators for prevalence and bias. The mean PABAK value was 0.83, which can be considered almost perfect. In Experiment II, twelve pregnant women at 24-36 gestational weeks recorded fetal movement at night once every four weeks. Overall, mean fetal movement counts per hour during maternal sleep significantly decreased along with gestational weeks, though individual differences in fetal development were noted. This newly developed automated analysis system can provide important data throughout late pregnancy.

  17. An open-source automated continuous condition-based maintenance platform for commercial buildings

    Energy Technology Data Exchange (ETDEWEB)

    Katipamula, Srinivas; Gowri, Krishnan; Hernandez, George

    2016-09-09

    This paper describes one such reference process that can be deployed to provide continuous automated conditioned-based maintenance management for buildings that have BIM, a building automation system (BAS) and a computerized maintenance management software (CMMS) systems. The process can be deployed using an open source transactional network platform, VOLTTRON™, designed for distributed sensing and controls and supports both energy efficiency and grid services.

  18. Automated data collection in single particle electron microscopy

    Science.gov (United States)

    Tan, Yong Zi; Cheng, Anchi; Potter, Clinton S.; Carragher, Bridget

    2016-01-01

    Automated data collection is an integral part of modern workflows in single particle electron microscopy (EM) research. This review surveys the software packages available for automated single particle EM data collection. The degree of automation at each stage of data collection is evaluated, and the capabilities of the software packages are described. Finally, future trends in automation are discussed. PMID:26671944

  19. Summary of the International Conference on Software and System Processes

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; O'Connor, Rory V.; Perry, Dewayne E.

    2016-01-01

    The International Conference on Software and Systems Process (ICSSP), continuing the success of Software Process Workshop (SPW), the Software Process Modeling and Simulation Workshop (ProSim) and the International Conference on Software Process (ICSP) conference series, has become the established...... premier event in the field of software and systems engineering processes. It provides a leading forum for the exchange of research outcomes and industrial best-practices in process development from software and systems disciplines. ICSSP 2016 was held in Austin, Texas, from 14-15 May 2016, co......-located with the 38th International Conference on Software Engineering (ICSE). The theme of mICSSP 2016 was studying "Process(es) in Action" by recognizing that the AS-Planned and AS-Practiced processes can be quite different in many ways including their ows, their complexity and the evolving needs of stakeholders...

  20. Experimental software for modeling and interpreting educational data analysis processes

    Directory of Open Access Journals (Sweden)

    Natalya V. Zorina

    2017-12-01

    Full Text Available Problems, tasks and processes of educational data mining are considered in this article. The objective is to create a fundamentally new information system of the University using the results educational data analysis. One of the functions of such a system is knowledge extraction from accumulated in the operation process data. The creation of the national system of this type is an iterative and time-consuming process requiring the preliminary studies and incremental prototyping modules. The novelty of such systems is that there is a lack of those using this methodology of the development, for this purpose a number of experiments was carried out in order to collect data, choose appropriate methods for the study and to interpret them. As a result of the experiment, the authors were available sources available for analysis in the information environment of the home university. The data were taken from the semester performance, obtained from the information system of the training department of the Institute of IT MTU MIREA, the data obtained as a result of the independent work of students and data, using specially designed Google-forms. To automate the collection of information and analysis of educational data, an experimental software package was created. As a methodology for developing the experimental software complex, a decision was made using the methodologies of rational-empirical complexes (REX and single-experimentation program technologies (TPEI. The details of the program implementation of the complex are described in detail, conclusions are given about the availability of the data sources used, and conclusions are drawn about the prospects for further development.

  1. Research and Development on Food Nutrition Statistical Analysis Software System

    OpenAIRE

    Du Li; Ke Yun

    2013-01-01

    Designing and developing a set of food nutrition component statistical analysis software can realize the automation of nutrition calculation, improve the nutrition processional professional’s working efficiency and achieve the informatization of the nutrition propaganda and education. In the software development process, the software engineering method and database technology are used to calculate the human daily nutritional intake and the intelligent system is used to evaluate the user’s hea...

  2. Automated Hardware and Software System for Monitoring the Earth’s Magnetic Environment

    Directory of Open Access Journals (Sweden)

    Alexei Gvishiani

    2016-12-01

    Full Text Available The continuous growth of geophysical observations requires adequate methods for their processing and analysis. This becomes one of the most important and widely discussed issues in the data science community. The system analysis methods and data mining techniques are able to sustain the solution of this problem. This paper presents an innovative holistic hardware/software system (HSS developed for efficient management and intellectual analysis of geomagnetic data, registered by Russian geomagnetic observatories and international satellites. Geomagnetic observatories that comprise the International Real-time Magnetic Observatory Network (INTERMAGNET produce preliminary (raw and definitive (corrected geomagnetic data of the highest quality. The designed system automates and accelerates routine production of definitive data from the preliminary magnetograms, obtained by Russian observatories, due to implemented algorithms that involve artificial intelligence elements. The HSS is the first system that provides sophisticated automatic detection and multi-criteria classification of extreme geomagnetic conditions, which may be hazardous for technological infrastructure and economic activity in Russia. It enables the online access to digital geomagnetic data, its processing results and modelling calculations along with their visualization on conventional and spherical screens. The concept of the presented system agrees with the accepted ‘four Vs’ paradigm of Big Data. The HSS can increase significantly the ‘velocity’ and ‘veracity’ features of the INTERMAGNET system. It also provides fusion of large sets of ground-based and satellite geomagnetic data, thus facilitating the ‘volume’ and ‘variety’ of handled data.

  3. MEASUREMENT PROCESS OF SOFTWARE DEVELOPMENT PROJECTS FOR SUPPORTING STRATEGIC BUSINESS OBJECTIVES IN SOFTWARE DEVELOPING COMPANIES

    Directory of Open Access Journals (Sweden)

    Sandra Lais Pedroso

    2013-08-01

    Full Text Available Software developing companies work in a competitive market and are often challenged to make business decisions with impact on competitiveness. Models accessing maturity for software development processes quality, such as CMMI and MPS-BR, comprise process measurements systems (PMS. However, these models are not necessarily suitable to support business decisions, neither to achieve strategic goals. The objective of this work is to analyze how the PMS of software development projects could support business strategies for software developing companies. Results taken from this work show that PMS results from maturity models for software processes can be suited to help evaluating operating capabilities and supporting strategic business decisions.

  4. SOFTWARE PROCESS IMPROVEMENT: AWARENESS, USE, AND BENEFITS IN CANADIAN SOFTWARE DEVELOPMENT FIRMS

    OpenAIRE

    CHEVERS, DELROY

    2017-01-01

    ABSTRACT Since 1982, the software development community has been concerned with the delivery of quality systems. Software process improvement (SPI) is an initiative to avoid the delivery of low quality systems. However, the awareness and adoption of SPI is low. Thus, this study examines the rate of awareness, use, and benefits of SPI initiatives in Canadian software development firms. Using SPSS as the analytical tool, this study found that 59% of Canadian software development firms are aware...

  5. Mars Science Laboratory Flight Software Boot Robustness Testing Project Report

    Science.gov (United States)

    Roth, Brian

    2011-01-01

    On the surface of Mars, the Mars Science Laboratory will boot up its flight computers every morning, having charged the batteries through the night. This boot process is complicated, critical, and affected by numerous hardware states that can be difficult to test. The hardware test beds do not facilitate testing a long duration of back-to-back unmanned automated tests, and although the software simulation has provided the necessary functionality and fidelity for this boot testing, there has not been support for the full flexibility necessary for this task. Therefore to perform this testing a framework has been build around the software simulation that supports running automated tests loading a variety of starting configurations for software and hardware states. This implementation has been tested against the nominal cases to validate the methodology, and support for configuring off-nominal cases is ongoing. The implication of this testing is that the introduction of input configurations that have yet proved difficult to test may reveal boot scenarios worth higher fidelity investigation, and in other cases increase confidence in the robustness of the flight software boot process.

  6. Automating the production of high-purity zirconium from waste products in industrial furnaces SKB-5025 and apparatuses TS-40M

    International Nuclear Information System (INIS)

    Lavrikov, S.A.; Kotsar', M.L.; Lapidus, A.O.; Akhtonov, S.G.; Aleksandrov, A.V.; Ogorodnikov, L.V.; Chernyshev, A.A.; Kopysov, N.V.

    2014-01-01

    A disadvantage of iodide refining of zirconium in industrial furnaces in the processing of waste products production in JSC CMP is the low direct yield of the metal in iodide rods and large energy consumption of the process. The aim of this work is to optimize the process by means of automated control. The paper deals with the creation of a test unit to automate the iodide refining of zirconium in JSC CMP. The main features of the unit, the hardware and software of the automated unit and the results of its work during the operation are described. A scheme for the automation of 10 furnaces SKB-5025 by optimizing the total cost of computing equipment and for software improvements was proposed and implemented in 2012

  7. Grasping devices and methods in automated production processes

    DEFF Research Database (Denmark)

    Fantoni, Gualtiero; Santochi, Marco; Dini, Gino

    2014-01-01

    assembly to disassembly, from aerospace to food industry, from textile to logistics) are discussed. Finally, the most recent research is reviewed in order to introduce the new trends in grasping. They provide an outlook on the future of both grippers and robotic hands in automated production processes. (C......In automated production processes grasping devices and methods play a crucial role in the handling of many parts, components and products. This keynote paper starts with a classification of grasping phases, describes how different principles are adopted at different scales in different applications...

  8. Experiment on safety software evaluation

    International Nuclear Information System (INIS)

    Soubies, B.; Henry, J.Y.

    1994-06-01

    The licensing procedures process of nuclear plants includes compulsory steps which bring about a thorough exam of the commands control system. In this context the IPSN uses a tool called MALPAS to carry out an analysis of the quality of the software involved in safety control. The IPSN also try to obtain the automation of the generation of test games necessary for dynamical analysis. The MALPAS tool puts forward the particularities of programing which can influence the testability and the upholding of the studied software. (TEC). 4 refs

  9. Compiling software for a hierarchical distributed processing system

    Science.gov (United States)

    Archer, Charles J; Blocksome, Michael A; Ratterman, Joseph D; Smith, Brian E

    2013-12-31

    Compiling software for a hierarchical distributed processing system including providing to one or more compiling nodes software to be compiled, wherein at least a portion of the software to be compiled is to be executed by one or more nodes; compiling, by the compiling node, the software; maintaining, by the compiling node, any compiled software to be executed on the compiling node; selecting, by the compiling node, one or more nodes in a next tier of the hierarchy of the distributed processing system in dependence upon whether any compiled software is for the selected node or the selected node's descendents; sending to the selected node only the compiled software to be executed by the selected node or selected node's descendent.

  10. Development methodology for the software life cycle process of the safety software

    Energy Technology Data Exchange (ETDEWEB)

    Kim, D. H.; Lee, S. S. [BNF Technology, Taejon (Korea, Republic of); Cha, K. H.; Lee, C. S.; Kwon, K. C.; Han, H. B. [KAERI, Taejon (Korea, Republic of)

    2002-05-01

    A methodology for developing software life cycle processes (SLCP) is proposed to develop the digital safety-critical Engineered Safety Features - Component Control System (ESF-CCS) successfully. A software life cycle model is selected as the hybrid model mixed with waterfall, prototyping, and spiral models and is composed of two stages , development stages of prototype of ESF-CCS and ESF-CCS. To produce the software life cycle (SLC) for the Development of the Digital Reactor Safety System, the Activities referenced in IEEE Std. 1074-1997 are mapped onto the hybrid model. The SLCP is established after the available OPAs (Organizational Process Asset) are applied to the SLC Activities, and the known constraints are reconciled. The established SLCP describes well the software life cycle activities with which the Regulatory Authority provides.

  11. Development methodology for the software life cycle process of the safety software

    International Nuclear Information System (INIS)

    Kim, D. H.; Lee, S. S.; Cha, K. H.; Lee, C. S.; Kwon, K. C.; Han, H. B.

    2002-01-01

    A methodology for developing software life cycle processes (SLCP) is proposed to develop the digital safety-critical Engineered Safety Features - Component Control System (ESF-CCS) successfully. A software life cycle model is selected as the hybrid model mixed with waterfall, prototyping, and spiral models and is composed of two stages , development stages of prototype of ESF-CCS and ESF-CCS. To produce the software life cycle (SLC) for the Development of the Digital Reactor Safety System, the Activities referenced in IEEE Std. 1074-1997 are mapped onto the hybrid model. The SLCP is established after the available OPAs (Organizational Process Asset) are applied to the SLC Activities, and the known constraints are reconciled. The established SLCP describes well the software life cycle activities with which the Regulatory Authority provides

  12. Automation of Cassini Support Imaging Uplink Command Development

    Science.gov (United States)

    Ly-Hollins, Lisa; Breneman, Herbert H.; Brooks, Robert

    2010-01-01

    "Support imaging" is imagery requested by other Cassini science teams to aid in the interpretation of their data. The generation of the spacecraft command sequences for these images is performed by the Cassini Instrument Operations Team. The process initially established for doing this was very labor-intensive, tedious and prone to human error. Team management recognized this process as one that could easily benefit from automation. Team members were tasked to document the existing manual process, develop a plan and strategy to automate the process, implement the plan and strategy, test and validate the new automated process, and deliver the new software tools and documentation to Flight Operations for use during the Cassini extended mission. In addition to the goals of higher efficiency and lower risk in the processing of support imaging requests, an effort was made to maximize adaptability of the process to accommodate uplink procedure changes and the potential addition of new capabilities outside the scope of the initial effort.

  13. LHCb - Automated Testing Infrastructure for the Software Framework Gaudi

    CERN Multimedia

    Clemencic, M

    2009-01-01

    An extensive test suite is the first step towards the delivery of robust software, but it is not always easy to implement it, especially in projects with many developers. An easy to use and flexible infrastructure to use to write and execute the tests reduces the work each developer has to do to instrument his packages with tests. At the same time, the infrastructure gives the same look and feel to the tests and allows automated execution of the test suite. For Gaudi, we decided to develop the testing infrastructure on top of the free tool QMTest, used already in LCG Application Area for the routine tests run in the nightly build system. The high flexibility of QMTest allowed us to integrate it in the Gaudi package structure. A specialized test class and some utility functions have been developed to simplify the definition of a test for a Gaudi-based application. Thanks to the testing infrastructure described here, we managed to quickly extend the standard Gaudi test suite and add tests to the main LHCb appli...

  14. A guide to the automation body of knowledge

    CERN Document Server

    2006-01-01

    Edited by Vernon L. Trevathan, with contributions from more than 35 leading experts from all aspects of automation, this book is a key resource for anyone who is studying for the ISA Certified Automation Professional® (CAP®), ISA Certified Control Systems Technician® (CCST®), and/or Control Systems Engineer (CSE) exams. The book defines the most important automation concepts and processes, while also describing the technical skills required to implement them in today's industrial environment. This edition provides comprehensive information about all major topics in the broad field of automation including: Process and analytical instrumentation ; Continuous and batch control ; Control valves and final control elements ; Basic discrete, sequencing, and manufacturing control ; Advanced control ; Digital and analog communications ; Data management and system software ; Networking and security ; Safety and reliability ; System checkout, testing, startup, and troubleshooting ; Project management. Whether you ar...

  15. Automated Podcasting System for Universities

    Directory of Open Access Journals (Sweden)

    Ypatios Grigoriadis

    2013-03-01

    Full Text Available This paper presents the results achieved at Graz University of Technology (TU Graz in the field of automating the process of recording and publishing university lectures in a very new way. It outlines cornerstones of the development and integration of an automated recording system such as the lecture hall setup, the recording hardware and software architecture as well as the development of a text-based search for the final product by method of indexing video podcasts. Furthermore, the paper takes a look at didactical aspects, evaluations done in this context and future outlook.

  16. Competing Values in Software Process Improvement

    DEFF Research Database (Denmark)

    Mûller, Sune Dueholm; Nielsen, Peter Axel

    2013-01-01

    Purpose The purpose of the article is to investigate the impact of organizational culture on software process improvement (SPI). Is cultural congruence between an organization and an adopted process model required? How can the level of congruence between an organizational culture and the values...... and assumptions underlying an adopted process model be assessed? How can cultural incongruence be managed to facilitate success of software process improvement? Design/methodology/approach The competing values framework and its associated assessment instrument are used in a case study to establish......-step process, SPI managers establish and compare culture profiles and decide how to address identified problems. To that end the text analysis technique is offered as a web service that allows for analysis of all text-based process models and standards, and of internal process documentation. Originality...

  17. EDNA-An expert software system for comparison and evaluation of DNA profiles in forensic casework

    DEFF Research Database (Denmark)

    Haldemann, B.; Dornseifer, S.; Heylen, T.

    2015-01-01

    eDNA is an expert software system for DNA profile comparison, match interpretation and automated report generation in forensic DNA casework. Process automation and intelligent graphical representation maximise reliability of DNA evidence, while facilitating and accelerating the work of DNA experts....

  18. Power, speed & automation with Adobe Photoshop

    CERN Document Server

    Scott, Geoff

    2012-01-01

    This is a must for the serious Photoshop user! Power, Speed & Automation explores how to customize and automate Photoshop to increase your speed and productivity.  With numerous step-by-step instructions, the authors-two of Adobe's own software developers!- walk you through the steps to best tailor Photoshop's interface to your personal workflow; write and apply Actions; and use batching and scripts to process large numbers of images quickly and automatically.  You will learn how to build your own dialogs and panels to improve your production workflows in Photoshop, the secrets of changing

  19. Development Of Software For Automation And Data Acquisition System For Gamma Spider And SPECT

    International Nuclear Information System (INIS)

    Maslina Mohd Ibrahim; Jaafar Abdullah; Nolida Yussup; Nur Aira Abd Rahman; Mukhlis Mokhtar; Hanafi Ithnin; Hearie Hassan

    2014-01-01

    The portable system of gamma-ray computed tomography known as Gamma Spider and Single-Photon Emission Computed Tomography (SPECT) are designed for the detection and measurement of the physical condition of small object that can fit within the diameter of 50 cm. The project consists of two major components; the scanner hardware and the system software. This paper describes the system software components which are linear and rotate motor control, data acquisition for rate meter and data recording. The scanning resolution for linear movement can be set to 0.25 cm, 0.5 cm or 1.0 cm whereas the rotation movement can be set to 5 or 10 degree depending on user requirement. The selections for scanning offset are set to three diameters which are 30 cm, 40 cm and 50 cm. The algorithm for this automation system and issues on development are also discussed in this paper. (author)

  20. Automation of the Analysis and Classification of the Line Material

    Directory of Open Access Journals (Sweden)

    A. A. Machuev

    2011-03-01

    Full Text Available The work is devoted to the automation of the process of the analysis and verification of various formats of data presentation for what the special software is developed. Working out and testing the special software were made on an example of files with the typical expansions which features of structure are known in advance.

  1. Software process improvement in CMS-are we different?

    International Nuclear Information System (INIS)

    Wellisch, J.P.

    2001-01-01

    One of the most challenging issues faced in HEP in recent years is the question of how to capitalise on software development and maintenance experience in a continuous manner. To capitalise in our context means to evaluate and apply new technologies as they arise, and to further evolve technologies already widely in use. It also implies the definition and adoption of standards, while ensuring reproducibility and quality of results. The CMS process improvement effort is two-pronged. It aims at continuous improvement of the ways we do Object Oriented software, as well as continuous improvement in the efficiency of the working environment. In particular the use and creation of de-facto software process standards within CMS has proven to be key to successful software process improvement program. The authors describe the successful CMS implementation of a software process improvement strategy, following ISO 15504 since three years. The authors give the current status of the most important processes families formally established in CMS, and provide the guidelines followed both for tool development, and methodology establishment

  2. Green Software Engineering Adaption In Requirement Elicitation Process

    Directory of Open Access Journals (Sweden)

    Umma Khatuna Jannat

    2015-08-01

    Full Text Available A recent technology investigates the role of concern in the environment software that is green software system. Now it is widely accepted that the green software can fit all process of software development. It is also suitable for the requirement elicitation process. Now a days software companies have used requirements elicitation techniques in an enormous majority. Because this process plays more and more important roles in software development. At the present time most of the requirements elicitation process is improved by using some techniques and tools. So that the intention of this research suggests to adapt green software engineering for the intention of existing elicitation technique and recommend suitable actions for improvement. This research being involved qualitative data. I used few keywords in my searching procedure then searched IEEE ACM Springer Elsevier Google scholar Scopus and Wiley. Find out articles which published in 2010 until 2016. Finding from the literature review Identify 15 traditional requirement elicitations factors and 23 improvement techniques to convert green engineering. Lastly The paper includes a squat review of the literature a description of the grounded theory and some of the identity issues related finding of the necessity for requirements elicitation improvement techniques.

  3. A 'Toolbox' Equivalent Process for Safety Analysis Software

    International Nuclear Information System (INIS)

    O'Kula, K.R.; Eng, Tony

    2004-01-01

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (Quality Assurance for Safety-Related Software) identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls that prevent or mitigate potential accidents. The development and maintenance of a collection, or 'toolbox', of multiple-site use, standard solution, Software Quality Assurance (SQA)-compliant safety software is one of the major improvements identified in the associated DOE Implementation Plan (IP). The DOE safety analysis toolbox will contain a set of appropriately quality-assured, configuration-controlled, safety analysis codes, recognized for DOE-broad, safety basis applications. Currently, six widely applied safety analysis computer codes have been designated for toolbox consideration. While the toolbox concept considerably reduces SQA burdens among DOE users of these codes, many users of unique, single-purpose, or single-site software may still have sufficient technical justification to continue use of their computer code of choice, but are thwarted by the multiple-site condition on toolbox candidate software. The process discussed here provides a roadmap for an equivalency argument, i.e., establishing satisfactory SQA credentials for single-site software that can be deemed ''toolbox-equivalent''. The process is based on the model established to meet IP Commitment 4.2.1.2: Establish SQA criteria for the safety analysis ''toolbox'' codes. Implementing criteria that establish the set of prescriptive SQA requirements are based on implementation plan/procedures from the Savannah River Site, also incorporating aspects of those from the Waste Isolation Pilot Plant (SNL component) and the Yucca Mountain Project. The major requirements are met with evidence of a software quality assurance plan, software requirements and design documentation, user's instructions, test report, a

  4. Automation in high-content flow cytometry screening.

    Science.gov (United States)

    Naumann, U; Wand, M P

    2009-09-01

    High-content flow cytometric screening (FC-HCS) is a 21st Century technology that combines robotic fluid handling, flow cytometric instrumentation, and bioinformatics software, so that relatively large numbers of flow cytometric samples can be processed and analysed in a short period of time. We revisit a recent application of FC-HCS to the problem of cellular signature definition for acute graft-versus-host-disease. Our focus is on automation of the data processing steps using recent advances in statistical methodology. We demonstrate that effective results, on par with those obtained via manual processing, can be achieved using our automatic techniques. Such automation of FC-HCS has the potential to drastically improve diagnosis and biomarker identification.

  5. Software process improvement: controlling developers, managers or users?

    DEFF Research Database (Denmark)

    Nørbjerg, Jacob

    1999-01-01

    The paper discusses how the latest trend in the management of software development: software process improvement (SPI) may affect user-developer relations. At the outset, SPI concerns the "internal workings" of software organisations, but it may also be interpreted as one way to give the developer...... organisation more control over the development process and the relations with the user organization....

  6. Software Engineering Laboratory (SEL) cleanroom process model

    Science.gov (United States)

    Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon

    1991-01-01

    The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.

  7. Moving toward the automation of the systematic review process: a summary of discussions at the second meeting of International Collaboration for the Automation of Systematic Reviews (ICASR).

    Science.gov (United States)

    O'Connor, Annette M; Tsafnat, Guy; Gilbert, Stephen B; Thayer, Kristina A; Wolfe, Mary S

    2018-01-09

    The second meeting of the International Collaboration for Automation of Systematic Reviews (ICASR) was held 3-4 October 2016 in Philadelphia, Pennsylvania, USA. ICASR is an interdisciplinary group whose aim is to maximize the use of technology for conducting rapid, accurate, and efficient systematic reviews of scientific evidence. Having automated tools for systematic review should enable more transparent and timely review, maximizing the potential for identifying and translating research findings to practical application. The meeting brought together multiple stakeholder groups including users of summarized research, methodologists who explore production processes and systematic review quality, and technologists such as software developers, statisticians, and vendors. This diversity of participants was intended to ensure effective communication with numerous stakeholders about progress toward automation of systematic reviews and stimulate discussion about potential solutions to identified challenges. The meeting highlighted challenges, both simple and complex, and raised awareness among participants about ongoing efforts by various stakeholders. An outcome of this forum was to identify several short-term projects that participants felt would advance the automation of tasks in the systematic review workflow including (1) fostering better understanding about available tools, (2) developing validated datasets for testing new tools, (3) determining a standard method to facilitate interoperability of tools such as through an application programming interface or API, and (4) establishing criteria to evaluate the quality of tools' output. ICASR 2016 provided a beneficial forum to foster focused discussion about tool development and resources and reconfirm ICASR members' commitment toward systematic reviews' automation.

  8. Framework Programmable Platform for the advanced software development workstation: Framework processor design document

    Science.gov (United States)

    Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, Wes; Sanders, Les

    1991-01-01

    The design of the Framework Processor (FP) component of the Framework Programmable Software Development Platform (FFP) is described. The FFP is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software development environment. Guided by the model, this Framework Processor will take advantage of an integrated operating environment to provide automated support for the management and control of the software development process so that costly mistakes during the development phase can be eliminated.

  9. Automated driving safer and more efficient future driving

    CERN Document Server

    Horn, Martin

    2017-01-01

    The main topics of this book include advanced control, cognitive data processing, high performance computing, functional safety, and comprehensive validation. These topics are seen as technological bricks to drive forward automated driving. The current state of the art of automated vehicle research, development and innovation is given. The book also addresses industry-driven roadmaps for major new technology advances as well as collaborative European initiatives supporting the evolvement of automated driving. Various examples highlight the state of development of automated driving as well as the way forward. The book will be of interest to academics and researchers within engineering, graduate students, automotive engineers at OEMs and suppliers, ICT and software engineers, managers, and other decision-makers.

  10. Automated support for experience-based software management

    Science.gov (United States)

    Valett, Jon D.

    1992-01-01

    To effectively manage a software development project, the software manager must have access to key information concerning a project's status. This information includes not only data relating to the project of interest, but also, the experience of past development efforts within the environment. This paper describes the concepts and functionality of a software management tool designed to provide this information. This tool, called the Software Management Environment (SME), enables the software manager to compare an ongoing development effort with previous efforts and with models of the 'typical' project within the environment, to predict future project status, to analyze a project's strengths and weaknesses, and to assess the project's quality. In order to provide these functions the tool utilizes a vast corporate memory that includes a data base of software metrics, a set of models and relationships that describe the software development environment, and a set of rules that capture other knowledge and experience of software managers within the environment. Integrating these major concepts into one software management tool, the SME is a model of the type of management tool needed for all software development organizations.

  11. Automated daily quality control analysis for mammography in a multi-unit imaging center.

    Science.gov (United States)

    Sundell, Veli-Matti; Mäkelä, Teemu; Meaney, Alexander; Kaasalainen, Touko; Savolainen, Sauli

    2018-01-01

    Background The high requirements for mammography image quality necessitate a systematic quality assurance process. Digital imaging allows automation of the image quality analysis, which can potentially improve repeatability and objectivity compared to a visual evaluation made by the users. Purpose To develop an automatic image quality analysis software for daily mammography quality control in a multi-unit imaging center. Material and Methods An automated image quality analysis software using the discrete wavelet transform and multiresolution analysis was developed for the American College of Radiology accreditation phantom. The software was validated by analyzing 60 randomly selected phantom images from six mammography systems and 20 phantom images with different dose levels from one mammography system. The results were compared to a visual analysis made by four reviewers. Additionally, long-term image quality trends of a full-field digital mammography system and a computed radiography mammography system were investigated. Results The automated software produced feature detection levels comparable to visual analysis. The agreement was good in the case of fibers, while the software detected somewhat more microcalcifications and characteristic masses. Long-term follow-up via a quality assurance web portal demonstrated the feasibility of using the software for monitoring the performance of mammography systems in a multi-unit imaging center. Conclusion Automated image quality analysis enables monitoring the performance of digital mammography systems in an efficient, centralized manner.

  12. Some problems of software development for the plant-level automated control system of NPPs with the RBMK reactors

    International Nuclear Information System (INIS)

    Gorbunov, V.P.; Egorov, A.K.; Isaev, N.V.; Saprykin, E.M.

    1987-01-01

    Problems on development and operation of automated control system (ACS) software of NPPs with the RBMK reactors are discussed. The ES computer with large on-line storage (not less than 1 Mbite) and fast response (not less than 300.000 of operations per a second) should enter the ACS composition. Several program complexes are used in the NPP ACS. The programs collected into the EhNERGIYa library are used to provide central control system operation. The information-retrival system called the Fuel file is used to automate NPP fuel motion account, as well as to estimate efficiency of fuel application, to carry out calculations of a fuel component of electric and heat energy production cost. The automated information system for unit operation efficiency analysis, which solves both plant and unit-level problems, including engineering and economical factors and complexing of operation parameter bank, is under trial operation

  13. CoLiTec software - detection of the near-zero apparent motion

    Science.gov (United States)

    Khlamov, Sergii V.; Savanevych, Vadym E.; Briukhovetskyi, Olexandr B.; Pohorelov, Artem V.

    2017-06-01

    In this article we described CoLiTec software for full automated frames processing. CoLiTec software allows processing the Big Data of observation results as well as processing of data that is continuously formed during observation. The scope of solving tasks includes frames brightness equalization, moving objects detection, astrometry, photometry, etc. Along with the high efficiency of Big Data processing CoLiTec software also ensures high accuracy of data measurements. A comparative analysis of the functional characteristics and positional accuracy was performed between CoLiTec and Astrometrica software. The benefits of CoLiTec used with wide field and low quality frames were observed. The efficiency of the CoLiTec software was proved by about 700.000 observations and over 1.500 preliminary discoveries.

  14. Automated complex spectra processing of actinide α-radiation

    International Nuclear Information System (INIS)

    Anichenkov, S.V.; Popov, Yu.S.; Tselishchev, I.V.; Mishenev, V.B.; Timofeev, G.A.

    1989-01-01

    Earlier described algorithms of automated processing of complex α - spectra of actinides with the use of Ehlektronika D3-28 computer line, connected with ICA-070 multichannel amplitude pulse analyzer, were realized. The developed program enables to calculated peak intensity and the relative isotope content, to conduct energy calibration of spectra, to calculate peak center of gravity and energy resolution, to perform integral counting in particular part of the spectrum. Error of the method of automated processing depens on the degree of spectrum complication and lies within the limits of 1-12%. 8 refs.; 4 figs.; 2 tabs

  15. The integration of automated knowledge acquisition with computer-aided software engineering for space shuttle expert systems

    Science.gov (United States)

    Modesitt, Kenneth L.

    1990-01-01

    A prediction was made that the terms expert systems and knowledge acquisition would begin to disappear over the next several years. This is not because they are falling into disuse; it is rather that practitioners are realizing that they are valuable adjuncts to software engineering, in terms of problem domains addressed, user acceptance, and in development methodologies. A specific problem was discussed, that of constructing an automated test analysis system for the Space Shuttle Main Engine. In this domain, knowledge acquisition was part of requirements systems analysis, and was performed with the aid of a powerful inductive ESBT in conjunction with a computer aided software engineering (CASE) tool. The original prediction is not a very risky one -- it has already been accomplished.

  16. WARACS: Wrappers to Automate the Reconstruction of Ancestral Character States1

    Science.gov (United States)

    Gruenstaeudl, Michael

    2016-01-01

    Premise of the study: Reconstructions of ancestral character states are among the most widely used analyses for evaluating the morphological, cytological, or ecological evolution of an organismic lineage. The software application Mesquite remains the most popular application for such reconstructions among plant scientists, even though its support for automating complex analyses is limited. A software tool is needed that automates the reconstruction and visualization of ancestral character states with Mesquite and similar applications. Methods and Results: A set of command line–based Python scripts was developed that (a) communicates standardized input to and output from the software applications Mesquite, BayesTraits, and TreeGraph2; (b) automates the process of ancestral character state reconstruction; and (c) facilitates the visualization of reconstruction results. Conclusions: WARACS provides a simple tool that streamlines the reconstruction and visualization of ancestral character states over a wide array of parameters, including tree distribution, character state, and optimality criterion. PMID:26949580

  17. Circular Hough transform diffraction analysis: A software tool for automated measurement of selected area electron diffraction patterns within Digital MicrographTM

    International Nuclear Information System (INIS)

    Mitchell, D.R.G.

    2008-01-01

    A software tool (script and plugin) for computing circular Hough transforms (CHT) in Digital Micrograph TM has been developed, for the purpose of automated analysis of selected area electron diffraction patterns (SADPs) of polycrystalline materials. The CHT enables the diffraction pattern centre to be determined with sub-pixel accuracy, regardless of the exposure condition of the transmitted beam or if a beam stop is present. Radii of the diffraction rings can also be accurately measured with sub-pixel precision. If the pattern is calibrated against a known camera length, then d-spacings with an accuracy of better than 1% can be obtained. These measurements require no a priori knowledge of the pattern and very limited user interaction. The accuracy of the CHT is degraded by distortion introduced by the projector lens, and this should be minimised prior to pattern acquisition. A number of optimisations in the CHT software enable rapid processing of patterns; a typical analysis of a 1kx1k image taking just a few minutes. The CHT tool appears robust and is even able to accurately measure SADPs with very incomplete diffraction rings due to texture effects. This software tool is freely downloadable via the Internet

  18. METODE ANALYTICAL HIERARCHY PROCESS: SISTEM REKOMENDER DATABASE SOFTWARE

    Directory of Open Access Journals (Sweden)

    Doni Purnama Alam Syah

    2014-09-01

    Full Text Available Abstract - Rekomender electoral system is a database software application that can be used to look for alternative software database selection strategy, the method of analytical hierarchy process (AHP. Rekomender systems needed by companies that have a large enough data processing such as the Bureau of Bina Sarana IT Information, expensive investments in the provision of Information Technology (IT makes IT Bina Sarana Information Bureau to be more careful in determining the selection of database software. This study focuses on research of database software selection system with the method of analytical hierarchy process (AHP, a case study of IT Bureau Bina Sarana Infromatika with the observation unit administrator. The results of the study found that there are two (2 main criteria, namely the selection of technology and a user with an alternative strategy My SQL, Oracle and SQL Server. Having tested the system rekomender My SQL result that the top priority in the selection of database software with a 41% weighting, followed by SQL Server and Oracle 39% 21%. The end result of a system that has been created rekomender concluded that the Bureau of Bina Sarana Informatics IT can define strategy alternatives before determining database software to be used more effectively and efficiently. Abstrak¬¬ - Sistem rekomender pemilihan database software merupakan aplikasi yang dapat digunakan untuk mencari alternatif strategi pemilihan database software, dengan metode analytical hierarchy process (AHP. Sistem rekomender dibutuhkan oleh perusahaan yang memiliki pengolahan data yang cukup besar seperti Biro TI Bina Sarana Informatika, mahalnya investasi pada penyediaan Teknologi Informasi (TI membuat Biro TI Bina Sarana Informatika lebih berhati-hati dalam menentukan pemilihan database software. Penelitian ini berfokus kepada penetilian tentang sistem pemilihan database sofware dengan metode analytical hierarchy process (AHP, studi kasus Biro TI Bina Sarana

  19. Automated data processing of high-resolution mass spectra

    DEFF Research Database (Denmark)

    Hansen, Michael Adsetts Edberg; Smedsgaard, Jørn

    of the massive amounts of data. We present an automated data processing method to quantitatively compare large numbers of spectra from the analysis of complex mixtures, exploiting the full quality of high-resolution mass spectra. By projecting all detected ions - within defined intervals on both the time...... infusion of crude extracts into the source taking advantage of the high sensitivity, high mass resolution and accuracy and the limited fragmentation. Unfortunately, there has not been a comparable development in the data processing techniques to fully exploit gain in high resolution and accuracy...... infusion analyses of crude extract to find the relationship between species from several species terverticillate Penicillium, and also that the ions responsible for the segregation can be identified. Furthermore the process can automate the process of detecting unique species and unique metabolites....

  20. Integration of disabled people in an automated work process

    Science.gov (United States)

    Jalba, C. K.; Muminovic, A.; Epple, S.; Barz, C.; Nasui, V.

    2017-05-01

    Automation processes enter more and more into all areas of life and production. Especially people with disabilities can hardly keep step with this change. In sheltered workshops in Germany people with physical and mental disabilities get help with much dedication, to be integrated into the work processes. This work shows that cooperation between disabled people and industrial robots by means of industrial image processing can successfully result in the production of highly complex products. Here is described how high-pressure hydraulic pumps are assembled by people with disabilities in cooperation with industrial robots in a sheltered workshop. After the assembly process, the pumps are checked for leaks at very high pressures in a completely automated process.

  1. From Pragmatic to Systematic Software Process Improvement

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Méndez Fernández, Daniel

    2015-01-01

    Software processes improvement (SPI) is a challenging task, as many different stakeholders, project settings, and contexts and goals need to be considered. SPI projects are often operated in a complex and volatile environment and, thus, require a sound management that is resource......-intensive requiring many stakeholders to contribute to the process assessment, analysis, design, realisation, and deployment. Although there exist many valuable SPI approaches, none address the needs of both process engineers and project managers. This article presents an Artefact-based Software Process Improvement...

  2. Automated Software Analysis of Fetal Movement Recorded during a Pregnant Woman's Sleep at Home.

    Directory of Open Access Journals (Sweden)

    Kyoko Nishihara

    Full Text Available Fetal movement is an important biological index of fetal well-being. Since 2008, we have been developing an original capacitive acceleration sensor and device that a pregnant woman can easily use to record fetal movement by herself at home during sleep. In this study, we report a newly developed automated software system for analyzing recorded fetal movement. This study will introduce the system and compare its results to those of a manual analysis of the same fetal movement signals (Experiment I. We will also demonstrate an appropriate way to use the system (Experiment II. In Experiment I, fetal movement data reported previously for six pregnant women at 28-38 gestational weeks were used. We evaluated the agreement of the manual and automated analyses for the same 10-sec epochs using prevalence-adjusted bias-adjusted kappa (PABAK including quantitative indicators for prevalence and bias. The mean PABAK value was 0.83, which can be considered almost perfect. In Experiment II, twelve pregnant women at 24-36 gestational weeks recorded fetal movement at night once every four weeks. Overall, mean fetal movement counts per hour during maternal sleep significantly decreased along with gestational weeks, though individual differences in fetal development were noted. This newly developed automated analysis system can provide important data throughout late pregnancy.

  3. Modifications of alpha processing software to improve calculation of limits for qualitative detection

    Energy Technology Data Exchange (ETDEWEB)

    Kirkpatrick, J.R.

    1997-01-01

    The work described in this report was done for the Bioassay Counting Laboratory (BCL) of the Center of Excellence for Bioassay of the Analytical Services Organization at the Oak Ridge Y-12 Plant. BCL takes urine and fecal samples and tests for alpha radiation. An automated system, supplied by Canberra Industries, counts the activities in the samples and processes the results. The Canberra system includes hardware and software. The managers of BCL want to improve the accuracy of the results they report to their final customers. The desired improvements are of particular interest to the managers of BCL because the levels of alpha-emitting radionuclides in samples measured at BCL are usually so low that a significant fraction of the measured signal is due to background and to the reagent material used to extract the radioactive nuclides from the samples. Also, the background and reagent signals show a significant level of random variation. The customers at BCL requested four major modifications of the software. The requested software changes have been made and tested. The present report is in two parts. The first part describes what the modifications were supposed to accomplish. The second part describes the changes on a line-by-line basis. The second part includes listings of the changed software and discusses possible steps to correct a particular error condition. Last, the second part describes the effect of truncation errors on the standard deviations calculated from samples whose signals are very nearly the same.

  4. Modifications of alpha processing software to improve calculation of limits for qualitative detection

    International Nuclear Information System (INIS)

    Kirkpatrick, J.R.

    1997-01-01

    The work described in this report was done for the Bioassay Counting Laboratory (BCL) of the Center of Excellence for Bioassay of the Analytical Services Organization at the Oak Ridge Y-12 Plant. BCL takes urine and fecal samples and tests for alpha radiation. An automated system, supplied by Canberra Industries, counts the activities in the samples and processes the results. The Canberra system includes hardware and software. The managers of BCL want to improve the accuracy of the results they report to their final customers. The desired improvements are of particular interest to the managers of BCL because the levels of alpha-emitting radionuclides in samples measured at BCL are usually so low that a significant fraction of the measured signal is due to background and to the reagent material used to extract the radioactive nuclides from the samples. Also, the background and reagent signals show a significant level of random variation. The customers at BCL requested four major modifications of the software. The requested software changes have been made and tested. The present report is in two parts. The first part describes what the modifications were supposed to accomplish. The second part describes the changes on a line-by-line basis. The second part includes listings of the changed software and discusses possible steps to correct a particular error condition. Last, the second part describes the effect of truncation errors on the standard deviations calculated from samples whose signals are very nearly the same

  5. Automated PCR setup for forensic casework samples using the Normalization Wizard and PCR Setup robotic methods.

    Science.gov (United States)

    Greenspoon, S A; Sykes, K L V; Ban, J D; Pollard, A; Baisden, M; Farr, M; Graham, N; Collins, B L; Green, M M; Christenson, C C

    2006-12-20

    Human genome, pharmaceutical and research laboratories have long enjoyed the application of robotics to performing repetitive laboratory tasks. However, the utilization of robotics in forensic laboratories for processing casework samples is relatively new and poses particular challenges. Since the quantity and quality (a mixture versus a single source sample, the level of degradation, the presence of PCR inhibitors) of the DNA contained within a casework sample is unknown, particular attention must be paid to procedural susceptibility to contamination, as well as DNA yield, especially as it pertains to samples with little biological material. The Virginia Department of Forensic Science (VDFS) has successfully automated forensic casework DNA extraction utilizing the DNA IQ(trade mark) System in conjunction with the Biomek 2000 Automation Workstation. Human DNA quantitation is also performed in a near complete automated fashion utilizing the AluQuant Human DNA Quantitation System and the Biomek 2000 Automation Workstation. Recently, the PCR setup for casework samples has been automated, employing the Biomek 2000 Automation Workstation and Normalization Wizard, Genetic Identity version, which utilizes the quantitation data, imported into the software, to create a customized automated method for DNA dilution, unique to that plate of DNA samples. The PCR Setup software method, used in conjunction with the Normalization Wizard method and written for the Biomek 2000, functions to mix the diluted DNA samples, transfer the PCR master mix, and transfer the diluted DNA samples to PCR amplification tubes. Once the process is complete, the DNA extracts, still on the deck of the robot in PCR amplification strip tubes, are transferred to pre-labeled 1.5 mL tubes for long-term storage using an automated method. The automation of these steps in the process of forensic DNA casework analysis has been accomplished by performing extensive optimization, validation and testing of the

  6. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    Science.gov (United States)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  7. Artificial intelligence and expert systems in-flight software testing

    Science.gov (United States)

    Demasie, M. P.; Muratore, J. F.

    1991-01-01

    The authors discuss the introduction of advanced information systems technologies such as artificial intelligence, expert systems, and advanced human-computer interfaces directly into Space Shuttle software engineering. The reconfiguration automation project (RAP) was initiated to coordinate this move towards 1990s software technology. The idea behind RAP is to automate several phases of the flight software testing procedure and to introduce AI and ES into space shuttle flight software testing. In the first phase of RAP, conventional tools to automate regression testing have already been developed or acquired. There are currently three tools in use.

  8. A software architectural framework specification for neutron activation analysis

    International Nuclear Information System (INIS)

    Preston, J.A.; Grant, C.N.

    2013-01-01

    Neutron Activation Analysis (NAA) is a sensitive multi-element nuclear analytical technique that has been routinely applied by research reactor (RR) facilities to environmental, nutritional, health related, geological and geochemical studies. As RR facilities face calls to increase their research output and impact, with existing or reducing budgets, automation of NAA offers a possible solution. However, automation has many challenges, not the least of which is a lack of system architecture standards to establish acceptable mechanisms for the various hardware/software and software/software interactions among data acquisition systems, specialised hardware such as sample changers, sample loaders, and data processing modules. This lack of standardization often results in automation hardware and software being incompatible with existing system components, in a facility looking to automate its NAA operations. This limits the availability of automation to a few RR facilities with adequate budgets or in-house engineering resources. What is needed is a modern open system architecture for NAA, that provides the required set of functionalities. This paper describes such an 'architectural framework' (OpenNAA), and portions of a reference implementation. As an example of the benefits, calculations indicate that applying this architecture to the compilation and QA steps associated with the analysis of 35 elements in 140 samples, with 14 SRM's, can reduce the time required by over 80 %. The adoption of open standards in the nuclear industry has been very successful over the years in promoting interchangeability and maximising the lifetime and output of nuclear measurement systems. OpenNAA will provide similar benefits within the NAA application space, safeguarding user investments in their current system, while providing a solid path for development into the future. (author)

  9. An Automated, Image Processing System for Concrete Evaluation

    International Nuclear Information System (INIS)

    Baumgart, C.W.; Cave, S.P.; Linder, K.E.

    1998-01-01

    Allied Signal Federal Manufacturing ampersand Technologies (FM ampersand T) was asked to perform a proof-of-concept study for the Missouri Highway and Transportation Department (MHTD), Research Division, in June 1997. The goal of this proof-of-concept study was to ascertain if automated scanning and imaging techniques might be applied effectively to the problem of concrete evaluation. In the current evaluation process, a concrete sample core is manually scanned under a microscope. Voids (or air spaces) within the concrete are then detected visually by a human operator by incrementing the sample under the cross-hairs of a microscope and by counting the number of ''pixels'' which fall within a void. Automation of the scanning and image analysis processes is desired to improve the speed of the scanning process, to improve evaluation consistency, and to reduce operator fatigue. An initial, proof-of-concept image analysis approach was successfully developed and demonstrated using acquired black and white imagery of concrete samples. In this paper, the automated scanning and image capture system currently under development will be described and the image processing approach developed for the proof-of-concept study will be demonstrated. A development update and plans for future enhancements are also presented

  10. Software development to implement the TxDOT culvert rating guide.

    Science.gov (United States)

    2013-05-01

    This implementation project created CULVLR: Culvert Load Rating, Version 1.0.0, a Windows-based : desktop application software package that automates the process by which Texas Department of Transportation : (TxDOT) engineers and their consultants ...

  11. Softwareland Chronicles: A Software Development Meta-Process Proposal

    Directory of Open Access Journals (Sweden)

    Bolanos Sandro

    2016-05-01

    Full Text Available This paper presents the software development meta-process (SD-MP as a proposal to set up software projects. Within this proposal we offer conceptual elements that help solve the war of methodologies and processes in favor of an integrating viewpoint, where the main flaws associated with conventional and agile approaches are removed. Our newly developed software platform to support the meta-process is also presented together with three case studies involving projects currently in progress, where the framework proposed in SD-MP has been applied.

  12. Towards a New Paradigm of Software Development: an Ambassador Driven Process in Distributed Software Companies

    Science.gov (United States)

    Kumlander, Deniss

    The globalization of companies operations and competitor between software vendors demand improving quality of delivered software and decreasing the overall cost. The same in fact introduce a lot of problem into software development process as produce distributed organization breaking the co-location rule of modern software development methodologies. Here we propose a reformulation of the ambassador position increasing its productivity in order to bridge communication and workflow gap by managing the entire communication process rather than concentrating purely on the communication result.

  13. Virtual automation.

    Science.gov (United States)

    Casis, E; Garrido, A; Uranga, B; Vives, A; Zufiaurre, C

    2001-01-01

    Total laboratory automation (TLA) can be substituted in mid-size laboratories by a computer sample workflow control (virtual automation). Such a solution has been implemented in our laboratory using PSM, software developed in cooperation with Roche Diagnostics (Barcelona, Spain), to this purpose. This software is connected to the online analyzers and to the laboratory information system and is able to control and direct the samples working as an intermediate station. The only difference with TLA is the replacement of transport belts by personnel of the laboratory. The implementation of this virtual automation system has allowed us the achievement of the main advantages of TLA: workload increase (64%) with reduction in the cost per test (43%), significant reduction in the number of biochemistry primary tubes (from 8 to 2), less aliquoting (from 600 to 100 samples/day), automation of functional testing, drastic reduction of preanalytical errors (from 11.7 to 0.4% of the tubes) and better total response time for both inpatients (from up to 48 hours to up to 4 hours) and outpatients (from up to 10 days to up to 48 hours). As an additional advantage, virtual automation could be implemented without hardware investment and significant headcount reduction (15% in our lab).

  14. Model for Simulating a Spiral Software-Development Process

    Science.gov (United States)

    Mizell, Carolyn; Curley, Charles; Nayak, Umanath

    2010-01-01

    A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code

  15. Evaluation of software tools for automated identification of neuroanatomical structures in quantitative β-amyloid PET imaging to diagnose Alzheimer's disease

    Energy Technology Data Exchange (ETDEWEB)

    Tuszynski, Tobias; Luthardt, Julia; Butzke, Daniel; Tiepolt, Solveig; Seese, Anita; Barthel, Henryk [Leipzig University Medical Centre, Department of Nuclear Medicine, Leipzig (Germany); Rullmann, Michael; Hesse, Swen; Sabri, Osama [Leipzig University Medical Centre, Department of Nuclear Medicine, Leipzig (Germany); Leipzig University Medical Centre, Integrated Treatment and Research Centre (IFB) Adiposity Diseases, Leipzig (Germany); Gertz, Hermann-Josef [Leipzig University Medical Centre, Department of Psychiatry, Leipzig (Germany); Lobsien, Donald [Leipzig University Medical Centre, Department of Neuroradiology, Leipzig (Germany)

    2016-06-15

    For regional quantification of nuclear brain imaging data, defining volumes of interest (VOIs) by hand is still the gold standard. As this procedure is time-consuming and operator-dependent, a variety of software tools for automated identification of neuroanatomical structures were developed. As the quality and performance of those tools are poorly investigated so far in analyzing amyloid PET data, we compared in this project four algorithms for automated VOI definition (HERMES Brass, two PMOD approaches, and FreeSurfer) against the conventional method. We systematically analyzed florbetaben brain PET and MRI data of ten patients with probable Alzheimer's dementia (AD) and ten age-matched healthy controls (HCs) collected in a previous clinical study. VOIs were manually defined on the data as well as through the four automated workflows. Standardized uptake value ratios (SUVRs) with the cerebellar cortex as a reference region were obtained for each VOI. SUVR comparisons between ADs and HCs were carried out using Mann-Whitney-U tests, and effect sizes (Cohen's d) were calculated. SUVRs of automatically generated VOIs were correlated with SUVRs of conventionally derived VOIs (Pearson's tests). The composite neocortex SUVRs obtained by manually defined VOIs were significantly higher for ADs vs. HCs (p=0.010, d=1.53). This was also the case for the four tested automated approaches which achieved effect sizes of d=1.38 to d=1.62. SUVRs of automatically generated VOIs correlated significantly with those of the hand-drawn VOIs in a number of brain regions, with regional differences in the degree of these correlations. Best overall correlation was observed in the lateral temporal VOI for all tested software tools (r=0.82 to r=0.95, p<0.001). Automated VOI definition by the software tools tested has a great potential to substitute for the current standard procedure to manually define VOIs in β-amyloid PET data analysis. (orig.)

  16. On the Role of Software Quality Management in Software Process Improvement

    DEFF Research Database (Denmark)

    Wiedemann Jacobsen, Jan; Kuhrmann, Marco; Münch, Jürgen

    2016-01-01

    Software Process Improvement (SPI) programs have been implemented, inter alia, to improve quality and speed of software development. SPI addresses many aspects ranging from individual developer skills to entire organizations. It comprises, for instance, the optimization of specific activities...... and a strong focus on custom review, testing, and documentation techniques, whereas a set of five selected improvement measures is almost equally addressed....

  17. Augmenting SCA project management and automation framework

    Science.gov (United States)

    Iyapparaja, M.; Sharma, Bhanupriya

    2017-11-01

    In our daily life we need to keep the records of the things in order to manage it in more efficient and proper way. Our Company manufactures semiconductor chips and sale it to the buyer. Sometimes it manufactures the entire product and sometimes partially and sometimes it sales the intermediary product obtained during manufacturing, so for the better management of the entire process there is a need to keep the track record of all the entity involved in it. Materials and Methods: Therefore to overcome with the problem the need raised to develop the framework for the maintenance of the project and for the automation testing. Project management framework provides an architecture which supports in managing the project by marinating the records of entire requirements, the test cases that were created for testing each unit of the software, defect raised from the past years. So through this the quality of the project can be maintained. Results: Automation framework provides the architecture which supports the development and implementation of the automation test script for the software testing process. Conclusion: For implementing project management framework the product of HP that is Application Lifecycle management is used which provides central repository to maintain the project.

  18. Software for Evaluation of Conceptual Design

    DEFF Research Database (Denmark)

    Hartvig, Susanne C

    1998-01-01

    by the prototype, it addresses the requirements that the methods imply, and it explains the actual implementation of the prototype. Finally it discusses what have been learned from developing and testing the prototype. In this paper it is suggested, that a software tool which supports evaluation of design can...... be developed with a limited effort, and that such tools could support a structured evaluation process as opposed to no evaluation. Compared to manual evaluation, the introduced software based evaluation tool offers automation of tasks, such as performing assessments, when they are based on prior evaluations...

  19. Combined process automation for large-scale EEG analysis.

    Science.gov (United States)

    Sfondouris, John L; Quebedeaux, Tabitha M; Holdgraf, Chris; Musto, Alberto E

    2012-01-01

    Epileptogenesis is a dynamic process producing increased seizure susceptibility. Electroencephalography (EEG) data provides information critical in understanding the evolution of epileptiform changes throughout epileptic foci. We designed an algorithm to facilitate efficient large-scale EEG analysis via linked automation of multiple data processing steps. Using EEG recordings obtained from electrical stimulation studies, the following steps of EEG analysis were automated: (1) alignment and isolation of pre- and post-stimulation intervals, (2) generation of user-defined band frequency waveforms, (3) spike-sorting, (4) quantification of spike and burst data and (5) power spectral density analysis. This algorithm allows for quicker, more efficient EEG analysis. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. Automating software design system DESTA

    Science.gov (United States)

    Lovitsky, Vladimir A.; Pearce, Patricia D.

    1992-01-01

    'DESTA' is the acronym for the Dialogue Evolutionary Synthesizer of Turnkey Algorithms by means of a natural language (Russian or English) functional specification of algorithms or software being developed. DESTA represents the computer-aided and/or automatic artificial intelligence 'forgiving' system which provides users with software tools support for algorithm and/or structured program development. The DESTA system is intended to provide support for the higher levels and earlier stages of engineering design of software in contrast to conventional Computer Aided Design (CAD) systems which provide low level tools for use at a stage when the major planning and structuring decisions have already been taken. DESTA is a knowledge-intensive system. The main features of the knowledge are procedures, functions, modules, operating system commands, batch files, their natural language specifications, and their interlinks. The specific domain for the DESTA system is a high level programming language like Turbo Pascal 6.0. The DESTA system is operational and runs on an IBM PC computer.

  1. Automation of gamma-therapy

    International Nuclear Information System (INIS)

    Al'bitskij, L.L.; Brikker, I.N.; Bychkov, V.N.; Voronin, V.V.; Mirzoyan, A.R.; Rogozhin, A.S.; Sarkisyan, Yu.Kh.

    1989-01-01

    A system of automated control Aspect-2 was developed for automation of gamma therapy on units of the Rokus series. The system consists of the following hardware and software complexes: a complex of preirradiation preparation Centrator-imitator, a complex Accord for anatomotopographic data coding; a software complex and a gamma-therapeutic complex Rokus-AM. The Centrator-imitator and Rokus-AM complexes are fitted out with built-in microcomputers with specially developed systemic software. The Rokus-AM complex has automatic punch tape programmed control of 9 degrees of freedom of the gamma-unit and treatment table and ensures 5 modes of irradiation: positional, rotating, rotaing-convergent, sectoral rotating-convergent and scanning

  2. Software Design Improvements. Part 2; Software Quality and the Design and Inspection Process

    Science.gov (United States)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    The application of assurance engineering techniques improves the duration of failure-free performance of software. The totality of features and characteristics of a software product are what determine its ability to satisfy customer needs. Software in safety-critical systems is very important to NASA. We follow the System Safety Working Groups definition for system safety software as: 'The optimization of system safety in the design, development, use and maintenance of software and its integration with safety-critical systems in an operational environment. 'If it is not safe, say so' has become our motto. This paper goes over methods that have been used by NASA to make software design improvements by focusing on software quality and the design and inspection process.

  3. PyDBS: an automated image processing workflow for deep brain stimulation surgery.

    Science.gov (United States)

    D'Albis, Tiziano; Haegelen, Claire; Essert, Caroline; Fernández-Vidal, Sara; Lalys, Florent; Jannin, Pierre

    2015-02-01

    Deep brain stimulation (DBS) is a surgical procedure for treating motor-related neurological disorders. DBS clinical efficacy hinges on precise surgical planning and accurate electrode placement, which in turn call upon several image processing and visualization tasks, such as image registration, image segmentation, image fusion, and 3D visualization. These tasks are often performed by a heterogeneous set of software tools, which adopt differing formats and geometrical conventions and require patient-specific parameterization or interactive tuning. To overcome these issues, we introduce in this article PyDBS, a fully integrated and automated image processing workflow for DBS surgery. PyDBS consists of three image processing pipelines and three visualization modules assisting clinicians through the entire DBS surgical workflow, from the preoperative planning of electrode trajectories to the postoperative assessment of electrode placement. The system's robustness, speed, and accuracy were assessed by means of a retrospective validation, based on 92 clinical cases. The complete PyDBS workflow achieved satisfactory results in 92 % of tested cases, with a median processing time of 28 min per patient. The results obtained are compatible with the adoption of PyDBS in clinical practice.

  4. Automated Processing of Imaging Data through Multi-tiered Classification of Biological Structures Illustrated Using Caenorhabditis elegans.

    Directory of Open Access Journals (Sweden)

    Mei Zhan

    2015-04-01

    Full Text Available Quantitative imaging has become a vital technique in biological discovery and clinical diagnostics; a plethora of tools have recently been developed to enable new and accelerated forms of biological investigation. Increasingly, the capacity for high-throughput experimentation provided by new imaging modalities, contrast techniques, microscopy tools, microfluidics and computer controlled systems shifts the experimental bottleneck from the level of physical manipulation and raw data collection to automated recognition and data processing. Yet, despite their broad importance, image analysis solutions to address these needs have been narrowly tailored. Here, we present a generalizable formulation for autonomous identification of specific biological structures that is applicable for many problems. The process flow architecture we present here utilizes standard image processing techniques and the multi-tiered application of classification models such as support vector machines (SVM. These low-level functions are readily available in a large array of image processing software packages and programming languages. Our framework is thus both easy to implement at the modular level and provides specific high-level architecture to guide the solution of more complicated image-processing problems. We demonstrate the utility of the classification routine by developing two specific classifiers as a toolset for automation and cell identification in the model organism Caenorhabditis elegans. To serve a common need for automated high-resolution imaging and behavior applications in the C. elegans research community, we contribute a ready-to-use classifier for the identification of the head of the animal under bright field imaging. Furthermore, we extend our framework to address the pervasive problem of cell-specific identification under fluorescent imaging, which is critical for biological investigation in multicellular organisms or tissues. Using these examples as a

  5. Automated Processing of Imaging Data through Multi-tiered Classification of Biological Structures Illustrated Using Caenorhabditis elegans.

    Science.gov (United States)

    Zhan, Mei; Crane, Matthew M; Entchev, Eugeni V; Caballero, Antonio; Fernandes de Abreu, Diana Andrea; Ch'ng, QueeLim; Lu, Hang

    2015-04-01

    Quantitative imaging has become a vital technique in biological discovery and clinical diagnostics; a plethora of tools have recently been developed to enable new and accelerated forms of biological investigation. Increasingly, the capacity for high-throughput experimentation provided by new imaging modalities, contrast techniques, microscopy tools, microfluidics and computer controlled systems shifts the experimental bottleneck from the level of physical manipulation and raw data collection to automated recognition and data processing. Yet, despite their broad importance, image analysis solutions to address these needs have been narrowly tailored. Here, we present a generalizable formulation for autonomous identification of specific biological structures that is applicable for many problems. The process flow architecture we present here utilizes standard image processing techniques and the multi-tiered application of classification models such as support vector machines (SVM). These low-level functions are readily available in a large array of image processing software packages and programming languages. Our framework is thus both easy to implement at the modular level and provides specific high-level architecture to guide the solution of more complicated image-processing problems. We demonstrate the utility of the classification routine by developing two specific classifiers as a toolset for automation and cell identification in the model organism Caenorhabditis elegans. To serve a common need for automated high-resolution imaging and behavior applications in the C. elegans research community, we contribute a ready-to-use classifier for the identification of the head of the animal under bright field imaging. Furthermore, we extend our framework to address the pervasive problem of cell-specific identification under fluorescent imaging, which is critical for biological investigation in multicellular organisms or tissues. Using these examples as a guide, we envision

  6. A Practical Guide to the Technology and Adoption of Software Process Automation

    Science.gov (United States)

    1994-03-01

    IDE’s integration of Software through Pictures, CodeCenter, and FrameMaker ). However, successful use of in- tegrated tools, as reflected in actual...tool for a specific platform. Thus, when a Work Context calls for a word processor, the weaver.tis file can be set up to call FrameMaker for the Sun4

  7. The Legacy of Space Shuttle Flight Software

    Science.gov (United States)

    Hickey, Christopher J.; Loveall, James B.; Orr, James K.; Klausman, Andrew L.

    2011-01-01

    The initial goals of the Space Shuttle Program required that the avionics and software systems blaze new trails in advancing avionics system technology. Many of the requirements placed on avionics and software were accomplished for the first time on this program. Examples include comprehensive digital fly-by-wire technology, use of a digital databus for flight critical functions, fail operational/fail safe requirements, complex automated redundancy management, and the use of a high-order software language for flight software development. In order to meet the operational and safety goals of the program, the Space Shuttle software had to be extremely high quality, reliable, robust, reconfigurable and maintainable. To achieve this, the software development team evolved a software process focused on continuous process improvement and defect elimination that consistently produced highly predictable and top quality results, providing software managers the confidence needed to sign each Certificate of Flight Readiness (COFR). This process, which has been appraised at Capability Maturity Model (CMM)/Capability Maturity Model Integration (CMMI) Level 5, has resulted in one of the lowest software defect rates in the industry. This paper will present an overview of the evolution of the Primary Avionics Software System (PASS) project and processes over thirty years, an argument for strong statistical control of software processes with examples, an overview of the success story for identifying and driving out errors before flight, a case study of the few significant software issues and how they were either identified before flight or slipped through the process onto a flight vehicle, and identification of the valuable lessons learned over the life of the project.

  8. Automation Challenges of the 80's: What to Do until Your Integrated Library System Arrives.

    Science.gov (United States)

    Allan, Ferne C.; Shields, Joyce M.

    1986-01-01

    A medium-sized aerospace library has developed interim solutions to automation needs by using software and equipment that were available in-house in preparation for an expected integrated library system. Automated processes include authors' file of items authored by employees, journal routing (including routing slips), statistics, journal…

  9. Managing Cultural Variation in Software Process Improvement

    DEFF Research Database (Denmark)

    Kræmmergaard, Pernille; Müller, Sune Dueholm; Mathiassen, Lars

    The scale and complexity of change in software process improvement (SPI) are considerable and managerial attention to organizational culture during SPI can therefore potentially contribute to successful outcomes. However, we know little about the impact of variations in organizational subculture ...... organizations can have important implications for SPI outcomes. Furthermore, it provides insights into how software managers can practically assess subcultures to inform decisions about and help prepare plans for SPI initiatives.......The scale and complexity of change in software process improvement (SPI) are considerable and managerial attention to organizational culture during SPI can therefore potentially contribute to successful outcomes. However, we know little about the impact of variations in organizational subculture...

  10. System for inspection and quality assurance of software - A knowledge-based experiment with code understanding

    International Nuclear Information System (INIS)

    Das, B.K.

    1989-01-01

    This paper describes a knowledge-based prototype that inspects and quality-assures software components. The prototype model, which offers a singular representation of these components, is used to automate both the mechanical and nonmechanical activities in the quality assurance (QA) process. It is shown that the prototype, in addition to automating the QA process, provides a novel approach to understanding code. These approaches are compared with recent approaches to code understanding. The paper also presents the results of an experiment with several classes of nonsyntactic bugs. It is argued that a structured environment, as facilitated by this unique architecture, along with software development standards used in the QA process, is essential for meaningful analysis of code. 8 refs

  11. Automated Installation Verification of COMSOL via LiveLink for MATLAB

    International Nuclear Information System (INIS)

    Crowell, Michael W

    2015-01-01

    Verifying that a local software installation performs as the developer intends is a potentially time-consuming but necessary step for nuclear safety-related codes. Automating this process not only saves time, but can increase reliability and scope of verification compared to ''hand'' comparisons. While COMSOL does not include automatic installation verification as many commercial codes do, it does provide tools such as LiveLink"T"M for MATLAB® and the COMSOL API for use with Java® through which the user can automate the process. Here we present a successful automated verification example of a local COMSOL 5.0 installation for nuclear safety-related calculations at the Oak Ridge National Laboratory's High Flux Isotope Reactor (HFIR).

  12. Automated Installation Verification of COMSOL via LiveLink for MATLAB

    Energy Technology Data Exchange (ETDEWEB)

    Crowell, Michael W [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-01-01

    Verifying that a local software installation performs as the developer intends is a potentially time-consuming but necessary step for nuclear safety-related codes. Automating this process not only saves time, but can increase reliability and scope of verification compared to ‘hand’ comparisons. While COMSOL does not include automatic installation verification as many commercial codes do, it does provide tools such as LiveLink™ for MATLAB® and the COMSOL API for use with Java® through which the user can automate the process. Here we present a successful automated verification example of a local COMSOL 5.0 installation for nuclear safety-related calculations at the Oak Ridge National Laboratory’s High Flux Isotope Reactor (HFIR).

  13. Description of research interests and current work related to automating software design

    Science.gov (United States)

    Kaindl, Hermann

    1992-01-01

    Enclosed is a list of selected and recent publications. Most of these publications concern applied research in the areas of software engineering and human-computer interaction. It is felt that domain-specific knowledge plays a major role in software development. Additionally, it is believed that improvements in the general software development process (e.g., object-oriented approaches) will have to be combined with the use of large domain-specific knowledge bases.

  14. DIDACTIC AUTOMATED STATION OF COMPLEX KINEMATICS

    Directory of Open Access Journals (Sweden)

    Mariusz Sosnowski

    2014-03-01

    Full Text Available The paper presents the design, control system and software that controls the automated station of complex kinematics. Control interface and software has been developed and manufactured in the West Pomeranian University of Technology in Szczecin in the Department of Automated Manufacturing Systems Engineering and Quality. Conducting classes designed to teach programming and design of structures and systems for monitoring the robot kinematic components with non-standard structures was the reason for installation of the control system and software.

  15. Inselect: Automating the Digitization of Natural History Collections.

    Directory of Open Access Journals (Sweden)

    Lawrence N Hudson

    Full Text Available The world's natural history collections constitute an enormous evidence base for scientific research on the natural world. To facilitate these studies and improve access to collections, many organisations are embarking on major programmes of digitization. This requires automated approaches to mass-digitization that support rapid imaging of specimens and associated data capture, in order to process the tens of millions of specimens common to most natural history collections. In this paper we present Inselect-a modular, easy-to-use, cross-platform suite of open-source software tools that supports the semi-automated processing of specimen images generated by natural history digitization programmes. The software is made up of a Windows, Mac OS X, and Linux desktop application, together with command-line tools that are designed for unattended operation on batches of images. Blending image visualisation algorithms that automatically recognise specimens together with workflows to support post-processing tasks such as barcode reading, label transcription and metadata capture, Inselect fills a critical gap to increase the rate of specimen digitization.

  16. Completely automated measurement facility (PAVICOM) for track-detector data processing

    CERN Document Server

    Aleksandrov, A B; Feinberg, E L; Goncharova, L A; Konovalova, N S; Martynov, A G; Polukhina, N G; Roussetski, A S; Starkov, NI; Tsarev, V A

    2004-01-01

    A review of technical capabilities and investigations performed using the completely automated measuring facility (PAVICOM) is presented. This very efficient facility for track-detector data processing in the field of nuclear and high-energy particle physics has been constructed in the Lebedev physical institute. PAVICOM is widely used in Russia for treatment of experimental data from track detectors (emulsion and solid-state trackers) in high- and low-energy physics, cosmic ray physics, etc. PAVICOM provides an essential improvement of the efficiency of experimental studies. In contrast to semi-automated microscopes widely used until now, PAVICOM is capable of performing completely automated measurements of charged particle tracks in nuclear emulsions and track detectors without employing hard visual work. In this case, track images are recorded by CCD cameras and then are digitized and converted into files. Thus, experimental data processing is accelerated by approximately a thousand times. Completely autom...

  17. Coordination and organization of security software process for power information application environment

    Science.gov (United States)

    Wang, Qiang

    2017-09-01

    As an important part of software engineering, the software process decides the success or failure of software product. The design and development feature of security software process is discussed, so is the necessity and the present significance of using such process. Coordinating the function software, the process for security software and its testing are deeply discussed. The process includes requirement analysis, design, coding, debug and testing, submission and maintenance. In each process, the paper proposed the subprocesses to support software security. As an example, the paper introduces the above process into the power information platform.

  18. Software engineering aspects of real-time programming concepts

    Science.gov (United States)

    Schoitsch, Erwin

    1986-08-01

    Real-time programming is a discipline of great importance not only in process control, but also in fields like communication, office automation, interactive databases, interactive graphics and operating systems development. General concepts of concurrent programming and constructs for process-synchronization are discussed in detail. Tasking and synchronization concepts, methods of process communication, interrupt and timeout handling in systems based on semaphores, signals, conditional critical regions or on real-time languages like Concurrent PASCAL, MODULA, CHILL and ADA are explained and compared with each other. The second part deals with structuring and modularization of technical processes to build reliable and maintainable real time systems. Software-quality and software engineering aspects are considered throughout the paper.

  19. Software problems in library automation in India

    OpenAIRE

    Francis, A. T.

    1998-01-01

    Important software problems faced by the library professionals in India are analysed and points out various compatibility and suitability issues in the selection of a library software. The paper also hints that these problems has affected the progress of computerisation of libraries. Upto date and detailed information on softwares available in India can prevent several issues that may arise in the course of computerisation. An agency/mechanism to continuously evaluate the softwares may be ...

  20. Semi-Automated Processing of Trajectory Simulator Output Files for Model Evaluation

    Science.gov (United States)

    2018-01-01

    ARL-TR-8284 ● JAN 2018 US Army Research Laboratory Semi-Automated Processing of Trajectory Simulator Output Files for Model...Semi-Automated Processing of Trajectory Simulator Output Files for Model Evaluation 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...although some minor changes may be needed. The program processes a GTRAJ output text file that contains results from 2 or more simulations , where each

  1. A New Automated Instrument Calibration Facility at the Savannah River Site

    International Nuclear Information System (INIS)

    Polz, E.; Rushton, R.O.; Wilkie, W.H.; Hancock, R.C.

    1998-01-01

    The Health Physics Instrument Calibration Facility at the Savannah River Site in Aiken, SC was expressly designed and built to calibrate portable radiation survey instruments. The facility incorporates recent advances in automation technology, building layout and construction, and computer software to improve the calibration process. Nine new calibration systems automate instrument calibration and data collection. The building is laid out so that instruments are moved from one area to another in a logical, efficient manner. New software and hardware integrate all functions such as shipping/receiving, work flow, calibration, testing, and report generation. Benefits include a streamlined and integrated program, improved efficiency, reduced errors, and better accuracy

  2. Automated ultrasonic inspection using PULSDAT

    International Nuclear Information System (INIS)

    Naybour, P.J.

    1992-01-01

    PULSDAT (Portable Ultrasonic Data Acquisition Tool) is a system for recording the data from single probe automated ultrasonic inspections. It is one of a range of instruments and software developed by Nuclear Electric to carry out a wide variety of high quality ultrasonic inspections. These vary from simple semi-automated inspections through to multi-probe, highly automated ones. PULSDAT runs under the control of MIPS software, and collects data which is compatible with the GUIDE data display system. PULSDAT is therefore fully compatible with Nuclear Electric's multi-probe inspection systems and utilises all the reliability and quality assurance of the software. It is a rugged, portable system that can be used in areas of difficult access. The paper discusses the benefits of automated inspection and gives an outline of the main features of PULSDAT. Since April 1990 PULSDAT has been used in several applications within Nuclear Electric and this paper presents two examples: the first is a ferritic set-through nozzle and the second is an austenitic fillet weld. (Author)

  3. IDAPS (Image Data Automated Processing System) System Description

    Science.gov (United States)

    1988-06-24

    This document describes the physical configuration and components used in the image processing system referred to as IDAPS (Image Data Automated ... Processing System). This system was developed by the Environmental Research Institute of Michigan (ERIM) for Eglin Air Force Base. The system is designed

  4. Overview of the software inspection process

    Energy Technology Data Exchange (ETDEWEB)

    Lane, G.L.; Dabbs, R. [Sandia National Labs., Albuquerque, NM (United States)

    1997-11-01

    This tutorial introduces attendees to the Inspection Process and teaches them how to organize and participate in a software inspection. The tutorial advocates the benefits of inspections and encourages attendees to socialize the inspection process in their organizations.

  5. Automated complex for research of electric drives control

    Science.gov (United States)

    Avlasko, P. V.; Antonenko, D. A.

    2018-05-01

    In the article, the automated complex intended for research of various control modes of electric motors including the inductor motor of double-way feed is described. As a basis of the created complex, the National Instruments platform is chosen. The operating controller built in a platform is delivered with an operating system of real-time for creation of systems of measurement and management. The software developed in the environment of LabVIEW consists of several connected modules which are in different elements of a complex. Besides the software for automated management by experimental installation, the program complex is developed for modelling of processes in the electric drive. As a result there is an opportunity to compare simulated and received experimentally transitional characteristics of the electric drive in various operating modes.

  6. Affective processes in human-automation interactions.

    Science.gov (United States)

    Merritt, Stephanie M

    2011-08-01

    This study contributes to the literature on automation reliance by illuminating the influences of user moods and emotions on reliance on automated systems. Past work has focused predominantly on cognitive and attitudinal variables, such as perceived machine reliability and trust. However, recent work on human decision making suggests that affective variables (i.e., moods and emotions) are also important. Drawing from the affect infusion model, significant effects of affect are hypothesized. Furthermore, a new affectively laden attitude termed liking is introduced. Participants watched video clips selected to induce positive or negative moods, then interacted with a fictitious automated system on an X-ray screening task At five time points, important variables were assessed including trust, liking, perceived machine accuracy, user self-perceived accuracy, and reliance.These variables, along with propensity to trust machines and state affect, were integrated in a structural equation model. Happiness significantly increased trust and liking for the system throughout the task. Liking was the only variable that significantly predicted reliance early in the task. Trust predicted reliance later in the task, whereas perceived machine accuracy and user self-perceived accuracy had no significant direct effects on reliance at any time. Affective influences on automation reliance are demonstrated, suggesting that this decision-making process may be less rational and more emotional than previously acknowledged. Liking for a new system may be key to appropriate reliance, particularly early in the task. Positive affect can be easily induced and may be a lever for increasing liking.

  7. The software and algorithms for hyperspectral data processing

    Science.gov (United States)

    Shyrayeva, Anhelina; Martinov, Anton; Ivanov, Victor; Katkovsky, Leonid

    2017-04-01

    Hyperspectral remote sensing technique is widely used for collecting and processing -information about the Earth's surface objects. Hyperspectral data are combined to form a three-dimensional (x, y, λ) data cube. Department of Aerospace Research of the Institute of Applied Physical Problems of the Belarusian State University presents a general model of the software for hyperspectral image data analysis and processing. The software runs in Windows XP/7/8/8.1/10 environment on any personal computer. This complex has been has been written in C++ language using QT framework and OpenGL for graphical data visualization. The software has flexible structure that consists of a set of independent plugins. Each plugin was compiled as Qt Plugin and represents Windows Dynamic library (dll). Plugins can be categorized in terms of data reading types, data visualization (3D, 2D, 1D) and data processing The software has various in-built functions for statistical and mathematical analysis, signal processing functions like direct smoothing function for moving average, Savitzky-Golay smoothing technique, RGB correction, histogram transformation, and atmospheric correction. The software provides two author's engineering techniques for the solution of atmospheric correction problem: iteration method of refinement of spectral albedo's parameters using Libradtran and analytical least square method. The main advantages of these methods are high rate of processing (several minutes for 1 GB data) and low relative error in albedo retrieval (less than 15%). Also, the software supports work with spectral libraries, region of interest (ROI) selection, spectral analysis such as cluster-type image classification and automatic hypercube spectrum comparison by similarity criterion with similar ones from spectral libraries, and vice versa. The software deals with different kinds of spectral information in order to identify and distinguish spectrally unique materials. Also, the following advantages

  8. Integrated software tool automates MOV diagnosis

    International Nuclear Information System (INIS)

    Joshi, B.D.; Upadhyaya, B.R.

    1996-01-01

    This article reports that researchers at the University of Tennessee have developed digital signal processing software that takes the guesswork out of motor current signature analysis (MCSA). The federal testing regulations for motor-operated valves (MOV) used in nuclear power plants have recently come under critical scrutiny by the Nuclear Regulatory Commission (NRC) and the American Society of Mechanical Engineers (ASME). New ASME testing specifications mandate that all valves performing a safety function are to be tested -- not just ASME Code 1, 2 and 3 valves. The NRC will likely endorse the ASME regulations in the near future. Because of these changes, several utility companies have voluntarily expanded the scope of their in-service testing programs for MOVs, in spite of the additional expense

  9. Software: our quest for excellence. Honoring 50 years of software history, progress, and process

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-06-01

    The Software Quality Forum was established by the Software Quality Assurance (SQA) Subcommittee, which serves as a technical advisory group on software engineering and quality initiatives and issues for DOE`s quality managers. The forum serves as an opportunity for all those involved in implementing SQA programs to meet and share ideas and concerns. Participation from managers, quality engineers, and software professionals provides an ideal environment for identifying and discussing issues and concerns. The interaction provided by the forum contributes to the realization of a shared goal--high quality software product. Topics include: testing, software measurement, software surety, software reliability, SQA practices, assessments, software process improvement, certification and licensing of software professionals, CASE tools, software project management, inspections, and management`s role in ensuring SQA. The bulk of this document consists of vugraphs. Selected papers have been indexed separately for inclusion in the Energy Science and Technology Database.

  10. IMPROVEMENT OF AIS FOR CONTROL OF THE BUSINESS PROCESS OF PUBLISHING SCIENTIFIC JOURNALS

    Directory of Open Access Journals (Sweden)

    O. Yu. Sakaliuk

    2017-06-01

    Full Text Available We consider business process automation publishing scientific journals. It describes the focal point of publishing houses Odessa National Academy of Food Technology and the automation of business processes. A complex business process models publishing scientific journals. Analyzed organizational structure of Coordinating Centre of Scientific Journals' Publishing ONAFT structure and created its model. Software “CCSJP Manager” was analyzed and found in it "weak areas." Automated information system was modernized. The economic feasibility of the software development was substantiated, and the definition of efficiency. The developed software will accelerate the development of scientific periodicals ONAFT, which in turn improve the academy ratings at the global level, improve its image and credibility.

  11. IMPROVEMENT OF AIS FOR CONTROL OF THE BUSINESS PROCESS OF PUBLISHING SCIENTIFIC JOURNALS

    Directory of Open Access Journals (Sweden)

    Olexiy Sakaliuk

    2017-05-01

    Full Text Available We consider business process automation publishing scientific journals. It describes the focal point of publishing houses Odessa National Academy of Food Technology and the automation of business processes. A complex business process models publishing scientific journals. Analyzed organizational structure of Coordinating Centre of Scientific Journals' Publishing ONAFT structure and created its model. Software “CCSJP Manager” was analyzed and found in it "weak areas." Automated information system was modernized. The economic feasibility of the software development was substantiated, and the definition of efficiency. The developed software will accelerate the development of scientific periodicals ONAFT, which in turn improve the academy ratings at the global level, improve its image and credibility.

  12. SOFTWARE PROCESS ASSESSMENT AND IMPROVEMENT USING MULTICRITERIA DECISION AIDING - CONSTRUCTIVIST

    Directory of Open Access Journals (Sweden)

    Leonardo Ensslin

    2012-12-01

    Full Text Available Software process improvement and software process assessment have received special attention since the 1980s. Some models have been created, but these models rest on a normative approach, where the decision-maker’s participation in a software organization is limited to understanding which process is more relevant to each organization. The proposal of this work is to present the MCDA-C as a constructivist methodology for software process improvement and assessment. The methodology makes it possible to visualize the criteria that must be taken into account according to the decision-makers’ values in the process improvement actions, making it possible to rank actions in the light of specific organizational needs. This process helped the manager of the company studied to focus on and prioritize process improvement actions. This paper offers an empirical understanding of the application of performance evaluation to software process improvement and identifies complementary tools to the normative models presented today.

  13. Modernising ATLAS Software Build Infrastructure

    CERN Document Server

    Ritsch, Elmar; The ATLAS collaboration

    2017-01-01

    In the last year ATLAS has radically updated its software development infrastructure hugely reducing the complexity of building releases and greatly improving build speed, flexibility and code testing. The first step in this transition was the adoption of CMake as the software build system over the older CMT. This required the development of an automated translation from the old system to the new, followed by extensive testing and improvements. This resulted in a far more standard build process that was married to the method of building ATLAS software as a series of $12$ separate projects from Subversion. We then proceeded with a migration of the code base from Subversion to Git. As the Subversion repository had been structured to manage each package more or less independently there was no simple mapping that could be used to manage the migration into Git. Instead a specialist set of scripts that captured the software changes across official software releases was developed. With some clean up of the repositor...

  14. A hybrid artificial neural network as a software sensor for optimal control of a wastewater treatment process.

    Science.gov (United States)

    Choi, D J; Park, H

    2001-11-01

    For control and automation of biological treatment processes, lack of reliable on-line sensors to measure water quality parameters is one of the most important problems to overcome. Many parameters cannot be measured directly with on-line sensors. The accuracy of existing hardware sensors is also not sufficient and maintenance problems such as electrode fouling often cause trouble. This paper deals with the development of software sensor techniques that estimate the target water quality parameter from other parameters using the correlation between water quality parameters. We focus our attention on the preprocessing of noisy data and the selection of the best model feasible to the situation. Problems of existing approaches are also discussed. We propose a hybrid neural network as a software sensor inferring wastewater quality parameter. Multivariate regression, artificial neural networks (ANN), and a hybrid technique that combines principal component analysis as a preprocessing stage are applied to data from industrial wastewater processes. The hybrid ANN technique shows an enhancement of prediction capability and reduces the overfitting problem of neural networks. The result shows that the hybrid ANN technique can be used to extract information from noisy data and to describe the nonlinearity of complex wastewater treatment processes.

  15. Seven Processes that Enable NASA Software Engineering Technologies

    Science.gov (United States)

    Housch, Helen; Godfrey, Sally

    2011-01-01

    This slide presentation reviews seven processes that NASA uses to ensure that software is developed, acquired and maintained as specified in the NPR 7150.2A requirement. The requirement is to ensure that all software be appraised for the Capability Maturity Model Integration (CMMI). The enumerated processes are: (7) Product Integration, (6) Configuration Management, (5) Verification, (4) Software Assurance, (3) Measurement and Analysis, (2) Requirements Management and (1) Planning & Monitoring. Each of these is described and the group(s) that are responsible is described.

  16. Business Intelligence Applied to the ALMA Software Integration Process

    Science.gov (United States)

    Zambrano, M.; Recabarren, C.; González, V.; Hoffstadt, A.; Soto, R.; Shen, T.-C.

    2012-09-01

    Software quality assurance and planning of an astronomy project is a complex task, specially if it is a distributed collaborative project such as ALMA, where the development centers are spread across the globe. When you execute a software project there is much valuable information about this process itself that you might be able to collect. One of the ways you can receive this input is via an issue tracking system that will gather the problem reports relative to software bugs captured during the testing of the software, during the integration of the different components or even worst, problems occurred during production time. Usually, there is little time spent on analyzing them but with some multidimensional processing you can extract valuable information from them and it might help you on the long term planning and resources allocation. We present an analysis of the information collected at ALMA from a collection of key unbiased indicators. We describe here the extraction, transformation and load process and how the data was processed. The main goal is to assess a software process and get insights from this information.

  17. SP-100 shield design automation process using expert system and heuristic search techniques

    International Nuclear Information System (INIS)

    Marcille, T.F.; Protsik, R.; Deane, N.A.; Hoover, D.G.

    1993-01-01

    The SP-100 shield subsystem design process has been modified to utilize the GE Corporate Reserch and Development program, ENGINEOUS (Tong 1990). ENGINEOUS is a software system that automates the use of Computer Aided Engineering (CAE) analysis programs in the engineering design process. The shield subsystem design process incorporates a nuclear subsystems design and performance code, a two-dimensional neutral particle transport code, several input processors and two general purpose neutronic output processors. Coupling these programs within ENGINEOUS provides automatic transition paths between applications, with no source code modifications. ENGINEOUS captures human design knowledge, as well as information about the specific CAE applications and stores this information in knowledge base files. The knowledge base information is used by the ENGINEOUS expert system to drive knowledge directed and knowledge supplemented search modules to find an optimum shield design for a given reactor definition, ensuring that specified constraints are satisfied. Alternate designs, not accommodated in the optimization design rules, can readily be explored through the use of a parametric study capability

  18. Software Defined Common Processing System (SDCPS), Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Coherent Logix, Incorporated proposes the Software Defined Common Processing System (SDCPS) program to facilitate the development of a Software Defined Radio...

  19. PONDEROSA-C/S: client-server based software package for automated protein 3D structure determination.

    Science.gov (United States)

    Lee, Woonghee; Stark, Jaime L; Markley, John L

    2014-11-01

    Peak-picking Of Noe Data Enabled by Restriction Of Shift Assignments-Client Server (PONDEROSA-C/S) builds on the original PONDEROSA software (Lee et al. in Bioinformatics 27:1727-1728. doi: 10.1093/bioinformatics/btr200, 2011) and includes improved features for structure calculation and refinement. PONDEROSA-C/S consists of three programs: Ponderosa Server, Ponderosa Client, and Ponderosa Analyzer. PONDEROSA-C/S takes as input the protein sequence, a list of assigned chemical shifts, and nuclear Overhauser data sets ((13)C- and/or (15)N-NOESY). The output is a set of assigned NOEs and 3D structural models for the protein. Ponderosa Analyzer supports the visualization, validation, and refinement of the results from Ponderosa Server. These tools enable semi-automated NMR-based structure determination of proteins in a rapid and robust fashion. We present examples showing the use of PONDEROSA-C/S in solving structures of four proteins: two that enable comparison with the original PONDEROSA package, and two from the Critical Assessment of automated Structure Determination by NMR (Rosato et al. in Nat Methods 6:625-626. doi: 10.1038/nmeth0909-625 , 2009) competition. The software package can be downloaded freely in binary format from http://pine.nmrfam.wisc.edu/download_packages.html. Registered users of the National Magnetic Resonance Facility at Madison can submit jobs to the PONDEROSA-C/S server at http://ponderosa.nmrfam.wisc.edu, where instructions, tutorials, and instructions can be found. Structures are normally returned within 1-2 days.

  20. Operations management system advanced automation: Fault detection isolation and recovery prototyping

    Science.gov (United States)

    Hanson, Matt

    1990-01-01

    The purpose of this project is to address the global fault detection, isolation and recovery (FDIR) requirements for Operation's Management System (OMS) automation within the Space Station Freedom program. This shall be accomplished by developing a selected FDIR prototype for the Space Station Freedom distributed processing systems. The prototype shall be based on advanced automation methodologies in addition to traditional software methods to meet the requirements for automation. A secondary objective is to expand the scope of the prototyping to encompass multiple aspects of station-wide fault management (SWFM) as discussed in OMS requirements documentation.