WorldWideScience

Sample records for automatic software processing

  1. Using Automatic Code Generation in the Attitude Control Flight Software Engineering Process

    Science.gov (United States)

    McComas, David; O'Donnell, James R., Jr.; Andrews, Stephen F.

    1999-01-01

    This paper presents an overview of the attitude control subsystem flight software development process, identifies how the process has changed due to automatic code generation, analyzes each software development phase in detail, and concludes with a summary of our lessons learned.

  2. Automatic sample changer control software for automation of neutron activation analysis process in Malaysian Nuclear Agency

    Science.gov (United States)

    Yussup, N.; Ibrahim, M. M.; Rahman, N. A. A.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.; Lombigit, L.; Azman, A.; Omar, S. A.

    2018-01-01

    Most of the procedures in neutron activation analysis (NAA) process that has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s were performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient especially for sample counting and measurement process. The sample needs to be changed and the measurement software needs to be setup for every one hour counting time. Both of these procedures are performed manually for every sample. Hence, an automatic sample changer system (ASC) that consists of hardware and software is developed to automate sample counting process for up to 30 samples consecutively. This paper describes the ASC control software for NAA process which is designed and developed to control the ASC hardware and call GammaVision software for sample measurement. The software is developed by using National Instrument LabVIEW development package.

  3. ASAP (Automatic Software for ASL Processing): A toolbox for processing Arterial Spin Labeling images.

    Science.gov (United States)

    Mato Abad, Virginia; García-Polo, Pablo; O'Daly, Owen; Hernández-Tamames, Juan Antonio; Zelaya, Fernando

    2016-04-01

    The method of Arterial Spin Labeling (ASL) has experienced a significant rise in its application to functional imaging, since it is the only technique capable of measuring blood perfusion in a truly non-invasive manner. Currently, there are no commercial packages for processing ASL data and there is no recognized standard for normalizing ASL data to a common frame of reference. This work describes a new Automated Software for ASL Processing (ASAP) that can automatically process several ASL datasets. ASAP includes functions for all stages of image pre-processing: quantification, skull-stripping, co-registration, partial volume correction and normalization. To assess the applicability and validity of the toolbox, this work shows its application in the study of hypoperfusion in a sample of healthy subjects at risk of progressing to Alzheimer's disease. ASAP requires limited user intervention, minimizing the possibility of random and systematic errors, and produces cerebral blood flow maps that are ready for statistical group analysis. The software is easy to operate and results in excellent quality of spatial normalization. The results found in this evaluation study are consistent with previous studies that find decreased perfusion in Alzheimer's patients in similar regions and demonstrate the applicability of ASAP. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Software of the BESM-6 computer for automatic image processing from liquid-hydrogen bubble chambers

    International Nuclear Information System (INIS)

    Grebenikov, E.A.; Kiosa, M.N.; Kobzarev, K.K.; Kuznetsova, N.A.; Mironov, S.V.; Nasonova, L.P.

    1978-01-01

    A set of programs, which is used in ''road guidance'' mode on the BESM-6 computer to process picture information taken in liquid hydrogen bubble chambers is discussed. This mode allows the system to process data from an automatic scanner (AS) taking into account the results of manual scanning. The system hardware includes: an automatic scanner, an M-6000 mini-controller and a BESM-6 computer. Software is functionally divided into the following units: computation of event mask parameters and generation . of data files controlling the AS; front-end processing of data coming from the AS; filtering of track data; simulation of AS operation and gauging of the AS reference system. To speed up the overall performance, programs which receive and decode data, coming from the AS via the M-6000 controller and the data link to the BESM-6 computer, are written in machine language

  5. Mirion--a software package for automatic processing of mass spectrometric images.

    Science.gov (United States)

    Paschke, C; Leisner, A; Hester, A; Maass, K; Guenther, S; Bouschen, W; Spengler, B

    2013-08-01

    Mass spectrometric imaging (MSI) techniques are of growing interest for the Life Sciences. In recent years, the development of new instruments employing ion sources that are tailored for spatial scanning allowed the acquisition of large data sets. A subsequent data processing, however, is still a bottleneck in the analytical process, as a manual data interpretation is impossible within a reasonable time frame. The transformation of mass spectrometric data into spatial distribution images of detected compounds turned out to be the most appropriate method to visualize the results of such scans, as humans are able to interpret images faster and easier than plain numbers. Image generation, thus, is a time-consuming and complex yet very efficient task. The free software package "Mirion," presented in this paper, allows the handling and analysis of data sets acquired by mass spectrometry imaging. Mirion can be used for image processing of MSI data obtained from many different sources, as it uses the HUPO-PSI-based standard data format imzML, which is implemented in the proprietary software of most of the mass spectrometer companies. Different graphical representations of the recorded data are available. Furthermore, automatic calculation and overlay of mass spectrometric images promotes direct comparison of different analytes for data evaluation. The program also includes tools for image processing and image analysis.

  6. Automatic program generation: future of software engineering

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, J.H.

    1979-01-01

    At this moment software development is still more of an art than an engineering discipline. Each piece of software is lovingly engineered, nurtured, and presented to the world as a tribute to the writer's skill. When will this change. When will the craftsmanship be removed and the programs be turned out like so many automobiles from an assembly line. Sooner or later it will happen: economic necessities will demand it. With the advent of cheap microcomputers and ever more powerful supercomputers doubling capacity, much more software must be produced. The choices are to double the number of programers, double the efficiency of each programer, or find a way to produce the needed software automatically. Producing software automatically is the only logical choice. How will automatic programing come about. Some of the preliminary actions which need to be done and are being done are to encourage programer plagiarism of existing software through public library mechanisms, produce well understood packages such as compiler automatically, develop languages capable of producing software as output, and learn enough about the whole process of programing to be able to automate it. Clearly, the emphasis must not be on efficiency or size, since ever larger and faster hardware is coming.

  7. Safety and reliability of automatization software

    Energy Technology Data Exchange (ETDEWEB)

    Kapp, K; Daum, R [Karlsruhe Univ. (TH) (Germany, F.R.). Lehrstuhl fuer Angewandte Informatik, Transport- und Verkehrssysteme

    1979-02-01

    Automated technical systems have to meet very high requirements concerning safety, security and reliability. Today, modern computers, especially microcomputers, are used as integral parts of those systems. In consequence computer programs must work in a safe and reliable mannter. Methods are discussed which allow to construct safe and reliable software for automatic systems such as reactor protection systems and to prove that the safety requirements are met. As a result it is shown that only the method of total software diversification can satisfy all safety requirements at tolerable cost. In order to achieve a high degree of reliability, structured and modular programming in context with high level programming languages are recommended.

  8. Proposal for future diagnosis and management of vascular tumors by using automatic software for image processing and statistic prediction.

    Science.gov (United States)

    Popescu, M D; Draghici, L; Secheli, I; Secheli, M; Codrescu, M; Draghici, I

    2015-01-01

    Infantile Hemangiomas (IH) are the most frequent tumors of vascular origin, and the differential diagnosis from vascular malformations is difficult to establish. Specific types of IH due to the location, dimensions and fast evolution, can determine important functional and esthetic sequels. To avoid these unfortunate consequences it is necessary to establish the exact appropriate moment to begin the treatment and decide which the most adequate therapeutic procedure is. Based on clinical data collected by a serial clinical observations correlated with imaging data, and processed by a computer-aided diagnosis system (CAD), the study intended to develop a treatment algorithm to accurately predict the best final results, from the esthetical and functional point of view, for a certain type of lesion. The preliminary database was composed of 75 patients divided into 4 groups according to the treatment management they received: medical therapy, sclerotherapy, surgical excision and no treatment. The serial clinical observation was performed each month and all the data was processed by using CAD. The project goal was to create a software that incorporated advanced methods to accurately measure the specific IH lesions, integrated medical information, statistical methods and computational methods to correlate this information with that obtained from the processing of images. Based on these correlations, a prediction mechanism of the evolution of hemangioma, which helped determine the best method of therapeutic intervention to minimize further complications, was established.

  9. Application of software technology to automatic test data analysis

    Science.gov (United States)

    Stagner, J. R.

    1991-01-01

    The verification process for a major software subsystem was partially automated as part of a feasibility demonstration. The methods employed are generally useful and applicable to other types of subsystems. The effort resulted in substantial savings in test engineer analysis time and offers a method for inclusion of automatic verification as a part of regression testing.

  10. Software Process Improvement Defined

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2002-01-01

    This paper argues in favor of the development of explanatory theory on software process improvement. The last one or two decades commitment to prescriptive approaches in software process improvement theory may contribute to the emergence of a gulf dividing theorists and practitioners....... It is proposed that this divide be met by the development of theory evaluating prescriptive approaches and informing practice with a focus on the software process policymaking and process control aspects of improvement efforts...

  11. Global Software Engineering: A Software Process Approach

    Science.gov (United States)

    Richardson, Ita; Casey, Valentine; Burton, John; McCaffery, Fergal

    Our research has shown that many companies are struggling with the successful implementation of global software engineering, due to temporal, cultural and geographical distance, which causes a range of factors to come into play. For example, cultural, project managementproject management and communication difficulties continually cause problems for software engineers and project managers. While the implementation of efficient software processes can be used to improve the quality of the software product, published software process models do not cater explicitly for the recent growth in global software engineering. Our thesis is that global software engineering factors should be included in software process models to ensure their continued usefulness in global organisations. Based on extensive global software engineering research, we have developed a software process, Global Teaming, which includes specific practices and sub-practices. The purpose is to ensure that requirements for successful global software engineering are stipulated so that organisations can ensure successful implementation of global software engineering.

  12. Managing Software Process Evolution

    DEFF Research Database (Denmark)

    This book focuses on the design, development, management, governance and application of evolving software processes that are aligned with changing business objectives, such as expansion to new domains or shifting to global production. In the context of an evolving business world, it examines...... the complete software process lifecycle, from the initial definition of a product to its systematic improvement. In doing so, it addresses difficult problems, such as how to implement processes in highly regulated domains or where to find a suitable notation system for documenting processes, and provides...... essential insights and tips to help readers manage process evolutions. And last but not least, it provides a wealth of examples and cases on how to deal with software evolution in practice. Reflecting these topics, the book is divided into three parts. Part 1 focuses on software business transformation...

  13. Software engineering and automatic continuous verification of scientific software

    Science.gov (United States)

    Piggott, M. D.; Hill, J.; Farrell, P. E.; Kramer, S. C.; Wilson, C. R.; Ham, D.; Gorman, G. J.; Bond, T.

    2011-12-01

    Software engineering of scientific code is challenging for a number of reasons including pressure to publish and a lack of awareness of the pitfalls of software engineering by scientists. The Applied Modelling and Computation Group at Imperial College is a diverse group of researchers that employ best practice software engineering methods whilst developing open source scientific software. Our main code is Fluidity - a multi-purpose computational fluid dynamics (CFD) code that can be used for a wide range of scientific applications from earth-scale mantle convection, through basin-scale ocean dynamics, to laboratory-scale classic CFD problems, and is coupled to a number of other codes including nuclear radiation and solid modelling. Our software development infrastructure consists of a number of free tools that could be employed by any group that develops scientific code and has been developed over a number of years with many lessons learnt. A single code base is developed by over 30 people for which we use bazaar for revision control, making good use of the strong branching and merging capabilities. Using features of Canonical's Launchpad platform, such as code review, blueprints for designing features and bug reporting gives the group, partners and other Fluidity uers an easy-to-use platform to collaborate and allows the induction of new members of the group into an environment where software development forms a central part of their work. The code repositoriy are coupled to an automated test and verification system which performs over 20,000 tests, including unit tests, short regression tests, code verification and large parallel tests. Included in these tests are build tests on HPC systems, including local and UK National HPC services. The testing of code in this manner leads to a continuous verification process; not a discrete event performed once development has ceased. Much of the code verification is done via the "gold standard" of comparisons to analytical

  14. [Development of a Software for Automatically Generated Contours in Eclipse TPS].

    Science.gov (United States)

    Xie, Zhao; Hu, Jinyou; Zou, Lian; Zhang, Weisha; Zou, Yuxin; Luo, Kelin; Liu, Xiangxiang; Yu, Luxin

    2015-03-01

    The automatic generation of planning targets and auxiliary contours have achieved in Eclipse TPS 11.0. The scripting language autohotkey was used to develop a software for automatically generated contours in Eclipse TPS. This software is named Contour Auto Margin (CAM), which is composed of operational functions of contours, script generated visualization and script file operations. RESULTS Ten cases in different cancers have separately selected, in Eclipse TPS 11.0 scripts generated by the software could not only automatically generate contours but also do contour post-processing. For different cancers, there was no difference between automatically generated contours and manually created contours. The CAM is a user-friendly and powerful software, and can automatically generated contours fast in Eclipse TPS 11.0. With the help of CAM, it greatly save plan preparation time and improve working efficiency of radiation therapy physicists.

  15. Software Engineering Program: Software Process Improvement Guidebook

    Science.gov (United States)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  16. Automatic Visualization of Software Requirements: Reactive Systems

    International Nuclear Information System (INIS)

    Castello, R.; Mili, R.; Tollis, I.G.; Winter, V.

    1999-01-01

    In this paper we present an approach that facilitates the validation of high consequence system requirements. This approach consists of automatically generating a graphical representation from an informal document. Our choice of a graphical notation is statecharts. We proceed in two steps: we first extract a hierarchical decomposition tree from a textual description, then we draw a graph that models the statechart in a hierarchical fashion. The resulting drawing is an effective requirements assessment tool that allows the end user to easily pinpoint inconsistencies and incompleteness

  17. A Survey of Automatic Code Generating Software

    Science.gov (United States)

    1988-09-01

    ID that matches a name in the user’s database file and must then match the password specified for that user. A successful login will allow access to...Software Corp. P.O. Box 10089 Chicago, IL 60610 312-743-2755 PRO-2 Prodata, Inc. 4477 Emerald Suite C-100 Boise, ID 83706 208-342-6878 67 GTP Allen, Emerson

  18. Software Process Improvement

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Diebold, Philipp; Münch, Jürgen

    2016-01-01

    Software process improvement (SPI) is around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out...... to new specialized frameworks. New and specialized frameworks account for the majority of the contributions found (approx. 38%). Furthermore, we find a growing interest in success factors (approx. 16%) to aid companies in conducting SPI and in adapting agile principles and practices for SPI (approx. 10...

  19. Next generation software process improvement

    OpenAIRE

    Turnas, Daniel

    2003-01-01

    Approved for public release; distribution is unlimited Software is often developed under a process that can at best be described as ad hoc. While it is possible to develop quality software under an ad hoc process, formal processes can be developed to help increase the overall quality of the software under development. The application of these processes allows for an organization to mature. The software maturity level, and process improvement, of an organization can be measured with the Cap...

  20. Process mining software repositories

    NARCIS (Netherlands)

    Poncin, W.; Serebrenik, A.; Brand, van den M.G.J.

    2011-01-01

    Software developers' activities are in general recorded in software repositories such as version control systems, bug trackers and mail archives. While abundant information is usually present in such repositories, successful information extraction is often challenged by the necessity to

  1. Accuracy of Automatic Cephalometric Software on Landmark Identification

    Science.gov (United States)

    Anuwongnukroh, N.; Dechkunakorn, S.; Damrongsri, S.; Nilwarat, C.; Pudpong, N.; Radomsutthisarn, W.; Kangern, S.

    2017-11-01

    This study was to assess the accuracy of an automatic cephalometric analysis software in the identification of cephalometric landmarks. Thirty randomly selected digital lateral cephalograms of patients undergoing orthodontic treatment were used in this study. Thirteen landmarks (S, N, Or, A-point, U1T, U1A, B-point, Gn, Pog, Me, Go, L1T, and L1A) were identified on the digital image by an automatic cephalometric software and on cephalometric tracing by manual method. Superimposition of printed image and manual tracing was done by registration at the soft tissue profiles. The accuracy of landmarks located by the automatic method was compared with that of the manually identified landmarks by measuring the mean differences of distances of each landmark on the Cartesian plane where X and Y coordination axes passed through the center of ear rod. One-Sample T test was used to evaluate the mean differences. Statistically significant mean differences (pmean differences in both horizontal and vertical directions. Small mean differences (mean differences were found for A-point (3.0 4mm) in vertical direction. Only 5 of 13 landmarks (38.46%; S, N, Gn, Pog, and Go) showed no significant mean difference between the automatic and manual landmarking methods. It is concluded that if this automatic cephalometric analysis software is used for orthodontic diagnosis, the orthodontist must correct or modify the position of landmarks in order to increase the accuracy of cephalometric analysis.

  2. Software Process Improvement

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Konopka, Claudia; Nellemann, Peter

    2016-01-01

    directions. An analysis of 635 publications draws a big picture of SPI-related research of the past 25 years. Our study shows a high number of solution proposals, experience reports, and secondary studies, but only few theories. In particular, standard SPI models are analyzed and evaluated for applicability......Software process improvement (SPI) is around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out...... there? Are there new emerging approaches? What are open issues? Still, we struggle to answer the question for what is the current state of SPI and related research? We present initial results from a systematic mapping study to shed light on the field of SPI and to draw conclusions for future research...

  3. Shaping electromagnetic waves using software-automatically-designed metasurfaces.

    Science.gov (United States)

    Zhang, Qian; Wan, Xiang; Liu, Shuo; Yuan Yin, Jia; Zhang, Lei; Jun Cui, Tie

    2017-06-15

    We present a fully digital procedure of designing reflective coding metasurfaces to shape reflected electromagnetic waves. The design procedure is completely automatic, controlled by a personal computer. In details, the macro coding units of metasurface are automatically divided into several types (e.g. two types for 1-bit coding, four types for 2-bit coding, etc.), and each type of the macro coding units is formed by discretely random arrangement of micro coding units. By combining an optimization algorithm and commercial electromagnetic software, the digital patterns of the macro coding units are optimized to possess constant phase difference for the reflected waves. The apertures of the designed reflective metasurfaces are formed by arranging the macro coding units with certain coding sequence. To experimentally verify the performance, a coding metasurface is fabricated by automatically designing two digital 1-bit unit cells, which are arranged in array to constitute a periodic coding metasurface to generate the required four-beam radiations with specific directions. Two complicated functional metasurfaces with circularly- and elliptically-shaped radiation beams are realized by automatically designing 4-bit macro coding units, showing excellent performance of the automatic designs by software. The proposed method provides a smart tool to realize various functional devices and systems automatically.

  4. Automatic calculations of electroweak processes

    International Nuclear Information System (INIS)

    Ishikawa, T.; Kawabata, S.; Kurihara, Y.; Shimizu, Y.; Kaneko, T.; Kato, K.; Tanaka, H.

    1996-01-01

    GRACE system is an excellent tool for calculating the cross section and for generating event of the elementary process automatically. However it is not always easy for beginners to use. An interactive version of GRACE is being developed so as to be a user friendly system. Since it works exactly in the same environment as PAW, all functions of PAW are available for handling any histogram information produced by GRACE. As its application the cross sections of all elementary processes with up to 5-body final states induced by e + e - interaction are going to be calculated and to be summarized as a catalogue. (author)

  5. Current position on software for the automatic data acquisition system

    International Nuclear Information System (INIS)

    1988-01-01

    This report describes the current concepts for software to control the operation of the Automatic Data Acquisition System (ADAS) proposed for the Deaf Smith County, Texas, Exploratory Shaft Facility (ESF). The purpose of this report is to provide conceptual details of how the ADAS software will execute the data acquisition function, and how the software will make collected information available to the test personnel, the Data Management Group (DMG), and other authorized users. It is not intended that this report describe all of the ADAS functions in exact detail, but the concepts included herein will form the basis for the formal ADAS functional requirements definition document. 5 refs., 14 figs

  6. Effectiveness of an Automatic Tracking Software in Underwater Motion Analysis

    Directory of Open Access Journals (Sweden)

    Fabrício A. Magalhaes

    2013-12-01

    Full Text Available Tracking of markers placed on anatomical landmarks is a common practice in sports science to perform the kinematic analysis that interests both athletes and coaches. Although different software programs have been developed to automatically track markers and/or features, none of them was specifically designed to analyze underwater motion. Hence, this study aimed to evaluate the effectiveness of a software developed for automatic tracking of underwater movements (DVP, based on the Kanade-Lucas-Tomasi feature tracker. Twenty-one video recordings of different aquatic exercises (n = 2940 markers’ positions were manually tracked to determine the markers’ center coordinates. Then, the videos were automatically tracked using DVP and a commercially available software (COM. Since tracking techniques may produce false targets, an operator was instructed to stop the automatic procedure and to correct the position of the cursor when the distance between the calculated marker’s coordinate and the reference one was higher than 4 pixels. The proportion of manual interventions required by the software was used as a measure of the degree of automation. Overall, manual interventions were 10.4% lower for DVP (7.4% than for COM (17.8%. Moreover, when examining the different exercise modes separately, the percentage of manual interventions was 5.6% to 29.3% lower for DVP than for COM. Similar results were observed when analyzing the type of marker rather than the type of exercise, with 9.9% less manual interventions for DVP than for COM. In conclusion, based on these results, the developed automatic tracking software presented can be used as a valid and useful tool for underwater motion analysis.

  7. What Counts in Software Process?

    DEFF Research Database (Denmark)

    Cohn, Marisa

    2009-01-01

    and conversations in negotiating between prescriptions from a model and the contingencies that arise in an enactment. A qualitative field study at two Agile software development companies was conducted to investigate the role of artifacts in the software development work and the relationship between these artifacts...... and the Software Process. Documentation of software requirements is a major concern among software developers and software researchers. Agile software development denotes a different relationship to documentation, one that warrants investigation. Empirical findings are presented which suggest a new understanding...

  8. Entropy based software processes improvement

    NARCIS (Netherlands)

    Trienekens, J.J.M.; Kusters, R.J.; Kriek, D.; Siemons, P.

    2009-01-01

    Actual results of software process improvement projects show different levels of success. Although many software development organisations have adopted improvement models such as CMMI, it appears to be difficult to improve software development processes in the right way, e.g. tuned to the actual

  9. Software quality testing process analysis

    OpenAIRE

    Mera Paz, Julián

    2016-01-01

    Introduction: This article is the result of reading, review, analysis of books, magazines and articles well known for their scientific and research quality, which have addressed the software quality testing process. The author, based on his work experience in software development companies, teaching and other areas, has compiled and selected information to argue and substantiate the importance of the software quality testing process. Methodology: the existing literature on the software qualit...

  10. Software process in Geant4

    International Nuclear Information System (INIS)

    Cosmo, G.

    2001-01-01

    Since its erliest years of R and D, the GEANT4 simulation toolkit has been developed following software process standards which dictated the overall evolution of the project. The complexity of the software involved, the wide areas of application of the software product, the huge amount of code and Category complexity, the size and distributed nature of the Collaboration itself are all ingredients which involve and correlate together a wide variety of software processes. Although in 'production' and available to the public since December 1998, the GEANT4 software product includes Category Domains which are still under active development. Therefore they require different treatment also in terms of improvement of the development cycle, system testing and user support. The author is meant to describe some of the software processes as they are applied in GEANT4 for both development, testing and maintenance of the software

  11. Next Generation Software Process Improvement

    National Research Council Canada - National Science Library

    Turnas, Daniel

    2003-01-01

    .... The application of these processes allows for an organization to mature. The software maturity level, and process improvement, of an organization can be measured with the Capability Maturity Model...

  12. Natural language processing techniques for automatic test ...

    African Journals Online (AJOL)

    Natural language processing techniques for automatic test questions generation using discourse connectives. ... PROMOTING ACCESS TO AFRICAN RESEARCH. AFRICAN JOURNALS ... Journal of Computer Science and Its Application.

  13. Software Sub-system in Loading Automatic Test System for the Measurement of Power Line Filters

    Directory of Open Access Journals (Sweden)

    Yu Bo

    2017-01-01

    Full Text Available The loading automatic test system for measurement of power line filters are in urgent demand. So the software sub-system of the whole test system was proposed. Methods: structured the test system based on the virtual instrument framework, which consisted of lower and up computer and adopted the top down approach of design to perform the system and its modules, according to the measurement principle of the test system. Results: The software sub-system including human machine interface, data analysis and process software, expert system, communication software, control software in lower computer, etc. had been designed. Furthermore, it had been integrated into the entire test system. Conclusion: This sub-system provided a fiendly software platform for the whole test system, and had many advantages such as strong functions, high performances, low prices. It not only raises the test efficiency of EMI filters, but also renders some creativities.

  14. Software Development Standard Processes (SDSP)

    Science.gov (United States)

    Lavin, Milton L.; Wang, James J.; Morillo, Ronald; Mayer, John T.; Jamshidian, Barzia; Shimizu, Kenneth J.; Wilkinson, Belinda M.; Hihn, Jairus M.; Borgen, Rosana B.; Meyer, Kenneth N.; hide

    2011-01-01

    A JPL-created set of standard processes is to be used throughout the lifecycle of software development. These SDSPs cover a range of activities, from management and engineering activities, to assurance and support activities. These processes must be applied to software tasks per a prescribed set of procedures. JPL s Software Quality Improvement Project is currently working at the behest of the JPL Software Process Owner to ensure that all applicable software tasks follow these procedures. The SDSPs are captured as a set of 22 standards in JPL s software process domain. They were developed in-house at JPL by a number of Subject Matter Experts (SMEs) residing primarily within the Engineering and Science Directorate, but also from the Business Operations Directorate and Safety and Mission Success Directorate. These practices include not only currently performed best practices, but also JPL-desired future practices in key thrust areas like software architecting and software reuse analysis. Additionally, these SDSPs conform to many standards and requirements to which JPL projects are beholden.

  15. Software quality: Process or people

    Science.gov (United States)

    Palmer, Regina; Labaugh, Modenna

    1993-01-01

    This paper will present data related to software development processes and personnel involvement from the perspective of software quality assurance. We examine eight years of data collected from six projects. Data collected varied by project but usually included defect and fault density with limited use of code metrics, schedule adherence, and budget growth information. The data are a blend of AFSCP 800-14 and suggested productivity measures in Software Metrics: A Practioner's Guide to Improved Product Development. A software quality assurance database tool, SQUID, was used to store and tabulate the data.

  16. Image processing. A system for the automatic sorting of chromosomes

    International Nuclear Information System (INIS)

    Najai, Amor

    1977-01-01

    The present paper deals with two aspects of the system: - an automata (specialized hardware) dedicated to image processing. Images are digitized, divided into sub-units and computations are carried out on their main parameters. - A software for the automatic recognition and sorting of chromosomes is implemented on a Multi-20 minicomputer, connected to the automata. (author) [fr

  17. A system for automatic evaluation of simulation software

    Science.gov (United States)

    Ryan, J. P.; Hodges, B. C.

    1976-01-01

    Within the field of computer software, simulation and verification are complementary processes. Simulation methods can be used to verify software by performing variable range analysis. More general verification procedures, such as those described in this paper, can be implicitly, viewed as attempts at modeling the end-product software. From software requirement methodology, each component of the verification system has some element of simulation to it. Conversely, general verification procedures can be used to analyze simulation software. A dynamic analyzer is described which can be used to obtain properly scaled variables for an analog simulation, which is first digitally simulated. In a similar way, it is thought that the other system components and indeed the whole system itself have the potential of being effectively used in a simulation environment.

  18. The MINERVA Software Development Process

    Science.gov (United States)

    Narkawicz, Anthony; Munoz, Cesar A.; Dutle, Aaron M.

    2017-01-01

    This paper presents a software development process for safety-critical software components of cyber-physical systems. The process is called MINERVA, which stands for Mirrored Implementation Numerically Evaluated against Rigorously Verified Algorithms. The process relies on formal methods for rigorously validating code against its requirements. The software development process uses: (1) a formal specification language for describing the algorithms and their functional requirements, (2) an interactive theorem prover for formally verifying the correctness of the algorithms, (3) test cases that stress the code, and (4) numerical evaluation on these test cases of both the algorithm specifications and their implementations in code. The MINERVA process is illustrated in this paper with an application to geo-containment algorithms for unmanned aircraft systems. These algorithms ensure that the position of an aircraft never leaves a predetermined polygon region and provide recovery maneuvers when the region is inadvertently exited.

  19. Patterns of Software Development Process

    Directory of Open Access Journals (Sweden)

    Sandro Javier Bolaños Castro

    2011-12-01

    Full Text Available "Times New Roman","serif";mso-fareast-font-family:"Times New Roman";mso-ansi-language:EN-US;mso-fareast-language:EN-US;mso-bidi-language:AR-SA">This article presents a set of patterns that can be found to perform best practices in software processes that are directly related to the problem of implementing the activities of the process, the roles involved, the knowledge generated and the inputs and outputs belonging to the process. In this work, a definition of the architecture is encouraged by using different recurrent configurations that strengthen the process and yield efficient results for the development of a software project. The patterns presented constitute a catalog, which serves as a vocabulary for communication among project participants [1], [2], and also can be implemented through software tools, thus facilitating patterns implementation [3]. Additionally, a tool that can be obtained under GPL (General Public license is provided for this purpose

  20. Software engineering processes principles and applications

    CERN Document Server

    Wang, Yingxu

    2000-01-01

    Fundamentals of the Software Engineering ProcessIntroductionA Unified Framework of the Software Engineering ProcessProcess AlgebraProcess-Based Software EngineeringSoftware Engineering Process System ModelingThe CMM ModelThe ISO 9001 ModelThe BOOTSTRAP ModelThe ISO/IEC 15504 (SPICE) ModelThe Software Engineering Process Reference Model: SEPRMSoftware Engineering Process System AnalysisBenchmarking the SEPRM ProcessesComparative Analysis of Current Process ModelsTransformation of Capability Levels Between Current Process ModelsSoftware Engineering Process EstablishmentSoftware Process Establish

  1. Software framework for automatic learning of telescope operation

    Science.gov (United States)

    Rodríguez, Jose A.; Molgó, Jordi; Guerra, Dailos

    2016-07-01

    The "Gran Telescopio de Canarias" (GTC) is an optical-infrared 10-meter segmented mirror telescope at the ORM observatory in Canary Islands (Spain). The GTC Control System (GCS) is a distributed object and component oriented system based on RT-CORBA and it is responsible for the operation of the telescope, including its instrumentation. The current development state of GCS is mature and fully operational. On the one hand telescope users as PI's implement the sequences of observing modes of future scientific instruments that will be installed in the telescope and operators, in turn, design their own sequences for maintenance. On the other hand engineers develop new components that provide new functionality required by the system. This great work effort is possible to minimize so that costs are reduced, especially if one considers that software maintenance is the most expensive phase of the software life cycle. Could we design a system that allows the progressive assimilation of sequences of operation and maintenance of the telescope, through an automatic self-programming system, so that it can evolve from one Component oriented organization to a Service oriented organization? One possible way to achieve this is to use mechanisms of learning and knowledge consolidation to reduce to the minimum expression the effort to transform the specifications of the different telescope users to the operational deployments. This article proposes a framework for solving this problem based on the combination of the following tools: data mining, self-Adaptive software, code generation, refactoring based on metrics, Hierarchical Agglomerative Clustering and Service Oriented Architectures.

  2. Understanding flexible and distributed software development processes

    OpenAIRE

    Agerfalk, Par J.; Fitzgerald, Brian

    2006-01-01

    peer-reviewed The minitrack on Flexible and Distributed Software Development Processes addresses two important and partially intertwined current themes in software development: process flexibility and globally distributed software development

  3. Product-oriented Software Certification Process for Software Synthesis

    Science.gov (United States)

    Nelson, Stacy; Fischer, Bernd; Denney, Ewen; Schumann, Johann; Richardson, Julian; Oh, Phil

    2004-01-01

    The purpose of this document is to propose a product-oriented software certification process to facilitate use of software synthesis and formal methods. Why is such a process needed? Currently, software is tested until deemed bug-free rather than proving that certain software properties exist. This approach has worked well in most cases, but unfortunately, deaths still occur due to software failure. Using formal methods (techniques from logic and discrete mathematics like set theory, automata theory and formal logic as opposed to continuous mathematics like calculus) and software synthesis, it is possible to reduce this risk by proving certain software properties. Additionally, software synthesis makes it possible to automate some phases of the traditional software development life cycle resulting in a more streamlined and accurate development process.

  4. Development of Software for Automatic Analysis of Intervention in the Field of Homeopathy.

    Science.gov (United States)

    Jain, Rajesh Kumar; Goyal, Shagun; Bhat, Sushma N; Rao, Srinath; Sakthidharan, Vivek; Kumar, Prasanna; Sajan, Kannanaikal Rappayi; Jindal, Sameer Kumar; Jindal, Ghanshyam D

    2018-05-01

    To study the effect of homeopathic medicines (in higher potencies) in normal subjects, Peripheral Pulse Analyzer (PPA) has been used to record physiologic variability parameters before and after administration of the medicine/placebo in 210 normal subjects. Data have been acquired in seven rounds; placebo was administered in rounds 1 and 2 and medicine in potencies 6, 30, 200, 1 M, and 10 M was administered in rounds 3 to 7, respectively. Five different medicines in the said potencies were given to a group of around 40 subjects each. Although processing of data required human intervention, a software application has been developed to analyze the processed data and detect the response to eliminate the undue delay as well as human bias in subjective analysis. This utility named Automatic Analysis of Intervention in the Field of Homeopathy is run on the processed PPA data and the outcome has been compared with the manual analysis. The application software uses adaptive threshold based on statistics for detecting responses in contrast to fixed threshold used in manual analysis. The automatic analysis has detected 12.96% higher responses than subjective analysis. Higher response rates have been manually verified to be true positive. This indicates robustness of the application software. The automatic analysis software was run on another set of pulse harmonic parameters derived from the same data set to study cardiovascular susceptibility and 385 responses were detected in contrast to 272 of variability parameters. It was observed that 65% of the subjects, eliciting response, were common. This not only validates the software utility for giving consistent yield but also reveals the certainty of the response. This development may lead to electronic proving of homeopathic medicines (e-proving).

  5. A software engineering process for safety-critical software application

    International Nuclear Information System (INIS)

    Kang, Byung Heon; Kim, Hang Bae; Chang, Hoon Seon; Jeon, Jong Sun

    1995-01-01

    Application of computer software to safety-critical systems in on the increase. To be successful, the software must be designed and constructed to meet the functional and performance requirements of the system. For safety reason, the software must be demonstrated not only to meet these requirements, but also to operate safely as a component within the system. For longer-term cost consideration, the software must be designed and structured to ease future maintenance and modifications. This paper presents a software engineering process for the production of safety-critical software for a nuclear power plant. The presentation is expository in nature of a viable high quality safety-critical software development. It is based on the ideas of a rational design process and on the experience of the adaptation of such process in the production of the safety-critical software for the shutdown system number two of Wolsung 2, 3 and 4 nuclear power generation plants. This process is significantly different from a conventional process in terms of rigorous software development phases and software design techniques, The process covers documentation, design, verification and testing using mathematically precise notations and highly reviewable tabular format to specify software requirements and software requirements and software requirements and code against software design using static analysis. The software engineering process described in this paper applies the principle of information-hiding decomposition in software design using a modular design technique so that when a change is required or an error is detected, the affected scope can be readily and confidently located. it also facilitates a sense of high degree of confidence in the 'correctness' of the software production, and provides a relatively simple and straightforward code implementation effort. 1 figs., 10 refs. (Author)

  6. Software Process Improvement: Blueprints versus Recipes

    DEFF Research Database (Denmark)

    Aaen, Ivan

    2003-01-01

    Viewing software processes as blueprints emphasizes that design is separate from use, and thus that software process designers and users are independent. In the approach presented here, software processes are viewed as recipes; developers individually and collectively design their own software...... processes through facilitation, reflection, and improvisation. Udgivelsesdato: SEP-OCT...

  7. SPEM: Software Process Engineering Metamodel

    Directory of Open Access Journals (Sweden)

    Víctor Hugo Menéndez Domínguez

    2015-05-01

    Full Text Available Todas las organizaciones involucradas en el desarrollo de software necesitan establecer, gestionar y soportar el trabajo de desarrollo. El término “proceso de desarrollo de software” tiende a unificar todas las actividades y prácticas que cubren esas necesidades. Modelar el proceso de software es una forma para mejorar el desarrollo y la calidad de las aplicaciones resultantes. De entre todos los lenguajes existentes para el modelado de procesos, aquellos basados en productos de trabajo son los más adecuados. Uno de tales lenguajes es SPEM (Software Process Engineering Metamodel. SPEM fue creado por OMG (Object Management Group como un estándar de alto nivel, que está basado en MOF (MetaObject Facility y es un metamodelo UML (Uniform Model Language. Constituye un tipo de ontología de procesos de desarrollo de software. En este artículo se ofrece una descripción, en términos generales, del estándar SPEM. También se destacan los cambios que ha experimentado entre la versión 1.1 y la versión 2.0, presentando tanto las ventajas como las desventajas encontradas entre ambas versiones.

  8. Software approach to automatic patching of analog computer

    Science.gov (United States)

    1973-01-01

    The Automatic Patching Verification program (APV) is described which provides the hybrid computer programmer with a convenient method of performing a static check of the analog portion of his study. The static check insures that the program is patched as specified, and that the computing components being used are operating correctly. The APV language the programmer uses to specify his conditions and interconnections is similar to the FORTRAN language in syntax. The APV control program reads APV source program statements from an assigned input device. Each source program statement is processed immediately after it is read. A statement may select an analog console, set an analog mode, set a potentiometer or DAC, or read from the analog console and perform a test. Statements are read and processed sequentially. If an error condition is detected, an output occurs on an assigned output device. When an end statement is read, the test is terminated.

  9. Automatically processed alpha-track radon monitor

    International Nuclear Information System (INIS)

    Langner, G.H. Jr.

    1993-01-01

    An automatically processed alpha-track radon monitor is provided which includes a housing having an aperture allowing radon entry, and a filter that excludes the entry of radon daughters into the housing. A flexible track registration material is located within the housing that records alpha-particle emissions from the decay of radon and radon daughters inside the housing. The flexible track registration material is capable of being spliced such that the registration material from a plurality of monitors can be spliced into a single strip to facilitate automatic processing of the registration material from the plurality of monitors. A process for the automatic counting of radon registered by a radon monitor is also provided

  10. Automatic and strategic processes in advertising effects

    DEFF Research Database (Denmark)

    Grunert, Klaus G.

    1996-01-01

    , the retrieval of information, and provide a heuristic for brand evaluation. Strategic processes govern learning and inference formation. T relative importance of both types of processes will depend on product involvement. The distinction of these two types of processes leads to some conclusions which...... are at variance with current notions about advertising effects. For example, the att span problem will be relevant only for strategic processes, not for automatic processes, a certain amount of learning can occur with very little conscious effort, and advertising's effect on brand evaluation may be more stable......Two kinds of cognitive processes can be distinguished: Automatic processes, which are mostly subconscious, are learned and changed very slowly, and are not subject to the capacity limitations of working memory, and strategic processes, which are conscious, are subject to capacity limitations...

  11. Software process improvement, quality assurance and measurement

    NARCIS (Netherlands)

    Trienekens, J.J.M.; Kusters, R.J.; Balla, K.; Kontogiannis, K.; Zou, Y.; Di Penta, M.

    2006-01-01

    The aim of this workshop was to present and discuss emergent software quality improvement approaches, with an emphasis on practical applications. Different views on the improvement of software processes, software products, and their interrelations, have been addressed during the workshop.

  12. Development of automatic navigation measuring system using template-matching software in image guided neurosurgery

    International Nuclear Information System (INIS)

    Watanabe, Yohei; Hayashi, Yuichiro; Fujii, Masazumi; Wakabayashi, Toshihiko; Kimura, Miyuki; Tsuzaka, Masatoshi; Sugiura, Akihiro

    2010-01-01

    An image-guided neurosurgery and neuronavigation system based on magnetic resonance imaging has been used as an indispensable tool for resection of brain tumors. Therefore, accuracy of the neuronavigation system, provided by periodic quality assurance (QA), is essential for image-guided neurosurgery. Two types of accuracy index, fiducial registration error (FRE) and target registration error (TRE), have been used to evaluate navigation accuracy. FRE shows navigation accuracy on points that have been registered. On the other hand, TRE shows navigation accuracy on points such as tumor, skin, and fiducial markers. This study shows that TRE is more reliable than FRE. However, calculation of TRE is a time-consuming, subjective task. Software for QA was developed to compute TRE. This software calculates TRE automatically by an image processing technique, such as automatic template matching. TRE was calculated by the software and compared with the results obtained by manual calculation. Using the software made it possible to achieve a reliable QA system. (author)

  13. FITS Liberator: Image processing software

    Science.gov (United States)

    Lindberg Christensen, Lars; Nielsen, Lars Holm; Nielsen, Kaspar K.; Johansen, Teis; Hurt, Robert; de Martin, David

    2012-06-01

    The ESA/ESO/NASA FITS Liberator makes it possible to process and edit astronomical science data in the FITS format to produce stunning images of the universe. Formerly a plugin for Adobe Photoshop, the current version of FITS Liberator is a stand-alone application and no longer requires Photoshop. This image processing software makes it possible to create color images using raw observations from a range of telescopes; the FITS Liberator continues to support the FITS and PDS formats, preferred by astronomers and planetary scientists respectively, which enables data to be processed from a wide range of telescopes and planetary probes, including ESO's Very Large Telescope, the NASA/ESA Hubble Space Telescope, NASA's Spitzer Space Telescope, ESA's XMM-Newton Telescope and Cassini-Huygens or Mars Reconnaissance Orbiter.

  14. Software process improvement in the NASA software engineering laboratory

    Science.gov (United States)

    Mcgarry, Frank; Pajerski, Rose; Page, Gerald; Waligora, Sharon; Basili, Victor; Zelkowitz, Marvin

    1994-01-01

    The Software Engineering Laboratory (SEL) was established in 1976 for the purpose of studying and measuring software processes with the intent of identifying improvements that could be applied to the production of ground support software within the Flight Dynamics Division (FDD) at the National Aeronautics and Space Administration (NASA)/Goddard Space Flight Center (GSFC). The SEL has three member organizations: NASA/GSFC, the University of Maryland, and Computer Sciences Corporation (CSC). The concept of process improvement within the SEL focuses on the continual understanding of both process and product as well as goal-driven experimentation and analysis of process change within a production environment.

  15. Developing free software for automatic registration for the quality control of IMRT with movies

    International Nuclear Information System (INIS)

    Moral, F. del; Meilan, E.; Pereira, L.; Salvador, F.; Munoz, V.; Salgado, M.

    2011-01-01

    In this work, as the commissioner of the e-JMRT, a Monte Carlo calculation network for IMRT planning, has developed software for the automatic recording of the image of the film with the results of the planning system.

  16. How Does Software Process Improvement Address Global Software Engineering?

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Diebold, Philipp; Münch, Jürgen

    2016-01-01

    For decades, Software Process Improvement (SPI) programs have been implemented, inter alia, to improve quality and speed of software development. To set up, guide, and carry out SPI projects, and to measure SPI state, impact, and success, a multitude of different SPI approaches and considerable...

  17. Interface unit and software of X-ray television automatic machine

    International Nuclear Information System (INIS)

    Molodykh, V.A.; Yamanaev, M.S.

    1983-01-01

    Description of the interface unit and specialized software of X-ray television automatic machine is presented. An algorithm for automatic defect survey, measuring of defect geometric parameters with a successive estimate of control quality in accordance with technical norms is proposed. Experimental investigation results on the quality of welded joints of steel tubes obtained using the above system are summarized

  18. Formal Verification of Digital Protection Logic and Automatic Testing Software

    Energy Technology Data Exchange (ETDEWEB)

    Cha, S. D.; Ha, J. S.; Seo, J. S. [KAIST, Daejeon (Korea, Republic of)

    2008-06-15

    - Technical aspect {center_dot} It is intended that digital I and C software have safety and reliability. Project results help the software to acquire license. Software verification technique, which results in this project, can be to use for digital NPP(Nuclear power plant) in the future. {center_dot} This research introduces many meaningful results of verification on digital protection logic and suggests I and C software testing strategy. These results apply to verify nuclear fusion device, accelerator, nuclear waste management and nuclear medical device that require dependable software and high-reliable controller. Moreover, These can be used for military, medical or aerospace-related software. - Economical and industrial aspect {center_dot} Since safety of digital I and C software is highly import, It is essential for the software to be verified. But verification and licence acquisition related to digital I and C software face high cost. This project gives economic profit to domestic economy by using introduced verification and testing technique instead of foreign technique. {center_dot} The operation rate of NPP will rise, when NPP safety critical software is verified with intellectual V and V tool. It is expected that these software substitute safety-critical software that wholly depend on foreign. Consequently, the result of this project has high commercial value and the recognition of the software development works will be able to be spread to the industrial circles. - Social and cultural aspect People expect that nuclear power generation contributes to relieving environmental problems because that does not emit more harmful air pollution source than other power generations. To give more trust and expectation about nuclear power generation to our society, we should make people to believe that NPP is highly safe system. In that point of view, we can present high-reliable I and C proofed by intellectual V and V technique as evidence

  19. The software development process in worldwide collaborations

    International Nuclear Information System (INIS)

    Amako, K.

    1998-01-01

    High energy physics experiments in future colliders are inevitably large scale international collaborations. In these experiments, software development has to be done by a large number of physicists, software engineers and computer scientists, dispersed all over the world. The major subject of this paper is to discuss on various aspects of software development in the worldwide environment. These include software engineering and methodology, software development process and management. (orig.)

  20. Mapping social networks in software process improvement

    DEFF Research Database (Denmark)

    Tjørnehøj, Gitte; Nielsen, Peter Axel

    2005-01-01

    Software process improvement in small, agile organizations is often problematic. Model-based approaches seem to overlook problems. We have been seeking an alternative approach to overcome this through action research. Here we report on a piece of action research from which we developed an approach...... to map social networks and suggest how it can be used in software process improvement. We applied the mapping approach in a small software company to support the realization of new ways of improving software processes. The mapping approach was found useful in improving social networks, and thus furthers...... software process improvement....

  1. Learning algorithms and automatic processing of languages

    International Nuclear Information System (INIS)

    Fluhr, Christian Yves Andre

    1977-01-01

    This research thesis concerns the field of artificial intelligence. It addresses learning algorithms applied to automatic processing of languages. The author first briefly describes some mechanisms of human intelligence in order to describe how these mechanisms are simulated on a computer. He outlines the specific role of learning in various manifestations of intelligence. Then, based on the Markov's algorithm theory, the author discusses the notion of learning algorithm. Two main types of learning algorithms are then addressed: firstly, an 'algorithm-teacher dialogue' type sanction-based algorithm which aims at learning how to solve grammatical ambiguities in submitted texts; secondly, an algorithm related to a document system which structures semantic data automatically obtained from a set of texts in order to be able to understand by references to any question on the content of these texts

  2. Online Rule Generation Software Process Model

    OpenAIRE

    Sudeep Marwaha; Alka Aroa; Satma M C; Rajni Jain; R C Goyal

    2013-01-01

    For production systems like expert systems, a rule generation software can facilitate the faster deployment. The software process model for rule generation using decision tree classifier refers to the various steps required to be executed for the development of a web based software model for decision rule generation. The Royce’s final waterfall model has been used in this paper to explain the software development process. The paper presents the specific output of various steps of modified wat...

  3. Design and Implementation of Software for Resistance Welding Process Simulations

    DEFF Research Database (Denmark)

    Zhang, Wenqi

    2003-01-01

    Based on long time engineering research and dedicated collaborations with industry, a new welding software, SORPAS, has been developed for simulation of resistance projection and spot welding processes applying the powerful finite element method (FEM). In order to make the software directly usabl...... equipment manufacturers...... by engineers and technicians in industry, all of the important parameters in resistance welding are considered and automatically implemented into the software. With the specially designed graphic user interface for Windows, engineers (even without prior knowledge of FEM) can quickly learn and easily operate...... and utilize the software. All industrial users, including welding engineers from DaimlerChrysler, Volkswangen, PSA Peugeot Citroen, VOLVO, Siemens, ABB and so on, have started using the software just after taking a one-day training course. With the user-friendly facilities for flexible geometric design...

  4. Scheduling algorithms for automatic control systems for technological processes

    Science.gov (United States)

    Chernigovskiy, A. S.; Tsarev, R. Yu; Kapulin, D. V.

    2017-01-01

    Wide use of automatic process control systems and the usage of high-performance systems containing a number of computers (processors) give opportunities for creation of high-quality and fast production that increases competitiveness of an enterprise. Exact and fast calculations, control computation, and processing of the big data arrays - all of this requires the high level of productivity and, at the same time, minimum time of data handling and result receiving. In order to reach the best time, it is necessary not only to use computing resources optimally, but also to design and develop the software so that time gain will be maximal. For this purpose task (jobs or operations), scheduling techniques for the multi-machine/multiprocessor systems are applied. Some of basic task scheduling methods for the multi-machine process control systems are considered in this paper, their advantages and disadvantages come to light, and also some usage considerations, in case of the software for automatic process control systems developing, are made.

  5. Process mining application in software process assessment

    NARCIS (Netherlands)

    Samalikova, J.

    2012-01-01

    Nowadays, our daily life heavily depends on software. Software is everywhere, from appliances in our homes, to safety-critical systems such as medical equipment. The failure of these software-intensive systems results in high financial losses, environmental or property damages, or even loss of life.

  6. Model-based software process improvement

    Science.gov (United States)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  7. Software-Engineering Process Simulation (SEPS) model

    Science.gov (United States)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  8. SPEM: Software Process Engineering Metamodel

    OpenAIRE

    Víctor Hugo Menéndez Domínguez; María Enriqueta Castellanos Bolaños

    2015-01-01

    Todas las organizaciones involucradas en el desarrollo de software necesitan establecer, gestionar y soportar el trabajo de desarrollo. El término “proceso de desarrollo de software” tiende a unificar todas las actividades y prácticas que cubren esas necesidades. Modelar el proceso de software es una forma para mejorar el desarrollo y la calidad de las aplicaciones resultantes. De entre todos los lenguajes existentes para el modelado de procesos, aquellos basados en productos de trabajo son l...

  9. Development and evaluation of new semi-automatic TLD reader software

    International Nuclear Information System (INIS)

    Pathan, M.S.; Pradhan, S.M.; Palani Selvam, T.; Datta, D.

    2018-01-01

    Nowadays, all technology advancement is primarily focused on creating the user-friendly environment while operating any machine, also minimizing the human errors by automation of procedures. In the present study development and evaluation of new software for semi-automatic TLD badge reader (TLDBR-7B) is presented. The software provides an interactive interface and is compatible with latest windows OS as well as USB mode of data communication. Important new features of the software are automatic glow curve analysis for identifying any abnormality, event log register, user defined limits on TL count and time of temperature stabilization for readout interruption and auto reading resumption options

  10. NuFTA: A CASE Tool for Automatic Software Fault Tree Analysis

    International Nuclear Information System (INIS)

    Yun, Sang Hyun; Lee, Dong Ah; Yoo, Jun Beom

    2010-01-01

    Software fault tree analysis (SFTA) is widely used for analyzing software requiring high-reliability. In SFTA, experts predict failures of system through HA-ZOP (Hazard and Operability study) or FMEA (Failure Mode and Effects Analysis) and draw software fault trees about the failures. Quality and cost of the software fault tree, therefore, depend on knowledge and experience of the experts. This paper proposes a CASE tool NuFTA in order to assist experts of safety analysis. The NuFTA automatically generate software fault trees from NuSCR formal requirements specification. NuSCR is a formal specification language used for specifying software requirements of KNICS RPS (Reactor Protection System) in Korea. We used the SFTA templates proposed by in order to generate SFTA automatically. The NuFTA also generates logical formulae summarizing the failure's cause, and we have a plan to use the formulae usefully through formal verification techniques

  11. Process-based software project management

    CERN Document Server

    Goodman, F Alan

    2006-01-01

    Not connecting software project management (SPM) to actual, real-world development processes can lead to a complete divorcing of SPM to software engineering that can undermine any successful software project. By explaining how a layered process architectural model improves operational efficiency, Process-Based Software Project Management outlines a new method that is more effective than the traditional method when dealing with SPM. With a clear and easy-to-read approach, the book discusses the benefits of an integrated project management-process management connection. The described tight coup

  12. A process algebra software engineering environment

    NARCIS (Netherlands)

    Diertens, B.

    2008-01-01

    In previous work we described how the process algebra based language PSF can be used in software engineering, using the ToolBus, a coordination architecture also based on process algebra, as implementation model. In this article we summarize that work and describe the software development process

  13. Imperfect Information in Software Design Processes

    NARCIS (Netherlands)

    Noppen, J.A.R.

    2007-01-01

    The process of designing high-quality software systems is one of the major issues in software engineering research. Over the years, this has resulted in numerous design methods, each with specific qualities and drawbacks. For example, the Rational Unified Process is a comprehensive design process,

  14. How does Software Process Improvement Address Global Software Engineering?

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Diebold, Philipp; Münch, Jürgen

    2016-01-01

    a systematic mapping study on the state-of-the-art in SPI from a general perspective, we observed Global Software Engineering (GSE) becoming a topic of interest in recent years. Therefore, in this paper, we provide a detailed investigation of those papers from the overall systematic mapping study that were......For decades, Software Process Improvement (SPI) programs have been implemented, inter alia, to improve quality and speed of software development. To set up, guide, and carry out SPI projects, and to measure SPI state, impact, and success, a multitude of different SPI approaches and considerable...... experience are available. SPI addresses many aspects ranging from individual developer skills to entire organizations. It comprises for instance the optimization of specific activities in the software lifecycle as well as the creation of organization awareness and project culture. In the course of conducting...

  15. Automatic Generation of Supervisory Control System Software Using Graph Composition

    Science.gov (United States)

    Nakata, Hideo; Sano, Tatsuro; Kojima, Taizo; Seo, Kazuo; Uchida, Tomoyuki; Nakamura, Yasuaki

    This paper describes the automatic generation of system descriptions for SCADA (Supervisory Control And Data Acquisition) systems. The proposed method produces various types of data and programs for SCADA systems from equipment definitions using conversion rules. At first, this method makes directed graphs, which represent connections between the equipment, from equipment definitions. System descriptions are generated using the conversion rules, by analyzing these directed graphs, and finding the groups of equipment that involve similar operations. This method can make the conversion rules multi levels by using the composition of graphs, and can reduce the number of rules. The developer can define and manage these rules efficiently.

  16. 2016 International Conference on Software Process Improvement

    CERN Document Server

    Muñoz, Mirna; Rocha, Álvaro; Feliu, Tomas; Peña, Adriana

    2017-01-01

    This book offers a selection of papers from the 2016 International Conference on Software Process Improvement (CIMPS’16), held between the 12th and 14th of October 2016 in Aguascalientes, Aguascalientes, México. The CIMPS’16 is a global forum for researchers and practitioners to present and discuss the most recent innovations, trends, results, experiences and concerns in the different aspects of software engineering with a focus on, but not limited to, software processes, security in information and communication technology, and big data. The main topics covered include: organizational models, standards and methodologies, knowledge management, software systems, applications and tools, information and communication technologies and processes in non-software domains (mining, automotive, aerospace, business, health care, manufacturing, etc.) with a clear focus on software process challenges.

  17. Accessories for Enhancement of the Semi-Automatic Welding Processes

    National Research Council Canada - National Science Library

    Wheeler, Douglas M; Sawhill, James M

    2000-01-01

    The project's objective is to identify specific areas of the semi-automatic welding operation that is performed with the major semi-automatic processes, which would be more productive if a suitable...

  18. TMT approach to observatory software development process

    Science.gov (United States)

    Buur, Hanne; Subramaniam, Annapurni; Gillies, Kim; Dumas, Christophe; Bhatia, Ravinder

    2016-07-01

    The purpose of the Observatory Software System (OSW) is to integrate all software and hardware components of the Thirty Meter Telescope (TMT) to enable observations and data capture; thus it is a complex software system that is defined by four principal software subsystems: Common Software (CSW), Executive Software (ESW), Data Management System (DMS) and Science Operations Support System (SOSS), all of which have interdependencies with the observatory control systems and data acquisition systems. Therefore, the software development process and plan must consider dependencies to other subsystems, manage architecture, interfaces and design, manage software scope and complexity, and standardize and optimize use of resources and tools. Additionally, the TMT Observatory Software will largely be developed in India through TMT's workshare relationship with the India TMT Coordination Centre (ITCC) and use of Indian software industry vendors, which adds complexity and challenges to the software development process, communication and coordination of activities and priorities as well as measuring performance and managing quality and risk. The software project management challenge for the TMT OSW is thus a multi-faceted technical, managerial, communications and interpersonal relations challenge. The approach TMT is using to manage this multifaceted challenge is a combination of establishing an effective geographically distributed software team (Integrated Product Team) with strong project management and technical leadership provided by the TMT Project Office (PO) and the ITCC partner to manage plans, process, performance, risk and quality, and to facilitate effective communications; establishing an effective cross-functional software management team composed of stakeholders, OSW leadership and ITCC leadership to manage dependencies and software release plans, technical complexities and change to approved interfaces, architecture, design and tool set, and to facilitate

  19. [Landmark-based automatic registration of serial cross-sectional images of Chinese digital human using Photoshop and Matlab software].

    Science.gov (United States)

    Su, Xiu-yun; Pei, Guo-xian; Yu, Bin; Hu, Yan-ling; Li, Jin; Huang, Qian; Li, Xu; Zhang, Yuan-zhi

    2007-12-01

    This paper describes automatic registration of the serial cross-sectional images of Chinese digital human by projective registration method based on the landmarks using the commercially available software Photoshop and Matlab. During cadaver embedment for acquisition of the Chinese digital human images, 4 rods were placed parallel to the vertical axis of the frozen cadaver to allow orientation. Projective distortion of the rod positions on the cross-sectional images was inevitable due to even slight changes of the relative position of the camera. The original cross-sectional images were first processed using Photoshop software firstly to obtain the images of the orientation rods, and the centroid coordinate of every rod image was acquired with Matlab software. With the average coordinate value of the rods as the fiducial point, two-dimensional projective transformation coefficient of each image was determined. Projective transformation was then carried out and projective distortion from each original serial image was eliminated. The rectified cross-sectional images were again processed using Photoshop to obtain the image of the first orientation rod, the coordinate value of first rod image was calculated using Matlab software, and the cross-sectional images were cut into images of the same size according to the first rod spatial coordinate, to achieve automatic registration of the serial cross-sectional images. sing Photoshop and Matlab softwares, projective transformation can accurately accomplish the image registration for the serial images with simpler calculation processes and easier computer processing.

  20. Social network analysis in software process improvement

    DEFF Research Database (Denmark)

    Nielsen, Peter Axel; Tjørnehøj, Gitte

    2010-01-01

    Software process improvement in small organisation is often problematic and communication and knowledge sharing is more informal. To improve software processes we need to understand how they communicate and share knowledge. In this article have studied the company SmallSoft through action research...

  1. Automatic management software for large-scale cluster system

    International Nuclear Information System (INIS)

    Weng Yunjian; Chinese Academy of Sciences, Beijing; Sun Gongxing

    2007-01-01

    At present, the large-scale cluster system faces to the difficult management. For example the manager has large work load. It needs to cost much time on the management and the maintenance of large-scale cluster system. The nodes in large-scale cluster system are very easy to be chaotic. Thousands of nodes are put in big rooms so that some managers are very easy to make the confusion with machines. How do effectively carry on accurate management under the large-scale cluster system? The article introduces ELFms in the large-scale cluster system. Furthermore, it is proposed to realize the large-scale cluster system automatic management. (authors)

  2. The design of automatic software testing module for civil aviation information system

    Science.gov (United States)

    Qi, Qi; Sun, Yang

    2018-05-01

    In this paper, the practical innovation design is carried out according to the urgent needs of the automatic testing module of civil aviation information system. Firstly, the background and significance of the automatic testing module of civil aviation information system is expounded, and the current research status of automatic testing module and the advantages and disadvantages of related software are analyzed. Then, from the three aspects of macro demand, module functional requirement and module nonfunctional demand, we further study the needs of automatic testing module of civil aviation information system. Finally, from the four aspects of module structure, module core function, database and security, we have made an innovative plan for the automatic testing module of civil aviation information system.

  3. Overview of the software inspection process

    Energy Technology Data Exchange (ETDEWEB)

    Lane, G.L.; Dabbs, R. [Sandia National Labs., Albuquerque, NM (United States)

    1997-11-01

    This tutorial introduces attendees to the Inspection Process and teaches them how to organize and participate in a software inspection. The tutorial advocates the benefits of inspections and encourages attendees to socialize the inspection process in their organizations.

  4. [Software support for automatic (computerized) work stations for dentists].

    Science.gov (United States)

    Prokhonchukov, A A; Balashov, A N; Zhizhina, N A; Pelkovskiĭ, V Iu

    1997-01-01

    The authors describe the software for a basic computer unified system of working places for dentists specializing in oral, dental surgery and orthodontics with diaries and special entries for each section. The computer system head physician--registration office--primary examination room--rooms of dental treatment, physiotherapy, and x-ray examination--is intended for use as part of local computer network of dental clinics in the MS DOS and Windows systems. Application of the system improves the labor efficiency by up to 35% and the financial efficacy by almost 30%. It also warrants legal protection of both physicians and patients.

  5. The laws of software process a new model for the production and management of software

    CERN Document Server

    Armour, Phillip G

    2003-01-01

    The Nature of Software and The Laws of Software ProcessA Brief History of KnowledgeThe Characteristics of Knowledge Storage MediaThe Nature of Software DevelopmentThe Laws of Software Process and the Five Orders of IgnoranceThe Laws of Software ProcessThe First Law of Software ProcessThe Corollary to the First Law of Software ProcessThe Reflexive Creation of Systems and ProcessesThe Lemma of Eternal LatenessThe Second Law of Software ProcessThe Rule of Process BifurcationThe Dual Hypotheses of Knowledge DiscoveryArmour's Observation on Software ProcessThe Third Law of Software Process (also kn

  6. Word Processing in Dyslexics: An Automatic Decoding Deficit?

    Science.gov (United States)

    Yap, Regina; Van Der Leu, Aryan

    1993-01-01

    Compares dyslexic children with normal readers on measures of phonological decoding and automatic word processing. Finds that dyslexics have a deficit in automatic phonological decoding skills. Discusses results within the framework of the phonological deficit and the automatization deficit hypotheses. (RS)

  7. Automatically generated acceptance test: A software reliability experiment

    Science.gov (United States)

    Protzel, Peter W.

    1988-01-01

    This study presents results of a software reliability experiment investigating the feasibility of a new error detection method. The method can be used as an acceptance test and is solely based on empirical data about the behavior of internal states of a program. The experimental design uses the existing environment of a multi-version experiment previously conducted at the NASA Langley Research Center, in which the launch interceptor problem is used as a model. This allows the controlled experimental investigation of versions with well-known single and multiple faults, and the availability of an oracle permits the determination of the error detection performance of the test. Fault interaction phenomena are observed that have an amplifying effect on the number of error occurrences. Preliminary results indicate that all faults examined so far are detected by the acceptance test. This shows promise for further investigations, and for the employment of this test method on other applications.

  8. SEL's Software Process-Improvement Program

    Science.gov (United States)

    Basili, Victor; Zelkowitz, Marvin; McGarry, Frank; Page, Jerry; Waligora, Sharon; Pajerski, Rose

    1995-01-01

    The goals and operations of the Software Engineering Laboratory (SEL) is reviewed. For nearly 20 years the SEL has worked to understand, assess, and improve software and the development process within the production environment of the Flight Dynamics Division (FDD) of NASA's Goddard Space Flight Center. The SEL was established in 1976 with the goals of reducing: (1) the defect rate of delivered software, (2) the cost of software to support flight projects, and (3) the average time to produce mission-support software. After studying over 125 projects of FDD, the results have guided the standards, management practices, technologies, and the training within the division. The results of the studies have been a 75 percent reduction in defects, a 50 percent reduction in cost, and a 25 percent reduction in development time. Over time the goals of SEL have been clarified. The goals are now stated as: (1) Understand baseline processes and product characteristics, (2) Assess improvements that have been incorporated into the development projects, (3) Package and infuse improvements into the standard SEL process. The SEL improvement goal is to demonstrate continual improvement of the software process by carrying out analysis, measurement and feedback to projects with in the FDD environment. The SEL supports the understanding of the process by study of several processes including, the effort distribution, and error detection rates. The SEL assesses and refines the processes. Once the assessment and refinement of a process is completed, the SEL packages the process by capturing the process in standards, tools and training.

  9. An analysis of tools for automatic software development and automatic code generation

    OpenAIRE

    Viviana Yarel Rosales-Morales; Giner Alor-Hernández; Jorge Luis García-Alcaráz; Ramón Zatarain-Cabada; María Lucía Barrón-Estrada

    2015-01-01

    El desarrollo de software es una importante área en la ingeniería de software, por tal motivo han surgido técnicas, enfoques y métodos que permiten la automatización de desarrollo del mismo. En este trabajo se presenta un análisis de las herramientas para el desarrollo automático de software y la generación automática de código fuente, con el fi n de evaluarlas y determinar si cumplen o no con un conjunto de características y funcionalidades en términos de calidad. Dichas característica...

  10. Software Process Improvement Using Force Field Analysis ...

    African Journals Online (AJOL)

    An improvement plan is then drawn and implemented. This paper studied the state of Nigerian software development organizations based on selected attributes. Force field analysis is used to partition the factors obtained into driving and restraining forces. An attempt was made to improve the software development process ...

  11. Automating Software Development Process using Fuzzy Logic

    NARCIS (Netherlands)

    Marcelloni, Francesco; Aksit, Mehmet; Damiani, Ernesto; Jain, Lakhmi C.; Madravio, Mauro

    2004-01-01

    In this chapter, we aim to highlight how fuzzy logic can be a valid expressive tool to manage the software development process. We characterize a software development method in terms of two major components: artifact types and methodological rules. Classes, attributes, operations, and inheritance

  12. Software engineering processes for Class D missions

    Science.gov (United States)

    Killough, Ronnie; Rose, Debi

    2013-09-01

    Software engineering processes are often seen as anathemas; thoughts of CMMI key process areas and NPR 7150.2A compliance matrices can motivate a software developer to consider other career fields. However, with adequate definition, common-sense application, and an appropriate level of built-in flexibility, software engineering processes provide a critical framework in which to conduct a successful software development project. One problem is that current models seem to be built around an underlying assumption of "bigness," and assume that all elements of the process are applicable to all software projects regardless of size and tolerance for risk. This is best illustrated in NASA's NPR 7150.2A in which, aside from some special provisions for manned missions, the software processes are to be applied based solely on the criticality of the software to the mission, completely agnostic of the mission class itself. That is, the processes applicable to a Class A mission (high priority, very low risk tolerance, very high national significance) are precisely the same as those applicable to a Class D mission (low priority, high risk tolerance, low national significance). This paper will propose changes to NPR 7150.2A, taking mission class into consideration, and discuss how some of these changes are being piloted for a current Class D mission—the Cyclone Global Navigation Satellite System (CYGNSS).

  13. Development of Automatic Visceral Fat Volume Calculation Software for CT Volume Data

    Directory of Open Access Journals (Sweden)

    Mitsutaka Nemoto

    2014-01-01

    Full Text Available Objective. To develop automatic visceral fat volume calculation software for computed tomography (CT volume data and to evaluate its feasibility. Methods. A total of 24 sets of whole-body CT volume data and anthropometric measurements were obtained, with three sets for each of four BMI categories (under 20, 20 to 25, 25 to 30, and over 30 in both sexes. True visceral fat volumes were defined on the basis of manual segmentation of the whole-body CT volume data by an experienced radiologist. Software to automatically calculate visceral fat volumes was developed using a region segmentation technique based on morphological analysis with CT value threshold. Automatically calculated visceral fat volumes were evaluated in terms of the correlation coefficient with the true volumes and the error relative to the true volume. Results. Automatic visceral fat volume calculation results of all 24 data sets were obtained successfully and the average calculation time was 252.7 seconds/case. The correlation coefficients between the true visceral fat volume and the automatically calculated visceral fat volume were over 0.999. Conclusions. The newly developed software is feasible for calculating visceral fat volumes in a reasonable time and was proved to have high accuracy.

  14. Problem Diagnosis in Software Process Improvement

    DEFF Research Database (Denmark)

    Iversen, Jakob; Nielsen, Peter Axel; Nørbjerg, Jacob

    1998-01-01

    This paper addresses software process improvement. In particular it reports on action research undertaken to understand the problems with software processes of a large Danish company. It is argued that in order to understand what the specific problems are we may, on the one hand, rely on process...... to enable process improvement to effectively take place. It is argued that problem diagnosis a useful approach and that it has advantages over model-based assessment....... models like CMM or Bootstrap. On the other hand, we may also see the specific and unique features of software processes in this company through what we call problem diagnosis. Problem diagnosis deals with eliciting problems perceived by software project managers and with forming commitment structures...

  15. Modular Algorithm Testbed Suite (MATS): A Software Framework for Automatic Target Recognition

    Science.gov (United States)

    2017-01-01

    NAVAL SURFACE WARFARE CENTER PANAMA CITY DIVISION PANAMA CITY, FL 32407-7001 TECHNICAL REPORT NSWC PCD TR-2017-004 MODULAR ...31-01-2017 Technical Modular Algorithm Testbed Suite (MATS): A Software Framework for Automatic Target Recognition DR...flexible platform to facilitate the development and testing of ATR algorithms. To that end, NSWC PCD has created the Modular Algorithm Testbed Suite

  16. An analysis of tools for automatic software development and automatic code generation

    Directory of Open Access Journals (Sweden)

    Viviana Yarel Rosales-Morales

    2015-01-01

    Full Text Available El desarrollo de software es una importante área en la ingeniería de software, por tal motivo han surgido técnicas, enfoques y métodos que permiten la automatización de desarrollo del mismo. En este trabajo se presenta un análisis de las herramientas para el desarrollo automático de software y la generación automática de código fuente, con el fi n de evaluarlas y determinar si cumplen o no con un conjunto de características y funcionalidades en términos de calidad. Dichas características incluyen efi cacia, productividad, seguridad y satisfacción, todo a través de una evaluación cualitativa y cuantitativa. Estas herramientas son 1 herramientas CASE, 2 marcos de trabajo ( frameworks y 3 ambientes de desarrollo integrado (IDEs. La evaluación se llevó a cabo con el fi n de medir no sólo la capacidad de uso, sino también el apoyo que brindan para el desarrollo de software automático y la generación automática de código fuente. El objetivo de este trabajo es proporcionar una metodología y una breve revisión de los trabajos más importantes para, de esta forma, identifi car las principales características de éstos y presentar una evaluación comparativa en términos cualitativos y cuantitativos, con la fi nalidad de proporcionar la información necesaria para el desarrollador de software que facilite la toma de decisiones al considerar herramientas que le pueden ser útiles.

  17. Semi-automatic film processing unit

    International Nuclear Information System (INIS)

    Mohamad Annuar Assadat Husain; Abdul Aziz Bin Ramli; Mohd Khalid Matori

    2005-01-01

    The design concept applied in the development of an semi-automatic film processing unit needs creativity and user support in channelling the required information to select materials and operation system that suit the design produced. Low cost and efficient operation are the challenges that need to be faced abreast with the fast technology advancement. In producing this processing unit, there are few elements which need to be considered in order to produce high quality image. Consistent movement and correct time coordination for developing and drying are a few elements which need to be controlled. Other elements which need serious attentions are temperature, liquid density and the amount of time for the chemical liquids to react. Subsequent chemical reaction that take place will cause the liquid chemical to age and this will adversely affect the quality of image produced. This unit is also equipped with liquid chemical drainage system and disposal chemical tank. This unit would be useful in GP clinics especially in rural area which practice manual system for developing and require low operational cost. (Author)

  18. A complete software application for automatic registration of x-ray mammography and magnetic resonance images

    Energy Technology Data Exchange (ETDEWEB)

    Solves-Llorens, J. A.; Rupérez, M. J., E-mail: mjruperez@labhuman.i3bh.es; Monserrat, C. [LabHuman, Universitat Politècnica de València, Camino de Vera s/n, 46022 Valencia (Spain); Feliu, E.; García, M. [Hospital Clínica Benidorm, Avda. Alfonso Puchades, 8, 03501 Benidorm (Alicante) (Spain); Lloret, M. [Hospital Universitari y Politècnic La Fe, Bulevar Sur, 46026 Valencia (Spain)

    2014-08-15

    Purpose: This work presents a complete and automatic software application to aid radiologists in breast cancer diagnosis. The application is a fully automated method that performs a complete registration of magnetic resonance (MR) images and x-ray (XR) images in both directions (from MR to XR and from XR to MR) and for both x-ray mammograms, craniocaudal (CC), and mediolateral oblique (MLO). This new approximation allows radiologists to mark points in the MR images and, without any manual intervention, it provides their corresponding points in both types of XR mammograms and vice versa. Methods: The application automatically segments magnetic resonance images and x-ray images using the C-Means method and the Otsu method, respectively. It compresses the magnetic resonance images in both directions, CC and MLO, using a biomechanical model of the breast that distinguishes the specific biomechanical behavior of each one of its three tissues (skin, fat, and glandular tissue) separately. It makes a projection of both compressions and registers them with the original XR images using affine transformations and nonrigid registration methods. Results: The application has been validated by two expert radiologists. This was carried out through a quantitative validation on 14 data sets in which the Euclidean distance between points marked by the radiologists and the corresponding points obtained by the application were measured. The results showed a mean error of 4.2 ± 1.9 mm for the MRI to CC registration, 4.8 ± 1.3 mm for the MRI to MLO registration, and 4.1 ± 1.3 mm for the CC and MLO to MRI registration. Conclusions: A complete software application that automatically registers XR and MR images of the breast has been implemented. The application permits radiologists to estimate the position of a lesion that is suspected of being a tumor in an imaging modality based on its position in another different modality with a clinically acceptable error. The results show that the

  19. A complete software application for automatic registration of x-ray mammography and magnetic resonance images

    International Nuclear Information System (INIS)

    Solves-Llorens, J. A.; Rupérez, M. J.; Monserrat, C.; Feliu, E.; García, M.; Lloret, M.

    2014-01-01

    Purpose: This work presents a complete and automatic software application to aid radiologists in breast cancer diagnosis. The application is a fully automated method that performs a complete registration of magnetic resonance (MR) images and x-ray (XR) images in both directions (from MR to XR and from XR to MR) and for both x-ray mammograms, craniocaudal (CC), and mediolateral oblique (MLO). This new approximation allows radiologists to mark points in the MR images and, without any manual intervention, it provides their corresponding points in both types of XR mammograms and vice versa. Methods: The application automatically segments magnetic resonance images and x-ray images using the C-Means method and the Otsu method, respectively. It compresses the magnetic resonance images in both directions, CC and MLO, using a biomechanical model of the breast that distinguishes the specific biomechanical behavior of each one of its three tissues (skin, fat, and glandular tissue) separately. It makes a projection of both compressions and registers them with the original XR images using affine transformations and nonrigid registration methods. Results: The application has been validated by two expert radiologists. This was carried out through a quantitative validation on 14 data sets in which the Euclidean distance between points marked by the radiologists and the corresponding points obtained by the application were measured. The results showed a mean error of 4.2 ± 1.9 mm for the MRI to CC registration, 4.8 ± 1.3 mm for the MRI to MLO registration, and 4.1 ± 1.3 mm for the CC and MLO to MRI registration. Conclusions: A complete software application that automatically registers XR and MR images of the breast has been implemented. The application permits radiologists to estimate the position of a lesion that is suspected of being a tumor in an imaging modality based on its position in another different modality with a clinically acceptable error. The results show that the

  20. Managing Cultural Variation in Software Process Improvement

    DEFF Research Database (Denmark)

    Kræmmergaard, Pernille; Müller, Sune Dueholm; Mathiassen, Lars

    The scale and complexity of change in software process improvement (SPI) are considerable and managerial attention to organizational culture during SPI can therefore potentially contribute to successful outcomes. However, we know little about the impact of variations in organizational subculture ...... organizations can have important implications for SPI outcomes. Furthermore, it provides insights into how software managers can practically assess subcultures to inform decisions about and help prepare plans for SPI initiatives.......The scale and complexity of change in software process improvement (SPI) are considerable and managerial attention to organizational culture during SPI can therefore potentially contribute to successful outcomes. However, we know little about the impact of variations in organizational subculture...

  1. The software and algorithms for hyperspectral data processing

    Science.gov (United States)

    Shyrayeva, Anhelina; Martinov, Anton; Ivanov, Victor; Katkovsky, Leonid

    2017-04-01

    Hyperspectral remote sensing technique is widely used for collecting and processing -information about the Earth's surface objects. Hyperspectral data are combined to form a three-dimensional (x, y, λ) data cube. Department of Aerospace Research of the Institute of Applied Physical Problems of the Belarusian State University presents a general model of the software for hyperspectral image data analysis and processing. The software runs in Windows XP/7/8/8.1/10 environment on any personal computer. This complex has been has been written in C++ language using QT framework and OpenGL for graphical data visualization. The software has flexible structure that consists of a set of independent plugins. Each plugin was compiled as Qt Plugin and represents Windows Dynamic library (dll). Plugins can be categorized in terms of data reading types, data visualization (3D, 2D, 1D) and data processing The software has various in-built functions for statistical and mathematical analysis, signal processing functions like direct smoothing function for moving average, Savitzky-Golay smoothing technique, RGB correction, histogram transformation, and atmospheric correction. The software provides two author's engineering techniques for the solution of atmospheric correction problem: iteration method of refinement of spectral albedo's parameters using Libradtran and analytical least square method. The main advantages of these methods are high rate of processing (several minutes for 1 GB data) and low relative error in albedo retrieval (less than 15%). Also, the software supports work with spectral libraries, region of interest (ROI) selection, spectral analysis such as cluster-type image classification and automatic hypercube spectrum comparison by similarity criterion with similar ones from spectral libraries, and vice versa. The software deals with different kinds of spectral information in order to identify and distinguish spectrally unique materials. Also, the following advantages

  2. From Pragmatic to Systematic Software Process Improvement

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Méndez Fernández, Daniel

    2015-01-01

    Software processes improvement (SPI) is a challenging task, as many different stakeholders, project settings, and contexts and goals need to be considered. SPI projects are often operated in a complex and volatile environment and, thus, require a sound management that is resource......-intensive requiring many stakeholders to contribute to the process assessment, analysis, design, realisation, and deployment. Although there exist many valuable SPI approaches, none address the needs of both process engineers and project managers. This article presents an Artefact-based Software Process Improvement...

  3. Evaluation of a new software tool for the automatic volume calculation of hepatic tumors. First results

    International Nuclear Information System (INIS)

    Meier, S.; Mildenberger, P.; Pitton, M.; Thelen, M.; Schenk, A.; Bourquain, H.

    2004-01-01

    Purpose: computed tomography has become the preferred method in detecting liver carcinomas. The introduction of spiral CT added volumetric assessment of intrahepatic tumors, which was unattainable in the clinical routine with incremental CT due to complex planimetric revisions and excessive computing time. In an ongoing clinical study, a new software tool was tested for the automatic detection of tumor volume and the time needed for this procedure. Materials and methods: we analyzed patients suffering from hepatocellular carcinoma (HCC). All patients underwent treatment with repeated transcatheter chemoembolization of the hepatic arteria. The volumes of the HCC lesions detected in CT were measured with the new software tool in HepaVison (MeVis, Germany). The results were compared with manual planimetric calculation of the volume performed by three independent radiologists. Results: our first results in 16 patients show a correlation between the automatically and the manually calculated volumes (up to a difference of 2 ml) of 96.8%. While the manual method of analyzing the volume of a lesion requires 2.5 minutes on average, the automatic method merely requires about 30 seconds of user interaction time. Conclusion: These preliminary results show a good correlation between automatic and manual calculations of the tumor volume. The new software tool requires less time for accurate determination of the tumor volume and can be applied in the daily clinical routine. (orig.) [de

  4. Flexible Software Process Lines in Practice

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Ternité, Thomas; Friedrich, Jan

    2016-01-01

    Process flexibility and adaptability is a frequently discussed topic in literature, and several approaches propose techniques to improve and optimize software processes for a given organization- or project context. A software process line (SPrL) is an instrument to systematically construct...... that can be adapted to the respective context. In this article, we present an approach to construct flexible software process lines and show its practical application in the German V-Modell XT. The presented approach emerges from a 10-year research endeavor and was used to enhance the metamodel of the V......-Modell XT and to allow for improved process variability and lifecycle management. Practical dissemination and complementing empirical research show the suitability of the concept. We therefore contribute a proven approach that is presented as metamodel fragment for reuse and implementation in further...

  5. Distributed automatic control of technological processes in conditions of weightlessness

    Science.gov (United States)

    Kukhtenko, A. I.; Merkulov, V. I.; Samoylenko, Y. I.; Ladikov-Royev, Y. P.

    1986-01-01

    Some problems associated with the automatic control of liquid metal and plasma systems under conditions of weightlessness are examined, with particular reference to the problem of stability of liquid equilibrium configurations. The theoretical fundamentals of automatic control of processes in electrically conducting continuous media are outlined, and means of using electromagnetic fields for simulating technological processes in a space environment are discussed.

  6. Managing Cultural Variation in Software Process Improvement

    DEFF Research Database (Denmark)

    Müller, Sune Dueholm; Kræmmergaard, Pernille; Mathiassen, Lars

    The scale and complexity of change in software process improvement (SPI) are considerable and managerial attention to organizational culture during SPI can therefore potentially contribute to successful outcomes. However, we know little about the impact of variations in organizational subculture...... CMMI level 2 as planned, ASY struggled to implement even modest improvements. To explain these differences, we analyzed the underlying organizational culture within ISY and ASY using two different methods for subculture assessment. The study demonstrates how variations in culture across software...

  7. Some results of automatic processing of images

    International Nuclear Information System (INIS)

    Golenishchev, I.A.; Gracheva, T.N.; Khardikov, S.V.

    1975-01-01

    The problems of automatic deciphering of the radiographic picture the purpose of which is making a conclusion concerning the quality of the inspected product on the basis of the product defect images in the picture are considered. The methods of defect image recognition are listed, and the algorithms and the class features of defects are described. The results of deciphering of a small radiographic picture by means of the ''Minsk-22'' computer are presented. It is established that the sensitivity of the method of the automatic deciphering is close to that obtained for visual deciphering

  8. Automation of the software production process for multiple cryogenic control applications

    OpenAIRE

    Fluder, Czeslaw; Lefebvre, Victor; Pezzetti, Marco; Plutecki, Przemyslaw; Tovar-González, Antonio; Wolak, Tomasz

    2018-01-01

    The development of process control systems for the cryogenic infrastructure at CERN is based on an automatic software generation approach. The overall complexity of the systems, their frequent evolution as well as the extensive use of databases, repositories, commercial engineering software and CERN frameworks have led to further efforts towards improving the existing automation based software production methodology. A large number of control system upgrades were successfully performed for th...

  9. Verification and Validation in a Rapid Software Development Process

    Science.gov (United States)

    Callahan, John R.; Easterbrook, Steve M.

    1997-01-01

    The high cost of software production is driving development organizations to adopt more automated design and analysis methods such as rapid prototyping, computer-aided software engineering (CASE) tools, and high-level code generators. Even developers of safety-critical software system have adopted many of these new methods while striving to achieve high levels Of quality and reliability. While these new methods may enhance productivity and quality in many cases, we examine some of the risks involved in the use of new methods in safety-critical contexts. We examine a case study involving the use of a CASE tool that automatically generates code from high-level system designs. We show that while high-level testing on the system structure is highly desirable, significant risks exist in the automatically generated code and in re-validating releases of the generated code after subsequent design changes. We identify these risks and suggest process improvements that retain the advantages of rapid, automated development methods within the quality and reliability contexts of safety-critical projects.

  10. Competing Values in Software Process Improvement

    DEFF Research Database (Denmark)

    Mûller, Sune Dueholm; Nielsen, Peter Axel

    2013-01-01

    Purpose The purpose of the article is to investigate the impact of organizational culture on software process improvement (SPI). Is cultural congruence between an organization and an adopted process model required? How can the level of congruence between an organizational culture and the values...... and assumptions underlying an adopted process model be assessed? How can cultural incongruence be managed to facilitate success of software process improvement? Design/methodology/approach The competing values framework and its associated assessment instrument are used in a case study to establish......-step process, SPI managers establish and compare culture profiles and decide how to address identified problems. To that end the text analysis technique is offered as a web service that allows for analysis of all text-based process models and standards, and of internal process documentation. Originality...

  11. Automatic Generation of Machine Emulators: Efficient Synthesis of Robust Virtual Machines for Legacy Software Migration

    DEFF Research Database (Denmark)

    Franz, Michael; Gal, Andreas; Probst, Christian

    2006-01-01

    As older mainframe architectures become obsolete, the corresponding le- gacy software is increasingly executed via platform emulators running on top of more modern commodity hardware. These emulators are virtual machines that often include a combination of interpreters and just-in-time compilers....... Implementing interpreters and compilers for each combination of emulated and target platform independently of each other is a redundant and error-prone task. We describe an alternative approach that automatically synthesizes specialized virtual-machine interpreters and just-in-time compilers, which...... then execute on top of an existing software portability platform such as Java. The result is a considerably reduced implementation effort....

  12. The automatic programming for safety-critical software in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jang Yeol; Eom, Heung Seop; Choi, You Rark

    1998-06-01

    We defined the Korean unique safety-critical software development methodology by modifying Dr. Harel`s statechart-based on formal methods in order to digitalized the reactor protection system. It is suggested software requirement specification guideline to specify design specification which is basis for requirement specification and automatic programming by the caused by shutdown parameter logic of the steam generator water level for Wolsung 2/3/4 unit SDS no.1 and simulated it by binding the Graphic User Interface (GUI). We generated the K and R C code automatically by utilizing the Statemate MAGNUM Sharpshooter/C code generator. Auto-generated K and R C code is machine independent code and has high productivity, quality and provability. The following are the summaries of major research and development. - Set up the Korean unique safety-critical software development methodology - Developed software requirement specification guidelines - Developed software design specification guidelines - Reactor trip modeling for steam generator waster level Wolsung 2/3/4 SDS no. 1 shutdown parameter logic - Graphic panel binding with GUI. (author). 20 refs., 12 tabs., 15 figs

  13. The automatic programming for safety-critical software in nuclear power plants

    International Nuclear Information System (INIS)

    Kim, Jang Yeol; Eom, Heung Seop; Choi, You Rark

    1998-06-01

    We defined the Korean unique safety-critical software development methodology by modifying Dr. Harel's statechart-based on formal methods in order to digitalized the reactor protection system. It is suggested software requirement specification guideline to specify design specification which is basis for requirement specification and automatic programming by the caused by shutdown parameter logic of the steam generator water level for Wolsung 2/3/4 unit SDS no.1 and simulated it by binding the Graphic User Interface (GUI). We generated the K and R C code automatically by utilizing the Statemate MAGNUM Sharpshooter/C code generator. Auto-generated K and R C code is machine independent code and has high productivity, quality and provability. The following are the summaries of major research and development. - Set up the Korean unique safety-critical software development methodology - Developed software requirement specification guidelines - Developed software design specification guidelines - Reactor trip modeling for steam generator waster level Wolsung 2/3/4 SDS no. 1 shutdown parameter logic - Graphic panel binding with GUI. (author). 20 refs., 12 tabs., 15 figs

  14. Extending the agile development process to develop acceptably secure software

    NARCIS (Netherlands)

    Ben Othmane, L.; Angin, P.; Weffers, H.T.G.; Bhargava, B.

    2013-01-01

    The agile software development approach makes developing secure software challenging. Existing approaches for extending the agile development process, which enables incremental and iterative software development, fall short of providing a method for efficiently ensuring the security of the software

  15. Towards an Evaluation Framework for Software Process Improvement

    OpenAIRE

    Cheng, Chow Kian; Permadi, Rahadian Bayu

    2009-01-01

    Software has gained an essential role in our daily life in the last decades. This condition demands high quality software. To produce high quality software many practitioners and researchers put more attention on the software development process. Large investments are poured to improve the software development process. Software Process Improvement (SPI) is a research area which is aimed to address the assessment and improvement issues in the software development process. One of the most impor...

  16. The effects of automatic spelling correction software on understanding and comprehension in compensated dyslexia: improved recall following dictation.

    Science.gov (United States)

    Hiscox, Lucy; Leonavičiūtė, Erika; Humby, Trevor

    2014-08-01

    Dyslexia is associated with difficulties in language-specific skills such as spelling, writing and reading; the difficulty in acquiring literacy skills is not a result of low intelligence or the absence of learning opportunity, but these issues will persist throughout life and could affect long-term education. Writing is a complex process involving many different functions, integrated by the working memory system; people with dyslexia have a working memory deficit, which means that concentration on writing quality may be detrimental to understanding. We confirm impaired working memory in a sample of university students with (compensated) dyslexia, and using a within-subject design with three test conditions, we show that these participants demonstrated better understanding of a piece of text if they had used automatic spelling correction software during a dictation/transcription task. We hypothesize that the use of the autocorrecting software reduced demand on working memory, by allowing word writing to be more automatic, thus enabling better processing and understanding of the content of the transcriptions and improved recall. Long-term and regular use of autocorrecting assistive software should be beneficial for people with and without dyslexia and may improve confidence, written work, academic achievement and self-esteem, which are all affected in dyslexia. Copyright © 2014 John Wiley & Sons, Ltd.

  17. Standardising Software Processes - An Obstacle for Innovation?

    DEFF Research Database (Denmark)

    Aaen, Ivan; Pries-Heje, Jan

    2004-01-01

    Over the last 10 years CMM has achieved widespread use as a model for improving software organisations. Often CMM is used to standardise software processes across projects. In this paper we discuss this standardisation of SPI in relation to innovation, organisational size and company growth. Our...... discussion is empirically based on years work and experience working with companies on SPI. In the concrete our discussion is enhanced by vignette stories taken from our experience. As a result we find that standardisation focussing on process, metrics, and controls may jeopardize innovative capabilities...

  18. Simulation Modeling of Software Development Processes

    Science.gov (United States)

    Calavaro, G. F.; Basili, V. R.; Iazeolla, G.

    1996-01-01

    A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.

  19. Automatic image processing as a means of safeguarding nuclear material

    International Nuclear Information System (INIS)

    Kahnmeyer, W.; Willuhn, K.; Uebel, W.

    1985-01-01

    Problems involved in computerized analysis of pictures taken by automatic film or video cameras in the context of international safeguards implementation are described. They include technical ones as well as the need to establish objective criteria for assessing image information. In the near future automatic image processing systems will be useful in verifying the identity and integrity of IAEA seals. (author)

  20. Software Engineering Laboratory (SEL) cleanroom process model

    Science.gov (United States)

    Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon

    1991-01-01

    The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.

  1. The ATLAS Data Management Software Engineering Process

    CERN Document Server

    Lassnig, M; The ATLAS collaboration; Stewart, G A; Barisits, M; Beermann, T; Vigne, R; Serfon, C; Goossens, L; Nairz, A; Molfetas, A

    2014-01-01

    Rucio is the next-generation data management system of the ATLAS experiment. The software engineering process to develop Rucio is fundamentally different to existing software development approaches in the ATLAS distributed computing community. Based on a conceptual design document, development takes place using peer-reviewed code in a test-driven environment. The main objectives are to ensure that every engineer understands the details of the full project, even components usually not touched by them, that the design and architecture are coherent, that temporary contributors can be productive without delay, that programming mistakes are prevented before being committed to the source code, and that the source is always in a fully functioning state. This contribution will illustrate the workflows and products used, and demonstrate the typical development cycle of a component from inception to deployment within this software engineering process. Next to the technological advantages, this contribution will also hi...

  2. The ATLAS Data Management Software Engineering Process

    CERN Document Server

    Lassnig, M; The ATLAS collaboration; Stewart, G A; Barisits, M; Beermann, T; Vigne, R; Serfon, C; Goossens, L; Nairz, A

    2013-01-01

    Rucio is the next-generation data management system of the ATLAS experiment. The software engineering process to develop Rucio is fundamentally different to existing software development approaches in the ATLAS distributed computing community. Based on a conceptual design document, development takes place using peer-reviewed code in a test-driven environment. The main objectives are to ensure that every engineer understands the details of the full project, even components usually not touched by them, that the design and architecture are coherent, that temporary contributors can be productive without delay, that programming mistakes are prevented before being committed to the source code, and that the source is always in a fully functioning state. This contribution will illustrate the workflows and products used, and demonstrate the typical development cycle of a component from inception to deployment within this software engineering process. Next to the technological advantages, this contribution will also hi...

  3. A method of automatic data processing in radiometric control

    International Nuclear Information System (INIS)

    Adonin, V.M.; Gulyukina, N.A.; Nemirov, Yu.V.; Mogil'nitskij, M.I.

    1980-01-01

    Described is the algorithm for automatic data processing in gamma radiography of products. Rapidity due to application of recurrent evaluation is a specific feature of the processing. Experimental data of by-line control are presented. The results obtained have shown the applicability of automatic signal processing to the testing under industrial conditions, which would permit to increase the testing efficiency to eliminate the subjectivism in assessment of testing results and to improve working conditions

  4. Software Package STATISTICA and Educational Process

    Directory of Open Access Journals (Sweden)

    Demidova Liliya

    2016-01-01

    Full Text Available The paper describes the main aspects of application of the software package STATISTICA in the educational process. Technologies of data mining which can be useful for students researches have been considered. The main tools of these technologies have been discussed.

  5. Organizational Change Perspectives on Software Process Improvement

    DEFF Research Database (Denmark)

    Müller, Sune Dueholm; Mathiassen, Lars; Balshøj, Hans Henrik

    Many software organizations have engaged in Software Process Improvement (SPI) and experienced the challenges related to managing such complex organizational change efforts. As a result, there is an increasing body of research investigating change management in SPI. To provide an overview of what......, and brain perspectives. Practitioners may use these articles as a guide to SPI insights relevant to their improvement initiatives. In contrast, the impact of culture, dominance, psychic prison, flux and transformation, and politics in SPI have only received scant attention. We argue that these perspectives...

  6. Real time data acquisition and processing system software supported by RDOS

    International Nuclear Information System (INIS)

    Yang Huimin; Chou Gongchun; Hao Shuxiang

    1982-01-01

    The paper describes a system software supported by RDOS operating system for management and supervision a multiuser nuclear spectrum data acquisition and processing system to be made of NOVA/840 computer. The system has manifold functions including four direct address data acquisition channels. One multi-dimension nuclear data acquisition channel, four scalers and four automatic control entrances. Every channel is used competitively by each user. The system's management and all processing functions are performed by system software. The architecture of the system can renew according to users' requirement with support of the system software.This article explains the logical structure of the system software and discusses some technical keys

  7. Inter-comparison and Quality Assurance of acquisition and processing software for MUGA studies in Cuba

    International Nuclear Information System (INIS)

    Lopez, A.; Ponce, F.; Peix, A.; Gonzalez, J.; Perez, M.; Diaz, M.

    2002-01-01

    With the purpose of create the bases for quality control and quality assurance of the acquisition and processing program of gated cardiac blood-pool (MUGA) studies, we used the VENSTRA cardiac function phantom in 7 cameras (4 SOPHA- DSX-1000, 2 GE- IMAGAMMA-2001 and 1 SIEMENS- HERMES) and made 3 acquisition for each Global Left Ventricular Ejection Fraction (LVEF 30%, 60% and 80%) and for each Heart Rate (HR 40, 80 and 160 heart beat/min). The planar resolution and the planar uniformity were proper in all the equipment. Differences less than 5% were found between the acquisition and processing program. To evaluate the processing program without the acquisition parameter's influence, we used one group of these image like software phantom and test the semi-automatic software in all cameras. The semi-automatic protocol showed difference less than 3% between software. The automatic processing software of gated cardiac studies were checked with the COST-B2 software phantom; the difference between the Left Ventricle Ejection Fraction calculated by these software was less than 5% and the regional wall motion analysis was complete coincident in the 93% of the cases. The use of VENSTRA and COST- B2 phantom confirm the correct functioning of the acquisition and the LVEF calculus software of MUGA studies in the 83% of cuban nuclear medicine centers

  8. Arquitectura orientada a servicios para software de apoyo para el proceso personal de software A service oriented architecture for the implementation of the personal software process

    Directory of Open Access Journals (Sweden)

    Erick Salinas

    2011-06-01

    Full Text Available El presente trabajo describe una arquitectura orientada a servicios para un software que tiene como objetivo facilitar la implementación de un Proceso Personal de Software en un equipo de desarrollo u organización. Entre las características que posee este software y que son relevantes de mencionar están las de entregar extensibilidad e independencia, esto se ve reflejado en la facilidad para agregar nuevas herramientas al proceso de desarrollo de software integradas al Proceso Personal de Software con un máximo de independencia de sistemas operativos y lenguajes de programación. El software implementado realiza la recolección de los datos necesarios para el Proceso Personal de Software casi completamente automática, considerando que el administrador solamente clasifica los errores que pueden ocurrir cuando se utiliza algún lenguaje de programación en particular, entre otras pequeñas tareas. Esta facilidad de uso hace que la implementación del Proceso Personal de Software se realice exitosamente con un bajo esfuerzo requerido por los integrantes del equipo de desarrollo.This work describes a service oriented architecture of a software application that facilitates the implementation of the Personal Software Process by a development team or an organization. Some of the characteristics of this software and which are important to mention are extensibility and technical environment independence. These characteristics facilitate the process of adding new tools to the software development process integrating them to the Personal Software Process independently of the operating systems and programming languages being used. The implemented software undertakes the data collection necessary to the Personal Software Process almost automatically, since the administrator must only classify the errors that may occur when a particular programming language is used, among other small tasks. This ease of use approach helps to make the implementation of

  9. Adopting software quality measures for healthcare processes.

    Science.gov (United States)

    Yildiz, Ozkan; Demirörs, Onur

    2009-01-01

    In this study, we investigated the adoptability of software quality measures for healthcare process measurement. Quality measures of ISO/IEC 9126 are redefined from a process perspective to build a generic healthcare process quality measurement model. Case study research method is used, and the model is applied to a public hospital's Entry to Care process. After the application, weak and strong aspects of the process can be easily observed. Access audibility, fault removal, completeness of documentation, and machine utilization are weak aspects and these aspects are the candidates for process improvement. On the other hand, functional completeness, fault ratio, input validity checking, response time, and throughput time are the strong aspects of the process.

  10. APT - NASA ENHANCED VERSION OF AUTOMATICALLY PROGRAMMED TOOL SOFTWARE - STAND-ALONE VERSION

    Science.gov (United States)

    Premo, D. A.

    1994-01-01

    The APT code is one of the most widely used software tools for complex numerically controlled (N/C) machining. APT is an acronym for Automatically Programmed Tools and is used to denote both a language and the computer software that processes that language. Development of the APT language and software system was begun over twenty years ago as a U. S. government sponsored industry and university research effort. APT is a "problem oriented" language that was developed for the explicit purpose of aiding the N/C machine tools. Machine-tool instructions and geometry definitions are written in the APT language to constitute a "part program." The APT part program is processed by the APT software to produce a cutter location (CL) file. This CL file may then be processed by user supplied post processors to convert the CL data into a form suitable for a particular N/C machine tool. This June, 1989 offering of the APT system represents an adaptation, with enhancements, of the public domain version of APT IV/SSX8 to the DEC VAX-11/780 for use by the Engineering Services Division of the NASA Goddard Space Flight Center. Enhancements include the super pocket feature which allows concave and convex polygon shapes of up to 40 points including shapes that overlap, that leave islands of material within the pocket, and that have one or more arcs as part of the pocket boundary. Recent modifications to APT include a rework of the POCKET subroutine and correction of an error that prevented the use within a macro of a macro variable cutter move statement combined with macro variable double check surfaces. Former modifications included the expansion of array and buffer sizes to accommodate larger part programs, and the insertion of a few user friendly error messages. The APT system software on the DEC VAX-11/780 is organized into two separate programs: the load complex and the APT processor. The load complex handles the table initiation phase and is usually only run when changes to the

  11. Automatically processing physical data from LHD experiments

    Energy Technology Data Exchange (ETDEWEB)

    Emoto, M., E-mail: emoto.masahiko@nifs.ac.jp; Ida, K.; Suzuki, C.; Yoshida, M.; Akiyama, T.; Nakamura, Y.; Sakamoto, R.; Yokoyama, M.; Yoshinuma, M.

    2014-05-15

    Physical data produced by large helical device (LHD) experiments is supplied by the Kaiseki server, and registers more than 200 types of diagnostic data. Dependencies exist amongst the data; i.e., in many cases, the calculation of one data requires other data. Therefore, to obtain unregistered data, one needs to calculate not only the diagnostic data itself but also the dependent data; however, because the data is registered by different scientists, each scientist must separately calculate and register their respective data. To simplify this complicated procedure, we have developed an automatic calculation system called AutoAna. The calculation programs of AutoAna are distributed on a network, and the number of such programs can be easily increased dynamically. Our system is therefore scalable and ready for substantial increases in the size of the target data.

  12. Beyond behaviorism: on the automaticity of higher mental processes.

    Science.gov (United States)

    Bargh, J A; Ferguson, M J

    2000-11-01

    The first 100 years of experimental psychology were dominated by 2 major schools of thought: behaviorism and cognitive science. Here the authors consider the common philosophical commitment to determinism by both schools, and how the radical behaviorists' thesis of the determined nature of higher mental processes is being pursued today in social cognition research on automaticity. In harmony with "dual process" models in contemporary cognitive science, which equate determined processes with those that are automatic and which require no intervening conscious choice or guidance, as opposed to "controlled" processes which do, the social cognition research on the automaticity of higher mental processes provides compelling evidence for the determinism of those processes. This research has revealed that social interaction, evaluation and judgment, and the operation of internal goal structures can all proceed without the intervention of conscious acts of will and guidance of the process.

  13. Software design of automatic counting system for nuclear track based on mathematical morphology algorithm

    International Nuclear Information System (INIS)

    Pan Yi; Mao Wanchong

    2010-01-01

    The parameter measurement of nuclear track occupies an important position in the field of nuclear technology. However, traditional artificial counting method has many limitations. In recent years, DSP and digital image processing technology have been applied in nuclear field more and more. For the sake of reducing errors of visual measurement in artificial counting method, an automatic counting system for nuclear track based on DM642 real-time image processing platform is introduced in this article, which is able to effectively remove interferences from the background and noise points, as well as automatically extract nuclear track-points by using mathematical morphology algorithm. (authors)

  14. Software Process Improvement for SMEs using OMM

    OpenAIRE

    Rodríguez, Jessica

    2010-01-01

    Software Process Improvement initiatives have been implemented by many companies in order to achieve quality of products and practices. Many models like CMMI and IDEAL have been adopted as a means to gain competitive advantages among competitors and trustworthiness of customers. Although these models have proved successful results, the inherent characteristics of SMEs make it difficult and in many cases unfeasible to implement such models, without meaning that those companies are less capable...

  15. Neural Correlates of Automatic and Controlled Auditory Processing in Schizophrenia

    Science.gov (United States)

    Morey, Rajendra A.; Mitchell, Teresa V.; Inan, Seniha; Lieberman, Jeffrey A.; Belger, Aysenil

    2009-01-01

    Individuals with schizophrenia demonstrate impairments in selective attention and sensory processing. The authors assessed differences in brain function between 26 participants with schizophrenia and 17 comparison subjects engaged in automatic (unattended) and controlled (attended) auditory information processing using event-related functional MRI. Lower regional neural activation during automatic auditory processing in the schizophrenia group was not confined to just the temporal lobe, but also extended to prefrontal regions. Controlled auditory processing was associated with a distributed frontotemporal and subcortical dysfunction. Differences in activation between these two modes of auditory information processing were more pronounced in the comparison group than in the patient group. PMID:19196926

  16. WormGender - Open-Source Software for Automatic Caenorhabditis elegans Sex Ratio Measurement.

    Directory of Open Access Journals (Sweden)

    Marta K Labocha

    Full Text Available Fast and quantitative analysis of animal phenotypes is one of the major challenges of current biology. Here we report the WormGender open-source software, which is designed for accurate quantification of sex ratio in Caenorhabditis elegans. The software functions include, i automatic recognition and counting of adult hermaphrodites and males, ii a manual inspection feature that enables manual correction of errors, and iii flexibility to use new training images to optimize the software for different imaging conditions. We evaluated the performance of our software by comparing manual and automated assessment of sex ratio. Our data showed that the WormGender software provided overall accurate sex ratio measurements. We further demonstrated the usage of WormGender by quantifying the high incidence of male (him phenotype in 27 mutant strains. Mutants of nine genes (brc-1, C30G12.6, cep-1, coh-3, him-3, him-5, him-8, skr-1, unc-86 showed significant him phenotype. The WormGender is written in Java and can be installed and run on both Windows and Mac platforms. The source code is freely available together with a user manual and sample data at http://www.QuantWorm.org/. The source code and sample data are also available at http://dx.doi.org/10.6084/m9.figshare.1541248.

  17. Software Defined Common Processing System (SDCPS), Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Coherent Logix, Incorporated proposes the Software Defined Common Processing System (SDCPS) program to facilitate the development of a Software Defined Radio...

  18. A general software reliability process simulation technique

    Science.gov (United States)

    Tausworthe, Robert C.

    1991-01-01

    The structure and rationale of the generalized software reliability process, together with the design and implementation of a computer program that simulates this process are described. Given assumed parameters of a particular project, the users of this program are able to generate simulated status timelines of work products, numbers of injected anomalies, and the progress of testing, fault isolation, repair, validation, and retest. Such timelines are useful in comparison with actual timeline data, for validating the project input parameters, and for providing data for researchers in reliability prediction modeling.

  19. Simulation software: engineer processes before reengineering.

    Science.gov (United States)

    Lepley, C J

    2001-01-01

    People make decisions all the time using intuition. But what happens when you are asked: "Are you sure your predictions are accurate? How much will a mistake cost? What are the risks associated with this change?" Once a new process is engineered, it is difficult to analyze what would have been different if other options had been chosen. Simulating a process can help senior clinical officers solve complex patient flow problems and avoid wasted efforts. Simulation software can give you the data you need to make decisions. The author introduces concepts, methodologies, and applications of computer aided simulation to illustrate their use in making decisions to improve workflow design.

  20. The ATLAS data management software engineering process

    International Nuclear Information System (INIS)

    Lassnig, M; Garonne, V; Stewart, G A; Barisits, M; Serfon, C; Goossens, L; Nairz, A; Beermann, T; Vigne, R; Molfetas, A

    2014-01-01

    Rucio is the next-generation data management system of the ATLAS experiment. The software engineering process to develop Rucio is fundamentally different to existing software development approaches in the ATLAS distributed computing community. Based on a conceptual design document, development takes place using peer-reviewed code in a test-driven environment. The main objectives are to ensure that every engineer understands the details of the full project, even components usually not touched by them, that the design and architecture are coherent, that temporary contributors can be productive without delay, that programming mistakes are prevented before being committed to the source code, and that the source is always in a fully functioning state. This contribution will illustrate the workflows and products used, and demonstrate the typical development cycle of a component from inception to deployment within this software engineering process. Next to the technological advantages, this contribution will also highlight the social aspects of an environment where every action is subject to detailed scrutiny.

  1. The ATLAS data management software engineering process

    Science.gov (United States)

    Lassnig, M.; Garonne, V.; Stewart, G. A.; Barisits, M.; Beermann, T.; Vigne, R.; Serfon, C.; Goossens, L.; Nairz, A.; Molfetas, A.; Atlas Collaboration

    2014-06-01

    Rucio is the next-generation data management system of the ATLAS experiment. The software engineering process to develop Rucio is fundamentally different to existing software development approaches in the ATLAS distributed computing community. Based on a conceptual design document, development takes place using peer-reviewed code in a test-driven environment. The main objectives are to ensure that every engineer understands the details of the full project, even components usually not touched by them, that the design and architecture are coherent, that temporary contributors can be productive without delay, that programming mistakes are prevented before being committed to the source code, and that the source is always in a fully functioning state. This contribution will illustrate the workflows and products used, and demonstrate the typical development cycle of a component from inception to deployment within this software engineering process. Next to the technological advantages, this contribution will also highlight the social aspects of an environment where every action is subject to detailed scrutiny.

  2. Process mining software repositories: do developers work as expected?

    NARCIS (Netherlands)

    Poncin, W.; Serebrenik, A.; Brand, van den M.G.J.

    2012-01-01

    Modern software development commonly makes use of a multitude of software repositories. How can these help us to understand the on-going development process? Researchers of Eindhoven University of Technology design new methods revealing how software has been developed.

  3. Managing Cultural Variation in Software Process Improvement

    DEFF Research Database (Denmark)

    Kræmmergaard, Pernille; Müller, Sune Dueholm; Mathiassen, Lars

    The scale and complexity of change in software process improvement (SPI) are considerable and managerial attention to organizational culture during SPI can therefore potentially contribute to successful outcomes. However, we know little about the impact of variations in organizational subculture...... on SPI initiatives. On this backdrop, we report from a large scale SPI project in a Danish high-tech company, Terma. Two of its business units - Integrated Systems (ISY) and Airborne Systems (ASY) - followed similar approaches over a two year period, but with quite different outcomes. While ISY reached...... CMMI level 2 as planned, ASY struggled to implement even modest improvements. To explain these differences, we analyzed the underlying organizational culture within ISY and ASY using two different methods for subculture assessment. The study demonstrates how variations in culture across software...

  4. Software Replica of Minimal Living Processes

    Science.gov (United States)

    Bersini, Hugues

    2010-04-01

    There is a long tradition of software simulations in theoretical biology to complement pure analytical mathematics which are often limited to reproduce and understand the self-organization phenomena resulting from the non-linear and spatially grounded interactions of the huge number of diverse biological objects. Since John Von Neumann and Alan Turing pioneering works on self-replication and morphogenesis, proponents of artificial life have chosen to resolutely neglecting a lot of materialistic and quantitative information deemed not indispensable and have focused on the rule-based mechanisms making life possible, supposedly neutral with respect to their underlying material embodiment. Minimal life begins at the intersection of a series of processes which need to be isolated, differentiated and duplicated as such in computers. Only software developments and running make possible to understand the way these processes are intimately interconnected in order for life to appear at the crossroad. In this paper, I will attempt to set out the history of life as the disciples of artificial life understand it, by placing these different lessons on a temporal and causal axis, showing which one is indispensable to the appearance of the next and how does it connect to the next. I will discuss the task of artificial life as setting up experimental software platforms where these different lessons, whether taken in isolation or together, are tested, simulated, and, more systematically, analyzed. I will sketch some of these existing software platforms: chemical reaction networks, Varela’s autopoietic cellular automata, Ganti’s chemoton model, whose running delivers interesting take home messages to open-minded biologists.

  5. Requirements Report Computer Software System for a Semi-Automatic Pipe Handling System and Fabrication Facility

    National Research Council Canada - National Science Library

    1980-01-01

    .... This report is to present the requirements of the computer software that must be developed to create Pipe Detail Drawings and to support the processing of the Pipe Detail Drawings through the Pipe Shop...

  6. Usefulness of Cone-Beam Computed Tomography and Automatic Vessel Detection Software in Emergency Transarterial Embolization

    Energy Technology Data Exchange (ETDEWEB)

    Carrafiello, Gianpaolo, E-mail: gcarraf@gmail.com; Ierardi, Anna Maria, E-mail: amierardi@yahoo.it; Duka, Ejona, E-mail: ejonaduka@hotmail.com [Insubria University, Department of Radiology, Interventional Radiology (Italy); Radaelli, Alessandro, E-mail: alessandro.radaelli@philips.com [Philips Healthcare (Netherlands); Floridi, Chiara, E-mail: chiara.floridi@gmail.com [Insubria University, Department of Radiology, Interventional Radiology (Italy); Bacuzzi, Alessandro, E-mail: alessandro.bacuzzi@ospedale.varese.it [University of Insubria, Anaesthesia and Palliative Care (Italy); Bucourt, Maximilian de, E-mail: maximilian.de-bucourt@charite.de [Charité - University Medicine Berlin, Department of Radiology (Germany); Marchi, Giuseppe De, E-mail: giuseppedemarchi@email.it [Insubria University, Department of Radiology, Interventional Radiology (Italy)

    2016-04-15

    BackgroundThis study was designed to evaluate the utility of dual phase cone beam computed tomography (DP-CBCT) and automatic vessel detection (AVD) software to guide transarterial embolization (TAE) of angiographically challenging arterial bleedings in emergency settings.MethodsTwenty patients with an arterial bleeding at computed tomography angiography and an inconclusive identification of the bleeding vessel at the initial 2D angiographic series were included. Accuracy of DP-CBCT and AVD software were defined as the ability to detect the bleeding site and the culprit arterial bleeder, respectively. Technical success was defined as the correct positioning of the microcatheter using AVD software. Clinical success was defined as the successful embolization. Total volume of iodinated contrast medium and overall procedure time were registered.ResultsThe bleeding site was not detected by initial angiogram in 20 % of cases, while impossibility to identify the bleeding vessel was the reason for inclusion in the remaining cases. The bleeding site was detected by DP-CBCT in 19 of 20 (95 %) patients; in one case CBCT-CT fusion was required. AVD software identified the culprit arterial branch in 18 of 20 (90 %) cases. In two cases, vessel tracking required manual marking of the candidate arterial bleeder. Technical success was 95 %. Successful embolization was achieved in all patients. Mean contrast volume injected for each patient was 77.5 ml, and mean overall procedural time was 50 min.ConclusionsC-arm CBCT and AVD software during TAE of angiographically challenging arterial bleedings is feasible and may facilitate successful embolization. Staff training in CBCT imaging and software manipulation is necessary.

  7. Monitoring caustic injuries from emergency department databases using automatic keyword recognition software.

    Science.gov (United States)

    Vignally, P; Fondi, G; Taggi, F; Pitidis, A

    2011-03-31

    In Italy the European Union Injury Database reports the involvement of chemical products in 0.9% of home and leisure accidents. The Emergency Department registry on domestic accidents in Italy and the Poison Control Centres record that 90% of cases of exposure to toxic substances occur in the home. It is not rare for the effects of chemical agents to be observed in hospitals, with a high potential risk of damage - the rate of this cause of hospital admission is double the domestic injury average. The aim of this study was to monitor the effects of injuries caused by caustic agents in Italy using automatic free-text recognition in Emergency Department medical databases. We created a Stata software program to automatically identify caustic or corrosive injury cases using an agent-specific list of keywords. We focused attention on the procedure's sensitivity and specificity. Ten hospitals in six regions of Italy participated in the study. The program identified 112 cases of injury by caustic or corrosive agents. Checking the cases by quality controls (based on manual reading of ED reports), we assessed 99 cases as true positive, i.e. 88.4% of the patients were automatically recognized by the software as being affected by caustic substances (99% CI: 80.6%- 96.2%), that is to say 0.59% (99% CI: 0.45%-0.76%) of the whole sample of home injuries, a value almost three times as high as that expected (p < 0.0001) from European codified information. False positives were 11.6% of the recognized cases (99% CI: 5.1%- 21.5%). Our automatic procedure for caustic agent identification proved to have excellent product recognition capacity with an acceptable level of excess sensitivity. Contrary to our a priori hypothesis, the automatic recognition system provided a level of identification of agents possessing caustic effects that was significantly much greater than was predictable on the basis of the values from current codifications reported in the European Database.

  8. Software Quality in the Objectory Software Development Process

    NARCIS (Netherlands)

    van den Berg, Klaas; Demeyer, S.; Bosch, J.

    In this paper we discuss how software quality assurance is realized in Rational Objectory. Although much support is given through guidelines and checkpoints, the tool fails to provide clear goals and metrics for quality assessments and it only partially supports the phases in a measurement program.

  9. Development of Automatic Quality Check Software in Mailbox Declaration For Nuclear Fuel Fabrication Plants

    International Nuclear Information System (INIS)

    Kim, Minsu; Shim, Hye Won; Jo, Seong Yeon; Lee, Kwang Yeol; Ban, Myoung Jin

    2014-01-01

    Short Notice Random Inspection (SNRI) is a new IAEA safeguards inspection regime for bulk handing facility, which utilities random inspection through a mailbox system. Its main objective is to verify 100% of the flow components of the safeguarded nuclear material at such a facility. To achieve the SNRI objective, it is required to provide daily mailbox declaration, by a facility's operator, to the IAEA with regard to information, such as the receipt and shipment of nuclear materials. Mailbox declarations are then later compared with accounting records so as to examine the accuracy and consistency of the facility operator's declaration at the time of the SNRI. The IAEA has emphasized the importance of accurate mailbox declarations and recommended that the ROK initiate its own independent quality control system in order to improve and maintain its mailbox declarations as a part of the SSAC activities. In an effort to improve the transparency of operational activities at fuel fabrication plants and to satisfy IAEA recommendation, an automatic quality check software application has been developed to improve mailbox declarations at fabrication plants in Korea. The ROK and the IAEA have recognized the importance of providing good quality mailbox declaration for an effective and efficient SNRI at fuel fabrication plants in Korea. The SRA developed an automatic quality check software program in order to provide an independent QC system of mailbox declaration, as well as to improve the quality of mailbox declaration. Once the automatic QC system is implemented, it will improve the quality of an operator's mailbox declaration by examining data before sending it to the IAEA. The QC system will be applied to fuel fabrication plants in the first half of 2014

  10. Process and device for automatically surveying complex installations

    International Nuclear Information System (INIS)

    Pekrul, P.J.; Thiele, A.W.

    1976-01-01

    A description is given of a process for automatically analysing separate signal processing channels in real time, one channel per signal, in a facility with significant background noise in signals varying in time and coming from transducers at selected points for the continuous monitoring of the operating conditions of the various components of the installation. The signals are intended to determine potential breakdowns, determine conclusions as to the severity of these potential breakdowns and indicate to an operator the measures to be taken in consequence. The feature of this process is that it comprises the automatic and successive selection of each channel for the purpose of spectral analysis, the automatic processing of the signal of each selected channel to show energy spectrum density data at pre-determined frequencies, the automatic comparison of the energy spectrum density data of each channel with pre-determined sets of limits varying with the frequency, and the automatic indication to the operator of the condition of the various components of the installation associated to each channel and the measures to be taken depending on the set of limits [fr

  11. Resource depletion promotes automatic processing: implications for distribution of practice.

    Science.gov (United States)

    Scheel, Matthew H

    2010-12-01

    Recent models of cognition include two processing systems: an automatic system that relies on associative learning, intuition, and heuristics, and a controlled system that relies on deliberate consideration. Automatic processing requires fewer resources and is more likely when resources are depleted. This study showed that prolonged practice on a resource-depleting mental arithmetic task promoted automatic processing on a subsequent problem-solving task, as evidenced by faster responding and more errors. Distribution of practice effects (0, 60, 120, or 180 sec. between problems) on rigidity also disappeared when groups had equal time on resource-depleting tasks. These results suggest that distribution of practice effects is reducible to resource availability. The discussion includes implications for interpreting discrepancies in the traditional distribution of practice effect.

  12. Automatic processing of radioimmunological research data on a computer

    International Nuclear Information System (INIS)

    Korolyuk, I.P.; Gorodenko, A.N.; Gorodenko, S.I.

    1979-01-01

    A program ''CRITEST'' in the language PL/1 for the EC computer intended for automatic processing of the results of radioimmunological research has been elaborated. The program works in the operation system of the OC EC computer and is performed in the section OC 60 kb. When compiling the program Eitken's modified algorithm was used. The program was clinically approved when determining a number of hormones: CTH, T 4 , T 3 , TSH. The automatic processing of the radioimmunological research data on the computer makes it possible to simplify the labour-consuming analysis and to raise its accuracy

  13. THE INTERNATIONALIZATION PROCESS OF AN FASHION SOFTWARE

    Directory of Open Access Journals (Sweden)

    William Ramos

    2011-06-01

    Full Text Available This paper aims to analyze the internationalization process from 1996 to 2010 of a company that develops software for fashion. The theoretical foundations of the article include the theories of internationalization with special attention to the Nordic School of International Business, the approach of the networks, as well as research on specific software firms, born globals and international entrepreneurship. The survey used the technique of case study, qualitative, exploratory and descriptive. Data were collected from primary sources such as semi-structured interviews and questionnaires, and secondary sources. The data from the interviews and documents were repeatedly reviewed, coded and analyzed for their content and meaning, to identify the categories under consideration. Among the factors seen fleeing theoretical approaches, are cited the heels of the steps outlined by them and not exclusive preference for countries with lower psychic distance. It was also relevant to the international orientation of the entrepreneurs, whose social networks encompassed international actors who sometimes influenced the process, confirming the approach of the networks.

  14. Image processing and analysis software development

    International Nuclear Information System (INIS)

    Shahnaz, R.

    1999-01-01

    The work presented in this project is aimed at developing a software 'IMAGE GALLERY' to investigate various image processing and analysis techniques. The work was divided into two parts namely the image processing techniques and pattern recognition, which further comprised of character and face recognition. Various image enhancement techniques including negative imaging, contrast stretching, compression of dynamic, neon, diffuse, emboss etc. have been studied. Segmentation techniques including point detection, line detection, edge detection have been studied. Also some of the smoothing and sharpening filters have been investigated. All these imaging techniques have been implemented in a window based computer program written in Visual Basic Neural network techniques based on Perception model have been applied for face and character recognition. (author)

  15. The Usefulness of Automatic Speech Recognition (ASR Eyespeak Software in Improving Iraqi EFL Students’ Pronunciation

    Directory of Open Access Journals (Sweden)

    Lina Fathi Sidig Sidgi

    2017-02-01

    Full Text Available The present study focuses on determining whether automatic speech recognition (ASR technology is reliable for improving English pronunciation to Iraqi EFL students. Non-native learners of English are generally concerned about improving their pronunciation skills, and Iraqi students face difficulties in pronouncing English sounds that are not found in their native language (Arabic. This study is concerned with ASR and its effectiveness in overcoming this difficulty. The data were obtained from twenty participants randomly selected from first-year college students at Al-Turath University College from the Department of English in Baghdad-Iraq. The students had participated in a two month pronunciation instruction course using ASR Eyespeak software. At the end of the pronunciation instruction course using ASR Eyespeak software, the students completed a questionnaire to get their opinions about the usefulness of the ASR Eyespeak in improving their pronunciation. The findings of the study revealed that the students found ASR Eyespeak software very useful in improving their pronunciation and helping them realise their pronunciation mistakes. They also reported that learning pronunciation with ASR Eyespeak enjoyable.

  16. Software: our quest for excellence. Honoring 50 years of software history, progress, and process

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-06-01

    The Software Quality Forum was established by the Software Quality Assurance (SQA) Subcommittee, which serves as a technical advisory group on software engineering and quality initiatives and issues for DOE`s quality managers. The forum serves as an opportunity for all those involved in implementing SQA programs to meet and share ideas and concerns. Participation from managers, quality engineers, and software professionals provides an ideal environment for identifying and discussing issues and concerns. The interaction provided by the forum contributes to the realization of a shared goal--high quality software product. Topics include: testing, software measurement, software surety, software reliability, SQA practices, assessments, software process improvement, certification and licensing of software professionals, CASE tools, software project management, inspections, and management`s role in ensuring SQA. The bulk of this document consists of vugraphs. Selected papers have been indexed separately for inclusion in the Energy Science and Technology Database.

  17. Test process for the safety-critical embedded software

    International Nuclear Information System (INIS)

    Sung, Ahyoung; Choi, Byoungju; Lee, Jangsoo

    2004-01-01

    Digitalization of nuclear Instrumentation and Control (I and C) system requires high reliability of not only hardware but also software. Verification and Validation (V and V) process is recommended for software reliability. But a more quantitative method is necessary such as software testing. Most of software in the nuclear I and C system is safety-critical embedded software. Safety-critical embedded software is specified, verified and developed according to V and V process. Hence two types of software testing techniques are necessary for the developed code. First, code-based software testing is required to examine the developed code. Second, after code-based software testing, software testing affected by hardware is required to reveal the interaction fault that may cause unexpected results. We call the testing of hardware's influence on software, an interaction testing. In case of safety-critical embedded software, it is also important to consider the interaction between hardware and software. Even if no faults are detected when testing either hardware or software alone, combining these components may lead to unexpected results due to the interaction. In this paper, we propose a software test process that embraces test levels, test techniques, required test tasks and documents for safety-critical embedded software. We apply the proposed test process to safety-critical embedded software as a case study, and show the effectiveness of it. (author)

  18. Automatic processing of multimodal tomography datasets.

    Science.gov (United States)

    Parsons, Aaron D; Price, Stephen W T; Wadeson, Nicola; Basham, Mark; Beale, Andrew M; Ashton, Alun W; Mosselmans, J Frederick W; Quinn, Paul D

    2017-01-01

    With the development of fourth-generation high-brightness synchrotrons on the horizon, the already large volume of data that will be collected on imaging and mapping beamlines is set to increase by orders of magnitude. As such, an easy and accessible way of dealing with such large datasets as quickly as possible is required in order to be able to address the core scientific problems during the experimental data collection. Savu is an accessible and flexible big data processing framework that is able to deal with both the variety and the volume of data of multimodal and multidimensional scientific datasets output such as those from chemical tomography experiments on the I18 microfocus scanning beamline at Diamond Light Source.

  19. A software application for the processing of students results | Ukem ...

    African Journals Online (AJOL)

    A software application for the processing of students results. ... In this work, a computer software application was developed to facilitate the automated processing of the results. ... Full Text: EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  20. Automatic Detection and Resolution of Lexical Ambiguity in Process Models

    NARCIS (Netherlands)

    Pittke, F.; Leopold, H.; Mendling, J.

    2015-01-01

    System-related engineering tasks are often conducted using process models. In this context, it is essential that these models do not contain structural or terminological inconsistencies. To this end, several automatic analysis techniques have been proposed to support quality assurance. While formal

  1. Software Process Improvement: Where Is the Evidence?

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Konopka, Claudia; Nellemann, Peter

    2015-01-01

    for future research directions. An analysis of 635 publications draws a big picture of SPI-related research of the past 25 years. Our study shows a high number of solution proposals, experience reports, and secondary studies, but only few theories. In particular, standard SPI models like CMMI and ISO......Software process improvement (SPI) is around for decades: frameworks are proposed, success factors are studied, and experiences have been reported. However, the sheer mass of concepts, approaches, and standards published over the years overwhelms practitioners as well as researchers. What is out...... there? Are there new emerging approaches? What are open issues? Still, we struggle to answer the question for what is the current state of SPI and related research? In this paper, we present initial results from a systematic mapping study to shed light on the field of SPI and to draw conclusions...

  2. Automatic process control in anaerobic digestion technology: A critical review.

    Science.gov (United States)

    Nguyen, Duc; Gadhamshetty, Venkataramana; Nitayavardhana, Saoharit; Khanal, Samir Kumar

    2015-10-01

    Anaerobic digestion (AD) is a mature technology that relies upon a synergistic effort of a diverse group of microbial communities for metabolizing diverse organic substrates. However, AD is highly sensitive to process disturbances, and thus it is advantageous to use online monitoring and process control techniques to efficiently operate AD process. A range of electrochemical, chromatographic and spectroscopic devices can be deployed for on-line monitoring and control of the AD process. While complexity of the control strategy ranges from a feedback control to advanced control systems, there are some debates on implementation of advanced instrumentations or advanced control strategies. Centralized AD plants could be the answer for the applications of progressive automatic control field. This article provides a critical overview of the available automatic control technologies that can be implemented in AD processes at different scales. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. A proposed metamodel for the implementation of object oriented software through the automatic generation of source code

    Directory of Open Access Journals (Sweden)

    CARVALHO, J. S. C.

    2008-12-01

    Full Text Available During the development of software one of the most visible risks and perhaps the biggest implementation obstacle relates to the time management. All delivery deadlines software versions must be followed, but it is not always possible, sometimes due to delay in coding. This paper presents a metamodel for software implementation, which will rise to a development tool for automatic generation of source code, in order to make any development pattern transparent to the programmer, significantly reducing the time spent in coding artifacts that make up the software.

  4. Decision Support for Software Process Management Teams: An Intelligent Software Agent Approach

    National Research Council Canada - National Science Library

    Church, Lori

    2000-01-01

    ... to market, eliminate redundancy, and ease job stress. This thesis proposes a conceptual model for software process management decision support in the form of an intelligent software agent network...

  5. Automation of chromosomes analysis. Automatic system for image processing

    International Nuclear Information System (INIS)

    Le Go, R.; Cosnac, B. de; Spiwack, A.

    1975-01-01

    The A.S.T.I. is an automatic system relating to the fast conversational processing of all kinds of images (cells, chromosomes) converted to a numerical data set (120000 points, 16 grey levels stored in a MOS memory) through a fast D.O. analyzer. The system performs automatically the isolation of any individual image, the area and weighted area of which are computed. These results are directly displayed on the command panel and can be transferred to a mini-computer for further computations. A bright spot allows parts of an image to be picked out and the results to be displayed. This study is particularly directed towards automatic karyo-typing [fr

  6. A software environment to execute automatic operational sequences on the ITER-FEAT DTP facility

    International Nuclear Information System (INIS)

    Fermani, G.; Zarfino, M.

    2001-01-01

    The divertor test platform (DTP) maintenance operations are carried out by means of the remote handling equipments (RHE), each dedicated to perform a set of specialised remote actions. Each RHE is controlled by an RHE control system (RHE-CS) and can be locally operated by an RHE-operator using the local control panel (LOP). To perform the maintenance activity, the DTP-operator coordinates the remote operations of every RHEs, using the supervisory system (DTP S S). Because the remote maintenance activities demand for a high degree of parallelism, automation and cooperation between various RHEs, the development of a software environment (OSAExE) that had the indicated characteristics has been necessary. The OSAExE environment is applicable to any distributed and cooperating system that is modelled as a set of autonomous subsystems. Each maintenance remote sequence needs to be modelled as a modified Petri-net diagram and subsequently 'compiled', in order to be automatically executed on OSAExE environment. The OSAExE architecture allows both, to program 'event driven' automatic sequences, and to maintain unchanged all the existing DTP S S features

  7. REALIZATION OF TRAINING PROGRAMME ON THE BASIS OF LINGUISTIC DATABASE FOR AUTOMATIC TEXTS PROCESSING SYSTEM

    Directory of Open Access Journals (Sweden)

    M. A. Makarych

    2016-01-01

    Full Text Available Due to the constant increasing of electronic textual information, modern society needs for the automatic processing of natural language (NL. The main purpose of NL automatic text processing systems is to analyze and create texts and represent their content. The purpose of the paper is the development of linguistic and software bases of an automatic system for processing English publicistic texts. This article discusses the examples of different approaches to the creation of linguistic databases for processing systems. The author gives a detailed description of basic building blocks for a new linguistic processor: lexical-semantic, syntactical and semantic-syntactical. The main advantage of the processor is using special semantic codes in the alphabetical dictionary. The semantic codes have been developed in accordance with a lexical-semantic classification. It helps to precisely define semantic functions of the keywords that are situated in parsing groups and allows the automatic system to avoid typical mistakes. The author also represents the realization of a developed linguistic database in the form of a training computer program.

  8. Summary of the International Conference on Software and System Processes

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; O'Connor, Rory V.; Perry, Dewayne E.

    2016-01-01

    The International Conference on Software and Systems Process (ICSSP), continuing the success of Software Process Workshop (SPW), the Software Process Modeling and Simulation Workshop (ProSim) and the International Conference on Software Process (ICSP) conference series, has become the established...... premier event in the field of software and systems engineering processes. It provides a leading forum for the exchange of research outcomes and industrial best-practices in process development from software and systems disciplines. ICSSP 2016 was held in Austin, Texas, from 14-15 May 2016, co......-located with the 38th International Conference on Software Engineering (ICSE). The theme of mICSSP 2016 was studying "Process(es) in Action" by recognizing that the AS-Planned and AS-Practiced processes can be quite different in many ways including their ows, their complexity and the evolving needs of stakeholders...

  9. Towards a New Paradigm of Software Development: an Ambassador Driven Process in Distributed Software Companies

    Science.gov (United States)

    Kumlander, Deniss

    The globalization of companies operations and competitor between software vendors demand improving quality of delivered software and decreasing the overall cost. The same in fact introduce a lot of problem into software development process as produce distributed organization breaking the co-location rule of modern software development methodologies. Here we propose a reformulation of the ambassador position increasing its productivity in order to bridge communication and workflow gap by managing the entire communication process rather than concentrating purely on the communication result.

  10. FUZZY LOGIC BASED SOFTWARE PROCESS IMPROVIZATION FRAMEWORK FOR INDIAN SMALL SCALE SOFTWARE ORGANIZATIONS

    OpenAIRE

    A.M.Kalpana; Dr.A.Ebenezer Jeyakumar

    2010-01-01

    In this paper, the authors elaborate the results obtained after analyzing and assessing the software process activities in five small to medium sized Indian software companies. This work demonstrates a cost effective framework for software process appraisal, specificallytargeted at Indian software Small-to-Medium-sized Enterprises (SMEs). Improvisation deals with the unforeseen. It involves continual experimentation with new possibilities to create innovative and improved solutions outside cu...

  11. MEASUREMENT PROCESS OF SOFTWARE DEVELOPMENT PROJECTS FOR SUPPORTING STRATEGIC BUSINESS OBJECTIVES IN SOFTWARE DEVELOPING COMPANIES

    Directory of Open Access Journals (Sweden)

    Sandra Lais Pedroso

    2013-08-01

    Full Text Available Software developing companies work in a competitive market and are often challenged to make business decisions with impact on competitiveness. Models accessing maturity for software development processes quality, such as CMMI and MPS-BR, comprise process measurements systems (PMS. However, these models are not necessarily suitable to support business decisions, neither to achieve strategic goals. The objective of this work is to analyze how the PMS of software development projects could support business strategies for software developing companies. Results taken from this work show that PMS results from maturity models for software processes can be suited to help evaluating operating capabilities and supporting strategic business decisions.

  12. Automatized material and radioactivity flow control tool in decommissioning process

    International Nuclear Information System (INIS)

    Rehak, I.; Vasko, M.; Daniska, V.; Schultz, O.

    2009-01-01

    In this presentation the automatized material and radioactivity flow control tool in decommissioning process is discussed. It is concluded that: computer simulation of the decommissioning process is one of the important attributes of computer code Omega; one of the basic tools of computer optimisation of decommissioning waste processing are the tools of integral material and radioactivity flow; all the calculated parameters of materials are stored in each point of calculation process and they can be viewed; computer code Omega represents opened modular system, which can be improved; improvement of the module of optimisation of decommissioning waste processing will be performed in the frame of improvement of material procedures and scenarios.

  13. Automatic and Systematic Atomistic Simulations in the MedeA® Software Environment: Application to EU-REACH

    Directory of Open Access Journals (Sweden)

    Rozanska Xavier

    2015-03-01

    Full Text Available This work demonstrates the systematic prediction of thermodynamic properties for batches of thousands of molecules using automated procedures. This is accomplished with newly developed tools and functions within the Material Exploration and Design Analysis (MedeA® software environment, which handle the automatic execution of sequences of tasks for large numbers of molecules including the creation of 3D molecular models from 1D representations, systematic exploration of possible conformers for each molecule, the creation and submission of computational tasks for property calculations on parallel computers, and the post-processing for comparison with available experimental properties. After the description of the different MedeA® functionalities and methods that make it easy to perform such large number of computations, we illustrate the strength and power of the approach with selected examples from molecular mechanics and quantum chemical simulations. Specifically, comparisons of thermochemical data with quantum-based heat capacities and standard energies of formation have been obtained for more than 2 000 compounds, yielding average deviations with experiments of less than 4% with the Design Institute for Physical PRoperties (DIPPR database. The automatic calculation of the density of molecular fluids is demonstrated for 192 systems. The relaxation to minimum-energy structures and the calculation of vibrational frequencies of 5 869 molecules are evaluated automatically using a semi-empirical quantum mechanical approach with a success rate of 99.9%. The present approach is scalable to large number of molecules, thus opening exciting possibilities with the advent of exascale computing.

  14. Software process improvement: controlling developers, managers or users?

    DEFF Research Database (Denmark)

    Nørbjerg, Jacob

    1999-01-01

    The paper discusses how the latest trend in the management of software development: software process improvement (SPI) may affect user-developer relations. At the outset, SPI concerns the "internal workings" of software organisations, but it may also be interpreted as one way to give the developer...... organisation more control over the development process and the relations with the user organization....

  15. Toward objective software process information : experiences from a case study

    NARCIS (Netherlands)

    Samalikova, J.; Kusters, R.J.; Trienekens, J.J.M.; Weijters, A.J.M.M.; Siemons, P.

    2011-01-01

    A critical problem in software development is the monitoring, control and improvement in the processes of software developers. Software processes are often not explicitly modeled, and manuals to support the development work contain abstract guidelines and procedures. Consequently, there are huge

  16. Software and hardware platform for testing of Automatic Generation Control algorithms

    Directory of Open Access Journals (Sweden)

    Vasiliev Alexey

    2017-01-01

    Full Text Available Development and implementation of new Automatic Generation Control (AGC algorithms requires testing them on a model that adequately simulates primary energetic, information and control processes. In this article an implementation of a test platform based on HRTSim (Hybrid Real Time Simulator and SCADA CK-2007 (which is widely used by the System Operator of Russia is proposed. Testing of AGC algorithms on the test platform based on the same SCADA system that is used in operation allows to exclude errors associated with the transfer of AGC algorithms and settings from the test platform to a real power system. A power system including relay protection, automatic control systems and emergency control automatics can be accurately simulated on HRTSim. Besides the information commonly used by conventional AGC systems HRTSim is able to provide a resemblance of Phasor Measurement Unit (PMU measurements (information about rotor angles, magnitudes and phase angles of currents and voltages etc.. The additional information significantly expands the number of possible AGC algorithms so the test platform is useful in modern AGC system developing. The obtained test results confirm that the proposed system is applicable for the tasks mentioned above.

  17. Data processing software for purex plant process control laboratory

    International Nuclear Information System (INIS)

    Kansara, V.P.; Achuthan, P.V.; Sridhar, S.; Ramanujam, A.; Dhumwad, R.K.

    1990-01-01

    A software has been developed at the Fuel Reprocessing Division, Trombay to meet the data processing needs of the Control Laboratory of a reprocessing plant. During the normal plant operations contents of over one hundred process tanks have to be sampled and analysed for regular monitoring. In order to speed up the computation and the reporting of results as well as to obtain the process performance data over a period of time a software has been developed. The package has been sucessfully demonstrated and implemented at the Plutonium Plant, Trombay. This has been in continuous use since May 1987 with highly satisfactory performance. The software is a totally menu-driven package which can be used by the laboratory analysts with a few hours of training. The features include data validation involving source tank identification, the nature of the sample, the range of expected results, any duplication in sample numbering etc. Audio indication of deviations from the expected input or output values are given with an option to override in case of abnormal samples. The progress of analysis can be obtained for a given sample at any given time. Incorporated in the software is the help menu for quick reference of analytical protocol to be followed for a given tank/method. The computations for the determinations are carried out after obtaining input values on a screen-form. Th e results can be displayed on the monitor or obtained in the form of a hard copy i n any desired format. (author). 17 figs., 2 refs

  18. Image quality dependence on image processing software in ...

    African Journals Online (AJOL)

    Image quality dependence on image processing software in computed radiography. ... Agfa CR readers use MUSICA software, and an upgrade with significantly different image ... Full Text: EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  19. Managing Change in Software Process Improvement

    DEFF Research Database (Denmark)

    Mathiassen, Lars; Ngwenyama, Ojelanki K.; Aaen, Ivan

    2005-01-01

    When software managers initiate SPI, most are ill prepared for the scale and complexity of the organizational change involved. Although they typically know how to deal with large software projects, few managers have sufficient experience with projects that transform organizations. To succeed with...

  20. Software engineering technology transfer: Understanding the process

    Science.gov (United States)

    Zelkowitz, Marvin V.

    1993-01-01

    Technology transfer is of crucial concern to both government and industry today. In this report, the mechanisms developed by NASA to transfer technology are explored and the actual mechanisms used to transfer software development technologies are investigated. Time, cost, and effectiveness of software engineering technology transfer is reported.

  1. Software process improvement in a research environment

    NARCIS (Netherlands)

    Velden, van der M.J.; Hendriks, P.R.H.; Udink ten Cate, A.J.

    1995-01-01

    Research organizations pay much attention to the quality of their work, but not always to the quality of the software they produce within research projects. This is not a healthy situation since research organizations are becoming more and more dependent on software development. This paper describes

  2. Aligning Requirements-Driven Software Processes with IT Governance

    OpenAIRE

    Nguyen Huynh Anh, Vu; Kolp, Manuel; Heng, Samedi; Wautelet, Yves

    2017-01-01

    Requirements Engineering is closely intertwined with Information Technology (IT) Governance. Aligning IT Governance principles with Requirements-Driven Software Processes allows them to propose governance and management rules for software development to cope with stakeholders’ requirements and expectations. Typically, the goal of IT Governance in software engineering is to ensure that the results of a software organization business processes meet the strategic requirements of the organization...

  3. SOFTWARE PROCESS IMPROVEMENT: AWARENESS, USE, AND BENEFITS IN CANADIAN SOFTWARE DEVELOPMENT FIRMS

    OpenAIRE

    CHEVERS, DELROY

    2017-01-01

    ABSTRACT Since 1982, the software development community has been concerned with the delivery of quality systems. Software process improvement (SPI) is an initiative to avoid the delivery of low quality systems. However, the awareness and adoption of SPI is low. Thus, this study examines the rate of awareness, use, and benefits of SPI initiatives in Canadian software development firms. Using SPSS as the analytical tool, this study found that 59% of Canadian software development firms are aware...

  4. Software development processes and analysis software: a mismatch and a novel framework

    International Nuclear Information System (INIS)

    Kelly, D.; Harauz, J.

    2011-01-01

    This paper discusses the salient characteristics of analysis software and the impact of those characteristics on its development. From this discussion, it can be seen that mainstream software development processes, usually characterized as Plan Driven or Agile, are built upon assumptions that are mismatched to the development and maintenance of analysis software. We propose a novel software development framework that would match the process normally observed in the development of analysis software. In the discussion of this framework, we suggest areas of research and directions for future work. (author)

  5. Verification and validation process for the safety software in KNICS

    International Nuclear Information System (INIS)

    Kwon, Kee-Choon; Lee, Jang-Soo; Kim, Jang-Yeol

    2004-01-01

    This paper describes the Verification and Validation (V and V ) process for safety software of Programmable Logic Controller (PLC), Digital Reactor Protection System (DRPS), and Engineered Safety Feature-Component Control System (ESF-CCS) that are being developed in Korea Nuclear Instrumentation and Control System (KNICS) projects. Specifically, it presents DRPS V and V experience according to the software development life cycle. The main activities of DRPS V and V process are preparation of software planning documentation, verification of Software Requirement Specification (SRS), Software Design Specification (SDS) and codes, and testing of the integrated software and the integrated system. In addition, they include software safety analysis and software configuration management. SRS V and V of DRPS are technical evaluation, licensing suitability evaluation, inspection and traceability analysis, formal verification, preparing integrated system test plan, software safety analysis, and software configuration management. Also, SDS V and V of RPS are technical evaluation, licensing suitability evaluation, inspection and traceability analysis, formal verification, preparing integrated software test plan, software safety analysis, and software configuration management. The code V and V of DRPS are traceability analysis, source code inspection, test case and test procedure generation, software safety analysis, and software configuration management. Testing is the major V and V activity of software integration and system integration phase. Software safety analysis at SRS phase uses Hazard Operability (HAZOP) method, at SDS phase it uses HAZOP and Fault Tree Analysis (FTA), and at implementation phase it uses FTA. Finally, software configuration management is performed using Nu-SCM (Nuclear Software Configuration Management) tool developed by KNICS project. Through these activities, we believe we can achieve the functionality, performance, reliability and safety that are V

  6. Experimental Study for Automatic Colony Counting System Based Onimage Processing

    Science.gov (United States)

    Fang, Junlong; Li, Wenzhe; Wang, Guoxin

    Colony counting in many colony experiments is detected by manual method at present, therefore it is difficult for man to execute the method quickly and accurately .A new automatic colony counting system was developed. Making use of image-processing technology, a study was made on the feasibility of distinguishing objectively white bacterial colonies from clear plates according to the RGB color theory. An optimal chromatic value was obtained based upon a lot of experiments on the distribution of the chromatic value. It has been proved that the method greatly improves the accuracy and efficiency of the colony counting and the counting result is not affected by using inoculation, shape or size of the colony. It is revealed that automatic detection of colony quantity using image-processing technology could be an effective way.

  7. Fluid Intelligence and Automatic Neural Processes in Facial Expression Perception

    DEFF Research Database (Denmark)

    Liu, Tongran; Xiao, Tong; Li, Xiaoyan

    2015-01-01

    The relationship between human fluid intelligence and social-emotional abilities has been a topic of considerable interest. The current study investigated whether adolescents with different intellectual levels had different automatic neural processing of facial expressions. Two groups of adolescent...... males were enrolled: a high IQ group and an average IQ group. Age and parental socioeconomic status were matched between the two groups. Participants counted the numbers of the central cross changes while paired facial expressions were presented bilaterally in an oddball paradigm. There were two.......2). Participants were required to concentrate on the primary task of counting the central cross changes and to ignore the expressions to ensure that facial expression processing was automatic. Event-related potentials (ERPs) were obtained during the tasks. The visual mismatch negativity (vMMN) components were...

  8. Development of Flexible Software Process Lines with Variability Operations

    DEFF Research Database (Denmark)

    Schramm, Joachim; Dohrmann, Patrick; Kuhrmann, Marco

    2015-01-01

    families of processes and, as part of this, variability operations provide means to modify and reuse pre-defined process assets. Objective: Our goal is to evaluate the feasibility of variability operations to support the development of flexible software process lines. Method: We conducted a longitudinal......Context: Software processes evolve over time and several approaches were proposed to support the required flexibility. Yet, little is known whether these approaches sufficiently support the development of large software processes. A software process line helps to systematically develop and manage...

  9. Systematic evaluation of three different commercial software solutions for automatic segmentation for adaptive therapy in head-and-neck, prostate and pleural cancer

    International Nuclear Information System (INIS)

    La Macchia, Mariangela; Fellin, Francesco; Amichetti, Maurizio; Cianchetti, Marco; Gianolini, Stefano; Paola, Vitali; Lomax, Antony J; Widesott, Lamberto

    2012-01-01

    To validate, in the context of adaptive radiotherapy, three commercial software solutions for atlas-based segmentation. Fifteen patients, five for each group, with cancer of the Head&Neck, pleura, and prostate were enrolled in the study. In addition to the treatment planning CT (pCT) images, one replanning CT (rCT) image set was acquired for each patient during the RT course. Three experienced physicians outlined on the pCT and rCT all the volumes of interest (VOIs). We used three software solutions (VelocityAI 2.6.2 (V), MIM 5.1.1 (M) by MIMVista and ABAS 2.0 (A) by CMS-Elekta) to generate the automatic contouring on the repeated CT. All the VOIs obtained with automatic contouring (AC) were successively corrected manually. We recorded the time needed for: 1) ex novo ROIs definition on rCT; 2) generation of AC by the three software solutions; 3) manual correction of AC. To compare the quality of the volumes obtained automatically by the software and manually corrected with those drawn from scratch on rCT, we used the following indexes: overlap coefficient (DICE), sensitivity, inclusiveness index, difference in volume, and displacement differences on three axes (x, y, z) from the isocenter. The time saved by the three software solutions for all the sites, compared to the manual contouring from scratch, is statistically significant and similar for all the three software solutions. The time saved for each site are as follows: about an hour for Head&Neck, about 40 minutes for prostate, and about 20 minutes for mesothelioma. The best DICE similarity coefficient index was obtained with the manual correction for: A (contours for prostate), A and M (contours for H&N), and M (contours for mesothelioma). From a clinical point of view, the automated contouring workflow was shown to be significantly shorter than the manual contouring process, even though manual correction of the VOIs is always needed

  10. Active Learning for Automatic Audio Processing of Unwritten Languages (ALAPUL)

    Science.gov (United States)

    2016-07-01

    AFRL-RH-WP-TR-2016-0074 ACTIVE LEARNING FOR AUTOMATIC AUDIO PROCESSING OF UNWRITTEN LANGUAGES (ALAPUL) Dimitra Vergyri Andreas Kathol Wen Wang...FA8650-15-C-9101 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) *Dimitra Vergyri; Andreas Kathol; Wen Wang; Chris Bartels; Julian VanHout...feature transform through deep auto-encoders for better phone recognition performance. We target iterative learning to improve the system through

  11. Costs and Benefits of Software Process Improvement

    National Research Council Canada - National Science Library

    Prenger, Karen

    1997-01-01

    There are numerous problems in DoD software development projects. The ad hoc practices used in the military services and in industry have resulted in unpredictable costs and schedules and low-quality products...

  12. Software Process Improvement: Supporting the Linking of the Software and the Business Strategies

    Science.gov (United States)

    Albuquerque, Adriano Bessa; Rocha, Ana Regina; Lima, Andreia Cavalcanti

    The market is becoming more and more competitive, a lot of products and services depend of the software product and the software is one of the most important assets, which influence the organizations’ businesses. Considering this context, we can observe that the companies must to deal with the software, developing or acquiring, carefully. One of the perspectives that can help to take advantage of the software, supporting effectively the business, is to invest on the organization’s software processes. This paper presents an approach to evaluate and improve the processes assets of the software organizations, based on internationally well-known standards and process models. This approach is supported by automated tools from the TABA Workstation and is part of a wider improvement strategy constituted of three layers (organizational layer, process execution layer and external entity layer). Moreover, this paper presents the experience of use and their results.

  13. Integrating Usability Evaluations into the Software Development Process

    DEFF Research Database (Denmark)

    Lizano, Fulvio

    as relevant and strategic human–computer interaction (HCI) activities in the software development process, there are obstacles that limit the complete, effective and efficient integration of this kind of testing into the software development process. Two main obstacles are the cost of usability evaluations...... and the software developers' resistance to accepting users’ opinions regarding the lack of usability in their software systems. The ‘cost obstacle’ refers to the constraint of conducting usability evaluations in the software process due to the significant amount of resources required by this type of testing. Some......This thesis addresses the integration of usability evaluations into the software development process. The integration here is contextualized in terms of how to include usability evaluation as an activity in the software development lifecycle. Even though usability evaluations are considered...

  14. Assessing the Army’s Software Patch Management Process

    Science.gov (United States)

    2016-03-04

    software maker or to antivirus vendors (Zetter, 2014). Fixing such a vulnerability within the zero-day period requires teamwork across multiple...Assessing the Army’s Software Patch Management Process Benjamin Alan Pryor March 4, 2016 PUBLISHED...19 Commercial-Off-the-Shelf Software

  15. Experience in automatic processing of 340.000 images from ITEF 3-m magnetic spectrometer

    International Nuclear Information System (INIS)

    Dzhelyadin, R.I.; Dukhovskoj, I.A.; Ivanov, L.V.; Kishkurno, V.V.; Krutenkova, A.P.; Kulikov, V.V.; Lyulevich, V.I.; Polikarpov, V.M.; Radkevich, I.A.; Fedorets, V.S.; Fedotov, O.P.

    1974-01-01

    A number of conclusions were made regarding automatic processing of 340.000 pictures (1.020.000 frames) developed on a three-meter magnetic spectrometer with spark chambers. Possibilities for time optimization of automatic processing programs are discussed. The results of processing of a series of photographs were analysed to compare the paramters of automatic ans semi-automatic processing. Some problems relating to organization and technology of picture processing are also autlined [ru

  16. Automatic extraction of process categories from process model collections

    NARCIS (Netherlands)

    Malinova, M.; Dijkman, R.M.; Mendling, J.; Lohmann, N.; Song, M.; Wohed, P.

    2014-01-01

    Many organizations build up their business process management activities in an incremental way. As a result, there is no overarching structure defined at the beginning. However, as business process modeling initiatives often yield hundreds to thousands of process models, there is a growing need for

  17. Software Defined Common Processing System (SDCPS), Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Coherent Logix, Incorporated (CLX) proposes the development of a Software Defined Common Processing System (SDCPS) that leverages the inherent advantages of an...

  18. SOFTWARE PROCESS ASSESSMENT AND IMPROVEMENT USING MULTICRITERIA DECISION AIDING - CONSTRUCTIVIST

    Directory of Open Access Journals (Sweden)

    Leonardo Ensslin

    2012-12-01

    Full Text Available Software process improvement and software process assessment have received special attention since the 1980s. Some models have been created, but these models rest on a normative approach, where the decision-maker’s participation in a software organization is limited to understanding which process is more relevant to each organization. The proposal of this work is to present the MCDA-C as a constructivist methodology for software process improvement and assessment. The methodology makes it possible to visualize the criteria that must be taken into account according to the decision-makers’ values in the process improvement actions, making it possible to rank actions in the light of specific organizational needs. This process helped the manager of the company studied to focus on and prioritize process improvement actions. This paper offers an empirical understanding of the application of performance evaluation to software process improvement and identifies complementary tools to the normative models presented today.

  19. Software Design Improvements. Part 2; Software Quality and the Design and Inspection Process

    Science.gov (United States)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    The application of assurance engineering techniques improves the duration of failure-free performance of software. The totality of features and characteristics of a software product are what determine its ability to satisfy customer needs. Software in safety-critical systems is very important to NASA. We follow the System Safety Working Groups definition for system safety software as: 'The optimization of system safety in the design, development, use and maintenance of software and its integration with safety-critical systems in an operational environment. 'If it is not safe, say so' has become our motto. This paper goes over methods that have been used by NASA to make software design improvements by focusing on software quality and the design and inspection process.

  20. Agile development with software process mining

    NARCIS (Netherlands)

    Rubin, V.A.; Lomazova, I.A.; Aalst, van der W.M.P.

    2014-01-01

    Modern companies continue investing more and more in the creation, maintenance and change of software systems, but the proper specification and design of such systems continues to be a challenge. The majority of current approaches either ignore real user and system runtime behavior or consider it

  1. Automatic physiological waveform processing for FMRI noise correction and analysis.

    Directory of Open Access Journals (Sweden)

    Daniel J Kelley

    2008-03-01

    Full Text Available Functional MRI resting state and connectivity studies of brain focus on neural fluctuations at low frequencies which share power with physiological fluctuations originating from lung and heart. Due to the lack of automated software to process physiological signals collected at high magnetic fields, a gap exists in the processing pathway between the acquisition of physiological data and its use in fMRI software for both physiological noise correction and functional analyses of brain activation and connectivity. To fill this gap, we developed an open source, physiological signal processing program, called PhysioNoise, in the python language. We tested its automated processing algorithms and dynamic signal visualization on resting monkey cardiac and respiratory waveforms. PhysioNoise consistently identifies physiological fluctuations for fMRI noise correction and also generates covariates for subsequent analyses of brain activation and connectivity.

  2. Secure Software Configuration Management Processes for nuclear safety software development environment

    International Nuclear Information System (INIS)

    Chou, I.-Hsin

    2011-01-01

    Highlights: → The proposed method emphasizes platform-independent security processes. → A hybrid process based on the nuclear SCM and security regulations is proposed. → Detailed descriptions and Process Flow Diagram are useful for software developers. - Abstract: The main difference between nuclear and generic software is that the risk factor is infinitely greater in nuclear software - if there is a malfunction in the safety system, it can result in significant economic loss, physical damage or threat to human life. However, secure software development environment have often been ignored in the nuclear industry. In response to the terrorist attacks on September 11, 2001, the US Nuclear Regulatory Commission (USNRC) revised the Regulatory Guide (RG 1.152-2006) 'Criteria for use of computers in safety systems of nuclear power plants' to provide specific security guidance throughout the software development life cycle. Software Configuration Management (SCM) is an essential discipline in the software development environment. SCM involves identifying configuration items, controlling changes to those items, and maintaining integrity and traceability of them. For securing the nuclear safety software, this paper proposes a Secure SCM Processes (S 2 CMP) which infuses regulatory security requirements into proposed SCM processes. Furthermore, a Process Flow Diagram (PFD) is adopted to describe S 2 CMP, which is intended to enhance the communication between regulators and developers.

  3. 4th International Conference on Software Process Improvement

    CERN Document Server

    Muñoz, Mirna; Rocha, Álvaro; Calvo-Manzano, Jose

    2016-01-01

    This book contains a selection of papers from The 2015 International Conference on Software Process Improvement (CIMPS’15), held between the 28th and 30th of October in Mazatlán, Sinaloa, México. The CIMPS’15 is a global forum for researchers and practitioners that present and discuss the most recent innovations, trends, results, experiences and concerns in the several perspectives of Software Engineering with clear relationship but not limited to software processes, Security in Information and Communication Technology and Big Data Field. The main topics covered are: Organizational Models, Standards and Methodologies, Knowledge Management, Software Systems, Applications and Tools, Information and Communication Technologies and Processes in non-software domains (Mining, automotive, aerospace, business, health care, manufacturing, etc.) with a demonstrated relationship to software process challenges.

  4. Compiling software for a hierarchical distributed processing system

    Science.gov (United States)

    Archer, Charles J; Blocksome, Michael A; Ratterman, Joseph D; Smith, Brian E

    2013-12-31

    Compiling software for a hierarchical distributed processing system including providing to one or more compiling nodes software to be compiled, wherein at least a portion of the software to be compiled is to be executed by one or more nodes; compiling, by the compiling node, the software; maintaining, by the compiling node, any compiled software to be executed on the compiling node; selecting, by the compiling node, one or more nodes in a next tier of the hierarchy of the distributed processing system in dependence upon whether any compiled software is for the selected node or the selected node's descendents; sending to the selected node only the compiled software to be executed by the selected node or selected node's descendent.

  5. Improving design processes through structured reflection : a prototype software tool

    OpenAIRE

    Reymen, I.M.M.J.; Melby, E.

    2001-01-01

    A prototype software tool facilitating the use of a design method supporting structured reflection on design processes is presented. The prototype, called Echo, has been developed to explore the benefits of using a software system to facilitate the use of the design method. Both the prototype software tool and the design method are developed as part of the Ph.D. project of Isabelle Reymen. The goal of the design method is supporting designers with reflection on design processes in a systemati...

  6. Practical support for Lean Six Sigma software process definition using IEEE software engineering standards

    CERN Document Server

    Land, Susan K; Walz, John W

    2012-01-01

    Practical Support for Lean Six Sigma Software Process Definition: Using IEEE Software Engineering Standards addresses the task of meeting the specific documentation requirements in support of Lean Six Sigma. This book provides a set of templates supporting the documentation required for basic software project control and management and covers the integration of these templates for their entire product development life cycle. Find detailed documentation guidance in the form of organizational policy descriptions, integrated set of deployable document templates, artifacts required in suppo

  7. Integrating interface slicing into software engineering processes

    Science.gov (United States)

    Beck, Jon

    1993-01-01

    Interface slicing is a tool which was developed to facilitate software engineering. As previously presented, it was described in terms of its techniques and mechanisms. The integration of interface slicing into specific software engineering activities is considered by discussing a number of potential applications of interface slicing. The applications discussed specifically address the problems, issues, or concerns raised in a previous project. Because a complete interface slicer is still under development, these applications must be phrased in future tenses. Nonetheless, the interface slicing techniques which were presented can be implemented using current compiler and static analysis technology. Whether implemented as a standalone tool or as a module in an integrated development or reverse engineering environment, they require analysis no more complex than that required for current system development environments. By contrast, conventional slicing is a methodology which, while showing much promise and intuitive appeal, has yet to be fully implemented in a production language environment despite 12 years of development.

  8. On the Process of Software Design

    DEFF Research Database (Denmark)

    Hertzum, Morten

    2008-01-01

    Software design is a complex undertaking. This study delineates and analyses three major constituents of this complexity: the formative element entailed in articulating and reaching closure on a design, the progress imperative entailed in making estimates and tracking status, and the collaboratio...... disintegrate, but the activities required to avoid this may be costly in terms of scarce resources such as the time of key designers...

  9. Software Development Process Improvement in Datacom Platform

    OpenAIRE

    Trabelsi, Walid

    2008-01-01

    Masteroppgave i Informasjons- og Kommunikasjonsteknologi 2008, Universitetet i Agder, Grimstad Ericsson Mobile Platform (EMP) is responsible of the development of a software platform and also to some extend responsible for related hardware parts. EMP is developing the data communication parts of the platform which is used by EMP customers. The platform development is done in large development programs and each program span over a quite a long time period. However, as we see eve...

  10. Software Process Improvement Journey: IBM Australia Application Management Services

    Science.gov (United States)

    2005-03-01

    See Section 5.1.2) - Client Relationship Management ( CRM ) processes-specifically, Solution Design and Solution Delivery - Worldwide Project Management ...plex systems life-cycle management , rapid solutions development, custom development, package selection and implementation, maintenance, minor...CarnegieMellon ___ Software Engineering Institute Software Process Improvement Journey: IBM Australia Application Management Services Robyn Nichols

  11. Automating the Object-Oriented Software Development Process: Workshop Report

    NARCIS (Netherlands)

    Aksit, Mehmet; Tekinerdogan, B.

    1998-01-01

    Cost-effective realization of robust, adaptable and reusable software systems demands efficient and effective management of the overall software production process. Current object-oriented methods are not completely formalized and lack the ability of reasoning about the quality of processes and

  12. Automating the Object-Oriented Software Development Process: Workshop Report

    NARCIS (Netherlands)

    Aksit, Mehmet; Demeyer, S.; Bosch, H.G.P.; Tekinerdogan, B.

    Cost-effective realization of robust, adaptable and reusable software systems demands efficient and effective management of the overall software production process. Current object-oriented methods are not completely formalized and lack the ability of reasoning about the quality of processes and

  13. Digital image processing software system using an array processor

    International Nuclear Information System (INIS)

    Sherwood, R.J.; Portnoff, M.R.; Journeay, C.H.; Twogood, R.E.

    1981-01-01

    A versatile array processor-based system for general-purpose image processing was developed. At the heart of this system is an extensive, flexible software package that incorporates the array processor for effective interactive image processing. The software system is described in detail, and its application to a diverse set of applications at LLNL is briefly discussed. 4 figures, 1 table

  14. Software for Demonstration of Features of Chain Polymerization Processes

    Science.gov (United States)

    Sosnowski, Stanislaw

    2013-01-01

    Free software for the demonstration of the features of homo- and copolymerization processes (free radical, controlled radical, and living) is described. The software is based on the Monte Carlo algorithms and offers insight into the kinetics, molecular weight distribution, and microstructure of the macromolecules formed in those processes. It also…

  15. OPERA, an automatic PSF reconstruction software for Shack-Hartmann AO systems: application to Altair

    Science.gov (United States)

    Jolissaint, Laurent; Veran, Jean-Pierre; Marino, Jose

    2004-10-01

    When doing high angular resolution imaging with adaptive optics (AO), it is of crucial importance to have an accurate knowledge of the point spread function associated with each observation. Applications are numerous: image contrast enhancement by deconvolution, improved photometry and astrometry, as well as real time AO performance evaluation. In this paper, we present our work on automatic PSF reconstruction based on control loop data, acquired simultaneously with the observation. This problem has already been solved for curvature AO systems. To adapt this method to another type of WFS, a specific analytical noise propagation model must be established. For the Shack-Hartmann WFS, we are able to derive a very accurate estimate of the noise on each slope measurement, based on the covariances of the WFS CCD pixel values in the corresponding sub-aperture. These covariances can be either derived off-line from telemetry data, or calculated by the AO computer during the acquisition. We present improved methods to determine 1) r0 from the DM drive commands, which includes an estimation of the outer scale L0 2) the contribution of the high spatial frequency component of the turbulent phase, which is not corrected by the AO system and is scaled by r0. This new method has been implemented in an IDL-based software called OPERA (Performance of Adaptive Optics). We have tested OPERA on Altair, the recently commissioned Gemini-North AO system, and present our preliminary results. We also summarize the AO data required to run OPERA on any other AO system.

  16. 7 Processes that Enable NASA Software Engineering Technologies: Value-Added Process Engineering

    Science.gov (United States)

    Housch, Helen; Godfrey, Sally

    2011-01-01

    The presentation reviews Agency process requirements and the purpose, benefits, and experiences or seven software engineering processes. The processes include: product integration, configuration management, verification, software assurance, measurement and analysis, requirements management, and planning and monitoring.

  17. On the Role of Software Quality Management in Software Process Improvement

    DEFF Research Database (Denmark)

    Wiedemann Jacobsen, Jan; Kuhrmann, Marco; Münch, Jürgen

    2016-01-01

    Software Process Improvement (SPI) programs have been implemented, inter alia, to improve quality and speed of software development. SPI addresses many aspects ranging from individual developer skills to entire organizations. It comprises, for instance, the optimization of specific activities...... and a strong focus on custom review, testing, and documentation techniques, whereas a set of five selected improvement measures is almost equally addressed....

  18. Rotor assembly and method for automatically processing liquids

    Science.gov (United States)

    Burtis, C.A.; Johnson, W.F.; Walker, W.A.

    1992-12-22

    A rotor assembly is described for performing a relatively large number of processing steps upon a sample, such as a whole blood sample, and a diluent, such as water. It includes a rotor body for rotation about an axis and includes a network of chambers within which various processing steps are performed upon the sample and diluent and passageways through which the sample and diluent are transferred. A transfer mechanism is movable through the rotor body by the influence of a magnetic field generated adjacent the transfer mechanism and movable along the rotor body, and the assembly utilizes centrifugal force, a transfer of momentum and capillary action to perform any of a number of processing steps such as separation, aliquoting, transference, washing, reagent addition and mixing of the sample and diluent within the rotor body. The rotor body is particularly suitable for automatic immunoassay analyses. 34 figs.

  19. Automatic attenuator upgrade for a Siemens D500 diffractometer via a generic software library to overcome hardware limitations

    International Nuclear Information System (INIS)

    Mayr, Sina; Randau, Christian; Kreuzpaintner, Wolfgang

    2017-01-01

    A proxy software was developed which allows the Siemens D500 x-ray diffractometer to be upgraded with add-ons that have never been officially available for it. For demonstration, we designed and integrated an automatic attenuator option and demonstrated the feasibility of our upgrade path by typical comparative x-ray measurements, which would usually saturate the x-ray detector, if no attenuator is used.

  20. Automatic attenuator upgrade for a Siemens D500 diffractometer via a generic software library to overcome hardware limitations

    Energy Technology Data Exchange (ETDEWEB)

    Mayr, Sina, E-mail: sina.mayr@frm2.tum.de [Technische Universität München, Physik-Department E21, James-Franck-Str. 1, 85748 Garching (Germany); Randau, Christian [Georg-August-Universität Göttingen, Fakultät für Geowissenschaften und Geologie, Abteilung Isotopengeologie Außenstelle MLZ (FRM II), Lichtenbergstr. 1, 85748 Garching (Germany); Kreuzpaintner, Wolfgang [Technische Universität München, Physik-Department E21, James-Franck-Str. 1, 85748 Garching (Germany)

    2017-05-21

    A proxy software was developed which allows the Siemens D500 x-ray diffractometer to be upgraded with add-ons that have never been officially available for it. For demonstration, we designed and integrated an automatic attenuator option and demonstrated the feasibility of our upgrade path by typical comparative x-ray measurements, which would usually saturate the x-ray detector, if no attenuator is used.

  1. Automatic and controlled processing and the Broad Autism Phenotype.

    Science.gov (United States)

    Camodeca, Amy; Voelker, Sylvia

    2016-01-30

    Research related to verbal fluency in the Broad Autism Phenotype (BAP) is limited and dated, but generally suggests intact abilities in the context of weaknesses in other areas of executive function (Hughes et al., 1999; Wong et al., 2006; Delorme et al., 2007). Controlled processing, the generation of search strategies after initial, automated responses are exhausted (Spat, 2013), has yet to be investigated in the BAP, and may be evidenced in verbal fluency tasks. One hundred twenty-nine participants completed the Delis-Kaplan Executive Function System Verbal Fluency test (D-KEFS; Delis et al., 2001) and the Broad Autism Phenotype Questionnaire (BAPQ; Hurley et al., 2007). The BAP group (n=53) produced significantly fewer total words during the 2nd 15" interval compared to the Non-BAP (n=76) group. Partial correlations indicated similar relations between verbal fluency variables for each group. Regression analyses predicting 2nd 15" interval scores suggested differentiation between controlled and automatic processing skills in both groups. Results suggest adequate automatic processing, but slowed development of controlled processing strategies in the BAP, and provide evidence for similar underlying cognitive constructs for both groups. Controlled processing was predictive of Block Design score for Non-BAP participants, and was predictive of Pragmatic Language score on the BAPQ for BAP participants. These results are similar to past research related to strengths and weaknesses in the BAP, respectively, and suggest that controlled processing strategy use may be required in instances of weak lower-level skills. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  2. Advanced information processing system: Input/output network management software

    Science.gov (United States)

    Nagle, Gail; Alger, Linda; Kemp, Alexander

    1988-01-01

    The purpose of this document is to provide the software requirements and specifications for the Input/Output Network Management Services for the Advanced Information Processing System. This introduction and overview section is provided to briefly outline the overall architecture and software requirements of the AIPS system before discussing the details of the design requirements and specifications of the AIPS I/O Network Management software. A brief overview of the AIPS architecture followed by a more detailed description of the network architecture.

  3. Social software for business process modeling

    NARCIS (Netherlands)

    Koschmider, A.; Song, M.S.; Reijers, H.A.

    2010-01-01

    Formal models of business processes are used for a variety of purposes. But where the elicitation of the characteristics of a business process usually takes place in a collaborative fashion, the building of the final, formal process model is done mostly by a single person. This article presents the

  4. Unit Testing Using Design by Contract and Equivalence Partitions, Extreme Programming and Agile Processes in Software Engineering

    DEFF Research Database (Denmark)

    Madsen, Per

    2003-01-01

    Extreme Programming [1] and in particular the idea of Unit Testing can improve the quality of the testing process. But still programmers need to do a lot of tiresome manual work writing test cases. If the programmers could get some automatic tool support enforcing the quality of test cases then t...... then the overall quality of the software would improve significantly....

  5. Process based unification for multi-model software process improvement

    NARCIS (Netherlands)

    Kelemen, Z.D.

    2013-01-01

    Many different quality approaches are available in the software industry. Some of the ap-proaches, such as ISO 9001 are not software specific, i.e. they define general requirements for an organization and they can be used at any company. Others, such as Automotive SPICE have been derived from a

  6. Improving design processes through structured reflection : a prototype software tool

    NARCIS (Netherlands)

    Reymen, I.M.M.J.; Melby, E.

    2001-01-01

    A prototype software tool facilitating the use of a design method supporting structured reflection on design processes is presented. The prototype, called Echo, has been developed to explore the benefits of using a software system to facilitate the use of the design method. Both the prototype

  7. Software Process Automation: Experiences from the Trenches.

    Science.gov (United States)

    1996-07-01

    Integration of problem database Weaver tions) J Process WordPerfect, All-in-One, Oracle, CM Integration of tools Weaver System K Process Framemaker , CM...handle change requests and problem reports. * Autoplan, a project management tool * Framemaker , a document processing system * Worldview, a document...Cadre, Team Work, FrameMaker , some- thing for requirements traceability, their own homegrown scheduling tool, and their own homegrown tool integrator

  8. Improving the Software Development Process Using Testability Research

    Science.gov (United States)

    Voas, Jeffrey M.; Miller, Keith W.

    1991-01-01

    Software testability is the the tendency of code to reveal existing faults during random testing. This paper proposes to take software testability predictions into account throughout the development process. These predictions can be made from formal specifications, design documents, and the code itself. The insight provided by software testability is valuable during design, coding, testing, and quality assurance. We further believe that software testability analysis can play a crucial role in quantifying the likelihood that faults are not hiding after testing does not result in any failures for the current version.

  9. Softwareland Chronicles: A Software Development Meta-Process Proposal

    Directory of Open Access Journals (Sweden)

    Bolanos Sandro

    2016-05-01

    Full Text Available This paper presents the software development meta-process (SD-MP as a proposal to set up software projects. Within this proposal we offer conceptual elements that help solve the war of methodologies and processes in favor of an integrating viewpoint, where the main flaws associated with conventional and agile approaches are removed. Our newly developed software platform to support the meta-process is also presented together with three case studies involving projects currently in progress, where the framework proposed in SD-MP has been applied.

  10. Development of a visualized software for tokamak experiment data processing

    International Nuclear Information System (INIS)

    Cao Jianyong; Ding Xuantong; Luo Cuiwen

    2004-01-01

    With the VBA programming in Microsoft Excel, the authors have developed a post-processing software of experimental data in tokamak. The standard formal data in the HL-1M and HL-2A tokamaks can be read, displayed in Excel, and transmitted directly into the MATLAB workspace, for displaying pictures in MATLAB with the software. The authors have also developed data post-processing software in MATLAB environment, which can read standard format data, display picture, supply visual graphical user interface and provide part of advanced signal processing ability

  11. Automatic pre-processing for an object-oriented distributed hydrological model using GRASS-GIS

    Science.gov (United States)

    Sanzana, P.; Jankowfsky, S.; Branger, F.; Braud, I.; Vargas, X.; Hitschfeld, N.

    2012-04-01

    Landscapes are very heterogeneous, which impact the hydrological processes occurring in the catchments, especially in the modeling of peri-urban catchments. The Hydrological Response Units (HRUs), resulting from the intersection of different maps, such as land use, soil types and geology, and flow networks, allow the representation of these elements in an explicit way, preserving natural and artificial contours of the different layers. These HRUs are used as model mesh in some distributed object-oriented hydrological models, allowing the application of a topological oriented approach. The connectivity between polygons and polylines provides a detailed representation of the water balance and overland flow in these distributed hydrological models, based on irregular hydro-landscape units. When computing fluxes between these HRUs, the geometrical parameters, such as the distance between the centroid of gravity of the HRUs and the river network, and the length of the perimeter, can impact the realism of the calculated overland, sub-surface and groundwater fluxes. Therefore, it is necessary to process the original model mesh in order to avoid these numerical problems. We present an automatic pre-processing implemented in the open source GRASS-GIS software, for which several Python scripts or some algorithms already available were used, such as the Triangle software. First, some scripts were developed to improve the topology of the various elements, such as snapping of the river network to the closest contours. When data are derived with remote sensing, such as vegetation areas, their perimeter has lots of right angles that were smoothed. Second, the algorithms more particularly address bad-shaped elements of the model mesh such as polygons with narrow shapes, marked irregular contours and/or the centroid outside of the polygons. To identify these elements we used shape descriptors. The convexity index was considered the best descriptor to identify them with a threshold

  12. Automatic Generation of Just-in-Time Online Assessments from Software Design Models

    Science.gov (United States)

    Zualkernan, Imran A.; El-Naaj, Salim Abou; Papadopoulos, Maria; Al-Amoudi, Budoor K.; Matthews, Charles E.

    2009-01-01

    Computer software is pervasive in today's society. The rate at which new versions of computer software products are released is phenomenal when compared to the release rate of new products in traditional industries such as aircraft building. This rapid rate of change can partially explain why most certifications in the software industry are…

  13. A Reconfigurable Simulation-Based Test System for Automatically Assessing Software Operating Skills

    Science.gov (United States)

    Su, Jun-Ming; Lin, Huan-Yu

    2015-01-01

    In recent years, software operating skills, the ability in computer literacy to solve problems using specific software, has become much more important. A great deal of research has also proven that students' software operating skills can be efficiently improved by practicing customized virtual and simulated examinations. However, constructing…

  14. System and Component Software Specification, Run-time Verification and Automatic Test Generation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The following background technology is described in Part 5: Run-time Verification (RV), White Box Automatic Test Generation (WBATG). Part 5 also describes how WBATG...

  15. Automatic Road Pavement Assessment with Image Processing: Review and Comparison

    Directory of Open Access Journals (Sweden)

    Sylvie Chambon

    2011-01-01

    Full Text Available In the field of noninvasive sensing techniques for civil infrastructures monitoring, this paper addresses the problem of crack detection, in the surface of the French national roads, by automatic analysis of optical images. The first contribution is a state of the art of the image-processing tools applied to civil engineering. The second contribution is about fine-defect detection in pavement surface. The approach is based on a multi-scale extraction and a Markovian segmentation. Third, an evaluation and comparison protocol which has been designed for evaluating this difficult task—the road pavement crack detection—is introduced. Finally, the proposed method is validated, analysed, and compared to a detection approach based on morphological tools.

  16. Practical Software Measurement: Measuring for Process Management and Improvement

    National Research Council Canada - National Science Library

    Florac, William

    1997-01-01

    This guidebook shows how well-established principles and methods for evaluating and controlling process performance can be applied in software settings to help achieve an organization's business and technical goals...

  17. Development methodology for the software life cycle process of the safety software

    International Nuclear Information System (INIS)

    Kim, D. H.; Lee, S. S.; Cha, K. H.; Lee, C. S.; Kwon, K. C.; Han, H. B.

    2002-01-01

    A methodology for developing software life cycle processes (SLCP) is proposed to develop the digital safety-critical Engineered Safety Features - Component Control System (ESF-CCS) successfully. A software life cycle model is selected as the hybrid model mixed with waterfall, prototyping, and spiral models and is composed of two stages , development stages of prototype of ESF-CCS and ESF-CCS. To produce the software life cycle (SLC) for the Development of the Digital Reactor Safety System, the Activities referenced in IEEE Std. 1074-1997 are mapped onto the hybrid model. The SLCP is established after the available OPAs (Organizational Process Asset) are applied to the SLC Activities, and the known constraints are reconciled. The established SLCP describes well the software life cycle activities with which the Regulatory Authority provides

  18. Development methodology for the software life cycle process of the safety software

    Energy Technology Data Exchange (ETDEWEB)

    Kim, D. H.; Lee, S. S. [BNF Technology, Taejon (Korea, Republic of); Cha, K. H.; Lee, C. S.; Kwon, K. C.; Han, H. B. [KAERI, Taejon (Korea, Republic of)

    2002-05-01

    A methodology for developing software life cycle processes (SLCP) is proposed to develop the digital safety-critical Engineered Safety Features - Component Control System (ESF-CCS) successfully. A software life cycle model is selected as the hybrid model mixed with waterfall, prototyping, and spiral models and is composed of two stages , development stages of prototype of ESF-CCS and ESF-CCS. To produce the software life cycle (SLC) for the Development of the Digital Reactor Safety System, the Activities referenced in IEEE Std. 1074-1997 are mapped onto the hybrid model. The SLCP is established after the available OPAs (Organizational Process Asset) are applied to the SLC Activities, and the known constraints are reconciled. The established SLCP describes well the software life cycle activities with which the Regulatory Authority provides.

  19. Developing free software for automatic registration for the quality control of IMRT with movies; Desarrollo de un software gratuito para el registro automatico durante el control de calidad de la IMRT con peliculas

    Energy Technology Data Exchange (ETDEWEB)

    Moral, F. del; Meilan, E.; Pereira, L.; Salvador, F.; Munoz, V.; Salgado, M.

    2011-07-01

    In this work, as the commissioner of the e-JMRT, a Monte Carlo calculation network for IMRT planning, has developed software for the automatic recording of the image of the film with the results of the planning system.

  20. Green Software Engineering Adaption In Requirement Elicitation Process

    Directory of Open Access Journals (Sweden)

    Umma Khatuna Jannat

    2015-08-01

    Full Text Available A recent technology investigates the role of concern in the environment software that is green software system. Now it is widely accepted that the green software can fit all process of software development. It is also suitable for the requirement elicitation process. Now a days software companies have used requirements elicitation techniques in an enormous majority. Because this process plays more and more important roles in software development. At the present time most of the requirements elicitation process is improved by using some techniques and tools. So that the intention of this research suggests to adapt green software engineering for the intention of existing elicitation technique and recommend suitable actions for improvement. This research being involved qualitative data. I used few keywords in my searching procedure then searched IEEE ACM Springer Elsevier Google scholar Scopus and Wiley. Find out articles which published in 2010 until 2016. Finding from the literature review Identify 15 traditional requirement elicitations factors and 23 improvement techniques to convert green engineering. Lastly The paper includes a squat review of the literature a description of the grounded theory and some of the identity issues related finding of the necessity for requirements elicitation improvement techniques.

  1. Seven Processes that Enable NASA Software Engineering Technologies

    Science.gov (United States)

    Housch, Helen; Godfrey, Sally

    2011-01-01

    This slide presentation reviews seven processes that NASA uses to ensure that software is developed, acquired and maintained as specified in the NPR 7150.2A requirement. The requirement is to ensure that all software be appraised for the Capability Maturity Model Integration (CMMI). The enumerated processes are: (7) Product Integration, (6) Configuration Management, (5) Verification, (4) Software Assurance, (3) Measurement and Analysis, (2) Requirements Management and (1) Planning & Monitoring. Each of these is described and the group(s) that are responsible is described.

  2. Ten steps to successful software process improvement

    Science.gov (United States)

    Kandt, R. K.

    2003-01-01

    This paper identifies ten steps for managing change that address organizational and cultural issues. Four of these steps are critical, that if not done, will almost guarantee failure. This ten-step program emphasizes the alignment of business goals, change process goals, and the work performed by the employees of an organization.

  3. Systematic evaluation of three different commercial software solutions for automatic segmentation for adaptive therapy in head-and-neck, prostate and pleural cancer.

    Science.gov (United States)

    La Macchia, Mariangela; Fellin, Francesco; Amichetti, Maurizio; Cianchetti, Marco; Gianolini, Stefano; Paola, Vitali; Lomax, Antony J; Widesott, Lamberto

    2012-09-18

    To validate, in the context of adaptive radiotherapy, three commercial software solutions for atlas-based segmentation. Fifteen patients, five for each group, with cancer of the Head&Neck, pleura, and prostate were enrolled in the study. In addition to the treatment planning CT (pCT) images, one replanning CT (rCT) image set was acquired for each patient during the RT course. Three experienced physicians outlined on the pCT and rCT all the volumes of interest (VOIs). We used three software solutions (VelocityAI 2.6.2 (V), MIM 5.1.1 (M) by MIMVista and ABAS 2.0 (A) by CMS-Elekta) to generate the automatic contouring on the repeated CT. All the VOIs obtained with automatic contouring (AC) were successively corrected manually. We recorded the time needed for: 1) ex novo ROIs definition on rCT; 2) generation of AC by the three software solutions; 3) manual correction of AC.To compare the quality of the volumes obtained automatically by the software and manually corrected with those drawn from scratch on rCT, we used the following indexes: overlap coefficient (DICE), sensitivity, inclusiveness index, difference in volume, and displacement differences on three axes (x, y, z) from the isocenter. The time saved by the three software solutions for all the sites, compared to the manual contouring from scratch, is statistically significant and similar for all the three software solutions. The time saved for each site are as follows: about an hour for Head&Neck, about 40 minutes for prostate, and about 20 minutes for mesothelioma. The best DICE similarity coefficient index was obtained with the manual correction for: A (contours for prostate), A and M (contours for H&N), and M (contours for mesothelioma). From a clinical point of view, the automated contouring workflow was shown to be significantly shorter than the manual contouring process, even though manual correction of the VOIs is always needed.

  4. Systematic evaluation of three different commercial software solutions for automatic segmentation for adaptive therapy in head-and-neck, prostate and pleural cancer

    Directory of Open Access Journals (Sweden)

    La Macchia Mariangela

    2012-09-01

    Full Text Available Abstract Purpose To validate, in the context of adaptive radiotherapy, three commercial software solutions for atlas-based segmentation. Methods and materials Fifteen patients, five for each group, with cancer of the Head&Neck, pleura, and prostate were enrolled in the study. In addition to the treatment planning CT (pCT images, one replanning CT (rCT image set was acquired for each patient during the RT course. Three experienced physicians outlined on the pCT and rCT all the volumes of interest (VOIs. We used three software solutions (VelocityAI 2.6.2 (V, MIM 5.1.1 (M by MIMVista and ABAS 2.0 (A by CMS-Elekta to generate the automatic contouring on the repeated CT. All the VOIs obtained with automatic contouring (AC were successively corrected manually. We recorded the time needed for: 1 ex novo ROIs definition on rCT; 2 generation of AC by the three software solutions; 3 manual correction of AC. To compare the quality of the volumes obtained automatically by the software and manually corrected with those drawn from scratch on rCT, we used the following indexes: overlap coefficient (DICE, sensitivity, inclusiveness index, difference in volume, and displacement differences on three axes (x, y, z from the isocenter. Results The time saved by the three software solutions for all the sites, compared to the manual contouring from scratch, is statistically significant and similar for all the three software solutions. The time saved for each site are as follows: about an hour for Head&Neck, about 40 minutes for prostate, and about 20 minutes for mesothelioma. The best DICE similarity coefficient index was obtained with the manual correction for: A (contours for prostate, A and M (contours for H&N, and M (contours for mesothelioma. Conclusions From a clinical point of view, the automated contouring workflow was shown to be significantly shorter than the manual contouring process, even though manual correction of the VOIs is always needed.

  5. Earth Observation Services (Image Processing Software)

    Science.gov (United States)

    1992-01-01

    San Diego State University and Environmental Systems Research Institute, with other agencies, have applied satellite imaging and image processing techniques to geographic information systems (GIS) updating. The resulting images display land use and are used by a regional planning agency for applications like mapping vegetation distribution and preserving wildlife habitats. The EOCAP program provides government co-funding to encourage private investment in, and to broaden the use of NASA-developed technology for analyzing information about Earth and ocean resources.

  6. Childhood trauma exposure disrupts the automatic regulation of emotional processing.

    Science.gov (United States)

    Marusak, Hilary A; Martin, Kayla R; Etkin, Amit; Thomason, Moriah E

    2015-03-13

    Early-life trauma is one of the strongest risk factors for later emotional psychopathology. Although research in adults highlights that childhood trauma predicts deficits in emotion regulation that persist decades later, it is unknown whether neural and behavioral changes that may precipitate illness are evident during formative, developmental years. This study examined whether automatic regulation of emotional conflict is perturbed in a high-risk urban sample of trauma-exposed children and adolescents. A total of 14 trauma-exposed and 16 age-, sex-, and IQ-matched comparison youth underwent functional MRI while performing an emotional conflict task that involved categorizing facial affect while ignoring an overlying emotion word. Engagement of the conflict regulation system was evaluated at neural and behavioral levels. Results showed that trauma-exposed youth failed to dampen dorsolateral prefrontal cortex activity and engage amygdala-pregenual cingulate inhibitory circuitry during the regulation of emotional conflict, and were less able to regulate emotional conflict. In addition, trauma-exposed youth showed greater conflict-related amygdala reactivity that was associated with diminished levels of trait reward sensitivity. These data point to a trauma-related deficit in automatic regulation of emotional processing, and increase in sensitivity to emotional conflict in neural systems implicated in threat detection. Aberrant amygdala response to emotional conflict was related to diminished reward sensitivity that is emerging as a critical stress-susceptibility trait that may contribute to the emergence of mental illness during adolescence. These results suggest that deficits in conflict regulation for emotional material may underlie heightened risk for psychopathology in individuals that endure early-life trauma.

  7. CATS, continuous automated testing of seismological, hydroacoustic, and infrasound (SHI) processing software.

    Science.gov (United States)

    Brouwer, Albert; Brown, David; Tomuta, Elena

    2017-04-01

    To detect nuclear explosions, waveform data from over 240 SHI stations world-wide flows into the International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO), located in Vienna, Austria. A complex pipeline of software applications processes this data in numerous ways to form event hypotheses. The software codebase comprises over 2 million lines of code, reflects decades of development, and is subject to frequent enhancement and revision. Since processing must run continuously and reliably, software changes are subjected to thorough testing before being put into production. To overcome the limitations and cost of manual testing, the Continuous Automated Testing System (CATS) has been created. CATS provides an isolated replica of the IDC processing environment, and is able to build and test different versions of the pipeline software directly from code repositories that are placed under strict configuration control. Test jobs are scheduled automatically when code repository commits are made. Regressions are reported. We present the CATS design choices and test methods. Particular attention is paid to how the system accommodates the individual testing of strongly interacting software components that lack test instrumentation.

  8. Prosody's Contribution to Fluency: An Examination of the Theory of Automatic Information Processing

    Science.gov (United States)

    Schrauben, Julie E.

    2010-01-01

    LaBerge and Samuels' (1974) theory of automatic information processing in reading offers a model that explains how and where the processing of information occurs and the degree to which processing of information occurs. These processes are dependent upon two criteria: accurate word decoding and automatic word recognition. However, LaBerge and…

  9. Automatic processing of unattended object features by functional connectivity

    Directory of Open Access Journals (Sweden)

    Katja Martina Mayer

    2013-05-01

    Full Text Available Observers can selectively attend to object features that are relevant for a task. However, unattended task-irrelevant features may still be processed and possibly integrated with the attended features. This study investigated the neural mechanisms for processing both task-relevant (attended and task-irrelevant (unattended object features. The Garner paradigm was adapted for functional magnetic resonance imaging (fMRI to test whether specific brain areas process the conjunction of features or whether multiple interacting areas are involved in this form of feature integration. Observers attended to shape, colour, or non-rigid motion of novel objects while unattended features changed from trial to trial (change blocks or remained constant (no-change blocks during a given block. This block manipulation allowed us to measure the extent to which unattended features affected neural responses which would reflect the extent to which multiple object features are automatically processed. We did not find Garner interference at the behavioural level. However, we designed the experiment to equate performance across block types so that any fMRI results could not be due solely to differences in task difficulty between change and no-change blocks. Attention to specific features localised several areas known to be involved in object processing. No area showed larger responses on change blocks compared to no-change blocks. However, psychophysiological interaction analyses revealed that several functionally-localised areas showed significant positive interactions with areas in occipito-temporal and frontal areas that depended on block type. Overall, these findings suggest that both regional responses and functional connectivity are crucial for processing multi-featured objects.

  10. METODE ANALYTICAL HIERARCHY PROCESS: SISTEM REKOMENDER DATABASE SOFTWARE

    Directory of Open Access Journals (Sweden)

    Doni Purnama Alam Syah

    2014-09-01

    Full Text Available Abstract - Rekomender electoral system is a database software application that can be used to look for alternative software database selection strategy, the method of analytical hierarchy process (AHP. Rekomender systems needed by companies that have a large enough data processing such as the Bureau of Bina Sarana IT Information, expensive investments in the provision of Information Technology (IT makes IT Bina Sarana Information Bureau to be more careful in determining the selection of database software. This study focuses on research of database software selection system with the method of analytical hierarchy process (AHP, a case study of IT Bureau Bina Sarana Infromatika with the observation unit administrator. The results of the study found that there are two (2 main criteria, namely the selection of technology and a user with an alternative strategy My SQL, Oracle and SQL Server. Having tested the system rekomender My SQL result that the top priority in the selection of database software with a 41% weighting, followed by SQL Server and Oracle 39% 21%. The end result of a system that has been created rekomender concluded that the Bureau of Bina Sarana Informatics IT can define strategy alternatives before determining database software to be used more effectively and efficiently. Abstrak¬¬ - Sistem rekomender pemilihan database software merupakan aplikasi yang dapat digunakan untuk mencari alternatif strategi pemilihan database software, dengan metode analytical hierarchy process (AHP. Sistem rekomender dibutuhkan oleh perusahaan yang memiliki pengolahan data yang cukup besar seperti Biro TI Bina Sarana Informatika, mahalnya investasi pada penyediaan Teknologi Informasi (TI membuat Biro TI Bina Sarana Informatika lebih berhati-hati dalam menentukan pemilihan database software. Penelitian ini berfokus kepada penetilian tentang sistem pemilihan database sofware dengan metode analytical hierarchy process (AHP, studi kasus Biro TI Bina Sarana

  11. Automated pre-processing and multivariate vibrational spectra analysis software for rapid results in clinical settings

    Science.gov (United States)

    Bhattacharjee, T.; Kumar, P.; Fillipe, L.

    2018-02-01

    Vibrational spectroscopy, especially FTIR and Raman, has shown enormous potential in disease diagnosis, especially in cancers. Their potential for detecting varied pathological conditions are regularly reported. However, to prove their applicability in clinics, large multi-center multi-national studies need to be undertaken; and these will result in enormous amount of data. A parallel effort to develop analytical methods, including user-friendly software that can quickly pre-process data and subject them to required multivariate analysis is warranted in order to obtain results in real time. This study reports a MATLAB based script that can automatically import data, preprocess spectra— interpolation, derivatives, normalization, and then carry out Principal Component Analysis (PCA) followed by Linear Discriminant Analysis (LDA) of the first 10 PCs; all with a single click. The software has been verified on data obtained from cell lines, animal models, and in vivo patient datasets, and gives results comparable to Minitab 16 software. The software can be used to import variety of file extensions, asc, .txt., .xls, and many others. Options to ignore noisy data, plot all possible graphs with PCA factors 1 to 5, and save loading factors, confusion matrices and other parameters are also present. The software can provide results for a dataset of 300 spectra within 0.01 s. We believe that the software will be vital not only in clinical trials using vibrational spectroscopic data, but also to obtain rapid results when these tools get translated into clinics.

  12. Design of automatic control and measurement software for radioactive aerosol continuity monitor

    International Nuclear Information System (INIS)

    Mao Yong; Li Aiwu

    1997-01-01

    The radioactive aerosol continuity measurement is very important for the development of nuclear industry, and it is the major method to measure and find out the leakage of radioactive material. Radioactive aerosol continuity monitor is the advanced method for the radioactive aerosol continuity measurement. With the development of nuclear industry and nuclear power station, it is necessary to design and automatic continuity measurement device. Because of this reason, the authors developed the first unit of radioactive aerosol continuity monitor and adopted the ministry appraisal. The design idea and method of automatic control and measurement for radioactive aerosol continuity monitor are discussed

  13. Automatic DNA Diagnosis for 1D Gel Electrophoresis Images using Bio-image Processing Technique.

    Science.gov (United States)

    Intarapanich, Apichart; Kaewkamnerd, Saowaluck; Shaw, Philip J; Ukosakit, Kittipat; Tragoonrung, Somvong; Tongsima, Sissades

    2015-01-01

    DNA gel electrophoresis is a molecular biology technique for separating different sizes of DNA fragments. Applications of DNA gel electrophoresis include DNA fingerprinting (genetic diagnosis), size estimation of DNA, and DNA separation for Southern blotting. Accurate interpretation of DNA banding patterns from electrophoretic images can be laborious and error prone when a large number of bands are interrogated manually. Although many bio-imaging techniques have been proposed, none of them can fully automate the typing of DNA owing to the complexities of migration patterns typically obtained. We developed an image-processing tool that automatically calls genotypes from DNA gel electrophoresis images. The image processing workflow comprises three main steps: 1) lane segmentation, 2) extraction of DNA bands and 3) band genotyping classification. The tool was originally intended to facilitate large-scale genotyping analysis of sugarcane cultivars. We tested the proposed tool on 10 gel images (433 cultivars) obtained from polyacrylamide gel electrophoresis (PAGE) of PCR amplicons for detecting intron length polymorphisms (ILP) on one locus of the sugarcanes. These gel images demonstrated many challenges in automated lane/band segmentation in image processing including lane distortion, band deformity, high degree of noise in the background, and bands that are very close together (doublets). Using the proposed bio-imaging workflow, lanes and DNA bands contained within are properly segmented, even for adjacent bands with aberrant migration that cannot be separated by conventional techniques. The software, called GELect, automatically performs genotype calling on each lane by comparing with an all-banding reference, which was created by clustering the existing bands into the non-redundant set of reference bands. The automated genotype calling results were verified by independent manual typing by molecular biologists. This work presents an automated genotyping tool from DNA

  14. Automatic DNA Diagnosis for 1D Gel Electrophoresis Images using Bio-image Processing Technique

    Science.gov (United States)

    2015-01-01

    Background DNA gel electrophoresis is a molecular biology technique for separating different sizes of DNA fragments. Applications of DNA gel electrophoresis include DNA fingerprinting (genetic diagnosis), size estimation of DNA, and DNA separation for Southern blotting. Accurate interpretation of DNA banding patterns from electrophoretic images can be laborious and error prone when a large number of bands are interrogated manually. Although many bio-imaging techniques have been proposed, none of them can fully automate the typing of DNA owing to the complexities of migration patterns typically obtained. Results We developed an image-processing tool that automatically calls genotypes from DNA gel electrophoresis images. The image processing workflow comprises three main steps: 1) lane segmentation, 2) extraction of DNA bands and 3) band genotyping classification. The tool was originally intended to facilitate large-scale genotyping analysis of sugarcane cultivars. We tested the proposed tool on 10 gel images (433 cultivars) obtained from polyacrylamide gel electrophoresis (PAGE) of PCR amplicons for detecting intron length polymorphisms (ILP) on one locus of the sugarcanes. These gel images demonstrated many challenges in automated lane/band segmentation in image processing including lane distortion, band deformity, high degree of noise in the background, and bands that are very close together (doublets). Using the proposed bio-imaging workflow, lanes and DNA bands contained within are properly segmented, even for adjacent bands with aberrant migration that cannot be separated by conventional techniques. The software, called GELect, automatically performs genotype calling on each lane by comparing with an all-banding reference, which was created by clustering the existing bands into the non-redundant set of reference bands. The automated genotype calling results were verified by independent manual typing by molecular biologists. Conclusions This work presents an

  15. Automatic landslide detection from LiDAR DTM derivatives by geographic-object-based image analysis based on open-source software

    Science.gov (United States)

    Knevels, Raphael; Leopold, Philip; Petschko, Helene

    2017-04-01

    With high-resolution airborne Light Detection and Ranging (LiDAR) data more commonly available, many studies have been performed to facilitate the detailed information on the earth surface and to analyse its limitation. Specifically in the field of natural hazards, digital terrain models (DTM) have been used to map hazardous processes such as landslides mainly by visual interpretation of LiDAR DTM derivatives. However, new approaches are striving towards automatic detection of landslides to speed up the process of generating landslide inventories. These studies usually use a combination of optical imagery and terrain data, and are designed in commercial software packages such as ESRI ArcGIS, Definiens eCognition, or MathWorks MATLAB. The objective of this study was to investigate the potential of open-source software for automatic landslide detection based only on high-resolution LiDAR DTM derivatives in a study area within the federal state of Burgenland, Austria. The study area is very prone to landslides which have been mapped with different methodologies in recent years. The free development environment R was used to integrate open-source geographic information system (GIS) software, such as SAGA (System for Automated Geoscientific Analyses), GRASS (Geographic Resources Analysis Support System), or TauDEM (Terrain Analysis Using Digital Elevation Models). The implemented geographic-object-based image analysis (GEOBIA) consisted of (1) derivation of land surface parameters, such as slope, surface roughness, curvature, or flow direction, (2) finding optimal scale parameter by the use of an objective function, (3) multi-scale segmentation, (4) classification of landslide parts (main scarp, body, flanks) by k-mean thresholding, (5) assessment of the classification performance using a pre-existing landslide inventory, and (6) post-processing analysis for the further use in landslide inventories. The results of the developed open-source approach demonstrated good

  16. A Software Development Simulation Model of a Spiral Process

    OpenAIRE

    Carolyn Mizell; Linda Malone

    2009-01-01

    This paper will present a discrete event simulation model of a spiral development lifecycle that can be used to analyze cost and schedule effects of using such a process in comparison to a waterfall process. There is a need for simulation models of software development processes other than the waterfall due to new processes becoming more widely used in order to overcome the limitations of the traditional waterfall lifecycle. The use of a spiral process can make the inherently difficult job of...

  17. Automatic insertion of simulated microcalcification clusters in a software breast phantom

    Science.gov (United States)

    Shankla, Varsha; Pokrajac, David D.; Weinstein, Susan P.; DeLeo, Michael; Tuite, Catherine; Roth, Robyn; Conant, Emily F.; Maidment, Andrew D.; Bakic, Predrag R.

    2014-03-01

    An automated method has been developed to insert realistic clusters of simulated microcalcifications (MCs) into computer models of breast anatomy. This algorithm has been developed as part of a virtual clinical trial (VCT) software pipeline, which includes the simulation of breast anatomy, mechanical compression, image acquisition, image processing, display and interpretation. An automated insertion method has value in VCTs involving large numbers of images. The insertion method was designed to support various insertion placement strategies, governed by probability distribution functions (pdf). The pdf can be predicated on histological or biological models of tumor growth, or estimated from the locations of actual calcification clusters. To validate the automated insertion method, a 2-AFC observer study was designed to compare two placement strategies, undirected and directed. The undirected strategy could place a MC cluster anywhere within the phantom volume. The directed strategy placed MC clusters within fibroglandular tissue on the assumption that calcifications originate from epithelial breast tissue. Three radiologists were asked to select between two simulated phantom images, one from each placement strategy. Furthermore, questions were posed to probe the rationale behind the observer's selection. The radiologists found the resulting cluster placement to be realistic in 92% of cases, validating the automated insertion method. There was a significant preference for the cluster to be positioned on a background of adipose or mixed adipose/fibroglandular tissues. Based upon these results, this automated lesion placement method will be included in our VCT simulation pipeline.

  18. A 'Toolbox' Equivalent Process for Safety Analysis Software

    International Nuclear Information System (INIS)

    O'Kula, K.R.; Eng, Tony

    2004-01-01

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (Quality Assurance for Safety-Related Software) identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls that prevent or mitigate potential accidents. The development and maintenance of a collection, or 'toolbox', of multiple-site use, standard solution, Software Quality Assurance (SQA)-compliant safety software is one of the major improvements identified in the associated DOE Implementation Plan (IP). The DOE safety analysis toolbox will contain a set of appropriately quality-assured, configuration-controlled, safety analysis codes, recognized for DOE-broad, safety basis applications. Currently, six widely applied safety analysis computer codes have been designated for toolbox consideration. While the toolbox concept considerably reduces SQA burdens among DOE users of these codes, many users of unique, single-purpose, or single-site software may still have sufficient technical justification to continue use of their computer code of choice, but are thwarted by the multiple-site condition on toolbox candidate software. The process discussed here provides a roadmap for an equivalency argument, i.e., establishing satisfactory SQA credentials for single-site software that can be deemed ''toolbox-equivalent''. The process is based on the model established to meet IP Commitment 4.2.1.2: Establish SQA criteria for the safety analysis ''toolbox'' codes. Implementing criteria that establish the set of prescriptive SQA requirements are based on implementation plan/procedures from the Savannah River Site, also incorporating aspects of those from the Waste Isolation Pilot Plant (SNL component) and the Yucca Mountain Project. The major requirements are met with evidence of a software quality assurance plan, software requirements and design documentation, user's instructions, test report, a

  19. A Development Process for Enterprise Information Systems Based on Automatic Generation of the Components

    Directory of Open Access Journals (Sweden)

    Adrian ALEXANDRESCU

    2008-01-01

    Full Text Available This paper contains some ideas concerning the Enterprise Information Systems (EIS development. It combines known elements from the software engineering domain, with original elements, which the author has conceived and experimented. The author has followed two major objectives: to use a simple description for the concepts of an EIS, and to achieve a rapid and reliable EIS development process with minimal cost. The first goal was achieved defining some models, which describes the conceptual elements of the EIS domain: entities, events, actions, states and attribute-domain. The second goal is based on a predefined architectural model for the EIS, on predefined analyze and design models for the elements of the domain and finally on the automatic generation of the system components. The proposed methods do not depend on a special programming language or a data base management system. They are general and may be applied to any combination of such technologies.

  20. Measuring health care process quality with software quality measures.

    Science.gov (United States)

    Yildiz, Ozkan; Demirörs, Onur

    2012-01-01

    Existing quality models focus on some specific diseases, clinics or clinical areas. Although they contain structure, process, or output type measures, there is no model which measures quality of health care processes comprehensively. In addition, due to the not measured overall process quality, hospitals cannot compare quality of processes internally and externally. To bring a solution to above problems, a new model is developed from software quality measures. We have adopted the ISO/IEC 9126 software quality standard for health care processes. Then, JCIAS (Joint Commission International Accreditation Standards for Hospitals) measurable elements were added to model scope for unifying functional requirements. Assessment (diagnosing) process measurement results are provided in this paper. After the application, it was concluded that the model determines weak and strong aspects of the processes, gives a more detailed picture for the process quality, and provides quantifiable information to hospitals to compare their processes with multiple organizations.

  1. Agile Processes in Software Engineering and Extreme Programming

    DEFF Research Database (Denmark)

    The volume constitutes the proceedings of the 18th International Conference on Agile Software Development, XP 2017, held in Cologne, Germany, in May 2017. The 14 full and 6 short papers presented in this volume were carefully reviewed and selected from 46 submissions. They were organized in topical...... sections named: improving agile processes; agile in organization; and safety critical software. In addition, the volume contains 3 doctoral symposium papers (from 4 papers submitted)....

  2. Human Factors in Software Development Processes: Measuring System Quality

    DEFF Research Database (Denmark)

    Abrahão, Silvia; Baldassarre, Maria Teresa; Caivano, Danilo

    2016-01-01

    Software Engineering and Human-Computer Interaction look at the development process from different perspectives. They apparently use very different approaches, are inspired by different principles and address different needs. But, they definitively have the same goal: develop high quality software...... in the most effective way. The second edition of the workshop puts particular attention on efforts of the two communities in enhancing system quality. The research question discussed is: who, what, where, when, why, and how should we evaluate?...

  3. Evaluation of a Surveillance Review Software based on Automatic Image Summaries

    International Nuclear Information System (INIS)

    Rocchi, S.; Hadfi, G.; John, M.; Moeslinger, M.; Murray, J.; Juengling, K.; Sequeira, V.; Versino, C.; )

    2015-01-01

    Surveillance streams from safeguards instruments contain thousands of images. Inspectors review them in order to find safeguards-relevant events. Statistically a very small fraction of the images is expected to be safeguards-relevant. For this reason inspectors need a tool which helps them to focus their attention directly to the relevant parts of the surveillance stream. The current approach for surveillance review makes use of scene change detection within areas of interest (AOIs). The data reduction provided can be effective for the review of regular processes, and requires specific knowledge of the process/environment under review for the proper setting of the AOIs. The VideoZoom approach, developed by the European Commission Joint Research Centre-Institute for Transuranium Elements (JRC-ITU), detects scene changes on the whole image plane. Changes are then summarized and rendered at different levels of abstraction in four layers of summaries, each one revealing more information about the image changes. By means of a zooming interface, the reviewer is able to navigate the summary layers and decide which are to be examined with full photographic detail or skipped because they are clearly not safeguards-relevant. In this way reviewers can make best use of their time by investigating what really requires their attention. VideoZoom was evaluated by a group of IAEA inspectors on a benchmark of image reviews, with promising results in terms of identification of safeguards-relevant events, efficiency and usability. Following the positive results collected during the preliminary benchmark, the IAEA initiated a task under the European Commission Support Programme (EC SP), aimed at the research, development, and evaluation of surveillance review software based on VideoZoom and compatible with surveillance streams produced by NGSS cameras, the current safeguards surveillance technology deployed by the IAEA. This paper provides a description of the VideoZoom approach to

  4. Image processing in biodosimetry: A proposal of a generic free software platform.

    Science.gov (United States)

    Dumpelmann, Matthias; Cadena da Matta, Mariel; Pereira de Lemos Pinto, Marcela Maria; de Salazar E Fernandes, Thiago; Borges da Silva, Edvane; Amaral, Ademir

    2015-08-01

    The scoring of chromosome aberrations is the most reliable biological method for evaluating individual exposure to ionizing radiation. However, microscopic analyses of chromosome human metaphases, generally employed to identify aberrations mainly dicentrics (chromosome with two centromeres), is a laborious task. This method is time consuming and its application in biological dosimetry would be almost impossible in case of a large scale radiation incidents. In this project, a generic software was enhanced for automatic chromosome image processing from a framework originally developed for the Framework V project Simbio, of the European Union for applications in the area of source localization from electroencephalographic signals. The platforms capability is demonstrated by a study comparing automatic segmentation strategies of chromosomes from microscopic images.

  5. Improving the Product Documentation Process of a Small Software Company

    Science.gov (United States)

    Valtanen, Anu; Ahonen, Jarmo J.; Savolainen, Paula

    Documentation is an important part of the software process, even though it is often neglected in software companies. The eternal question is how much documentation is enough. In this article, we present a practical implementation of lightweight product documentation process resulting from SPI efforts in a small company. Small companies’ financial and human resources are often limited. The documentation process described here, offers a template for creating adequate documentation consuming minimal amount of resources. The key element of the documentation process is an open source web-based bugtracking system that was customized to be used as a documentation tool. The use of the tool enables iterative and well structured documentation. The solution best serves the needs of a small company with off-the-shelf software products and striving for SPI.

  6. Business Intelligence Applied to the ALMA Software Integration Process

    Science.gov (United States)

    Zambrano, M.; Recabarren, C.; González, V.; Hoffstadt, A.; Soto, R.; Shen, T.-C.

    2012-09-01

    Software quality assurance and planning of an astronomy project is a complex task, specially if it is a distributed collaborative project such as ALMA, where the development centers are spread across the globe. When you execute a software project there is much valuable information about this process itself that you might be able to collect. One of the ways you can receive this input is via an issue tracking system that will gather the problem reports relative to software bugs captured during the testing of the software, during the integration of the different components or even worst, problems occurred during production time. Usually, there is little time spent on analyzing them but with some multidimensional processing you can extract valuable information from them and it might help you on the long term planning and resources allocation. We present an analysis of the information collected at ALMA from a collection of key unbiased indicators. We describe here the extraction, transformation and load process and how the data was processed. The main goal is to assess a software process and get insights from this information.

  7. Software process improvement in CMS-are we different?

    International Nuclear Information System (INIS)

    Wellisch, J.P.

    2001-01-01

    One of the most challenging issues faced in HEP in recent years is the question of how to capitalise on software development and maintenance experience in a continuous manner. To capitalise in our context means to evaluate and apply new technologies as they arise, and to further evolve technologies already widely in use. It also implies the definition and adoption of standards, while ensuring reproducibility and quality of results. The CMS process improvement effort is two-pronged. It aims at continuous improvement of the ways we do Object Oriented software, as well as continuous improvement in the efficiency of the working environment. In particular the use and creation of de-facto software process standards within CMS has proven to be key to successful software process improvement program. The authors describe the successful CMS implementation of a software process improvement strategy, following ISO 15504 since three years. The authors give the current status of the most important processes families formally established in CMS, and provide the guidelines followed both for tool development, and methodology establishment

  8. Automatic Control of Arc Process for Making Carbon Nanotubes

    Science.gov (United States)

    Scott, Carl D.; Pulumbarit, Robert B.; Victor, Joe

    2004-01-01

    An automatic-control system has been devised for a process in which carbon nanotubes are produced in an arc between a catalyst-filled carbon anode and a graphite cathode. The control system includes a motor-driven screw that adjusts the distance between the electrodes. The system also includes a bridge circuit that puts out a voltage proportional to the difference between (1) the actual value of potential drop across the arc and (2) a reference value between 38 and 40 V (corresponding to a current of about 100 A) at which the yield of carbon nanotubes is maximized. Utilizing the fact that the potential drop across the arc increases with the interelectrode gap, the output of the bridge circuit is fed to a motor-control circuit that causes the motor to move the anode toward or away from the cathode if the actual potential drop is more or less, respectively, than the reference potential. Thus, the system regulates the interelectrode gap to maintain the optimum potential drop. The system also includes circuitry that records the potential drop across the arc and the relative position of the anode holder as function of time.

  9. Automatic processing of CERN video, audio and photo archives

    Energy Technology Data Exchange (ETDEWEB)

    Kwiatek, M [CERN, Geneva (Switzerland)], E-mail: Michal.Kwiatek@cem.ch

    2008-07-15

    The digitalization of CERN audio-visual archives, a major task currently in progress, will generate over 40 TB of video, audio and photo files. Storing these files is one issue, but a far more important challenge is to provide long-time coherence of the archive and to make these files available on-line with minimum manpower investment. An infrastructure, based on standard CERN services, has been implemented, whereby master files, stored in the CERN Distributed File System (DFS), are discovered and scheduled for encoding into lightweight web formats based on predefined profiles. Changes in master files, conversion profiles or in the metadata database (read from CDS, the CERN Document Server) are automatically detected and the media re-encoded whenever necessary. The encoding processes are run on virtual servers provided on-demand by the CERN Server Self Service Centre, so that new servers can be easily configured to adapt to higher load. Finally, the generated files are made available from the CERN standard web servers with streaming implemented using Windows Media Services.

  10. Automatic processing of CERN video, audio and photo archives

    International Nuclear Information System (INIS)

    Kwiatek, M

    2008-01-01

    The digitalization of CERN audio-visual archives, a major task currently in progress, will generate over 40 TB of video, audio and photo files. Storing these files is one issue, but a far more important challenge is to provide long-time coherence of the archive and to make these files available on-line with minimum manpower investment. An infrastructure, based on standard CERN services, has been implemented, whereby master files, stored in the CERN Distributed File System (DFS), are discovered and scheduled for encoding into lightweight web formats based on predefined profiles. Changes in master files, conversion profiles or in the metadata database (read from CDS, the CERN Document Server) are automatically detected and the media re-encoded whenever necessary. The encoding processes are run on virtual servers provided on-demand by the CERN Server Self Service Centre, so that new servers can be easily configured to adapt to higher load. Finally, the generated files are made available from the CERN standard web servers with streaming implemented using Windows Media Services

  11. A software package for biomedical image processing and analysis

    International Nuclear Information System (INIS)

    Goncalves, J.G.M.; Mealha, O.

    1988-01-01

    The decreasing cost of computing power and the introduction of low cost imaging boards justifies the increasing number of applications of digital image processing techniques in the area of biomedicine. There is however a large software gap to be fulfilled, between the application and the equipment. The requirements to bridge this gap are twofold: good knowledge of the hardware provided and its interface to the host computer, and expertise in digital image processing and analysis techniques. A software package incorporating these two requirements was developed using the C programming language, in order to create a user friendly image processing programming environment. The software package can be considered in two different ways: as a data structure adapted to image processing and analysis, which acts as the backbone and the standard of communication for all the software; and as a set of routines implementing the basic algorithms used in image processing and analysis. Hardware dependency is restricted to a single module upon which all hardware calls are based. The data structure that was built has four main features: hierchical, open, object oriented, and object dependent dimensions. Considering the vast amount of memory needed by imaging applications and the memory available in small imaging systems, an effective image memory management scheme was implemented. This software package is being used for more than one and a half years by users with different applications. It proved to be an excellent tool for helping people to get adapted into the system, and for standardizing and exchanging software, yet preserving flexibility allowing for users' specific implementations. The philosophy of the software package is discussed and the data structure that was built is described in detail

  12. Adapting the unified software development process for user interface development

    NARCIS (Netherlands)

    Obrenovic, Z.; Starcevic, D.

    2006-01-01

    In this paper we describe how existing software developing processes, such as Rational Unified Process, can be adapted in order to allow disciplined and more efficient development of user interfaces. The main objective of this paper is to demonstrate that standard modeling environments, based on the

  13. PROCESS SIMULATION IN SUPPLY CHAIN USING LOGWARE SOFTWARE

    OpenAIRE

    Sebastian Kot; Beata Åšlusarczyk

    2009-01-01

    The authors present basis of simulation usage in managerial decisionsupport focusing on the supply chain processes. In the beginning the need for simulationis presented, then advantages and disadvantages of simulation experiments and thesimulation tools juxtaposition. Finally the chances of supply chain process simulationusing Logware software are presented.

  14. Development of Flexible Software Process Lines with Variability Operations

    DEFF Research Database (Denmark)

    Dohrmann, Patrick; Schramm, Joachim; Kuhrmann, Marco

    2016-01-01

    the development of flexible software process lines. Method: We conducted a longitudinal study in which we studied 5 variants of the V-Modell XT process line for 2 years. Results: Our results show the variability operation instrument feasible in practice. We analyzed 616 operation exemplars addressing various...

  15. Software for event oriented processing on multiprocessor systems

    International Nuclear Information System (INIS)

    Fischler, M.; Areti, H.; Biel, J.; Bracker, S.; Case, G.; Gaines, I.; Husby, D.; Nash, T.

    1984-08-01

    Computing intensive problems that require the processing of numerous essentially independent events are natural customers for large scale multi-microprocessor systems. This paper describes the software required to support users with such problems in a multiprocessor environment. It is based on experience with and development work aimed at processing very large amounts of high energy physics data

  16. Algorithm of automatic generation of technology process and process relations of automotive wiring harnesses

    Institute of Scientific and Technical Information of China (English)

    XU Benzhu; ZHU Jiman; LIU Xiaoping

    2012-01-01

    Identifying each process and their constraint relations from the complex wiring harness drawings quickly and accurately is the basis for formulating process routes. According to the knowledge of automotive wiring harness and the characteristics of wiring harness components, we established the model of wiring harness graph. Then we research the algorithm of identifying technology processes automatically, finally we describe the relationships between processes by introducing the constraint matrix, which is in or- der to lay a good foundation for harness process planning and production scheduling.

  17. Automatic Detection of Mitosis and Nuclei From Cytogenetic Images by CellProfiler Software for Mitotic Index Estimation.

    Science.gov (United States)

    González, Jorge Ernesto; Radl, Analía; Romero, Ivonne; Barquinero, Joan Francesc; García, Omar; Di Giorgio, Marina

    2016-12-01

    Mitotic Index (MI) estimation expressed as percentage of mitosis plays an important role as quality control endpoint. To this end, MI is applied to check the lot of media and reagents to be used throughout the assay and also to check cellular viability after blood sample shipping, indicating satisfactory/unsatisfactory conditions for the progression of cell culture. The objective of this paper was to apply the CellProfiler open-source software for automatic detection of mitotic and nuclei figures from digitized images of cultured human lymphocytes for MI assessment, and to compare its performance to that performed through semi-automatic and visual detection. Lymphocytes were irradiated and cultured for mitosis detection. Sets of images from cultures were analyzed visually and findings were compared with those using CellProfiler software. The CellProfiler pipeline includes the detection of nuclei and mitosis with 80% sensitivity and more than 99% specificity. We conclude that CellProfiler is a reliable tool for counting mitosis and nuclei from cytogenetic images, saves considerable time compared to manual operation and reduces the variability derived from the scoring criteria of different scorers. The CellProfiler automated pipeline achieves good agreement with visual counting workflow, i.e. it allows fully automated mitotic and nuclei scoring in cytogenetic images yielding reliable information with minimal user intervention. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. Automatic Detection of Mitosis and Nuclei from Cytogenetic Images by CellProfiler Software for Mitotic Index Estimation

    International Nuclear Information System (INIS)

    Gonzalez, Jorge Ernesto; Romero, Ivonne; Garcia, Omar; Radl, Analia; Di Giorgio, Marina; Barquinero, Joan Francesc

    2016-01-01

    Mitotic Index (MI) estimation expressed as percentage of mitosis plays an important role as quality control endpoint. To this end, MI is applied to check the lot of media and reagents to be used throughout the assay and also to check cellular viability after blood sample shipping, indicating satisfactory/unsatisfactory conditions for the progression of cell culture. The objective of this paper was to apply the CellProfiler open-source software for automatic detection of mitotic and nuclei figures from digitized images of cultured human lymphocytes for MI assessment, and to compare its performance to that performed through semi-automatic and visual detection. Lymphocytes were irradiated and cultured for mitosis detection. Sets of images from cultures were analyzed visually and findings were compared with those using CellProfiler software. The CellProfiler pipeline includes the detection of nuclei and mitosis with 80% sensitivity and more than 99% specificity. We conclude that CellProfiler is a reliable tool for counting mitosis and nuclei from cytogenetic images, saves considerable time compared to manual operation and reduces the variability derived from the scoring criteria of different scorers. The CellProfiler automated pipeline achieves good agreement with visual counting workflow, i.e. it allows fully automated mitotic and nuclei scoring in cytogenetic images yielding reliable information with minimal user intervention. (authors)

  19. Cloud Detours: A Non-intrusive Approach for Automatic Software Adaptation to the Cloud

    OpenAIRE

    Maia , Paulo; Vasconcelos , Michel; Mendonça , Nabor ,

    2015-01-01

    Part 1: Research Track; International audience; A major challenge facing cloud migration is the need to change a legacy (on-premise) application’s source code so that it can better benefit from the inherit cloud computing characteristics, such as resource elasticity and high scalability. When performed manually, those changes are error-prone and may require a great effort from application developers. This paper presents a novel approach to support organizations in automatically adapting their...

  20. A process improvement model for software verification and validation

    Science.gov (United States)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  1. The application of image processing software: Photoshop in environmental design

    Science.gov (United States)

    Dong, Baohua; Zhang, Chunmi; Zhuo, Chen

    2011-02-01

    In the process of environmental design and creation, the design sketch holds a very important position in that it not only illuminates the design's idea and concept but also shows the design's visual effects to the client. In the field of environmental design, computer aided design has made significant improvement. Many types of specialized design software for environmental performance of the drawings and post artistic processing have been implemented. Additionally, with the use of this software, working efficiency has greatly increased and drawings have become more specific and more specialized. By analyzing the application of photoshop image processing software in environmental design and comparing and contrasting traditional hand drawing and drawing with modern technology, this essay will further explore the way for computer technology to play a bigger role in environmental design.

  2. Exploring the Role of Usability in the Software Process: A Study of Irish Software SMEs

    Science.gov (United States)

    O'Connor, Rory V.

    This paper explores the software processes and usability techniques used by Small and Medium Enterprises (SMEs) that develop web applications. The significance of this research is that it looks at development processes used by SMEs in order to assess to what degree usability is integrated into the process. This study seeks to gain an understanding into the level of awareness of usability within SMEs today and their commitment to usability in practice. The motivation for this research is to explore the current development processes used by SMEs in developing web applications and to understand how usability is represented in those processes. The background for this research is provided by the growth of the web application industry beyond informational web sites to more sophisticated applications delivering a broad range of functionality. This paper presents an analysis of the practices of several Irish SMEs that develop web applications through a series of case studies. With the focus on SMEs that develop web applications as Management Information Systems and not E-Commerce sites, informational sites, online communities or web portals. This study gathered data about the usability techniques practiced by these companies and their awareness of usability in the context of the software process in those SMEs. The contribution of this study is to further the understanding of the current role of usability within the software development processes of SMEs that develop web applications.

  3. SU-F-T-94: Plan2pdf - a Software Tool for Automatic Plan Report for Philips Pinnacle TPS

    Energy Technology Data Exchange (ETDEWEB)

    Wu, C [Sutter Medical Foundation, Roseville, CA (United States)

    2016-06-15

    Purpose: To implement an automatic electronic PDF plan reporting tool for Philips Pinnacle treatment planning system (TPS) Methods: An electronic treatment plan reporting software is developed by us to enable fully automatic PDF report from Pinnacle TPS to external EMR programs such as MOSAIQ. The tool is named “plan2pdf”. plan2pdf is implemented using Pinnacle scripts, Java and UNIX shell scripts, without any external program needed. plan2pdf supports full auto-mode and manual mode reporting. In full auto-mode, with a single mouse click, plan2pdf will generate a detailed Pinnacle plan report in PDF format, which includes customizable cover page, Pinnacle plan summary, orthogonal views through each plan POI and maximum dose point, DRR for each beam, serial transverse views captured throughout the dose grid at a user specified interval, DVH and scorecard windows. The final PDF report is also automatically bookmarked for each section above for convenient plan review. The final PDF report can either be saved on a user specified folder on Pinnacle, or it can be automatically exported to an EMR import folder via a user configured FTP service. In manual capture mode, plan2pdf allows users to capture any Pinnacle plan by full screen, individual window or rectangular ROI drawn on screen. Furthermore, to avoid possible patients’ plan mix-up during auto-mode reporting, a user conflict check feature is included in plan2pdf: it prompts user to wait if another patient is being exported by plan2pdf by another user. Results: plan2pdf is tested extensively and successfully at our institution consists of 5 centers, 15 dosimetrists and 10 physicists, running Pinnacle version 9.10 on Enterprise servers. Conclusion: plan2pdf provides a highly efficient, user friendly and clinical proven platform for all Philips Pinnacle users, to generate a detailed plan report in PDF format for external EMR systems.

  4. SU-F-T-94: Plan2pdf - a Software Tool for Automatic Plan Report for Philips Pinnacle TPS

    International Nuclear Information System (INIS)

    Wu, C

    2016-01-01

    Purpose: To implement an automatic electronic PDF plan reporting tool for Philips Pinnacle treatment planning system (TPS) Methods: An electronic treatment plan reporting software is developed by us to enable fully automatic PDF report from Pinnacle TPS to external EMR programs such as MOSAIQ. The tool is named “plan2pdf”. plan2pdf is implemented using Pinnacle scripts, Java and UNIX shell scripts, without any external program needed. plan2pdf supports full auto-mode and manual mode reporting. In full auto-mode, with a single mouse click, plan2pdf will generate a detailed Pinnacle plan report in PDF format, which includes customizable cover page, Pinnacle plan summary, orthogonal views through each plan POI and maximum dose point, DRR for each beam, serial transverse views captured throughout the dose grid at a user specified interval, DVH and scorecard windows. The final PDF report is also automatically bookmarked for each section above for convenient plan review. The final PDF report can either be saved on a user specified folder on Pinnacle, or it can be automatically exported to an EMR import folder via a user configured FTP service. In manual capture mode, plan2pdf allows users to capture any Pinnacle plan by full screen, individual window or rectangular ROI drawn on screen. Furthermore, to avoid possible patients’ plan mix-up during auto-mode reporting, a user conflict check feature is included in plan2pdf: it prompts user to wait if another patient is being exported by plan2pdf by another user. Results: plan2pdf is tested extensively and successfully at our institution consists of 5 centers, 15 dosimetrists and 10 physicists, running Pinnacle version 9.10 on Enterprise servers. Conclusion: plan2pdf provides a highly efficient, user friendly and clinical proven platform for all Philips Pinnacle users, to generate a detailed plan report in PDF format for external EMR systems.

  5. Model for Simulating a Spiral Software-Development Process

    Science.gov (United States)

    Mizell, Carolyn; Curley, Charles; Nayak, Umanath

    2010-01-01

    A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code

  6. A process control software package for the SRS

    International Nuclear Information System (INIS)

    Atkins, V.R.; Poole, D.E.; Rawlinson, W.R.

    1980-03-01

    The development of software to give high level access from application programs for monitoring and control of the Daresbury Synchrotron Radiation Source on a network-wide basis is described. The design and implementation of the control system database, a special supervisor call and and 'executive' type task handling of all process input/output services for the 7/32 (which runs under 05/32-MT), and process control 'device driver' software for the 7/16 (run under L5/16-MT) are included. (UK)

  7. Collaboration in Global Software Engineering Based on Process Description Integration

    Science.gov (United States)

    Klein, Harald; Rausch, Andreas; Fischer, Edward

    Globalization is one of the big trends in software development. Development projects need a variety of different resources with appropriate expert knowledge to be successful. More and more of these resources are nowadays obtained from specialized organizations and countries all over the world, varying in development approaches, processes, and culture. As seen with early outsourcing attempts, collaboration may fail due to these differences. Hence, the major challenge in global software engineering is to streamline collaborating organizations towards a successful conjoint development. Based on typical collaboration scenarios, this paper presents a structured approach to integrate processes in a comprehensible way.

  8. Why and how Mastering an Incremental and Iterative Software Development Process

    Science.gov (United States)

    Dubuc, François; Guichoux, Bernard; Cormery, Patrick; Mescam, Jean Christophe

    2004-06-01

    should theclassical reviews be performed: Software Specification Review? Preliminary Design Review? CriticalDesign Review? Code Review? Etc...Several solutions envisaged or already deployed by EADS SPACE Transportation will be presented, both from a methodological and technological point of view:- How the MELANIE EADS ST internal methodology improves the concurrent engineering activitiesbetween GNC, software and simulation teams in a very iterative and reactive way.- How the CMM approach can help by better formalizing Requirements Management and Planningprocesses.- How the Automatic Code Generation with "certified" tools (SCADE) can still dramatically shorten thedevelopment cycle.Then the presentation will conclude by showing an evaluation of the cost and planning reduction based on a pilot application by comparing figures on two similar projects: one with the classical waterfall process, the other one with an iterative and incremental approach.

  9. APERO, AN OPEN SOURCE BUNDLE ADJUSMENT SOFTWARE FOR AUTOMATIC CALIBRATION AND ORIENTATION OF SET OF IMAGES

    Directory of Open Access Journals (Sweden)

    M. Pierrot Deseilligny

    2012-09-01

    Full Text Available IGN has developed a set of photogrammetric tools, APERO and MICMAC, for computing 3D models from set of images. This software, developed initially for its internal needs are now delivered as open source code. This paper focuses on the presentation of APERO the orientation software. Compared to some other free software initiatives, it is probably more complex but also more complete, its targeted user is rather professionals (architects, archaeologist, geomophologist than people. APERO uses both computer vision approach for estimation of initial solution and photogrammetry for a rigorous compensation of the total error; it has a large library of parametric model of distortion allowing a precise modelization of all the kind of pinhole camera we know, including several model of fish-eye; there is also several tools for geo-referencing the result. The results are illustrated on various application, including the data-set of 3D-Arch workshop.

  10. Software tool for representation and processing of experimental data on high energy interactions of elementary particles

    International Nuclear Information System (INIS)

    Cherepanov, E.O.; Skachkov, N.B.

    2002-01-01

    The software tool is developed for detailed and evident displaying of information about energy and space distribution of secondary particles produced in the processes of elementary particles collisions. As input information the data on the components of 4-momenta of secondary particles is used. As for these data the information obtained from different parts of physical detector (for example, from the calorimeter or tracker) as well as the information obtained with the help of event generator is taken. The tool is intended for use in Windows operation system and is developed on the basis of Borland Delphi. Mathematical architecture of the software tool allows user to receive complete information without making additional calculations. The program automatically performs analysis of structure and distributions of signals and displays the results in a transparent form which allows their quick analysis. To display the information the three-dimensional graphic methods as well as colour decisions based on intuitive associations are also used. (author)

  11. A Software Development Simulation Model of a Spiral Process

    Science.gov (United States)

    Mizell, Carolyn; Malone, Linda

    2007-01-01

    There is a need for simulation models of software development processes other than the waterfall because processes such as spiral development are becoming more and more popular. The use of a spiral process can make the inherently difficult job of cost and schedule estimation even more challenging due to its evolutionary nature, but this allows for a more flexible process that can better meet customers' needs. This paper will present a discrete event simulation model of spiral development that can be used to analyze cost and schedule effects of using such a process in comparison to a waterfall process.

  12. EOS MLS Level 2 Data Processing Software Version 3

    Science.gov (United States)

    Livesey, Nathaniel J.; VanSnyder, Livesey W.; Read, William G.; Schwartz, Michael J.; Lambert, Alyn; Santee, Michelle L.; Nguyen, Honghanh T.; Froidevaux, Lucien; wang, Shuhui; Manney, Gloria L.; hide

    2011-01-01

    This software accepts the EOS MLS calibrated measurements of microwave radiances products and operational meteorological data, and produces a set of estimates of atmospheric temperature and composition. This version has been designed to be as flexible as possible. The software is controlled by a Level 2 Configuration File that controls all aspects of the software: defining the contents of state and measurement vectors, defining the configurations of the various forward models available, reading appropriate a priori spectroscopic and calibration data, performing retrievals, post-processing results, computing diagnostics, and outputting results in appropriate files. In production mode, the software operates in a parallel form, with one instance of the program acting as a master, coordinating the work of multiple slave instances on a cluster of computers, each computing the results for individual chunks of data. In addition, to do conventional retrieval calculations and producing geophysical products, the Level 2 Configuration File can instruct the software to produce files of simulated radiances based on a state vector formed from a set of geophysical product files taken as input. Combining both the retrieval and simulation tasks in a single piece of software makes it far easier to ensure that identical forward model algorithms and parameters are used in both tasks. This also dramatically reduces the complexity of the code maintenance effort.

  13. Software control and system configuration management - A process that works

    Science.gov (United States)

    Petersen, K. L.; Flores, C., Jr.

    1983-01-01

    A comprehensive software control and system configuration management process for flight-crucial digital control systems of advanced aircraft has been developed and refined to insure efficient flight system development and safe flight operations. Because of the highly complex interactions among the hardware, software, and system elements of state-of-the-art digital flight control system designs, a systems-wide approach to configuration control and management has been used. Specific procedures are implemented to govern discrepancy reporting and reconciliation, software and hardware change control, systems verification and validation testing, and formal documentation requirements. An active and knowledgeable configuration control board reviews and approves all flight system configuration modifications and revalidation tests. This flexible process has proved effective during the development and flight testing of several research aircraft and remotely piloted research vehicles with digital flight control systems that ranged from relatively simple to highly complex, integrated mechanizations.

  14. STAR-GENERIS - a software package for information processing

    International Nuclear Information System (INIS)

    Felkel, L.

    1985-01-01

    Man-machine-communication in electrical power plants is increasingly based on the capabilities of minicomputers. Rather than just displaying raw process data more complex processing is done to aid operators by improving information quality. Advanced operator aids for nuclear power plants are, e.g. alarm reduction, disturbance analysis and expert systems. Operator aids use complex combinations and computations of plant signals, which have to be described in a formal and homogeneous way. The design of such computer-based information systems requires extensive software and engineering efforts. The STAR software concept reduces the software effort to a minimum by proving an advanced program package which facilitates specification and implementation of engineering know-how necessary for sophisticated operator aids. (orig./HP) [de

  15. Software/hardware distributed processing network supporting the Ada environment

    Science.gov (United States)

    Wood, Richard J.; Pryk, Zen

    1993-09-01

    A high-performance, fault-tolerant, distributed network has been developed, tested, and demonstrated. The network is based on the MIPS Computer Systems, Inc. R3000 Risc for processing, VHSIC ASICs for high speed, reliable, inter-node communications and compatible commercial memory and I/O boards. The network is an evolution of the Advanced Onboard Signal Processor (AOSP) architecture. It supports Ada application software with an Ada- implemented operating system. A six-node implementation (capable of expansion up to 256 nodes) of the RISC multiprocessor architecture provides 120 MIPS of scalar throughput, 96 Mbytes of RAM and 24 Mbytes of non-volatile memory. The network provides for all ground processing applications, has merit for space-qualified RISC-based network, and interfaces to advanced Computer Aided Software Engineering (CASE) tools for application software development.

  16. DEVELOPING A SPATIAL PROCESSING SERVICE FOR AUTOMATIC CALCULATION OF STORM INUNDATION

    Directory of Open Access Journals (Sweden)

    H. Jafari

    2017-09-01

    Full Text Available With the increase in urbanization, the surface of earth and its climate are changing. These changes resulted in more frequent floodingand storm inundation in urban areas. The challenges of flooding can be addressed through several computational procedures. Due to its numerous advantages, accessible web services can be chosen as a proper format for determining the storm inundation. Web services have facilitated the integration and interactivity of the web applications. Such services made the interaction between machines more feasible. Web services enable the heterogeneous software systems to communicate with each other. A Web Processing Service (WPS makes it possible to process spatial data with different formats. In this study, we developed a WPS to automatically calculate the amount of storm inundation caused by rainfall in urban areas. The method we used for calculating the storm inundation is based on a simplified hydrologic model which estimates the final status of inundation. The simulation process and water transfer between subcatchments are carried out respectively, without user’s interference. The implementation of processing functions in a form of processing web services gives the capability to reuse the services and apply them in other services. As a result, it would avoid creating the duplicate resources.

  17. Policy Process Editor for P3BM Software

    Science.gov (United States)

    James, Mark; Chang, Hsin-Ping; Chow, Edward T.; Crichton, Gerald A.

    2010-01-01

    A computer program enables generation, in the form of graphical representations of process flows with embedded natural-language policy statements, input to a suite of policy-, process-, and performance-based management (P3BM) software. This program (1) serves as an interface between users and the Hunter software, which translates the input into machine-readable form; and (2) enables users to initialize and monitor the policy-implementation process. This program provides an intuitive graphical interface for incorporating natural-language policy statements into business-process flow diagrams. Thus, the program enables users who dictate policies to intuitively embed their intended process flows as they state the policies, reducing the likelihood of errors and reducing the time between declaration and execution of policy.

  18. PRISM, Processing and Review Interface for Strong Motion Data Software

    Science.gov (United States)

    Kalkan, E.; Jones, J. M.; Stephens, C. D.; Ng, P.

    2016-12-01

    A continually increasing number of high-quality digital strong-motion records from stations of the National Strong Motion Project (NSMP) of the U.S. Geological Survey (USGS), as well as data from regional seismic networks within the U.S., calls for automated processing of strong-motion records with human review limited to selected significant or flagged records. The NSMP has developed the Processing and Review Interface for Strong Motion data (PRISM) software to meet this need. PRISM automates the processing of strong-motion records by providing batch-processing capabilities. The PRISM software is platform-independent (coded in Java), open-source, and does not depend on any closed-source or proprietary software. The software consists of two major components: a record processing engine composed of modules for each processing step, and a graphical user interface (GUI) for manual review and processing. To facilitate the use by non-NSMP earthquake engineers and scientists, PRISM (both its processing engine and GUI components) is easy to install and run as a stand-alone system on common operating systems such as Linux, OS X and Windows. PRISM was designed to be flexible and extensible in order to accommodate implementation of new processing techniques. Input to PRISM currently is limited to data files in the Consortium of Organizations for Strong-Motion Observation Systems (COSMOS) V0 format, so that all retrieved acceleration time series need to be converted to this format. Output products include COSMOS V1, V2 and V3 files as: (i) raw acceleration time series in physical units with mean removed (V1), (ii) baseline-corrected and filtered acceleration, velocity, and displacement time series (V2), and (iii) response spectra, Fourier amplitude spectra and common earthquake-engineering intensity measures (V3). A thorough description of the record processing features supported by PRISM is presented with examples and validation results. All computing features have been

  19. The software for automatic creation of the formal grammars used by speech recognition, computer vision, editable text conversion systems, and some new functions

    Science.gov (United States)

    Kardava, Irakli; Tadyszak, Krzysztof; Gulua, Nana; Jurga, Stefan

    2017-02-01

    For more flexibility of environmental perception by artificial intelligence it is needed to exist the supporting software modules, which will be able to automate the creation of specific language syntax and to make a further analysis for relevant decisions based on semantic functions. According of our proposed approach, of which implementation it is possible to create the couples of formal rules of given sentences (in case of natural languages) or statements (in case of special languages) by helping of computer vision, speech recognition or editable text conversion system for further automatic improvement. In other words, we have developed an approach, by which it can be achieved to significantly improve the training process automation of artificial intelligence, which as a result will give us a higher level of self-developing skills independently from us (from users). At the base of our approach we have developed a software demo version, which includes the algorithm and software code for the entire above mentioned component's implementation (computer vision, speech recognition and editable text conversion system). The program has the ability to work in a multi - stream mode and simultaneously create a syntax based on receiving information from several sources.

  20. Controlled versus automatic processes: which is dominant to safety? The moderating effect of inhibitory control.

    Directory of Open Access Journals (Sweden)

    Yaoshan Xu

    Full Text Available This study explores the precursors of employees' safety behaviors based on a dual-process model, which suggests that human behaviors are determined by both controlled and automatic cognitive processes. Employees' responses to a self-reported survey on safety attitudes capture their controlled cognitive process, while the automatic association concerning safety measured by an Implicit Association Test (IAT reflects employees' automatic cognitive processes about safety. In addition, this study investigates the moderating effects of inhibition on the relationship between self-reported safety attitude and safety behavior, and that between automatic associations towards safety and safety behavior. The results suggest significant main effects of self-reported safety attitude and automatic association on safety behaviors. Further, the interaction between self-reported safety attitude and inhibition and that between automatic association and inhibition each predict unique variances in safety behavior. Specifically, the safety behaviors of employees with lower level of inhibitory control are influenced more by automatic association, whereas those of employees with higher level of inhibitory control are guided more by self-reported safety attitudes. These results suggest that safety behavior is the joint outcome of both controlled and automatic cognitive processes, and the relative importance of these cognitive processes depends on employees' individual differences in inhibitory control. The implications of these findings for theoretical and practical issues are discussed at the end.

  1. Workflow-centred evaluation of an automatic lesion tracking software for chemotherapy monitoring by CT

    Energy Technology Data Exchange (ETDEWEB)

    Moltz, Jan Hendrik; Peitgen, Heinz-Otto [Fraunhofer MEVIS - Institute for Medical Image Computing, Bremen (Germany); D' Anastasi, Melvin [University Hospital Munich-Grosshadern, Department of Clinical Radiology, Muenchen (Germany); Kiessling, Andreas [University Hospital Giessen and Marburg, Department of Diagnostic Radiology, Marburg (Germany); Pinto dos Santos, Daniel [University Hospital Mainz, Department of Diagnostic and Interventional Radiology, Mainz (Germany); Schuelke, Christoph [University Hospital Muenster, Institute of Clinical Radiology, Muenster (Germany)

    2012-12-15

    In chemotherapy monitoring, an estimation of the change in tumour size is an important criterion for the assessment of treatment success. This requires a comparison between corresponding lesions in the baseline and follow-up computed tomography (CT) examinations. We evaluate the clinical benefits of an automatic lesion tracking tool that identifies the target lesions in the follow-up CT study and pre-computes the lesion volumes. Four radiologists performed volumetric follow-up examinations for 52 patients with and without lesion tracking. In total, 139 lung nodules, liver metastases and lymph nodes were given as target lesions. We measured reading time, inter-reader variability in lesion identification and volume measurements, and the amount of manual adjustments of the segmentation results. With lesion tracking, target lesion assessment time decreased by 38 % or 22 s per lesion. Relative volume difference between readers was reduced from 0.171 to 0.1. Segmentation quality was comparable with and without lesion tracking. Our automatic lesion tracking tool can make interpretation of follow-up CT examinations quicker and provide results that are less reader-dependent. (orig.)

  2. Combination of physical exercise and adenosine improves accuracy of automatic calculation of stress LVEF in gated SPECT using QGS software

    International Nuclear Information System (INIS)

    Tehranipour, N.; AL-Nahhas, A.; Towey, D.

    2005-01-01

    Combining exercise and adenosine during the stress phase of myocardial perfusion imaging (MPI) is known to reduce adverse effects and improve image quality. The aim of this study was to assess whether it can also improve the automatic calculation of left ventricular ejection fraction (LVEF) by QGS software package, during the stress phase of Gated SPECT. One hundred patients who had stress Gated SPECT were retrospectively included in this study. Gated data of those who had adenosine only (50 patients = group A) was compared with those obtained in another group of 50 patients who had added bicycle exercise (Group B). All had identical image acquisition protocol using 99mT c-tetrofosmine. Clinical adverse effects, changes in blood pressure (BP), heart rate (HR), and ECG were monitored. Visual assessment of subdiaphragmatic uptake and accuracy of automatic regions of interest (ROI's) drawn by the software were noted. Regions of interest that involved sub-diaphragmatic uptake and resulting in low LVEF were manually adjusted to include the left ventricle only, and the frequency of manual adjustment was noted. No significant difference was noted in age, sex, baseline BP and HR between groups A and B. Adverse effects occurred less often in group B compared to group A (12% vs. 24%, p = 0.118). Maximum HR and BP achieved during stress were significantly higher in group B compared to group A (p 0.025, p = 0.001 respectively). The number of patients who had faulty ROI's and low LVEF, who needed manual adjustment of ROI.s, were higher in group A compared to group B (16% vs. 6%, p = 0.025). The values of LVEF showed significant improvement following manual adjustment of ROI's, increasing from a mean of 19.63 ± 15.96 to 62.13 ± 7.55 (p = 0.0001) and from 17.33 ± 9.5 to 49.67 ± 7.7 (p = 0.0014) in groups A and B respectively. The addition of exercise to adenosine significantly improves the automatic calculation of LVEF by QGS software during Gated SPECT and reduces the need

  3. Techniques and software architectures for medical visualisation and image processing

    NARCIS (Netherlands)

    Botha, C.P.

    2005-01-01

    This thesis presents a flexible software platform for medical visualisation and image processing, a technique for the segmentation of the shoulder skeleton from CT data and three techniques that make contributions to the field of direct volume rendering. Our primary goal was to investigate the use

  4. Using Knowledge Management to Revise Software-Testing Processes

    Science.gov (United States)

    Nogeste, Kersti; Walker, Derek H. T.

    2006-01-01

    Purpose: This paper aims to use a knowledge management (KM) approach to effectively revise a utility retailer's software testing process. This paper presents a case study of how the utility organisation's customer services IT production support group improved their test planning skills through applying the American Productivity and Quality Center…

  5. A system for classifying wood-using industries and recording statistics for automatic data processing.

    Science.gov (United States)

    E.W. Fobes; R.W. Rowe

    1968-01-01

    A system for classifying wood-using industries and recording pertinent statistics for automatic data processing is described. Forms and coding instructions for recording data of primary processing plants are included.

  6. Induction Heating Process Design Using COMSOL Multiphysics Software

    Directory of Open Access Journals (Sweden)

    Andy Triwinarko

    2011-08-01

    Full Text Available Induction heating is clean environmental heating process due to a non-contact heating process. There is lots of the induction heating type that be used in the home appliance but it is still new technology in Indonesia. The main interesting area of the induction heating design is the efficiency of the usage of energy and choice of the plate material. COMSOL Multiphysics Software can be used to simulate and estimate the induction heating process. Therefore, the software can be used to design the induction heating process that will have a optimum efficiency. The properties of the induction heating design were also simulated and analyzed such as effect of inductors width, inductors distance, and conductive plate material. The result was shown that the good design of induction heating must have a short width and distance inductor and used silicon carbide as material plate with high frequency controller.

  7. Crafting a Software Process Improvement Approach - A Retrospective Systematization

    DEFF Research Database (Denmark)

    Kuhrmann, Marco

    2015-01-01

    Structured approaches are beneficial for successful software process improvement (SPI). However, process engineers often struggle with standardized SPI methods, such as capability maturity model integration (CMMI) or International Organization for Standardization (ISO) 15504, and complain about too...... and provide a structured reflection on our experiences from creating and adopting the Artifact-based Software Process Improvement & Management (ArSPI) model. We present the steps of the construction procedure, the validation, and the dissemination of the model. Furthermore, we detail on the applied methods...... generic or voluminous approaches or methods that are alien to the organizations in which SPI is conducted. Therefore, process engineers need to customize existing SPI models or develop new approaches for company-specific SPI programs. While conducting SPI in the context of the German V-Modell XT, we faced...

  8. High-throughput image analysis of tumor spheroids: a user-friendly software application to measure the size of spheroids automatically and accurately.

    Science.gov (United States)

    Chen, Wenjin; Wong, Chung; Vosburgh, Evan; Levine, Arnold J; Foran, David J; Xu, Eugenia Y

    2014-07-08

    The increasing number of applications of three-dimensional (3D) tumor spheroids as an in vitro model for drug discovery requires their adaptation to large-scale screening formats in every step of a drug screen, including large-scale image analysis. Currently there is no ready-to-use and free image analysis software to meet this large-scale format. Most existing methods involve manually drawing the length and width of the imaged 3D spheroids, which is a tedious and time-consuming process. This study presents a high-throughput image analysis software application - SpheroidSizer, which measures the major and minor axial length of the imaged 3D tumor spheroids automatically and accurately; calculates the volume of each individual 3D tumor spheroid; then outputs the results in two different forms in spreadsheets for easy manipulations in the subsequent data analysis. The main advantage of this software is its powerful image analysis application that is adapted for large numbers of images. It provides high-throughput computation and quality-control workflow. The estimated time to process 1,000 images is about 15 min on a minimally configured laptop, or around 1 min on a multi-core performance workstation. The graphical user interface (GUI) is also designed for easy quality control, and users can manually override the computer results. The key method used in this software is adapted from the active contour algorithm, also known as Snakes, which is especially suitable for images with uneven illumination and noisy background that often plagues automated imaging processing in high-throughput screens. The complimentary "Manual Initialize" and "Hand Draw" tools provide the flexibility to SpheroidSizer in dealing with various types of spheroids and diverse quality images. This high-throughput image analysis software remarkably reduces labor and speeds up the analysis process. Implementing this software is beneficial for 3D tumor spheroids to become a routine in vitro model

  9. Collaborative business processes for enhancing partnerships among software services providers

    Science.gov (United States)

    Heil Cancian, Maiara; Rabelo, Ricardo; Gresse von Wangenheim, Christiane

    2015-08-01

    Software services have represented a powerful view to support the realisation of the service-oriented architecture (SOA) paradigm. Using open standards and facilitating systems projects, they have increasingly been used as a corporate architectural approach to create interoperable services-based software solutions that can more easily be reused and shared across disparate applications. In the context of software companies, most of them are small firms having enormous difficulties to keep competitive. One strategy to enhance their sustainability is to enlarge partnerships among them at a more valuable level by jointly offering (web) services-based solutions. However, their culture of collaboration is low, and partnerships are usually done with the same companies and sporadically. This article presents an approach to support a more intense collaboration among software companies to attend business opportunities in a more agile way, joining capacities and capabilities which they would not have if they worked alone. This requires, however, some preparedness. From the perspective of business processes, they should understand how to carry out a collaboration more properly. This is essentially what this article is about. It presents a comprehensive list of collaborative business processes and base practices that can also act as a guide for service providers' managers to implement and manage the collaboration along its lifecycle. Processes have been validated and results are discussed.

  10. Software factory techniques applied to Process Control at CERN

    CERN Multimedia

    Dutour, MD

    2007-01-01

    The CERN Large Hadron Collider (LHC) requires constant monitoring and control of quantities of parameters to guarantee operational conditions. For this purpose, a methodology called UNICOS (UNIfied Industrial COntrols Systems) has been implemented to standardize the design of process control applications. To further accelerate the development of these applications, we migrated our existing UNICOS tooling suite toward a software factory in charge of assembling project, domain and technical information seamlessly into deployable PLC (Programmable logic Controller) – SCADA (Supervisory Control And Data Acquisition) systems. This software factory delivers consistently high quality by reducing human error and repetitive tasks, and adapts to user specifications in a cost-efficient way. Hence, this production tool is designed to encapsulate and hide the PLC and SCADA target platforms, enabling the experts to focus on the business model rather than specific syntaxes and grammars. Based on industry standard software...

  11. Software factory techniques applied to process control at CERN

    CERN Document Server

    Dutour, Mathias D

    2008-01-01

    The CERN Large Hadron Collider (LHC) requires constant monitoring and control of quantities of parameters to guarantee operational conditions. For this purpose, a methodology called UNICOS (UNIfied Industrial COntrols Systems) has been implemented to standardize the design of process control applications. To further accelerate the development of these applications, we migrated our existing UNICOS tooling suite toward a software factory in charge of assembling project, domain and technical information seamlessly into deployable PLC (Programmable logic Controller) - SCADA (Supervisory Control And Data Acquisition) systems. This software factory delivers consistently high quality by reducing human error and repetitive tasks, and adapts to user specifications in a cost-efficient way. Hence, this production tool is designed to encapsulate and hide the PLC and SCADA target platforms, enabling the experts to focus on the business model rather than specific syntaxes and grammars. Based on industry standard software, ...

  12. eGrader, A Software Application that Automatically Scores Student Essays: with a Postscript on Ethical Complexities

    Directory of Open Access Journals (Sweden)

    Roxanne Byrne

    2010-12-01

    Full Text Available Online and traditional teachers face several instructional challenges with regard to assessing student learning. This paper focuses on a software application that automatically scores student essay. The first part gives a brief overview of three commercial automated essay scoring systems. Then it describes the technical aspects of the machine grader developed by the authors, including an assessment of its performance. Although the statistical results were significant in finding a strong correlation between human and machine scorers and the other measures, follow-up non-quantitative evaluations led the researchers to discontinue using the eGrader. They concluded that while the eGrader's ability to measure objective evaluation criteria was successful, measuring subjective ideas proved to more complex and problematic.

  13. Collaborative process improvement with examples from the software world

    CERN Document Server

    Yeakley, C

    2007-01-01

    Collaborative Process Improvement specifically addresses software companies that are interested in addressing quality in human terms. Using Collaborative Process Improvement techniques builds effective processes to deliver quality products; it helps readers relate to what quality means to the end-user and provides the essential tools and methods to integrate the face of the customer into the organization's day-to-day processes. It comes complete with real-world examples that are practical and unders andable to professionals in every role of a company.

  14. Renal cortical volume measured using automatic contouring software for computed tomography and its relationship with BMI, age and renal function

    International Nuclear Information System (INIS)

    Muto, Natalia Sayuri; Kamishima, Tamotsu; Harris, Ardene A.; Kato, Fumi; Onodera, Yuya; Terae, Satoshi; Shirato, Hiroki

    2011-01-01

    Purpose: To evaluate the relationship between renal cortical volume, measured by an automatic contouring software, with body mass index (BMI), age and renal function. Materials and methods: The study was performed in accordance to the institutional guidelines at our hospital. Sixty-four patients (34 men, 30 women), aged 19 to 79 years had their CT scans for diagnosis or follow-up of hepatocellular carcinoma retrospectively examined by a computer workstation using a software that automatically contours the renal cortex and the renal parenchyma. Body mass index and estimated glomerular filtration rate (eGFR) were calculated based on data collected. Statistical analysis was done using the Student t-test, multiple regression analysis, and intraclass correlation coefficient (ICC). Results: The ICC for total renal and renal cortical volumes were 0.98 and 0.99, respectively. Renal volume measurements yielded a mean cortical volume of 105.8 cm 3 ± 28.4 SD, mean total volume of 153 cm 3 ± 39 SD and mean medullary volume of 47.8 cm 3 ± 19.5 SD. The correlation between body weight/height/BMI and both total renal and cortical volumes presented r = 0.6, 0.6 and 0.4, respectively, p < 0.05, while the correlation between renal cortex and age was r = -0.3, p < 0.05. eGFR showed correlation with renal cortical volume r = 0.6, p < 0.05. Conclusion: This study demonstrated that renal cortical volume had a moderate positive relationship with BMI, moderate negative relationship with age, and a strong positive relationship with the renal function, and provided a new method to routinely produce volumetric assessment of the kidney.

  15. Adult echocardiographic nomograms: overview, critical review and creation of a software for automatic, fast and easy calculation of normal values.

    Science.gov (United States)

    Cantinotti, Massimiliano; Giordano, Raffaele; Paterni, Marco; Saura, Daniel; Scalese, Marco; Franchi, Eliana; Assanta, Nadia; Koestenberg, Martin; Dulgheru, Raluca; Sugimoto, Tadafumi; Bernard, Anne; Caballero, Luis; Lancellotti, Patrizio

    2017-12-01

    There is a crescent interest on normal adult echocardiographic values and the introduction of new deformation imaging and 3D parameters pose the issue of normative data. A multitude of nomograms has been recently published, however data are often fragmentary, difficult to find, and their strengths/limitations have been never evaluated. (I) to provide a review of current echocardiographic nomograms; (II) to generate a tool for easy and fast access to these data. A literature search was conducted accessing the National Library of Medicine using the keywords: 2D/3D echocardiography, strain, left/right ventricle, atrial, mitral/tricuspid valve, aorta, reference values/nomograms/normal values. Adding the following keywords, the results were further refined: range/intervals, myocardial velocity, strain rate and speckle tracking. Forty one published studies were included. Our study reveals that for several of 2D/3D parameters sufficient normative data exist, however, a few limitations still persist. For some basic parameters (i.e., mitral/tricuspid/pulmonary valves, great vessels) and for 3D valves data are scarce. There is a lack of studies evaluating ethnic differences. Data have been generally expressed as mean values normalised for gender and age instead of computing models incorporating different variables (age/gender/body sizes) to calculate z scores. To summarize results a software ( Echocardio-Normal Values ) who automatically calculate range of normality for a broad range of echocardiographic measurements according to age/gender/weight/height, has been generated. We provide an up-to-date and critical review of strengths/limitation of current adult echocardiographic nomograms. Furthermore we generated a software for automatic, easy and fast access to multiple echocardiographic normative data.

  16. Automatic system of production, transfer and processing of coin targets for the production of metallic radioisotopes

    Science.gov (United States)

    Pellicioli, M.; Ouadi, A.; Marchand, P.; Foehrenbacher, T.; Schuler, J.; Dick-Schuler, N.; Brasse, D.

    2017-05-01

    The work presented in this paper gathers three main technical developments aiming at 1) optimizing nuclide production by the mean of solid targets 2) automatically transferring coin targets from vault to hotcell without human intervention 3) processing target dilution and purification in hotcell automatically. This system has been installed on a ACSI TR24 cyclotron in Strasbourg France.

  17. Combining On-Line Characterization Tools with Modern Software Environments for Optimal Operation of Polymerization Processes

    Directory of Open Access Journals (Sweden)

    Navid Ghadipasha

    2016-02-01

    Full Text Available This paper discusses the initial steps towards the formulation and implementation of a generic and flexible model centric framework for integrated simulation, estimation, optimization and feedback control of polymerization processes. For the first time it combines the powerful capabilities of the automatic continuous on-line monitoring of polymerization system (ACOMP, with a modern simulation, estimation and optimization software environment towards an integrated scheme for the optimal operation of polymeric processes. An initial validation of the framework was performed for modelling and optimization using literature data, illustrating the flexibility of the method to apply under different systems and conditions. Subsequently, off-line capabilities of the system were fully tested experimentally for model validations, parameter estimation and process optimization using ACOMP data. Experimental results are provided for free radical solution polymerization of methyl methacrylate.

  18. Dosimetric validation for an automatic brain metastases planning software using single-isocenter dynamic conformal arcsDosimetric validation for an automatic brain metastases planning software using single-isocenter dynamic conformal arcs.

    Science.gov (United States)

    Liu, Haisong; Li, Jun; Pappas, Evangelos; Andrews, David; Evans, James; Werner-Wasik, Maria; Yu, Yan; Dicker, Adam; Shi, Wenyin

    2016-09-08

    An automatic brain-metastases planning (ABMP) software has been installed in our institution. It is dedicated for treating multiple brain metastases with radiosurgery on linear accelerators (linacs) using a single-setup isocenter with noncoplanar dynamic conformal arcs. This study is to validate the calculated absolute dose and dose distribution of ABMP. Three types of measurements were performed to validate the planning software: 1, dual micro ion chambers were used with an acrylic phantom to measure the absolute dose; 2, a 3D cylindrical phantom with dual diode array was used to evaluate 2D dose distribution and point dose for smaller targets; and 3, a 3D pseudo-in vivo patient-specific phantom filled with polymer gels was used to evaluate the accuracy of 3D dose distribution and radia-tion delivery. Micro chamber measurement of two targets (volumes of 1.2 cc and 0.9 cc, respectively) showed that the percentage differences of the absolute dose at both targets were less than 1%. Averaged GI passing rate of five different plans measured with the diode array phantom was above 98%, using criteria of 3% dose difference, 1 mm distance to agreement (DTA), and 10% low-dose threshold. 3D gel phantom measurement results demonstrated a 3D displacement of nine targets of 0.7 ± 0.4 mm (range 0.2 ~ 1.1 mm). The averaged two-dimensional (2D) GI passing rate for several region of interests (ROI) on axial slices that encompass each one of the nine targets was above 98% (5% dose difference, 2 mm DTA, and 10% low-dose threshold). Measured D95, the minimum dose that covers 95% of the target volume, of the nine targets was 0.7% less than the calculated D95. Three different types of dosimetric verification methods were used and proved the dose calculation of the new automatic brain metastases planning (ABMP) software was clinical acceptable. The 3D pseudo-in vivo patient-specific gel phantom test also served as an end-to-end test for validating not only the dose calculation, but the

  19. 77 FR 50724 - Developing Software Life Cycle Processes for Digital Computer Software Used in Safety Systems of...

    Science.gov (United States)

    2012-08-22

    ... review of applications for permits and licenses. The DG entitled ``Developing Software Life Cycle... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Developing Software Life Cycle Processes for Digital Computer Software Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory Commission...

  20. Understanding and Predicting the Process of Software Maintenance Releases

    Science.gov (United States)

    Basili, Victor; Briand, Lionel; Condon, Steven; Kim, Yong-Mi; Melo, Walcelio L.; Valett, Jon D.

    1996-01-01

    One of the major concerns of any maintenance organization is to understand and estimate the cost of maintenance releases of software systems. Planning the next release so as to maximize the increase in functionality and the improvement in quality are vital to successful maintenance management. The objective of this paper is to present the results of a case study in which an incremental approach was used to better understand the effort distribution of releases and build a predictive effort model for software maintenance releases. This study was conducted in the Flight Dynamics Division (FDD) of NASA Goddard Space Flight Center(GSFC). This paper presents three main results: 1) a predictive effort model developed for the FDD's software maintenance release process; 2) measurement-based lessons learned about the maintenance process in the FDD; and 3) a set of lessons learned about the establishment of a measurement-based software maintenance improvement program. In addition, this study provides insights and guidelines for obtaining similar results in other maintenance organizations.

  1. Versatile software for semiautomatic analysis and processing of laser-induced plasma spectra

    International Nuclear Information System (INIS)

    Mateo, M.P.; Nicolas, G.; Pinon, V.; Alvarez, J.C.; Ramil, A.; Yanez, A.

    2005-01-01

    The present article describes the main characteristics and operations of SALIPS (software for the analysis of laser-induced plasma spectra), a computer program designed for use in Spectroscopy. During the last years laser-induced plasma spectroscopy (LIPS) has grown in popularity and different applications have been developed in several fields. However, until now there is no software reported to perform the recognition of the elemental composition of a generic sample from its LIP spectrum, which must be achieved by hand in a tedious comparative process of experimental peaks with emission lines from databases. For this reason, a computer program that includes several tools to provide a semi-automatic identification of the peaks of a LIP spectrum has been developed. The program, written in Microsoft registered Visual Basic registered code, has a user-friendly graphical interface and is a flexible tool that enables to handle, edit, copy and print a quick presentation of the data including automatically the identification results in the graph. SALIPS also provides some physical properties of the elements and includes algorithms for performing the simulation of spectra. The potential of the program is illustrated with some examples

  2. Radiation dose optimisation for conventional imaging in infants and newborns using automatic dose management software: an application of the new 2013/59 EURATOM directive.

    Science.gov (United States)

    Alejo, L; Corredoira, E; Sánchez-Muñoz, F; Huerga, C; Aza, Z; Plaza-Núñez, R; Serrada, A; Bret-Zurita, M; Parrón, M; Prieto-Areyano, C; Garzón-Moll, G; Madero, R; Guibelalde, E

    2018-04-09

    Objective: The new 2013/59 EURATOM Directive (ED) demands dosimetric optimisation procedures without undue delay. The aim of this study was to optimise paediatric conventional radiology examinations applying the ED without compromising the clinical diagnosis. Automatic dose management software (ADMS) was used to analyse 2678 studies of children from birth to 5 years of age, obtaining local diagnostic reference levels (DRLs) in terms of entrance surface air kerma. Given local DRL for infants and chest examinations exceeded the European Commission (EC) DRL, an optimisation was performed decreasing the kVp and applying the automatic control exposure. To assess the image quality, an analysis of high-contrast resolution (HCSR), signal-to-noise ratio (SNR) and figure of merit (FOM) was performed, as well as a blind test based on the generalised estimating equations method. For newborns and chest examinations, the local DRL exceeded the EC DRL by 113%. After the optimisation, a reduction of 54% was obtained. No significant differences were found in the image quality blind test. A decrease in SNR (-37%) and HCSR (-68%), and an increase in FOM (42%), was observed. ADMS allows the fast calculation of local DRLs and the performance of optimisation procedures in babies without delay. However, physical and clinical analyses of image quality remain to be needed to ensure the diagnostic integrity after the optimisation process. Advances in knowledge: ADMS are useful to detect radiation protection problems and to perform optimisation procedures in paediatric conventional imaging without undue delay, as ED requires.

  3. Biometric Secured Result Processing Software For Nigerian Tertiary Institutions

    Directory of Open Access Journals (Sweden)

    Oladipo Oluwasegun

    2015-08-01

    Full Text Available Abstract One of the challenges facing result processing in Nigerian tertiary institutions is the problem of insecurity. Untraceable changes are made to students result and this result to various disasters such as innocent people losing their jobs since their innocence cannot be proven. Biometric based systems operate on behavioral and physiological biometric data to identify a person and grant required access to a user. Physiological characteristics such as fingerprint remains unchanged throughout an individuals life time and thus it can serve as a viable means of identifying and authenticating users who are to access a system. In this study fingerprint biometric based result processing software is developed to ensure that users are well authenticated and are made to see only what they are pre-configured to see and work with. The fingerprint authentication system was developed using visual basic.net. Staff fingerprints were enrolled into the system to form a biometric template which the system validates against at every login attempt on the result processing software. The digital personal one touch ID sdk and other libraries were used in developing the authentication system. The result processing software also ensures that all write transactions to the database are confirmed and identified by forcing another biometric authentication at the point of making a write request to the web server and associated database. This ensures that the exact person initiating the transaction was the same user who logged in to the application. The users identified at login and various confirmation milestones set for write transactions are logged into a table for future reference and audit trail. Conclusively the developed system has helped to eradicate the problem of user impersonation by ensuring only authorized users are made to access the software and in-turn participate in result processing activities.

  4. A Knowledge-based Environment for Software Process Performance Analysis

    Directory of Open Access Journals (Sweden)

    Natália Chaves Lessa Schots

    2015-08-01

    Full Text Available Background: Process performance analysis is a key step for implementing continuous improvement in software organizations. However, the knowledge to execute such analysis is not trivial and the person responsible to executing it must be provided with appropriate support. Aim: This paper presents a knowledge-based environment, named SPEAKER, proposed for supporting software organizations during the execution of process performance analysis. SPEAKER comprises a body of knowledge and a set of activities and tasks for software process performance analysis along with supporting tools to executing these activities and tasks. Method: We conducted an informal literature reviews and a systematic mapping study, which provided basic requirements for the proposed environment. We implemented the SPEAKER environment integrating supporting tools for the execution of activities and tasks of performance analysis and the knowledge necessary to execute them, in order to meet the variability presented by the characteristics of these activities. Results: In this paper, we describe each SPEAKER module and the individual evaluations of these modules, and also present an example of use comprising how the environment can guide the user through a specific performance analysis activity. Conclusion: Although we only conducted individual evaluations of SPEAKER’s modules, the example of use indicates the feasibility of the proposed environment. Therefore, the environment as a whole will be further evaluated to verify if it attains its goal of assisting in the execution of process performance analysis by non-specialist people.

  5. SU-E-T-253: Open-Source Automatic Software for Quantifying Biological Assays of Radiation Effects

    International Nuclear Information System (INIS)

    Detappe, A; Korideck, H; Makrigiorgos, G; Berbeco, R

    2014-01-01

    Purpose: Clonogenic cell survival is a common assay for quantifying the effect of drugs and radiation. Manual counting of surviving colonies can take 30–90seconds per plate, a major limitation for large studies. Currently available automatic counting tools are not easily modified for radiation oncology research. Our goal is to provide an open-source toolkit for precise, accurate and fast analysis of biological assays in radiation oncology. Methods: As an example analysis, we used HeLa cells incubated with gadolinium nanoparticles prior to irradiation. After treatment, the cells are grown for 14days to allow for colony formation. To analyze the colony growth, we capture images of each dish for archiving and automatic computer-based analysis. A FujifilmX20 camera is placed at the top of a box setup, 20cm above the sample, which is backlit by a LED lamp placed at the bottom of the box. We use a Gaussian filter (width=1.3mm) and color threshold (19–255). The minimum size for a colony to be counted is 1mm. For this example, 20 dishes with a large range of colonies were analyzed. Each dish was counted 3 times manually by 3 different users and then compared to our counter. Results: Automatic counting of cell colonies takes an average of 7seconds, enabling the analysis process to be accelerated 4–12 times. The average precision of the automatic counter was 1.7%. The Student t-test demonstrated the non-significant differences between the two counting methods (p=0.64). The ICC demonstrated the reliability of each method with ICC>0.999 (automatic) and ICC=0.95 (manual). Conclusion: We developed an open-source automatic toolkit for the analysis of biological assays in radiation oncology and demonstrated the accuracy, precision and effort savings for clonogenic cell survival quantification. This toolkit is currently being used in two laboratories for routine experimental analysis and will be made freely available on our departmental website

  6. CAPD Software Development for Automatic Piping System Design: Checking Piping Pocket, Checking Valve Level and Flexibility

    International Nuclear Information System (INIS)

    Ari Satmoko; Edi Karyanta; Dedy Haryanto; Abdul Hafid; Sudarno; Kussigit Santosa; Pinitoyo, A.; Demon Handoyo

    2003-01-01

    One of several steps in industrial plant construction is preparing piping layout drawing. In this drawing, pipe and all other pieces such as instrumentation, equipment, structure should be modeled A software called CAPD was developed to replace and to behave as piping drafter or designer. CAPD was successfully developed by adding both subprogram CHKUPIPE and CHKMANV. The first subprogram can check and gives warning if there is piping pocket in the piping system. The second can identify valve position and then check whether valve can be handled by operator hand The main program CAPD was also successfully modified in order to be capable in limiting the maximum length of straight pipe. By limiting the length, piping flexibility can be increased. (author)

  7. A software tool for automatic classification and segmentation of 2D/3D medical images

    International Nuclear Information System (INIS)

    Strzelecki, Michal; Szczypinski, Piotr; Materka, Andrzej; Klepaczko, Artur

    2013-01-01

    Modern medical diagnosis utilizes techniques of visualization of human internal organs (CT, MRI) or of its metabolism (PET). However, evaluation of acquired images made by human experts is usually subjective and qualitative only. Quantitative analysis of MR data, including tissue classification and segmentation, is necessary to perform e.g. attenuation compensation, motion detection, and correction of partial volume effect in PET images, acquired with PET/MR scanners. This article presents briefly a MaZda software package, which supports 2D and 3D medical image analysis aiming at quantification of image texture. MaZda implements procedures for evaluation, selection and extraction of highly discriminative texture attributes combined with various classification, visualization and segmentation tools. Examples of MaZda application in medical studies are also provided

  8. A software tool for automatic classification and segmentation of 2D/3D medical images

    Energy Technology Data Exchange (ETDEWEB)

    Strzelecki, Michal, E-mail: michal.strzelecki@p.lodz.pl [Institute of Electronics, Technical University of Lodz, Wolczanska 211/215, 90-924 Lodz (Poland); Szczypinski, Piotr; Materka, Andrzej; Klepaczko, Artur [Institute of Electronics, Technical University of Lodz, Wolczanska 211/215, 90-924 Lodz (Poland)

    2013-02-21

    Modern medical diagnosis utilizes techniques of visualization of human internal organs (CT, MRI) or of its metabolism (PET). However, evaluation of acquired images made by human experts is usually subjective and qualitative only. Quantitative analysis of MR data, including tissue classification and segmentation, is necessary to perform e.g. attenuation compensation, motion detection, and correction of partial volume effect in PET images, acquired with PET/MR scanners. This article presents briefly a MaZda software package, which supports 2D and 3D medical image analysis aiming at quantification of image texture. MaZda implements procedures for evaluation, selection and extraction of highly discriminative texture attributes combined with various classification, visualization and segmentation tools. Examples of MaZda application in medical studies are also provided.

  9. A software defined RTU multi-protocol automatic adaptation data transmission method

    Science.gov (United States)

    Jin, Huiying; Xu, Xingwu; Wang, Zhanfeng; Ma, Weijun; Li, Sheng; Su, Yong; Pan, Yunpeng

    2018-02-01

    Remote terminal unit (RTU) is the core device of the monitor system in hydrology and water resources. Different devices often have different communication protocols in the application layer, which results in the difficulty in information analysis and communication networking. Therefore, we introduced the idea of software defined hardware, and abstracted the common feature of mainstream communication protocols of RTU application layer, and proposed a uniformed common protocol model. Then, various communication protocol algorithms of application layer are modularized according to the model. The executable codes of these algorithms are labeled by the virtual functions and stored in the flash chips of embedded CPU to form the protocol stack. According to the configuration commands to initialize the RTU communication systems, it is able to achieve dynamic assembling and loading of various application layer communication protocols of RTU and complete the efficient transport of sensor data from RTU to central station when the data acquisition protocol of sensors and various external communication terminals remain unchanged.

  10. Coordination and organization of security software process for power information application environment

    Science.gov (United States)

    Wang, Qiang

    2017-09-01

    As an important part of software engineering, the software process decides the success or failure of software product. The design and development feature of security software process is discussed, so is the necessity and the present significance of using such process. Coordinating the function software, the process for security software and its testing are deeply discussed. The process includes requirement analysis, design, coding, debug and testing, submission and maintenance. In each process, the paper proposed the subprocesses to support software security. As an example, the paper introduces the above process into the power information platform.

  11. Parallel-Processing Test Bed For Simulation Software

    Science.gov (United States)

    Blech, Richard; Cole, Gary; Townsend, Scott

    1996-01-01

    Second-generation Hypercluster computing system is multiprocessor test bed for research on parallel algorithms for simulation in fluid dynamics, electromagnetics, chemistry, and other fields with large computational requirements but relatively low input/output requirements. Built from standard, off-shelf hardware readily upgraded as improved technology becomes available. System used for experiments with such parallel-processing concepts as message-passing algorithms, debugging software tools, and computational steering. First-generation Hypercluster system described in "Hypercluster Parallel Processor" (LEW-15283).

  12. SignalPlant: an open signal processing software platform

    Czech Academy of Sciences Publication Activity Database

    Plešinger, Filip; Jurčo, Juraj; Halámek, Josef; Jurák, Pavel

    2016-01-01

    Roč. 37, č. 7 (2016), N38-N48 ISSN 0967-3334 R&D Projects: GA ČR GAP103/11/0933; GA MŠk(CZ) LO1212; GA ČR GAP102/12/2034 Institutional support: RVO:68081731 Keywords : data visualization * software * signal processing * ECG * EEG Subject RIV: FS - Medical Facilities ; Equipment Impact factor: 2.058, year: 2016

  13. Software factory techniques applied to process control at CERN

    OpenAIRE

    Dutour, Mathias D

    2007-01-01

    The CERN Large Hadron Collider (LHC) requires constant monitoring and control of quantities of parameters to guarantee operational conditions. For this purpose, a methodology called UNICOS (UNIfied Industrial COntrols Systems) has been implemented to standardize the design of process control applications. To further accelerate the development of these applications, we migrated our existing UNICOS tooling suite toward a software factory in charge of assembling project, domain and technical inf...

  14. The Implementation of Computer Data Processing Software for EAST NBI

    International Nuclear Information System (INIS)

    Zhang Xiaodan; Hu Chundong; Sheng Peng; Zhao Yuanzhe; Wu Deyun; Cui Qinglong

    2014-01-01

    One of the most important project missions of neutral beam injectors is the implementation of 100 s neutral beam injection (NBI) with high power energy to the plasma of the EAST superconducting tokamak. Correspondingly, it's necessary to construct a high-speed and reliable computer data processing system for processing experimental data, such as data acquisition, data compression and storage, data decompression and query, as well as data analysis. The implementation of computer data processing application software (CDPS) for EAST NBI is presented in this paper in terms of its functional structure and system realization. The set of software is programmed in C language and runs on Linux operating system based on TCP network protocol and multi-threading technology. The hardware mainly includes industrial control computer (IPC), data server, PXI DAQ cards and so on. Now this software has been applied to EAST NBI system, and experimental results show that the CDPS can serve EAST NBI very well. (fusion engineering)

  15. SignalPlant: an open signal processing software platform.

    Science.gov (United States)

    Plesinger, F; Jurco, J; Halamek, J; Jurak, P

    2016-07-01

    The growing technical standard of acquisition systems allows the acquisition of large records, often reaching gigabytes or more in size as is the case with whole-day electroencephalograph (EEG) recordings, for example. Although current 64-bit software for signal processing is able to process (e.g. filter, analyze, etc) such data, visual inspection and labeling will probably suffer from rather long latency during the rendering of large portions of recorded signals. For this reason, we have developed SignalPlant-a stand-alone application for signal inspection, labeling and processing. The main motivation was to supply investigators with a tool allowing fast and interactive work with large multichannel records produced by EEG, electrocardiograph and similar devices. The rendering latency was compared with EEGLAB and proves significantly faster when displaying an image from a large number of samples (e.g. 163-times faster for 75  ×  10(6) samples). The presented SignalPlant software is available free and does not depend on any other computation software. Furthermore, it can be extended with plugins by third parties ensuring its adaptability to future research tasks and new data formats.

  16. Software Development Process Changes in the Telecommunications Industry

    Directory of Open Access Journals (Sweden)

    John Kevin Doyle

    2006-06-01

    Full Text Available The tremendous changes in the telecommunications business in the last several years drove changes in the software development processes of telecommunications equipment providers. We compare changes in these very large projects, in two companies, with those proposed in the Theory of Constraints / Critical Chains, Extreme Programming, and Agile development models. The 2000s have been a time of significant challenge in the telecommunications equipment business. Telecommunications service providers have excess equipment capacity. Many are waiting for next generation telephone switches that will simultaneously lower operating costs and enable additional revenue generation. The large service providers have drastically reduced their capital and expense purchases. Many small service providers, particularly the dot-coms, went bankrupt; much of their equipment is on the secondary market, at a fraction of the original cost. Thus the equipment market has significantly shrunk, and the equipment providers have been reducing expenses, while continuing to deliver software and hardware equipment at the high quality level required by the service providers. This drove many changes in the software development process. While the process changes are reported in two telecommunication equipment development organizations, the changes are applicable in any product development organization.

  17. Automatic discovery of data-centric and artifact-centric processes

    NARCIS (Netherlands)

    Nooijen, E.H.J.; Dongen, van B.F.; Fahland, D.; La Rosa, M.; Soffer, P.

    2013-01-01

    Process discovery is a technique that allows for automatically discovering a process model from recorded executions of a process as it happens in reality. This technique has successfully been applied for classical processes where one process execution is constituted by a single case with a unique

  18. cisTEM, user-friendly software for single-particle image processing

    Science.gov (United States)

    2018-01-01

    We have developed new open-source software called cisTEM (computational imaging system for transmission electron microscopy) for the processing of data for high-resolution electron cryo-microscopy and single-particle averaging. cisTEM features a graphical user interface that is used to submit jobs, monitor their progress, and display results. It implements a full processing pipeline including movie processing, image defocus determination, automatic particle picking, 2D classification, ab-initio 3D map generation from random parameters, 3D classification, and high-resolution refinement and reconstruction. Some of these steps implement newly-developed algorithms; others were adapted from previously published algorithms. The software is optimized to enable processing of typical datasets (2000 micrographs, 200 k – 300 k particles) on a high-end, CPU-based workstation in half a day or less, comparable to GPU-accelerated processing. Jobs can also be scheduled on large computer clusters using flexible run profiles that can be adapted for most computing environments. cisTEM is available for download from cistem.org. PMID:29513216

  19. cisTEM, user-friendly software for single-particle image processing.

    Science.gov (United States)

    Grant, Timothy; Rohou, Alexis; Grigorieff, Nikolaus

    2018-03-07

    We have developed new open-source software called cis TEM (computational imaging system for transmission electron microscopy) for the processing of data for high-resolution electron cryo-microscopy and single-particle averaging. cis TEM features a graphical user interface that is used to submit jobs, monitor their progress, and display results. It implements a full processing pipeline including movie processing, image defocus determination, automatic particle picking, 2D classification, ab-initio 3D map generation from random parameters, 3D classification, and high-resolution refinement and reconstruction. Some of these steps implement newly-developed algorithms; others were adapted from previously published algorithms. The software is optimized to enable processing of typical datasets (2000 micrographs, 200 k - 300 k particles) on a high-end, CPU-based workstation in half a day or less, comparable to GPU-accelerated processing. Jobs can also be scheduled on large computer clusters using flexible run profiles that can be adapted for most computing environments. cis TEM is available for download from cistem.org. © 2018, Grant et al.

  20. In vitro motility evaluation of aggregated cancer cells by means of automatic image processing.

    Science.gov (United States)

    De Hauwer, C; Darro, F; Camby, I; Kiss, R; Van Ham, P; Decaesteker, C

    1999-05-01

    Set up of an automatic image processing based method that enables the motility of in vitro aggregated cells to be evaluated for a number of hours. Our biological model included the PC-3 human prostate cancer cell line growing as a monolayer on the bottom of Falcon plastic dishes containing conventional culture media. Our equipment consisted of an incubator, an inverted phase contrast microscope, a Charge Coupled Device (CCD) video camera, and a computer equipped with an image processing software developed in our laboratory. This computer-assisted microscope analysis of aggregated cells enables global cluster motility to be evaluated. This analysis also enables the trajectory of each cell to be isolated and parametrized within a given cluster or, indeed, the trajectories of individual cells outside a cluster. The results show that motility inside a PC-3 cluster is not restricted to slight motion due to cluster expansion, but rather consists of a marked cell movement within the cluster. The proposed equipment enables in vitro aggregated cell motility to be studied. This method can, therefore, be used in pharmacological studies in order to select anti-motility related compounds. The compounds selected by the equipment described could then be tested in vivo as potential anti-metastatic.

  1. Automatic identification of corrosion damage using image processing techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bento, Mariana P.; Ramalho, Geraldo L.B.; Medeiros, Fatima N.S. de; Ribeiro, Elvis S. [Universidade Federal do Ceara (UFC), Fortaleza, CE (Brazil); Medeiros, Luiz C.L. [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2009-07-01

    This paper proposes a Nondestructive Evaluation (NDE) method for atmospheric corrosion detection on metallic surfaces using digital images. In this study, the uniform corrosion is characterized by texture attributes extracted from co-occurrence matrix and the Self Organizing Mapping (SOM) clustering algorithm. We present a technique for automatic inspection of oil and gas storage tanks and pipelines of petrochemical industries without disturbing their properties and performance. Experimental results are promising and encourage the possibility of using this methodology in designing trustful and robust early failure detection systems. (author)

  2. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    Science.gov (United States)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  3. Analyzing a Mature Software Inspection Process Using Statistical Process Control (SPC)

    Science.gov (United States)

    Barnard, Julie; Carleton, Anita; Stamper, Darrell E. (Technical Monitor)

    1999-01-01

    This paper presents a cooperative effort where the Software Engineering Institute and the Space Shuttle Onboard Software Project could experiment applying Statistical Process Control (SPC) analysis to inspection activities. The topics include: 1) SPC Collaboration Overview; 2) SPC Collaboration Approach and Results; and 3) Lessons Learned.

  4. Licensing process for safety-critical software-based systems

    Energy Technology Data Exchange (ETDEWEB)

    Haapanen, P. [VTT Automation, Espoo (Finland); Korhonen, J. [VTT Electronics, Espoo (Finland); Pulkkinen, U. [VTT Automation, Espoo (Finland)

    2000-12-01

    System vendors nowadays propose software-based technology even for the most critical safety functions in nuclear power plants. Due to the nature of software faults and the way they cause system failures new methods are needed for the safety and reliability evaluation of these systems. In the research project 'Programmable automation systems in nuclear power plants (OHA)', financed together by the Radiation and Nuclear Safety Authority (STUK), the Ministry of Trade and Industry (KTM) and the Technical Research Centre of Finland (VTT), various safety assessment methods and tools for software based systems are developed and evaluated. As a part of the OHA-work a reference model for the licensing process for software-based safety automation systems is defined. The licensing process is defined as the set of interrelated activities whose purpose is to produce and assess evidence concerning the safety and reliability of the system/application to be licensed and to make the decision about the granting the construction and operation permissions based on this evidence. The parties of the licensing process are the authority, the licensee (the utility company), system vendors and their subcontractors and possible external independent assessors. The responsibility about the production of the evidence in first place lies at the licensee who in most cases rests heavily on the vendor expertise. The evaluation and gauging of the evidence is carried out by the authority (possibly using external experts), who also can acquire additional evidence by using their own (independent) methods and tools. Central issue in the licensing process is to combine the quality evidence about the system development process with the information acquired through tests, analyses and operational experience. The purpose of the licensing process described in this report is to act as a reference model both for the authority and the licensee when planning the licensing of individual applications

  5. Licensing process for safety-critical software-based systems

    International Nuclear Information System (INIS)

    Haapanen, P.; Korhonen, J.; Pulkkinen, U.

    2000-12-01

    System vendors nowadays propose software-based technology even for the most critical safety functions in nuclear power plants. Due to the nature of software faults and the way they cause system failures new methods are needed for the safety and reliability evaluation of these systems. In the research project 'Programmable automation systems in nuclear power plants (OHA)', financed together by the Radiation and Nuclear Safety Authority (STUK), the Ministry of Trade and Industry (KTM) and the Technical Research Centre of Finland (VTT), various safety assessment methods and tools for software based systems are developed and evaluated. As a part of the OHA-work a reference model for the licensing process for software-based safety automation systems is defined. The licensing process is defined as the set of interrelated activities whose purpose is to produce and assess evidence concerning the safety and reliability of the system/application to be licensed and to make the decision about the granting the construction and operation permissions based on this evidence. The parties of the licensing process are the authority, the licensee (the utility company), system vendors and their subcontractors and possible external independent assessors. The responsibility about the production of the evidence in first place lies at the licensee who in most cases rests heavily on the vendor expertise. The evaluation and gauging of the evidence is carried out by the authority (possibly using external experts), who also can acquire additional evidence by using their own (independent) methods and tools. Central issue in the licensing process is to combine the quality evidence about the system development process with the information acquired through tests, analyses and operational experience. The purpose of the licensing process described in this report is to act as a reference model both for the authority and the licensee when planning the licensing of individual applications. Many of the

  6. Software Quality Perceptions of Stakeholders Involved in the Software Development Process

    Science.gov (United States)

    Padmanabhan, Priya

    2013-01-01

    Software quality is one of the primary determinants of project management success. Stakeholders involved in software development widely agree that quality is important (Barney and Wohlin 2009). However, they may differ on what constitutes software quality, and which of its attributes are more important than others. Although, software quality…

  7. Software features and applications in process design, integration and operation

    Energy Technology Data Exchange (ETDEWEB)

    Dhole, V. [Aspen Tech Limited, Warrington (United Kingdom)

    1999-02-01

    Process engineering technologies and tools have evolved rapidly over the last twenty years. Process simulation/modeling, advanced process control, on-line optimisation, production planning and supply chain management are some of the examples of technologies that have rapidly matured from early commercial prototypes and concepts to established tools with significant impact on profitability of process industry today. Process Synthesis or Process Integration (PI) in comparison is yet to create its impact and still remains largely in the domain of few expert users. One of the key reasons as to why PI has not taken off is because the PI tools have not become integral components of the standard process engineering environments. On the last 15 years AspenTech has grown from a small process simulation tool provider to a large multinational company providing a complete suite of process engineering technologies and services covering process design, operation, planning and supply chain management. Throughout this period, AspenTech has acquired experience in rapidly evolving technologies from their early prototype stage to mature products and services. The paper outlines AspenTech`s strategy of integrating PI with other more established process design and operational improvement technologies. The paper illustrates the key elements of AspenTech`s strategy via examples of software development initiatives and services projects. The paper also outlines AspenTech`s future vision of the role of PI in process engineering. (au)

  8. Study of automatic boat loading unit and horizontal sintering process of uranium dioxide pellet

    International Nuclear Information System (INIS)

    He Zhongjing; Chen Yu; Yao Dengfeng; Wang Youliang; Shu Binhua; Wu Genjiu

    2014-01-01

    Sintering process is a key process for the manufacture of nuclear fuel UO_2 pellet. In our factory, the continuous high temperature sintering furnace is used for sintering process. During the sintering of green pellets, the furnace, the boat and the accumulation way can influence the quality of the final product. In this text, on the basis of early process research, The automatic loading boat Unit and horizontal sintering process is studied successively. The results show that the physical and chemical properties of the products manufactured by automatic loading boat unit and horizontal sintering process can meet the technique requirements completely, and this system is reliable and continuous. (authors)

  9. Software

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, R.; Budd, G.; Ross, E.; Wells, P.

    2010-07-15

    The software section of this journal presented new software programs that have been developed to help in the exploration and development of hydrocarbon resources. Software provider IHS Inc. has made additions to its geological and engineering analysis software tool, IHS PETRA, a product used by geoscientists and engineers to visualize, analyze and manage well production, well log, drilling, reservoir, seismic and other related information. IHS PETRA also includes a directional well module and a decline curve analysis module to improve analysis capabilities in unconventional reservoirs. Petris Technology Inc. has developed a software to help manage the large volumes of data. PetrisWinds Enterprise (PWE) helps users find and manage wellbore data, including conventional wireline and MWD core data; analysis core photos and images; waveforms and NMR; and external files documentation. Ottawa-based Ambercore Software Inc. has been collaborating with Nexen on the Petroleum iQ software for steam assisted gravity drainage (SAGD) producers. Petroleum iQ integrates geology and geophysics data with engineering data in 3D and 4D. Calgary-based Envirosoft Corporation has developed a software that reduces the costly and time-consuming effort required to comply with Directive 39 of the Alberta Energy Resources Conservation Board. The product includes an emissions modelling software. Houston-based Seismic Micro-Technology (SMT) has developed the Kingdom software that features the latest in seismic interpretation. Holland-based Joa Oil and Gas and Calgary-based Computer Modelling Group have both supplied the petroleum industry with advanced reservoir simulation software that enables reservoir interpretation. The 2010 software survey included a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In

  10. EOS MLS Level 1B Data Processing Software. Version 3

    Science.gov (United States)

    Perun, Vincent S.; Jarnot, Robert F.; Wagner, Paul A.; Cofield, Richard E., IV; Nguyen, Honghanh T.; Vuu, Christina

    2011-01-01

    This software is an improvement on Version 2, which was described in EOS MLS Level 1B Data Processing, Version 2.2, NASA Tech Briefs, Vol. 33, No. 5 (May 2009), p. 34. It accepts the EOS MLS Level 0 science/engineering data, and the EOS Aura spacecraft ephemeris/attitude data, and produces calibrated instrument radiances and associated engineering and diagnostic data. This version makes the code more robust, improves calibration, provides more diagnostics outputs, defines the Galactic core more finely, and fixes the equator crossing. The Level 1 processing software manages several different tasks. It qualifies each data quantity using instrument configuration and checksum data, as well as data transmission quality flags. Statistical tests are applied for data quality and reasonableness. The instrument engineering data (e.g., voltages, currents, temperatures, and encoder angles) is calibrated by the software, and the filter channel space reference measurements are interpolated onto the times of each limb measurement with the interpolates being differenced from the measurements. Filter channel calibration target measurements are interpolated onto the times of each limb measurement, and are used to compute radiometric gain. The total signal power is determined and analyzed by each digital autocorrelator spectrometer (DACS) during each data integration. The software converts each DACS data integration from an autocorrelation measurement in the time domain into a spectral measurement in the frequency domain, and estimates separately the spectrally, smoothly varying and spectrally averaged components of the limb port signal arising from antenna emission and scattering effects. Limb radiances are also calibrated.

  11. Dissociation between controlled and automatic processes in the behavioral variant of fronto-temporal dementia.

    Science.gov (United States)

    Collette, Fabienne; Van der Linden, Martial; Salmon, Eric

    2010-01-01

    A decline of cognitive functioning affecting several cognitive domains was frequently reported in patients with frontotemporal dementia. We were interested in determining if these deficits can be interpreted as reflecting an impairment of controlled cognitive processes by using an assessment tool specifically developed to explore the distinction between automatic and controlled processes, namely the process dissociation procedure (PDP) developed by Jacoby. The PDP was applied to a word stem completion task to determine the contribution of automatic and controlled processes to episodic memory performance and was administered to a group of 12 patients with the behavioral variant of frontotemporal dementia (bv-FTD) and 20 control subjects (CS). Bv-FTD patients obtained a lower performance than CS for the estimates of controlled processes, but no group differences was observed for estimates of automatic processes. The between-groups comparison of the estimates of controlled and automatic processes showed a larger contribution of automatic processes to performance in bv-FTD, while a slightly more important contribution of controlled processes was observed in control subjects. These results are clearly indicative of an alteration of controlled memory processes in bv-FTD.

  12. Modeling the Object-Oriented Software Process: OPEN and the Unified Process

    NARCIS (Netherlands)

    van den Berg, Klaas; Aksit, Mehmet; van den Broek, P.M.

    A short introduction to software process modeling is presented, particularly object-oriented modeling. Two major industrial process models are discussed: the OPEN model and the Unified Process model. In more detail, the quality assurance in the Unified Process tool (formally called Objectory) is

  13. Modeling the Object-Oriented Software Process: OPEN and the Unified Process

    OpenAIRE

    van den Berg, Klaas; Aksit, Mehmet; van den Broek, P.M.

    1999-01-01

    A short introduction to software process modeling is presented, particularly object-oriented modeling. Two major industrial process models are discussed: the OPEN model and the Unified Process model. In more detail, the quality assurance in the Unified Process tool (formally called Objectory) is reviewed.

  14. The role of supervisory controller synthesis in automatic control software development

    NARCIS (Netherlands)

    Baeten, J.C.M.; Markovski, J.

    2015-01-01

    We give an overview of a model-driven systems engineering approach for high-tech systems that relies on supervisory controller synthesis. The proposed framework has a process-theoretic foundation and supports extensions with quantitative features. We briefly discuss several industrial case studies

  15. Automatic Gap Detection in Friction Stir Welding Processes (Preprint)

    National Research Council Canada - National Science Library

    Yang, Yu; Kalya, Prabhanjana; Landers, Robert G; Krishnamurthy, K

    2006-01-01

    .... This paper develops a monitoring algorithm to detect gaps in Friction Stir Welding (FSW) processes. Experimental studies are conducted to determine how the process parameters and the gap width affect the welding process...

  16. The National Air Intelligence Center Software Process Improvement Effort (NAIC SPI)

    National Research Council Canada - National Science Library

    Blankenship, Donald

    2001-01-01

    ...) Software Process Improvements effort. The objective of this effort was for the contractor to provide engineering and software process improvement for NAIC/SCD to reach SEI's CMM Level 2 in process maturity...

  17. Automatic welding processes for reactor coolant pipes used in PWR type nuclear power plant

    International Nuclear Information System (INIS)

    Hamada, T.; Nakamura, A.; Nagura, Y.; Sakamoto, N.

    1979-01-01

    The authors developed automatic welding processes (submerged arc welding process and TIG welding process) for application to the welding of reactor coolant pipes which constitute the most important part of the PWR type nuclear power plant. Submerged arc welding process is suitable for flat position welding in which pipes can be rotated, while TIG welding process is suitable for all position welding. This paper gives an outline of the two processes and the results of tests performed using these processes. (author)

  18. Strategies to Automatically Derive a Process Model from a Configurable Process Model Based on Event Data

    Directory of Open Access Journals (Sweden)

    Mauricio Arriagada-Benítez

    2017-10-01

    Full Text Available Configurable process models are frequently used to represent business workflows and other discrete event systems among different branches of large organizations: they unify commonalities shared by all branches and describe their differences, at the same time. The configuration of such models is usually done manually, which is challenging. On the one hand, when the number of configurable nodes in the configurable process model grows, the size of the search space increases exponentially. On the other hand, the person performing the configuration may lack the holistic perspective to make the right choice for all configurable nodes at the same time, since choices influence each other. Nowadays, information systems that support the execution of business processes create event data reflecting how processes are performed. In this article, we propose three strategies (based on exhaustive search, genetic algorithms and a greedy heuristic that use event data to automatically derive a process model from a configurable process model that better represents the characteristics of the process in a specific branch. These strategies have been implemented in our proposed framework and tested in both business-like event logs as recorded in a higher educational enterprise resource planning system and a real case scenario involving a set of Dutch municipalities.

  19. Automatic generation of optimal business processes from business rules

    NARCIS (Netherlands)

    Steen, B.; Ferreira Pires, Luis; Iacob, Maria Eugenia

    2010-01-01

    In recent years, business process models are increasingly being used as a means for business process improvement. Business rules can be seen as requirements for business processes, in that they describe the constraints that must hold for business processes that implement these business rules.

  20. Statistical data processing with automatic system for environmental radiation monitoring

    International Nuclear Information System (INIS)

    Zarkh, V.G.; Ostroglyadov, S.V.

    1986-01-01

    Practice of statistical data processing for radiation monitoring is exemplified, and some results obtained are presented. Experience in practical application of mathematical statistics methods for radiation monitoring data processing allowed to develop a concrete algorithm of statistical processing realized in M-6000 minicomputer. The suggested algorithm by its content is divided into 3 parts: parametrical data processing and hypotheses test, pair and multiple correlation analysis. Statistical processing programms are in a dialogue operation. The above algorithm was used to process observed data over radioactive waste disposal control region. Results of surface waters monitoring processing are presented

  1. Gamma camera image processing and graphical analysis mutual software system

    International Nuclear Information System (INIS)

    Wang Zhiqian; Chen Yongming; Ding Ailian; Ling Zhiye; Jin Yongjie

    1992-01-01

    GCCS gamma camera image processing and graphical analysis system is a special mutual software system. It is mainly used to analyse various patient data acquired from gamma camera. This system is used on IBM PC, PC/XT or PC/AT. It consists of several parts: system management, data management, device management, program package and user programs. The system provides two kinds of user interfaces: command menu and command characters. It is easy to change and enlarge this system because it is best modularized. The user programs include almost all the clinical protocols used now

  2. A comparison of conscious and automatic memory processes for picture and word stimuli: a process dissociation analysis.

    Science.gov (United States)

    McBride, Dawn M; Anne Dosher, Barbara

    2002-09-01

    Four experiments were conducted to evaluate explanations of picture superiority effects previously found for several tasks. In a process dissociation procedure (Jacoby, 1991) with word stem completion, picture fragment completion, and category production tasks, conscious and automatic memory processes were compared for studied pictures and words with an independent retrieval model and a generate-source model. The predictions of a transfer appropriate processing account of picture superiority were tested and validated in "process pure" latent measures of conscious and unconscious, or automatic and source, memory processes. Results from both model fits verified that pictures had a conceptual (conscious/source) processing advantage over words for all tasks. The effects of perceptual (automatic/word generation) compatibility depended on task type, with pictorial tasks favoring pictures and linguistic tasks favoring words. Results show support for an explanation of the picture superiority effect that involves an interaction of encoding and retrieval processes.

  3. Gene-Auto: Automatic Software Code Generation for Real-Time Embedded Systems

    Science.gov (United States)

    Rugina, A.-E.; Thomas, D.; Olive, X.; Veran, G.

    2008-08-01

    This paper gives an overview of the Gene-Auto ITEA European project, which aims at building a qualified C code generator from mathematical models under Matlab-Simulink and Scilab-Scicos. The project is driven by major European industry partners, active in the real-time embedded systems domains. The Gene- Auto code generator will significantly improve the current development processes in such domains by shortening the time to market and by guaranteeing the quality of the generated code through the use of formal methods. The first version of the Gene-Auto code generator has already been released and has gone thought a validation phase on real-life case studies defined by each project partner. The validation results are taken into account in the implementation of the second version of the code generator. The partners aim at introducing the Gene-Auto results into industrial development by 2010.

  4. Process model for building quality software on internet time ...

    African Journals Online (AJOL)

    The competitive nature of the software construction market and the inherently exhilarating nature of software itself have hinged the success of any software development project on four major pillars: time to market, product quality, innovation and documentation. Unfortunately, however, existing software development models ...

  5. Usefulness of the automatic quantitative estimation tool for cerebral blood flow: clinical assessment of the application software tool AQCEL.

    Science.gov (United States)

    Momose, Mitsuhiro; Takaki, Akihiro; Matsushita, Tsuyoshi; Yanagisawa, Shin; Yano, Kesato; Miyasaka, Tadashi; Ogura, Yuka; Kadoya, Masumi

    2011-01-01

    AQCEL enables automatic reconstruction of single-photon emission computed tomogram (SPECT) without image degradation and quantitative analysis of cerebral blood flow (CBF) after the input of simple parameters. We ascertained the usefulness and quality of images obtained by the application software AQCEL in clinical practice. Twelve patients underwent brain perfusion SPECT using technetium-99m ethyl cysteinate dimer at rest and after acetazolamide (ACZ) loading. Images reconstructed using AQCEL were compared with those reconstructed using conventional filtered back projection (FBP) method for qualitative estimation. Two experienced nuclear medicine physicians interpreted the image quality using the following visual scores: 0, same; 1, slightly superior; 2, superior. For quantitative estimation, the mean CBF values of the normal hemisphere of the 12 patients using ACZ calculated by the AQCEL method were compared with those calculated by the conventional method. The CBF values of the 24 regions of the 3-dimensional stereotaxic region of interest template (3DSRT) calculated by the AQCEL method at rest and after ACZ loading were compared to those calculated by the conventional method. No significant qualitative difference was observed between the AQCEL and conventional FBP methods in the rest study. The average score by the AQCEL method was 0.25 ± 0.45 and that by the conventional method was 0.17 ± 0.39 (P = 0.34). There was a significant qualitative difference between the AQCEL and conventional methods in the ACZ loading study. The average score for AQCEL was 0.83 ± 0.58 and that for the conventional method was 0.08 ± 0.29 (P = 0.003). During quantitative estimation using ACZ, the mean CBF values of 12 patients calculated by the AQCEL method were 3-8% higher than those calculated by the conventional method. The square of the correlation coefficient between these methods was 0.995. While comparing the 24 3DSRT regions of 12 patients, the squares of the correlation

  6. Test of a potential link between analytic and nonanalytic category learning and automatic, effortful processing.

    Science.gov (United States)

    Tracy, J I; Pinsk, M; Helverson, J; Urban, G; Dietz, T; Smith, D J

    2001-08-01

    The link between automatic and effortful processing and nonanalytic and analytic category learning was evaluated in a sample of 29 college undergraduates using declarative memory, semantic category search, and pseudoword categorization tasks. Automatic and effortful processing measures were hypothesized to be associated with nonanalytic and analytic categorization, respectively. Results suggested that contrary to prediction strong criterion-attribute (analytic) responding on the pseudoword categorization task was associated with strong automatic, implicit memory encoding of frequency-of-occurrence information. Data are discussed in terms of the possibility that criterion-attribute category knowledge, once established, may be expressed with few attentional resources. The data indicate that attention resource requirements, even for the same stimuli and task, vary depending on the category rule system utilized. Also, the automaticity emerging from familiarity with analytic category exemplars is very different from the automaticity arising from extensive practice on a semantic category search task. The data do not support any simple mapping of analytic and nonanalytic forms of category learning onto the automatic and effortful processing dichotomy and challenge simple models of brain asymmetries for such procedures. Copyright 2001 Academic Press.

  7. Development of Open source-based automatic shooting and processing UAV imagery for Orthoimage Using Smart Camera UAV

    Science.gov (United States)

    Park, J. W.; Jeong, H. H.; Kim, J. S.; Choi, C. U.

    2016-06-01

    Recently, aerial photography with unmanned aerial vehicle (UAV) system uses UAV and remote controls through connections of ground control system using bandwidth of about 430 MHz radio Frequency (RF) modem. However, as mentioned earlier, existing method of using RF modem has limitations in long distance communication. The Smart Camera equipments's LTE (long-term evolution), Bluetooth, and Wi-Fi to implement UAV that uses developed UAV communication module system carried out the close aerial photogrammetry with the automatic shooting. Automatic shooting system is an image capturing device for the drones in the area's that needs image capturing and software for loading a smart camera and managing it. This system is composed of automatic shooting using the sensor of smart camera and shooting catalog management which manages filmed images and information. Processing UAV imagery module used Open Drone Map. This study examined the feasibility of using the Smart Camera as the payload for a photogrammetric UAV system. The open soure tools used for generating Android, OpenCV (Open Computer Vision), RTKLIB, Open Drone Map.

  8. Development of Open source-based automatic shooting and processing UAV imagery for Orthoimage Using Smart Camera UAV

    Directory of Open Access Journals (Sweden)

    J. W. Park

    2016-06-01

    Full Text Available Recently, aerial photography with unmanned aerial vehicle (UAV system uses UAV and remote controls through connections of ground control system using bandwidth of about 430 MHz radio Frequency (RF modem. However, as mentioned earlier, existing method of using RF modem has limitations in long distance communication. The Smart Camera equipments’s LTE (long-term evolution, Bluetooth, and Wi-Fi to implement UAV that uses developed UAV communication module system carried out the close aerial photogrammetry with the automatic shooting. Automatic shooting system is an image capturing device for the drones in the area’s that needs image capturing and software for loading a smart camera and managing it. This system is composed of automatic shooting using the sensor of smart camera and shooting catalog management which manages filmed images and information. Processing UAV imagery module used Open Drone Map. This study examined the feasibility of using the Smart Camera as the payload for a photogrammetric UAV system. The open soure tools used for generating Android, OpenCV (Open Computer Vision, RTKLIB, Open Drone Map.

  9. Grid infrastructure for automatic processing of SAR data for flood applications

    Science.gov (United States)

    Kussul, Natalia; Skakun, Serhiy; Shelestov, Andrii

    2010-05-01

    More and more geosciences applications are being put on to the Grids. Due to the complexity of geosciences applications that is caused by complex workflow, the use of computationally intensive environmental models, the need of management and integration of heterogeneous data sets, Grid offers solutions to tackle these problems. Many geosciences applications, especially those related to the disaster management and mitigations require the geospatial services to be delivered in proper time. For example, information on flooded areas should be provided to corresponding organizations (local authorities, civil protection agencies, UN agencies etc.) no more than in 24 h to be able to effectively allocate resources required to mitigate the disaster. Therefore, providing infrastructure and services that will enable automatic generation of products based on the integration of heterogeneous data represents the tasks of great importance. In this paper we present Grid infrastructure for automatic processing of synthetic-aperture radar (SAR) satellite images to derive flood products. In particular, we use SAR data acquired by ESA's ENVSAT satellite, and neural networks to derive flood extent. The data are provided in operational mode from ESA rolling archive (within ESA Category-1 grant). We developed a portal that is based on OpenLayers frameworks and provides access point to the developed services. Through the portal the user can define geographical region and search for the required data. Upon selection of data sets a workflow is automatically generated and executed on the resources of Grid infrastructure. For workflow execution and management we use Karajan language. The workflow of SAR data processing consists of the following steps: image calibration, image orthorectification, image processing with neural networks, topographic effects removal, geocoding and transformation to lat/long projection, and visualisation. These steps are executed by different software, and can be

  10. Flexible software process lines in practice: A metamodel-based approach to effectively construct and manage families of software process models

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Ternité, Thomas; Friedrich, Jan

    2017-01-01

    Process flexibility and adaptability is frequently discussed, and several proposals aim to improve software processes for a given organization-/project context. A software process line (SPrL) is an instrument to systematically construct and manage variable software processes, by combining pre-def......: A metamodel-based approach to effectively construct and manage families of software process models [Ku16]. This paper was published as original research article in the Journal of Systems and Software.......Process flexibility and adaptability is frequently discussed, and several proposals aim to improve software processes for a given organization-/project context. A software process line (SPrL) is an instrument to systematically construct and manage variable software processes, by combining pre...... to construct flexible SPrLs and show its practical application in the German V-Modell XT. We contribute a proven approach that is presented as metamodel fragment for reuse and implementation in further process modeling approaches. This summary refers to the paper Flexible software process lines in practice...

  11. SOFTWARE FOR IMAGE PROCESSING OF IRON-CARBONACEOUS ALLOY MICROSTRUCTURES

    Directory of Open Access Journals (Sweden)

    A. N. Chichko

    2011-01-01

    Full Text Available The paper proposes a mathematical apparatus for image processing of a cast-iron microstructure of a pearlite class that has casually distributed graphite inclusions in the structure the software has been developed and it allows to determine statistical functions concerning distribution of graphite inclusion characteristics according to areas, perimeters and distances between inclusions. The paper shows that computer processing of gray pig-iron microstructure image makes it possible to classify microstructures on the basis of statistical distribution of a graphite phase which are considered as indiscernible while applying conventional metallographic methods and it has practical significance for investigation of the interrelations – «workability – cast iron microstructure».

  12. Clinical evaluation of semi-automatic open-source algorithmic software segmentation of the mandibular bone: Practical feasibility and assessment of a new course of action.

    Science.gov (United States)

    Wallner, Jürgen; Hochegger, Kerstin; Chen, Xiaojun; Mischak, Irene; Reinbacher, Knut; Pau, Mauro; Zrnc, Tomislav; Schwenzer-Zimmerer, Katja; Zemann, Wolfgang; Schmalstieg, Dieter; Egger, Jan

    2018-01-01

    Computer assisted technologies based on algorithmic software segmentation are an increasing topic of interest in complex surgical cases. However-due to functional instability, time consuming software processes, personnel resources or licensed-based financial costs many segmentation processes are often outsourced from clinical centers to third parties and the industry. Therefore, the aim of this trial was to assess the practical feasibility of an easy available, functional stable and licensed-free segmentation approach to be used in the clinical practice. In this retrospective, randomized, controlled trail the accuracy and accordance of the open-source based segmentation algorithm GrowCut was assessed through the comparison to the manually generated ground truth of the same anatomy using 10 CT lower jaw data-sets from the clinical routine. Assessment parameters were the segmentation time, the volume, the voxel number, the Dice Score and the Hausdorff distance. Overall semi-automatic GrowCut segmentation times were about one minute. Mean Dice Score values of over 85% and Hausdorff Distances below 33.5 voxel could be achieved between the algorithmic GrowCut-based segmentations and the manual generated ground truth schemes. Statistical differences between the assessment parameters were not significant (p 0.94) for any of the comparison made between the two groups. Complete functional stable and time saving segmentations with high accuracy and high positive correlation could be performed by the presented interactive open-source based approach. In the cranio-maxillofacial complex the used method could represent an algorithmic alternative for image-based segmentation in the clinical practice for e.g. surgical treatment planning or visualization of postoperative results and offers several advantages. Due to an open-source basis the used method could be further developed by other groups or specialists. Systematic comparisons to other segmentation approaches or with a

  13. Towards Automatic Capturing of Manual Data Processing Provenance

    NARCIS (Netherlands)

    Wombacher, Andreas; Huq, M.R.

    2011-01-01

    Often data processing is not implemented by a work ow system or an integration application but is performed manually by humans along the lines of a more or less specified procedure. Collecting provenance information during manual data processing can not be automated. Further, manual collection of

  14. Model-based control of a fuel cell cooling circuit with automatic software testing; Modellbasierte Steuerung des Kuehlkreislaufes einer Brennstoffzelle mit automatisiertem Test der Software

    Energy Technology Data Exchange (ETDEWEB)

    Schaefer, Sascha

    2012-07-01

    For reconstruction and control of the volume flow in a cooling circuit a fuel cell system is analyzed, and physical models of the fluid temperature and pump volume flow are derived. On this basis, functional models for derivation of software algorithms are presented which enable model-based calculation of the volume flow in a fluid circuit and coolant pump control on the basis of detailed system modelling. The available functions enable complete control and diagnosis of the volume flow. The functional models in Matlab/Simulink will be implemented in a control unit; they are discussed in the context of a software development process. It is stressed that correctness of the functions must be assured, and a trial automation system for functional models of this type is presented. (orig./AKB) [German] Zur Rekonstruktion und Steuerung des Volumenstroms in einem Kuehlkreislauf, wird ein Brennstoffzellensystem analysiert und physikalische Modelle fuer die Fluidtemperatur und den Pumpenvolumenstrom werden hergeleitet. Basierend auf diesen Zusammenhaengen werden Funktionsmodelle zur Ableitung von Softwarealgorithmen vorgestellt. Diese ermoeglichen es den Volumenstrom in einem Fluidkreislauf modellbasiert zu bestimmen bzw. die Kuehlmittelpumpe basierend auf einer detaillierten Systemmodellierung zu steuern. Die zur Verfuegung stehenden Funktionen erlauben eine komplette Regelung, Steuerung und Diagnose des Volumenstroms. Die Funktionsmodelle, in Matlab/Simulink realisiert und zur Ausfuehrung auf einem Steuergeraet vorgesehen, werden im Kontext eines Softwareentwicklungsprozesses diskutiert. Es wird die Notwendigkeit der Sicherstellung der Korrektheit der Funktionen herausgearbeitet und ein Testautomatisierungssystem fuer solche Funktionsmodelle vorgestellt.

  15. Experimental software for modeling and interpreting educational data analysis processes

    Directory of Open Access Journals (Sweden)

    Natalya V. Zorina

    2017-12-01

    Full Text Available Problems, tasks and processes of educational data mining are considered in this article. The objective is to create a fundamentally new information system of the University using the results educational data analysis. One of the functions of such a system is knowledge extraction from accumulated in the operation process data. The creation of the national system of this type is an iterative and time-consuming process requiring the preliminary studies and incremental prototyping modules. The novelty of such systems is that there is a lack of those using this methodology of the development, for this purpose a number of experiments was carried out in order to collect data, choose appropriate methods for the study and to interpret them. As a result of the experiment, the authors were available sources available for analysis in the information environment of the home university. The data were taken from the semester performance, obtained from the information system of the training department of the Institute of IT MTU MIREA, the data obtained as a result of the independent work of students and data, using specially designed Google-forms. To automate the collection of information and analysis of educational data, an experimental software package was created. As a methodology for developing the experimental software complex, a decision was made using the methodologies of rational-empirical complexes (REX and single-experimentation program technologies (TPEI. The details of the program implementation of the complex are described in detail, conclusions are given about the availability of the data sources used, and conclusions are drawn about the prospects for further development.

  16. Automatic detection of NIL defects using microscopy and image processing

    KAUST Repository

    Pietroy, David; Gereige, Issam; Gourgon, Cé cile

    2013-01-01

    patterns, sticking. In this paper, microscopic imaging combined to a specific processing algorithm is used to detect numerically defects in printed patterns. Results obtained for 1D and 2D imprinted gratings with different microscopic image magnifications

  17. Towards automatic parameter tuning of stream processing systems

    KAUST Repository

    Bilal, Muhammad; Canini, Marco

    2017-01-01

    for automating parameter tuning for stream-processing systems. Our framework supports standard black-box optimization algorithms as well as a novel gray-box optimization algorithm. We demonstrate the multiple benefits of automated parameter tuning in optimizing

  18. Measuring automatic retrieval: a comparison of implicit memory, process dissociation, and speeded response procedures.

    Science.gov (United States)

    Horton, Keith D; Wilson, Daryl E; Vonk, Jennifer; Kirby, Sarah L; Nielsen, Tina

    2005-07-01

    Using the stem completion task, we compared estimates of automatic retrieval from an implicit memory task, the process dissociation procedure, and the speeded response procedure. Two standard manipulations were employed. In Experiment 1, a depth of processing effect was found on automatic retrieval using the speeded response procedure although this effect was substantially reduced in Experiment 2 when lexical processing was required of all words. In Experiment 3, the speeded response procedure showed an advantage of full versus divided attention at study on automatic retrieval. An implicit condition showed parallel effects in each study, suggesting that implicit stem completion may normally provide a good estimate of automatic retrieval. Also, we replicated earlier findings from the process dissociation procedure, but estimates of automatic retrieval from this procedure were consistently lower than those from the speeded response procedure, except when conscious retrieval was relatively low. We discuss several factors that may contribute to the conflicting outcomes, including the evidence for theoretical assumptions and criterial task differences between implicit and explicit tests.

  19. Process Concepts for Semi-automatic Dismantling of LCD Televisions

    OpenAIRE

    Elo, Kristofer; Sundin, Erik

    2014-01-01

    There is a large variety of electrical and electronic equipment products, for example liquid crystal display television sets (LCD TVs), in the waste stream today. Many LCD TVs contain mercury, which is a challenge to treat at the recycling plants. Two current used processes to recycle LCD TVs are automated shredding and manual disassembly. This paper aims to present concepts for semi-automated dismantling processes for LCD TVs in order to achieve higher productivity and flexibility, and in tu...

  20. Automatic system for processing the plasma radiation spectra

    International Nuclear Information System (INIS)

    Isakaev, Eh.Kh.; Markin, A.V.; Khajmin, V.A.; Chinnov, V.F.

    2001-01-01

    One is tackling a problem to ensure computer for processing of experimental data when studying plasma obtained due to the present day systems to acquire information. One elaborated rather simple and reliable programs for processing. The system is used in case of plasma quantitative spectroscopy representing the classical and most widely used method to analyze the parameters and the properties of low-temperature and high-temperature plasma [ru

  1. Method for Processing Liver Spheroids Using an Automatic Tissue Processor

    Science.gov (United States)

    2016-05-01

    alcohol dehydration and hot liquid wax infiltration. After the water in the tissue is replaced with wax and cooled, it then becomes possible to cut...effective for processing and preparing microscopy slides of liver spheroids. The general process involved formalin fixation, dehydration in a...DPBS);  formalin (37% neutral buffer formaldehyde);  series of alcohol solutions: 70, 80, 95, and 100% ethanol in water; 2  xylene

  2. IEEE Computer Society/Software Engineering Institute Software Process Achievement (SPA) Award 2009

    Science.gov (United States)

    2011-03-01

    capabilities to our GDM. We also introduced software as a service ( SaaS ) as part our technology solutions and have further enhanced our ability to...model PROSPER Infosys production support methodology Q&P quality and productivity R&D research and development SaaS software as a service ... Software Development Life Cycle (SDLC) 23 Table 10: Scientific Estimation Coverage by Service Line 27 CMU/SEI-2011-TR-008 | vi CMU/SEI-2011

  3. Software EpiData - Applications for Needs of Biotechnological Processes

    Directory of Open Access Journals (Sweden)

    Ljakova K.

    2007-12-01

    Full Text Available EpiData (free software for entering and documenting data is presented. Some aspect of this software is shown for needs of database system (DB and information systems (IS that can be used in bioprocess system.

  4. Dialog system for automatic data input/output and processing with two BESM-6 computers

    International Nuclear Information System (INIS)

    Belyaev, Y.N.; Gorlov, Y.P.; Makarychev, S.V.; Monakov, A.A.; Shcherbakov, S.A.

    1985-01-01

    This paper presents a system for conducting experiments with fully automatic processing of data from multichannel recorders in the dialog mode. The system acquires data at a rate of 2.5 . 10 3 readings/sec, processes in real time, and outputs digital and graphical material in a multitasking environment

  5. Formal Specification and Automatic Analysis of Business Processes under Authorization Constraints: An Action-Based Approach

    Science.gov (United States)

    Armando, Alessandro; Giunchiglia, Enrico; Ponta, Serena Elisa

    We present an approach to the formal specification and automatic analysis of business processes under authorization constraints based on the action language \\cal{C}. The use of \\cal{C} allows for a natural and concise modeling of the business process and the associated security policy and for the automatic analysis of the resulting specification by using the Causal Calculator (CCALC). Our approach improves upon previous work by greatly simplifying the specification step while retaining the ability to perform a fully automatic analysis. To illustrate the effectiveness of the approach we describe its application to a version of a business process taken from the banking domain and use CCALC to determine resource allocation plans complying with the security policy.

  6. Microsoft excel's automatic data processing and diagram drawing of RIA internal quality control parameters

    International Nuclear Information System (INIS)

    Zeng Pingfan; Liu Guoqiang

    2006-01-01

    We did automatic data processing and diagram drawing of various parameters of RIA' s internal quality control (IQC)by the use of Microsoft Excel (ME). By use of AVERAGE and STDEV of ME, we got x-bar, s and CV%. With pearson, we got the serum quality control coefficients (r). Inputing the original data to diagram's self-definition item, the diagram was drawn automatically. By the use of logic judging, we got the quality control judging results with the status, timing and data of various quality control parameters. For the past four years, the ME data processing and diagram drawing as well as quality control judging have been showed to be accurate, convenient and correct. It was quick and easy to manage and the automatic computer processing of RIA's IQC was realized. Conclusion: the method is applicable to all types of RIA' s IQC. (authors)

  7. Integration of drinking water treatment plant process models and emulated process automation software

    NARCIS (Netherlands)

    Worm, G.I.M.

    2012-01-01

    The objective of this research is to limit the risks of fully automated operation of drinking water treatment plants and to improve their operation by using an integrated system of process models and emulated process automation software. This thesis contains the design of such an integrated system.

  8. Automatic processing of isotopic dilution curves obtained by precordial detection

    International Nuclear Information System (INIS)

    Verite, J.C.

    1973-01-01

    Dilution curves pose two distinct problems: that of their acquisition and that of their processing. A study devoted to the latter aspect only was presented. It was necessary to satisfy two important conditions: the treatment procedure, although applied to a single category of curves (isotopic dilution curves obtained by precordial detection), had to be as general as possible; to allow dissemination of the method the equipment used had to be relatively modest and inexpensive. A simple method, considering the curve processing as a process identification, was developed and should enable the mean heart cavity volume and certain pulmonary circulation parameters to be determined. Considerable difficulties were encountered, limiting the value of the results obtained though not condemning the method itself. The curve processing question raised the problem of their acquisition, i.e. the number of these curves and their meaning. A list of the difficulties encountered is followed by a set of possible solutions, a solution being understood to mean a curve processing combination where the overlapping between the two aspects of the problem is accounted for [fr

  9. Image fusion between whole body FDG PET images and whole body MRI images using a full-automatic mutual information-based multimodality image registration software

    International Nuclear Information System (INIS)

    Uchida, Yoshitaka; Nakano, Yoshitada; Fujibuchi, Toshiou; Isobe, Tomoko; Kazama, Toshiki; Ito, Hisao

    2006-01-01

    We attempted image fusion between whole body PET and whole body MRI of thirty patients using a full-automatic mutual information (MI) -based multimodality image registration software and evaluated accuracy of this method and impact of the coregistrated imaging on diagnostic accuracy. For 25 of 30 fused images in body area, translating gaps were within 6 mm in all axes and rotating gaps were within 2 degrees around all axes. In head and neck area, considerably much gaps caused by difference of head inclination at imaging occurred in 16 patients, however these gaps were able to decrease by fused separately. In 6 patients, diagnostic accuracy using PET/MRI fused images was superior compared by PET image alone. This work shows that whole body FDG PET images and whole body MRI images can be automatically fused using MI-based multimodality image registration software accurately and this technique can add useful information when evaluating FDG PET images. (author)

  10. Software architecture for intelligent image processing using Prolog

    Science.gov (United States)

    Jones, Andrew C.; Batchelor, Bruce G.

    1994-10-01

    We describe a prototype system for interactive image processing using Prolog, implemented by the first author on an Apple Macintosh computer. This system is inspired by Prolog+, but differs from it in two particularly important respects. The first is that whereas Prolog+ assumes the availability of dedicated image processing hardware, with which the Prolog system communicates, our present system implements image processing functions in software using the C programming language. The second difference is that although our present system supports Prolog+ commands, these are implemented in terms of lower-level Prolog predicates which provide a more flexible approach to image manipulation. We discuss the impact of the Apple Macintosh operating system upon the implementation of the image-processing functions, and the interface between these functions and the Prolog system. We also explain how the Prolog+ commands have been implemented. The system described in this paper is a fairly early prototype, and we outline how we intend to develop the system, a task which is expedited by the extensible architecture we have implemented.

  11. Application of parallel processing for automatic inspection of printed circuits

    International Nuclear Information System (INIS)

    Lougheed, R.M.

    1986-01-01

    Automated visual inspection of printed electronic circuits is a challenging application for image processing systems. Detailed inspection requires high speed analysis of gray scale imagery along with high quality optics, lighting, and sensing equipment. A prototype system has been developed and demonstrated at the Environmental Research Institute of Michigan (ERIM) for inspection of multilayer thick-film circuits. The central problem of real-time image processing is solved by a special-purpose parallel processor which includes a new high-speed Cytocomputer. In this chapter the inspection process and the algorithms used are summarized, along with the functional requirements of the machine vision system. Next, the parallel processor is described in detail and then performance on this application is given

  12. [Complex automatic data processing in multi-profile hospitals].

    Science.gov (United States)

    Dovzhenko, Iu M; Panov, G D

    1990-01-01

    The computerization of data processing in multi-disciplinary hospitals is the key factor in raising the quality of medical care provided to the population, intensifying the work of the personnel, improving the curative and diagnostic process and the use of resources. Even a small experience in complex computerization at the Botkin Hospital indicates that due to the use of the automated system the quality of data processing in being improved, a high level of patients' examination is being provided, a speedy training of young specialists is being achieved, conditions are being created for continuing education of physicians through the analysis of their own activity. At big hospitals a complex solution of administrative and curative diagnostic tasks on the basis of general hospital network of display connection and general hospital data bank is the most prospective form of computerization.

  13. Delivering Software Process-Specific Project Courses in Tertiary Education Environment: Challenges and Solution

    Science.gov (United States)

    Rong, Guoping; Shao, Dong

    2012-01-01

    The importance of delivering software process courses to software engineering students has been more and more recognized in China in recent years. However, students usually cannot fully appreciate the value of software process courses by only learning methodology and principle in the classroom. Therefore, a process-specific project course was…

  14. First performance evaluation of software for automatic segmentation, labeling and reformation of anatomical aligned axial images of the thoracolumbar spine at CT.

    Science.gov (United States)

    Scholtz, Jan-Erik; Wichmann, Julian L; Kaup, Moritz; Fischer, Sebastian; Kerl, J Matthias; Lehnert, Thomas; Vogl, Thomas J; Bauer, Ralf W

    2015-03-01

    To evaluate software for automatic segmentation, labeling and reformation of anatomical aligned axial images of the thoracolumbar spine on CT in terms of accuracy, potential for time savings and workflow improvement. 77 patients (28 women, 49 men, mean age 65.3±14.4 years) with known or suspected spinal disorders (degenerative spine disease n=32; disc herniation n=36; traumatic vertebral fractures n=9) underwent 64-slice MDCT with thin-slab reconstruction. Time for automatic labeling of the thoracolumbar spine and reconstruction of double-angulated axial images of the pathological vertebrae was compared with manually performed reconstruction of anatomical aligned axial images. Reformatted images of both reconstruction methods were assessed by two observers regarding accuracy of symmetric depiction of anatomical structures. In 33 cases double-angulated axial images were created in 1 vertebra, in 28 cases in 2 vertebrae and in 16 cases in 3 vertebrae. Correct automatic labeling was achieved in 72 of 77 patients (93.5%). Errors could be manually corrected in 4 cases. Automatic labeling required 1min in average. In cases where anatomical aligned axial images of 1 vertebra were created, reconstructions made by hand were significantly faster (pquality with excellent inter-observer agreement. The evaluated software for automatic labeling and anatomically aligned, double-angulated axial image reconstruction of the thoracolumbar spine on CT is time-saving when reconstructions of 2 and more vertebrae are performed. Checking results of automatic labeling is necessary to prevent errors in labeling. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  15. The Development from Effortful to Automatic Processing in Mathematical Cognition.

    Science.gov (United States)

    Kaye, Daniel B.; And Others

    This investigation capitalizes upon the information processing models that depend upon measurement of latency of response to a mathematical problem and the decomposition of reaction time (RT). Simple two term addition problems were presented with possible solutions for true-false verification, and accuracy and RT to response were recorded. Total…

  16. Indentification and structuring of data for automatic processing

    International Nuclear Information System (INIS)

    Wohland, H.; Rexer, G.; Ruehle, R.

    1976-01-01

    The data structure of a technical and scientific application system is described. The description of the structure is divided in different sections where the user can describe his own data. By fixing a section of this structure, a high degree of automation of the problem solving process can be achieved while preserving flexibility. (orig.) [de

  17. Process for automatic filling of nuclear fuel rod cans

    International Nuclear Information System (INIS)

    Bezold, H.

    1977-01-01

    A drying section is inserted in the production line for the automation of the filling process for fuel rods with nuclear fuel pellets. The pellets are taken in a drum magazine to a drying furnace and then pushed out one after the other into the can to be filled. (TK) [de

  18. Automatic Processing of Changes in Facial Emotions in Dysphoria: A Magnetoencephalography Study.

    Science.gov (United States)

    Xu, Qianru; Ruohonen, Elisa M; Ye, Chaoxiong; Li, Xueqiao; Kreegipuu, Kairi; Stefanics, Gabor; Luo, Wenbo; Astikainen, Piia

    2018-01-01

    It is not known to what extent the automatic encoding and change detection of peripherally presented facial emotion is altered in dysphoria. The negative bias in automatic face processing in particular has rarely been studied. We used magnetoencephalography (MEG) to record automatic brain responses to happy and sad faces in dysphoric (Beck's Depression Inventory ≥ 13) and control participants. Stimuli were presented in a passive oddball condition, which allowed potential negative bias in dysphoria at different stages of face processing (M100, M170, and M300) and alterations of change detection (visual mismatch negativity, vMMN) to be investigated. The magnetic counterpart of the vMMN was elicited at all stages of face processing, indexing automatic deviance detection in facial emotions. The M170 amplitude was modulated by emotion, response amplitudes being larger for sad faces than happy faces. Group differences were found for the M300, and they were indexed by two different interaction effects. At the left occipital region of interest, the dysphoric group had larger amplitudes for sad than happy deviant faces, reflecting negative bias in deviance detection, which was not found in the control group. On the other hand, the dysphoric group showed no vMMN to changes in facial emotions, while the vMMN was observed in the control group at the right occipital region of interest. Our results indicate that there is a negative bias in automatic visual deviance detection, but also a general change detection deficit in dysphoria.

  19. First performance evaluation of software for automatic segmentation, labeling and reformation of anatomical aligned axial images of the thoracolumbar spine at CT

    Energy Technology Data Exchange (ETDEWEB)

    Scholtz, Jan-Erik, E-mail: janerikscholtz@gmail.com; Wichmann, Julian L.; Kaup, Moritz; Fischer, Sebastian; Kerl, J. Matthias; Lehnert, Thomas; Vogl, Thomas J.; Bauer, Ralf W.

    2015-03-15

    Highlights: •Automatic segmentation and labeling of the thoracolumbar spine. •Automatically generated double-angulated and aligned axial images of spine segments. •High grade of accurateness for the symmetric depiction of anatomical structures. •Time-saving and may improve workflow in daily practice. -- Abstract: Objectives: To evaluate software for automatic segmentation, labeling and reformation of anatomical aligned axial images of the thoracolumbar spine on CT in terms of accuracy, potential for time savings and workflow improvement. Material and methods: 77 patients (28 women, 49 men, mean age 65.3 ± 14.4 years) with known or suspected spinal disorders (degenerative spine disease n = 32; disc herniation n = 36; traumatic vertebral fractures n = 9) underwent 64-slice MDCT with thin-slab reconstruction. Time for automatic labeling of the thoracolumbar spine and reconstruction of double-angulated axial images of the pathological vertebrae was compared with manually performed reconstruction of anatomical aligned axial images. Reformatted images of both reconstruction methods were assessed by two observers regarding accuracy of symmetric depiction of anatomical structures. Results: In 33 cases double-angulated axial images were created in 1 vertebra, in 28 cases in 2 vertebrae and in 16 cases in 3 vertebrae. Correct automatic labeling was achieved in 72 of 77 patients (93.5%). Errors could be manually corrected in 4 cases. Automatic labeling required 1 min in average. In cases where anatomical aligned axial images of 1 vertebra were created, reconstructions made by hand were significantly faster (p < 0.05). Automatic reconstruction was time-saving in cases of 2 and more vertebrae (p < 0.05). Both reconstruction methods revealed good image quality with excellent inter-observer agreement. Conclusion: The evaluated software for automatic labeling and anatomically aligned, double-angulated axial image reconstruction of the thoracolumbar spine on CT is time

  20. Automatic detection of NIL defects using microscopy and image processing

    KAUST Repository

    Pietroy, David

    2013-12-01

    Nanoimprint Lithography (NIL) is a promising technology for low cost and large scale nanostructure fabrication. This technique is based on a contact molding-demolding process, that can produce number of defects such as incomplete filling, negative patterns, sticking. In this paper, microscopic imaging combined to a specific processing algorithm is used to detect numerically defects in printed patterns. Results obtained for 1D and 2D imprinted gratings with different microscopic image magnifications are presented. Results are independent on the device which captures the image (optical, confocal or electron microscope). The use of numerical images allows the possibility to automate the detection and to compute a statistical analysis of defects. This method provides a fast analysis of printed gratings and could be used to monitor the production of such structures. © 2013 Elsevier B.V. All rights reserved.

  1. Automatic Optimization of Hardware Accelerators for Image Processing

    OpenAIRE

    Reiche, Oliver; Häublein, Konrad; Reichenbach, Marc; Hannig, Frank; Teich, Jürgen; Fey, Dietmar

    2015-01-01

    In the domain of image processing, often real-time constraints are required. In particular, in safety-critical applications, such as X-ray computed tomography in medical imaging or advanced driver assistance systems in the automotive domain, timing is of utmost importance. A common approach to maintain real-time capabilities of compute-intensive applications is to offload those computations to dedicated accelerator hardware, such as Field Programmable Gate Arrays (FPGAs). Programming such arc...

  2. The Structure of Processing Resource Demands in Monitoring Automatic Systems.

    Science.gov (United States)

    1981-01-01

    Attempts at modelling the human failure detection process have continually focused on normative predictions of optimal operator behavior ( Smallwood ...Broadbent’s filter model (Broadbent, 1957), to Treisman’s attenuation model (Treisman, 1964), to Norman’s late selection model ( Norman , 1968), tife concept...survey and a model. Acta Psychologica, 1967, 27, 84-92. Moray, N. Mental workload: Its theory and measurement. New York: Plenum Press, 1979. Li 42 Norman

  3. Validation of a semi-automatic protocol for the assessment of the tear meniscus central area based on open-source software

    Science.gov (United States)

    Pena-Verdeal, Hugo; Garcia-Resua, Carlos; Yebra-Pimentel, Eva; Giraldez, Maria J.

    2017-08-01

    Purpose: Different lower tear meniscus parameters can be clinical assessed on dry eye diagnosis. The aim of this study was to propose and analyse the variability of a semi-automatic method for measuring lower tear meniscus central area (TMCA) by using open source software. Material and methods: On a group of 105 subjects, one video of the lower tear meniscus after fluorescein instillation was generated by a digital camera attached to a slit-lamp. A short light beam (3x5 mm) with moderate illumination in the central portion of the meniscus (6 o'clock) was used. Images were extracted from each video by a masked observer. By using an open source software based on Java (NIH ImageJ), a further observer measured in a masked and randomized order the TMCA in the short light beam illuminated area by two methods: (1) manual method, where TMCA images was "manually" measured; (2) semi-automatic method, where TMCA images were transformed in an 8-bit-binary image, then holes inside this shape were filled and on the isolated shape, the area size was obtained. Finally, both measurements, manual and semi-automatic, were compared. Results: Paired t-test showed no statistical difference between both techniques results (p = 0.102). Pearson correlation between techniques show a significant positive near to perfect correlation (r = 0.99; p Conclusions: This study showed a useful tool to objectively measure the frontal central area of the meniscus in photography by free open source software.

  4. Modeling of ultrasonic processes utilizing a generic software framework

    Science.gov (United States)

    Bruns, P.; Twiefel, J.; Wallaschek, J.

    2017-06-01

    Modeling of ultrasonic processes is typically characterized by a high degree of complexity. Different domains and size scales must be regarded, so that it is rather difficult to build up a single detailed overall model. Developing partial models is a common approach to overcome this difficulty. In this paper a generic but simple software framework is presented which allows to coupe arbitrary partial models by slave modules with well-defined interfaces and a master module for coordination. Two examples are given to present the developed framework. The first one is the parameterization of a load model for ultrasonically-induced cavitation. The piezoelectric oscillator, its mounting, and the process load are described individually by partial models. These partial models then are coupled using the framework. The load model is composed of spring-damper-elements which are parameterized by experimental results. In the second example, the ideal mounting position for an oscillator utilized in ultrasonic assisted machining of stone is determined. Partial models for the ultrasonic oscillator, its mounting, the simplified contact process, and the workpiece’s material characteristics are presented. For both applications input and output variables are defined to meet the requirements of the framework’s interface.

  5. AR2, a novel automatic muscle artifact reduction software method for ictal EEG interpretation: Validation and comparison of performance with commercially available software [version 2; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Shennan Aibel Weiss

    2017-04-01

    Full Text Available Objective: To develop a novel software method (AR2 for reducing muscle contamination of ictal scalp electroencephalogram (EEG, and validate this method on the basis of its performance in comparison to a commercially available software method (AR1 to accurately depict seizure-onset location. Methods: A blinded investigation used 23 EEG recordings of seizures from 8 patients. Each recording was uninterpretable with digital filtering because of muscle artifact and processed using AR1 and AR2 and reviewed by 26 EEG specialists. EEG readers assessed seizure-onset time, lateralization, and region, and specified confidence for each determination. The two methods were validated on the basis of the number of readers able to render assignments, confidence, the intra-class correlation (ICC, and agreement with other clinical findings. Results: Among the 23 seizures, two-thirds of the readers were able to delineate seizure-onset time in 10 of 23 using AR1, and 15 of 23 using AR2 (p<0.01. Fewer readers could lateralize seizure-onset (p<0.05. The confidence measures of the assignments were low (probable-unlikely, but increased using AR2 (p<0.05. The ICC for identifying the time of seizure-onset was 0.15 (95% confidence interval (CI, 0.11-0.18 using AR1 and 0.26 (95% CI 0.21-0.30 using AR2.  The EEG interpretations were often consistent with behavioral, neurophysiological, and neuro-radiological findings, with left sided assignments correct in 95.9% (CI 85.7-98.9%, n=4 of cases using AR2, and 91.9% (77.0-97.5% (n=4 of cases using AR1. Conclusions: EEG artifact reduction methods for localizing seizure-onset does not result in high rates of interpretability, reader confidence, and inter-reader agreement. However, the assignments by groups of readers are often congruent with other clinical data. Utilization of the AR2 software method may improve the validity of ictal EEG artifact reduction.

  6. Marcoule pilot work-room: process automatic operation

    International Nuclear Information System (INIS)

    Mus, G.; Linger, C.

    1987-01-01

    Commissioned in the early 1960s, the Marcoule Pilot Plant has undergone a series of sweeping transformations. The Research and Development resources concerning irradiated fuel processing have been expanded and modified. Its reprocessing capacity has also been raised from 2 to 5 t/year. Simultaneously, the installation control system was completely remodelled. The control consoles, which were previously positioned locally near the different units, have been grouped together in a centralized control room. To do this, the measurement and operating circuits were replaced by new data acquisition and processing systems requiring the use of numerical algorithms. The management and control of certain units, including mechanical fuel preparation, sampling, and sample transport to the laboratories, have been entrusted to programmable automata. Certain unit operations, such as concentration by evaporation, are set up with complete automation. These new arrangements will expand the resources for analysing the operation of the Pilot Plant, while offering a more overall view of the operations. They have been made possible by a major effort in the development of sensors, and represent the indispensable prerequisite for the installation of expert systems [fr

  7. Evaluation of an atlas-based automatic segmentation software for the delineation of brain organs at risk in a radiation therapy clinical context

    International Nuclear Information System (INIS)

    Isambert, Aurelie; Dhermain, Frederic; Bidault, Francois; Commowick, Olivier; Bondiau, Pierre-Yves; Malandain, Gregoire; Lefkopoulos, Dimitri

    2008-01-01

    Background and purpose: Conformal radiation therapy techniques require the delineation of volumes of interest, a time-consuming and operator-dependent task. In this work, we aimed to evaluate the potential interest of an atlas-based automatic segmentation software (ABAS) of brain organs at risk (OAR), when used under our clinical conditions. Materials and methods: Automatic and manual segmentations of the eyes, optic nerves, optic chiasm, pituitary gland, brain stem and cerebellum of 11 patients on T1-weighted magnetic resonance, 3-mm thick slice images were compared using the Dice similarity coefficient (DSC). The sensitivity and specificity of the ABAS were also computed and analysed from a radiotherapy point of view by splitting the ROC (Receiver Operating Characteristic) space into four sub-regions. Results: Automatic segmentation of OAR was achieved in 7-8 min. Excellent agreement was obtained between automatic and manual delineations for organs exceeding 7 cm 3 : the DSC was greater than 0.8. For smaller structures, the DSC was lower than 0.41. Conclusions: These tests demonstrated that this ABAS is a robust and reliable tool for automatic delineation of large structures under clinical conditions in our daily practice, even though the small structures must continue to be delineated manually by an expert

  8. ExcelAutomat: a tool for systematic processing of files as applied to quantum chemical calculations

    Science.gov (United States)

    Laloo, Jalal Z. A.; Laloo, Nassirah; Rhyman, Lydia; Ramasami, Ponnadurai

    2017-07-01

    The processing of the input and output files of quantum chemical calculations often necessitates a spreadsheet as a key component of the workflow. Spreadsheet packages with a built-in programming language editor can automate the steps involved and thus provide a direct link between processing files and the spreadsheet. This helps to reduce user-interventions as well as the need to switch between different programs to carry out each step. The ExcelAutomat tool is the implementation of this method in Microsoft Excel (MS Excel) using the default Visual Basic for Application (VBA) programming language. The code in ExcelAutomat was adapted to work with the platform-independent open-source LibreOffice Calc, which also supports VBA. ExcelAutomat provides an interface through the spreadsheet to automate repetitive tasks such as merging input files, splitting, parsing and compiling data from output files, and generation of unique filenames. Selected extracted parameters can be retrieved as variables which can be included in custom codes for a tailored approach. ExcelAutomat works with Gaussian files and is adapted for use with other computational packages including the non-commercial GAMESS. ExcelAutomat is available as a downloadable MS Excel workbook or as a LibreOffice workbook.

  9. SraTailor: graphical user interface software for processing and visualizing ChIP-seq data.

    Science.gov (United States)

    Oki, Shinya; Maehara, Kazumitsu; Ohkawa, Yasuyuki; Meno, Chikara

    2014-12-01

    Raw data from ChIP-seq (chromatin immunoprecipitation combined with massively parallel DNA sequencing) experiments are deposited in public databases as SRAs (Sequence Read Archives) that are publically available to all researchers. However, to graphically visualize ChIP-seq data of interest, the corresponding SRAs must be downloaded and converted into BigWig format, a process that involves complicated command-line processing. This task requires users to possess skill with script languages and sequence data processing, a requirement that prevents a wide range of biologists from exploiting SRAs. To address these challenges, we developed SraTailor, a GUI (Graphical User Interface) software package that automatically converts an SRA into a BigWig-formatted file. Simplicity of use is one of the most notable features of SraTailor: entering an accession number of an SRA and clicking the mouse are the only steps required to obtain BigWig-formatted files and to graphically visualize the extents of reads at given loci. SraTailor is also able to make peak calls, generate files of other formats, process users' own data, and accept various command-line-like options. Therefore, this software makes ChIP-seq data fully exploitable by a wide range of biologists. SraTailor is freely available at http://www.devbio.med.kyushu-u.ac.jp/sra_tailor/, and runs on both Mac and Windows machines. © 2014 The Authors Genes to Cells © 2014 by the Molecular Biology Society of Japan and Wiley Publishing Asia Pty Ltd.

  10. Towards automatic parameter tuning of stream processing systems

    KAUST Repository

    Bilal, Muhammad

    2017-09-27

    Optimizing the performance of big-data streaming applications has become a daunting and time-consuming task: parameters may be tuned from a space of hundreds or even thousands of possible configurations. In this paper, we present a framework for automating parameter tuning for stream-processing systems. Our framework supports standard black-box optimization algorithms as well as a novel gray-box optimization algorithm. We demonstrate the multiple benefits of automated parameter tuning in optimizing three benchmark applications in Apache Storm. Our results show that a hill-climbing algorithm that uses a new heuristic sampling approach based on Latin Hypercube provides the best results. Our gray-box algorithm provides comparable results while being two to five times faster.

  11. Adaptive Automatic Gauge Control of a Cold Strip Rolling Process

    Directory of Open Access Journals (Sweden)

    ROMAN, N.

    2010-02-01

    Full Text Available The paper tackles with thickness control structure of the cold rolled strips. This structure is based on the rolls position control of a reversible quarto rolling mill. The main feature of the system proposed in the paper consists in the compensation of the errors introduced by the deficient dynamics of the hydraulic servo-system used for the rolls positioning, by means of a dynamic compensator that approximates the inverse system of the servo-system. Because the servo-system is considered variant over time, an on-line identification of the servo-system and parameter adapting of the compensator are achieved. The results obtained by numerical simulation are presented together with the data taken from real process. These results illustrate the efficiency of the proposed solutions.

  12. METHOD FOR SECURITY SPECIFICATION SOFTWARE REQUIREMENTS AS A MEANS FOR IMPLEMENTING A SOFTWARE DEVELOPMENT PROCESS SECURE - MERSEC

    Directory of Open Access Journals (Sweden)

    Castro Mecías, L.T.

    2015-06-01

    Full Text Available Often security incidents that have the object or use the software as a means of causing serious damage and legal, economic consequences, etc. Results of a survey by Kaspersky Lab reflectvulnerabilities in software are the main cause of security incidents in enterprises, the report shows that 85% of them have reported security incidents and vulnerabilities in software are the main reason is further estimated that incidents can cause significant losses estimated from 50,000 to $ 649.000. (1 In this regard academic and industry research focuses on proposals based on reducing vulnerabilities and failures of technology, with a positive influence on how the software is developed. A development process for improved safety practices and should include activities from the initial phases of the software; so that security needs are identified, manage risk and appropriate measures are implemented. This article discusses a method of analysis, acquisition and requirements specification of the software safety analysis on the basis of various proposals and deficiencies identified from participant observation in software development teams. Experiments performed using the proposed yields positive results regarding the reduction of security vulnerabilities and compliance with the safety objectives of the software.

  13. Automatical and accurate segmentation of cerebral tissues in fMRI dataset with combination of image processing and deep learning

    Science.gov (United States)

    Kong, Zhenglun; Luo, Junyi; Xu, Shengpu; Li, Ting

    2018-02-01

    Image segmentation plays an important role in medical science. One application is multimodality imaging, especially the fusion of structural imaging with functional imaging, which includes CT, MRI and new types of imaging technology such as optical imaging to obtain functional images. The fusion process require precisely extracted structural information, in order to register the image to it. Here we used image enhancement, morphometry methods to extract the accurate contours of different tissues such as skull, cerebrospinal fluid (CSF), grey matter (GM) and white matter (WM) on 5 fMRI head image datasets. Then we utilized convolutional neural network to realize automatic segmentation of images in deep learning way. Such approach greatly reduced the processing time compared to manual and semi-automatic segmentation and is of great importance in improving speed and accuracy as more and more samples being learned. The contours of the borders of different tissues on all images were accurately extracted and 3D visualized. This can be used in low-level light therapy and optical simulation software such as MCVM. We obtained a precise three-dimensional distribution of brain, which offered doctors and researchers quantitative volume data and detailed morphological characterization for personal precise medicine of Cerebral atrophy/expansion. We hope this technique can bring convenience to visualization medical and personalized medicine.

  14. Software Sensors Design for a Class of Aerobic Fermentation Processes

    Directory of Open Access Journals (Sweden)

    Trayana Patarinska

    2010-08-01

    Full Text Available The problem of on-line state and parameter estimation (software sensors design of a class of aerobic fermentation processes for metabolite product formation is considered. The class is characterized by: two limiting substrates one of which, growth factor, is practically depleted during the biomass growth where the product formation is negligible; corresponding general reaction scheme – a qualitative description of the main metabolic reactions between the main components in the liquid phase (biomass, substrates, product and dissolved oxygen concentrations. Two separate sensors – state and parameter estimators – are designed. The state estimator is developed based on knowledge of only one on-line measurable variable, the dissolved oxygen, and the yield factors assumed as constant coefficients. Parameter estimator of the specific reaction rates is developed under the assumption that all the process variables are known on-line by measurements or estimates. The yield factors are estimated also as non-stationary parameters, thus creating a basis for comparison with the specified constant values used for the state estimator design. As a case study industrial Lysine fermentation in fed-batch mode of operation is considered. Simulation investigations under different operating conditions are done in order to highlight the performances of the proposed sensors.

  15. The software improvement process - tools and rules to encourage quality

    International Nuclear Information System (INIS)

    Sigerud, K.; Baggiolini, V.

    2012-01-01

    The Applications section of the CERN accelerator controls group has decided to apply a systematic approach to quality assurance (QA), the 'Software Improvement Process' - SIP. This process focuses on three areas: the development process itself, suitable QA tools, and how to practically encourage developers to do QA. For each stage of the development process we have agreed on the recommended activities and deliverables, and identified tools to automate and support the task. For example we do more code reviews. As peer reviews are resource intensive, we only do them for complex parts of a product. As a complement, we are using static code checking tools, like FindBugs and Checkstyle. We also encourage unit testing and have agreed on a minimum level of test coverage recommended for all products, measured using Clover. Each of these tools is well integrated with our IDE (Eclipse) and give instant feedback to the developer about the quality of their code. The major challenges of SIP have been to 1) agree on common standards and configurations, for example common code formatting and Javadoc documentation guidelines, and 2) how to encourage the developers to do QA. To address the second point, we have successfully implemented 'SIP days', i.e. one day dedicated to QA work to which the whole group of developers participates, and 'Top/Flop' lists, clearly indicating the best and worst products with regards to SIP guidelines and standards, for example test coverage. This paper presents the SIP initiative in more detail, summarizing our experience since two years and our future plans. (authors)

  16. Image processing. A system for the automatic sorting of chromosomes; Traitement d'images - Applications au classement des chromosomes

    Energy Technology Data Exchange (ETDEWEB)

    Najai, Amor

    1977-05-27

    The present paper deals with two aspects of the system: - an automata (specialized hardware) dedicated to image processing. Images are digitized, divided into sub-units and computations are carried out on their main parameters. - A software for the automatic recognition and sorting of chromosomes is implemented on a Multi-20 minicomputer, connected to the automata. (author) [French] Nous decrivons un systeme automatique de classification de chromosomes. Il se compose de: - l'A.S.T.I., Automate Specialise de Traitement d'Images permettant de numeriser celles-ci, d'isoler des sous-images, d'effectuer des calculs sur leurs parametres principaux. - Un programme de reconnaissance et de classification automatique des chromosomes implante sur un mini-ordinateur MULTI-20, couple a l'A.S.T.I. (auteur)

  17. Targets, drivers and metrics in software process improvement: results of a survey in a multinational organization

    NARCIS (Netherlands)

    Trienekens, J.J.M.; Kusters, R.J.; Genuchten, van M.J.I.M.; Aerts, H.

    2007-01-01

    This paper reports on a survey amongst software groups in a multinational organization. The survey was initiated by the Software Process Improvement (SPI) Steering Committee of Philips, a committee that monitors the status and quality of software process improvement in the global organization. The

  18. Automatic Methods in Image Processing and Their Relevance to Map-Making.

    Science.gov (United States)

    1981-02-11

    folding fre- quency = .5) and s is the "shaoing fac- tor" which controls the spatial frequency content of the signal; the signal band- width increases...ARIZONA UNIV TUCSON DIGITAL IAgE ANALYSIS LAB Iris 8/ 2AUTOMATIC METHOOS IN IMAGE PROCESSING AND THEIR RELEVANCE TO MA-.ETC~tl;FEB 1 S R HUNT DAA629

  19. Automatic/Control Processing Concepts and Their Implications for the Training of Skills.

    Science.gov (United States)

    1982-04-01

    driving a car are examples of automatic processes. Controll p s is comparatively slow, serial, limited by short-term memory, and requires subject effort...development has convinced us that moivation a oftn more Jmportn nti mAn =other iJli velLJoa jjthpgy gI. njj Lautomatic U_2,LLjjk. Motivation Is much more

  20. 10 CFR 95.49 - Security of automatic data processing (ADP) systems.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Security of automatic data processing (ADP) systems. 95.49 Section 95.49 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) FACILITY SECURITY CLEARANCE AND SAFEGUARDING OF NATIONAL SECURITY INFORMATION AND RESTRICTED DATA Control of Information § 95.49 Security of...

  1. Automatic and Controlled Processing in Sentence Recall: The Role of Long-Term and Working Memory

    Science.gov (United States)

    Jefferies, E.; Lambon Ralph, M.A.; Baddeley, A.D.

    2004-01-01

    Immediate serial recall is better for sentences than word lists presumably because of the additional support that meaningful material receives from long-term memory. This may occur automatically, without the involvement of attention, or may require additional attentionally demanding processing. For example, the episodic buffer model (Baddeley,…

  2. The Development of Automatic and Controlled Inhibitory Retrieval Processes in True and False Recall

    Science.gov (United States)

    Knott, Lauren M.; Howe, Mark L.; Wimmer, Marina C.; Dewhurst, Stephen A.

    2011-01-01

    In three experiments, we investigated the role of automatic and controlled inhibitory retrieval processes in true and false memory development in children and adults. Experiment 1 incorporated a directed forgetting task to examine controlled retrieval inhibition. Experiments 2 and 3 used a part-set cue and retrieval practice task to examine…

  3. Automatic Processing of Metallurgical Abstracts for the Purpose of Information Retrieval. Final Report.

    Science.gov (United States)

    Melton, Jessica S.

    Objectives of this project were to develop and test a method for automatically processing the text of abstracts for a document retrieval system. The test corpus consisted of 768 abstracts from the metallurgical section of Chemical Abstracts (CA). The system, based on a subject indexing rational, had two components: (1) a stored dictionary of words…

  4. Examining the influence of psychopathy, hostility biases, and automatic processing on criminal offenders' Theory of Mind

    NARCIS (Netherlands)

    Nentjes, L.; Bernstein, D.; Arntz, A.; van Breukelen, G.; Slaats, M.

    2015-01-01

    Theory of Mind (ToM) is a social perceptual skill that refers to the ability to take someone else's perspective and infer what others think. The current study examined the effect of potential hostility biases, as well as controlled (slow) versus automatic (fast) processing on ToM performance in

  5. Improving the software fault localization process through testability information

    NARCIS (Netherlands)

    Gonzalez-Sanchez, A.; Abreu, R.; Gross, H.; Van Gemund, A.

    2010-01-01

    When failures occur during software testing, automated software fault localization helps to diagnose their root causes and identify the defective components of a program to support debugging. Diagnosis is carried out by selecting test cases in such way that their pass or fail information will narrow

  6. Process mining : business intelligence software wordt eindelijk intelligent

    NARCIS (Netherlands)

    Aalst, van der W.M.P.

    2007-01-01

    Business Intelligence is een begrip dat verwijst naar software die gebruikt kan worden om gegevens over operationele bedrijfsprocessen te verzamelen en deze vervolgens te analyseren. Het doel van BI software is het verkrijgen van meer kennis en inzicht, welke gebruikt kunnen worden om processen

  7. A Model for Quality Optimization in Software Design Processes

    NARCIS (Netherlands)

    Noppen, J.A.R.; van den Broek, P.M.; Aksit, Mehmet

    The main objective of software engineers is to design and implement systems that implement all functional and non-functional requirements. Unfortunately, it is very difficult or even generally impossible to deliver a software system that satisfies all the requirements. Even more seriously, failures

  8. Emerging methods, technologies and process management in software engineering

    CERN Document Server

    Ferrucci, Filomena; Tortora, Genny; Tucci, Maurizio

    2007-01-01

    A high-level introduction to new technologies andmethods in the field of software engineering Recent years have witnessed rapid evolution of software engineering methodologies, and until now, there has been no single-source introduction to emerging technologies in the field.

  9. Memory biases in remitted depression: the role of negative cognitions at explicit and automatic processing levels.

    Science.gov (United States)

    Romero, Nuria; Sanchez, Alvaro; Vazquez, Carmelo

    2014-03-01

    Cognitive models propose that depression is caused by dysfunctional schemas that endure beyond the depressive episode, representing vulnerability factors for recurrence. However, research testing negative cognitions linked to dysfunctional schemas in formerly depressed individuals is still scarce. Furthermore, negative cognitions are presumed to be linked to biases in recalling negative self-referent information in formerly depressed individuals, but no studies have directly tested this association. In the present study, we evaluated differences between formerly and never-depressed individuals in several experimental indices of negative cognitions and their associations with the recall of emotional self-referent material. Formerly (n = 30) and never depressed individuals (n = 40) completed measures of explicit (i.e., scrambled sentence test) and automatic (i.e., lexical decision task) processing to evaluate negative cognitions. Furthermore participants completed a self-referent incidental recall task to evaluate memory biases. Formerly compared to never depressed individuals showed greater negative cognitions at both explicit and automatic levels of processing. Results also showed greater recall of negative self-referent information in formerly compared to never-depressed individuals. Finally, individual differences in negative cognitions at both explicit and automatic levels of processing predicted greater recall of negative self-referent material in formerly depressed individuals. Analyses of the relationship between explicit and automatic processing indices and memory biases were correlational and the majority of participants in both groups were women. Our findings provide evidence of negative cognitions in formerly depressed individuals at both automatic and explicit levels of processing that may confer a cognitive vulnerability to depression. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Developmental Dyscalculia and Automatic Magnitudes Processing: Investigating Interference Effects between Area and Perimeter

    Directory of Open Access Journals (Sweden)

    Hili Eidlin-Levy

    2017-12-01

    Full Text Available The relationship between numbers and other magnitudes has been extensively investigated in the scientific literature. Here, the objectives were to examine whether two continuous magnitudes, area and perimeter, are automatically processed and whether adults with developmental dyscalculia (DD are deficient in their ability to automatically process one or both of these magnitudes. Fifty-seven students (30 with DD and 27 with typical development performed a novel Stroop-like task requiring estimation of one aspect (area or perimeter while ignoring the other. In order to track possible changes in automaticity due to practice, we measured performance after initial and continuous exposure to stimuli. Similar to previous findings, current results show a significant group × congruency interaction, evident beyond exposure level or magnitude type. That is, the DD group systematically showed larger Stroop effects. However, analysis of each exposure period showed that during initial exposure to stimuli the DD group showed larger Stroop effects in the perimeter and not in the area task. In contrast, during continuous exposure to stimuli no triple interaction was evident. It is concluded that both magnitudes are automatically processed. Nevertheless, individuals with DD are deficient in inhibiting irrelevant magnitude information in general and, specifically, struggle to inhibit salient area information after initial exposure to a perimeter comparison task. Accordingly, the findings support the assumption that DD involves a deficiency in multiple cognitive components, which include domain-specific and domain-general cognitive functions.

  11. Developmental Dyscalculia and Automatic Magnitudes Processing: Investigating Interference Effects between Area and Perimeter.

    Science.gov (United States)

    Eidlin-Levy, Hili; Rubinsten, Orly

    2017-01-01

    The relationship between numbers and other magnitudes has been extensively investigated in the scientific literature. Here, the objectives were to examine whether two continuous magnitudes, area and perimeter, are automatically processed and whether adults with developmental dyscalculia (DD) are deficient in their ability to automatically process one or both of these magnitudes. Fifty-seven students (30 with DD and 27 with typical development) performed a novel Stroop-like task requiring estimation of one aspect (area or perimeter) while ignoring the other. In order to track possible changes in automaticity due to practice, we measured performance after initial and continuous exposure to stimuli. Similar to previous findings, current results show a significant group × congruency interaction, evident beyond exposure level or magnitude type. That is, the DD group systematically showed larger Stroop effects. However, analysis of each exposure period showed that during initial exposure to stimuli the DD group showed larger Stroop effects in the perimeter and not in the area task. In contrast, during continuous exposure to stimuli no triple interaction was evident. It is concluded that both magnitudes are automatically processed. Nevertheless, individuals with DD are deficient in inhibiting irrelevant magnitude information in general and, specifically, struggle to inhibit salient area information after initial exposure to a perimeter comparison task. Accordingly, the findings support the assumption that DD involves a deficiency in multiple cognitive components, which include domain-specific and domain-general cognitive functions.

  12. THE PROCESS OF INTERNATIONALIZATION OF SOFTWARE FIRMS: THE CASE AUDACES

    Directory of Open Access Journals (Sweden)

    Antonia Egidia Souza

    2012-10-01

    Full Text Available This study addresses the way in which Brazilian companies are internationalising, their motivations for internationalization and competitive advantage in this process. The article analyzes in depth a software company that since 2002 has been working successfully in the global market. The theoretical framework used is supported in three major theories on the internationalization: the advantage of ownership (DUNNING, 1988, the Uppsala School of the 1970s and the born global theory (RENNIE, 1993; CAVUSGIL and KNIGHT, 1996; OVIATT and McDougall, 1994; MADSEN and SERBIA, 1997. To conduct the study, using a qualitative approach with an exploratory-descriptive approach. The strategy was the case study. Data were collected from primary and secondary sources. The data analysis was from the content analysis. As a result, it appears that the motivation for the internationalization is based on the need to reduce dependence on domestic market, ability to modify products for the international market opportunity for profit and growth in the international market, interest management for international activities.

  13. Intentional and Automatic Numerical Processing as Predictors of Mathematical Abilities in Primary School Children

    Directory of Open Access Journals (Sweden)

    Violeta ePina

    2015-03-01

    Full Text Available Previous studies have suggested that numerical processing relates to mathematical performance, but it seems that such relationship is more evident for intentional than for automatic numerical processing. In the present study we assessed the relationship between the two types of numerical processing and specific mathematical abilities in a sample of 109 children in grades 1 to 6. Participants were tested in an ample range of mathematical tests and also performed both a numerical and a size comparison task. The results showed that numerical processing related to mathematical performance only when inhibitory control was involved in the comparison tasks. Concretely, we found that intentional numerical processing, as indexed by the numerical distance effect in the numerical comparison task, was related to mathematical reasoning skills only when the task-irrelevant dimension (the physical size was incongruent; whereas automatic numerical processing, indexed by the congruency effect in the size comparison task, was related to mathematical calculation skills only when digits were separated by small distance. The observed double dissociation highlights the relevance of both intentional and automatic numerical processing in mathematical skills, but when inhibitory control is also involved.

  14. Using dual-task methodology to dissociate automatic from nonautomatic processes involved in artificial grammar learning.

    Science.gov (United States)

    Hendricks, Michelle A; Conway, Christopher M; Kellogg, Ronald T

    2013-09-01

    Previous studies have suggested that both automatic and intentional processes contribute to the learning of grammar and fragment knowledge in artificial grammar learning (AGL) tasks. To explore the relative contribution of automatic and intentional processes to knowledge gained in AGL, we utilized dual-task methodology to dissociate automatic and intentional grammar- and fragment-based knowledge in AGL at both acquisition and at test. Both experiments used a balanced chunk strength grammar to assure an equal proportion of fragment cues (i.e., chunks) in grammatical and nongrammatical test items. In Experiment 1, participants engaged in a working memory dual-task either during acquisition, test, or both acquisition and test. The results showed that participants performing the dual-task during acquisition learned the artificial grammar as well as the single-task group, presumably by relying on automatic learning mechanisms. A working memory dual-task at test resulted in attenuated grammar performance, suggesting a role for intentional processes for the expression of grammatical learning at test. Experiment 2 explored the importance of perceptual cues by changing letters between the acquisition and test phase; unlike Experiment 1, there was no significant learning of grammatical information for participants under dual-task conditions in Experiment 2, suggesting that intentional processing is necessary for successful acquisition and expression of grammar-based knowledge under transfer conditions. In sum, it appears that some aspects of learning in AGL are indeed relatively automatic, although the expression of grammatical information and the learning of grammatical patterns when perceptual similarity is eliminated both appear to require explicit resources. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  15. Techniques, processes, and measures for software safety and reliability

    International Nuclear Information System (INIS)

    Sparkman, D.

    1992-01-01

    The purpose of this report is to provide a detailed survey of current recommended practices and measurement techniques for the development of reliable and safe software-based systems. This report is intended to assist the United States Nuclear Reaction Regulation (NRR) in determining the importance and maturity of the available techniques and in assessing the relevance of individual standards for application to instrumentation and control systems in nuclear power generating stations. Lawrence Livermore National Laboratory (LLNL) provides technical support for the Instrumentation and Control System Branch (ICSB) of NRRin advanced instrumentation and control systems, distributed digital systems, software reliability, and the application of verificafion and validafion for the development of software

  16. AUTOMR: An automatic processing program system for the molecular replacement method

    International Nuclear Information System (INIS)

    Matsuura, Yoshiki

    1991-01-01

    An automatic processing program system of the molecular replacement method AUTMR is presented. The program solves the initial model of the target crystal structure using a homologous molecule as the search model. It processes the structure-factor calculation of the model molecule, the rotation function, the translation function and the rigid-group refinement successively in one computer job. Test calculations were performed for six protein crystals and the structures were solved in all of these cases. (orig.)

  17. Method and Tool for Design Process Navigation and Automatic Generation of Simulation Models for Manufacturing Systems

    Science.gov (United States)

    Nakano, Masaru; Kubota, Fumiko; Inamori, Yutaka; Mitsuyuki, Keiji

    Manufacturing system designers should concentrate on designing and planning manufacturing systems instead of spending their efforts on creating the simulation models to verify the design. This paper proposes a method and its tool to navigate the designers through the engineering process and generate the simulation model automatically from the design results. The design agent also supports collaborative design projects among different companies or divisions with distributed engineering and distributed simulation techniques. The idea was implemented and applied to a factory planning process.

  18. Sensitometric comparison of E and F dental radiographic films using manual and automatic processing systems

    Directory of Open Access Journals (Sweden)

    Dabaghi A.

    2008-04-01

    Full Text Available Background and Aim: Processing conditions affect sensitometric properties of X-ray films. In this study, we aimed to evaluate the sensitometric characteristics of InSight (IP, a new F-speed film, in fresh and used processing solutions in dental office condition and compare them with Ektaspeed Plus (EP.Materials and Methods: In this experimental in vitro study, an aluminium step wedge was used to construct characteristic curves for InSight and Ektaspeed Plus films (Kodak Eastman, Rochester, USA.All films were processed in Champion solution (X-ray Iran, Tehran, Iran both manually and automatically in a period of six days. Unexposed films of both types were processed manually and automatically to determine base plus fog density. Speed and film contrast were measured according to ISO definition. Data were analyzed using one-way ANOVA and T tests with P<0.05 as the level of significance.Results: IP was 20 to 22% faster than EP and showed to be an F-speed film when processed in automatic condition and E-F film when processed manually. Also it was F-speed in fresh solution and E-speed in old solution. IP and EP contrasts were similar in automatic processing but EP contrast was higher when processed manually. Both EP and IP films had standard values of base plus fog (<0.35 and B+F densities were decreased in old solution.Conclusion: Based on the results of this study, InSight is a F-speed film with a speed of at least 20% greater than Ektaspeed. In addition, it reduces patient exposure with no damage to image quality.

  19. Process and equipment for automatic measurement of resonant frequencies in seismic detectors

    International Nuclear Information System (INIS)

    Fredriksson, O.A.; Thomas, E.L.

    1977-01-01

    This is a process for the automatic indication of the resonant frequency of one or more detector elements which have operated inside a geophysical data-gathering system. Geophones or hydrophones or groups of both instruments are to be understood as comprising the detector elements. The invention concerns the creation of a process and of equipment working with laboratory precision, although it can be used in the field. (orig./RW) [de

  20. Data from configuration management tools as sources for software process mining

    NARCIS (Netherlands)

    Trienekens, J.J.M.; Kusters, R.J.; Samalik, J.; Weijters, A.J.M.M.; Lavazza, L; Oberhauser, R; Martin, A; Hassine, J; Gebhart, M; Jäntti, M

    2013-01-01

    Process mining has proven to be a valuable approach that provides new and objective insights into processes within organizations. Based on sets of well-structured data, the underlying ‘actual’ processes can be extracted and process models can be constructed automatically, i.e., the process model can

  1. Automatic humidification system to support the assessment of food drying processes

    Science.gov (United States)

    Ortiz Hernández, B. D.; Carreño Olejua, A. R.; Castellanos Olarte, J. M.

    2016-07-01

    This work shows the main features of an automatic humidification system to provide drying air that match environmental conditions of different climate zones. This conditioned air is then used to assess the drying process of different agro-industrial products at the Automation and Control for Agro-industrial Processes Laboratory of the Pontifical Bolivarian University of Bucaramanga, Colombia. The automatic system allows creating and improving control strategies to supply drying air under specified conditions of temperature and humidity. The development of automatic routines to control and acquire real time data was made possible by the use of robust control systems and suitable instrumentation. The signals are read and directed to a controller memory where they are scaled and transferred to a memory unit. Using the IP address is possible to access data to perform supervision tasks. One important characteristic of this automatic system is the Dynamic Data Exchange Server (DDE) to allow direct communication between the control unit and the computer used to build experimental curves.

  2. UFMulti: A new parallel processing software system for HEP

    Science.gov (United States)

    Avery, Paul; White, Andrew

    1989-12-01

    UFMulti is a multiprocessing software package designed for general purpose high energy physics applications, including physics and detector simulation, data reduction and DST physics analysis. The system is particularly well suited for installations where several workstation or computers are connected through a local area network (LAN). The initial configuration of the software is currently running on VAX/VMS machines with a planned extension to ULTRIX, using the new RISC CPUs from Digital, in the near future.

  3. UFMULTI: A new parallel processing software system for HEP

    International Nuclear Information System (INIS)

    Avery, P.; White, A.

    1989-01-01

    UFMulti is a multiprocessing software package designed for general purpose high energy physics applications, including physics and detector simulation, data reduction and DST physics analysis. The system is particularly well suited for installations where several workstations or computers are connected through a local area network (LAN). The initial configuration of the software is currently running on VAX/VMS machines with a planned extension to ULTRIX, using the new RISC CPUs from Digital, in the near future. (orig.)

  4. Automatic processing of unattended lexical information in visual oddball presentation: neurophysiological evidence

    Directory of Open Access Journals (Sweden)

    Yury eShtyrov

    2013-08-01

    Full Text Available Previous electrophysiological studies of automatic language processing revealed early (100-200 ms reflections of access to lexical characteristics of speech signal using the so-called mismatch negativity (MMN, a negative ERP deflection elicited by infrequent irregularities in unattended repetitive auditory stimulation. In those studies, lexical processing of spoken stimuli became manifest as an enhanced ERP in response to unattended real words as opposed to phonologically matched but meaningless pseudoword stimuli. This lexical ERP enhancement was explained by automatic activation of word memory traces realised as distributed strongly intra-connected neuronal circuits, whose robustness guarantees memory trace activation even in the absence of attention on spoken input. Such an account would predict the automatic activation of these memory traces upon any presentation of linguistic information, irrespective of the presentation modality. As previous lexical MMN studies exclusively used auditory stimulation, we here adapted the lexical MMN paradigm to investigate early automatic lexical effects in the visual modality. In a visual oddball sequence, matched short word and pseudoword stimuli were presented tachistoscopically in perifoveal area outside the visual focus of attention, as the subjects’ attention was concentrated on a concurrent non-linguistic visual dual task in the centre of the screen. Using EEG, we found a visual analogue of the lexical ERP enhancement effect, with unattended written words producing larger brain response amplitudes than matched pseudowords, starting at ~100 ms. Furthermore, we also found significant visual MMN, reported here for the first time for unattended lexical stimuli presented perifoveally. The data suggest early automatic lexical processing of visually presented language outside the focus of attention.

  5. Software life cycle process and classification guides for KNICS digital instrumentation and control system design

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jang Soo; Son, Han Seung; Kim, Jang Yeol; Kwon, Kee Choon; Lee, Soon Seung; Kim, Doo Hwan [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2002-04-01

    Documentation should exist that shows that the qualification activities have been successfully accomplished for each life cycle activity group. In particular, the documentation should show that the system safety requirements have been adequately addressed for each life cycle activity group, that no new hazards have been introduced, and that the software requirements, design elements, and code elements that can affect safety have been identified. Because the safety of software can be assured through both the process Verification and Validation (V and V) itself and the V and V of all the intermediate and final products during the software development lifecycle, the development of KNICS Software Safety Framework (KSSF) must be established. As the first activity for establishing KSSF, we have developed this report, Software Life Cycle Process and Classification Guides for KNICS Digital I and C System. This report is organized as follows. Chapter I describes the background, definitions, and references of SLCP. Chapter II describes KNICS safety software categorization. In Chapter III, we define the requirements on software life cycle process for designing digital KNICS. Chapter III.3, that is the main section of the chapter, includes the requirements for software life cycle process planning, the requirements for software life cycle process implementation, and the requirements for software life cycle process design outputs. Finally, we have described the result of a case study on the SLCP for developing the software of ESF-CCS system that is being developed by a private company, BNF. 29 refs., 5 figs., 7 tabs. (Author)

  6. Modeling Student Software Testing Processes: Attitudes, Behaviors, Interventions, and Their Effects

    Science.gov (United States)

    Buffardi, Kevin John

    2014-01-01

    Effective software testing identifies potential bugs and helps correct them, producing more reliable and maintainable software. As software development processes have evolved, incremental testing techniques have grown in popularity, particularly with introduction of test-driven development (TDD). However, many programmers struggle to adopt TDD's…

  7. 78 FR 47012 - Developing Software Life Cycle Processes Used in Safety Systems of Nuclear Power Plants

    Science.gov (United States)

    2013-08-02

    ... for quality assurance programs in Appendix B to 10 CFR Part 50 as they apply to software development... is one of six RG revisions addressing computer software development and use in safety related systems... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Developing Software Life Cycle Processes Used in...

  8. GMI-IPS: Python Processing Software for Aircraft Campaigns

    Science.gov (United States)

    Damon, M. R.; Strode, S. A.; Steenrod, S. D.; Prather, M. J.

    2018-01-01

    NASA's Atmospheric Tomography Mission (ATom) seeks to understand the impact of anthropogenic air pollution on gases in the Earth's atmosphere. Four flight campaigns are being deployed on a seasonal basis to establish a continuous global-scale data set intended to improve the representation of chemically reactive gases in global atmospheric chemistry models. The Global Modeling Initiative (GMI), is creating chemical transport simulations on a global scale for each of the ATom flight campaigns. To meet the computational demands required to translate the GMI simulation data to grids associated with the flights from the ATom campaigns, the GMI ICARTT Processing Software (GMI-IPS) has been developed and is providing key functionality for data processing and analysis in this ongoing effort. The GMI-IPS is written in Python and provides computational kernels for data interpolation and visualization tasks on GMI simulation data. A key feature of the GMI-IPS, is its ability to read ICARTT files, a text-based file format for airborne instrument data, and extract the required flight information that defines regional and temporal grid parameters associated with an ATom flight. Perhaps most importantly, the GMI-IPS creates ICARTT files containing GMI simulated data, which are used in collaboration with ATom instrument teams and other modeling groups. The initial main task of the GMI-IPS is to interpolate GMI model data to the finer temporal resolution (1-10 seconds) of a given flight. The model data includes basic fields such as temperature and pressure, but the main focus of this effort is to provide species concentrations of chemical gases for ATom flights. The software, which uses parallel computation techniques for data intensive tasks, linearly interpolates each of the model fields to the time resolution of the flight. The temporally interpolated data is then saved to disk, and is used to create additional derived quantities. In order to translate the GMI model data to the

  9. Enhancement of the automatic ultrasonic signal processing system using digital technology

    International Nuclear Information System (INIS)

    Koo, In Soo; Park, H. Y.; Suh, Y. S.; Kim, D. Hoon; Huh, S.; Sung, S. H.; Jang, G. S.; Ryoo, S. G.; Choi, J. H.; Kim, Y. H.; Lee, J. C.; Kim, D. Hyun; Park, H. J.; Kim, Y. C.; Lee, J. P.; Park, C. H.; Kim, M. S.

    1999-12-01

    The objective of this study is to develop the automatic ultrasonic signal processing system which can be used in the inspection equipment to assess the integrity of the reactor vessel by enhancing the performance of the ultrasonic signal processing system. Main activities of this study divided into three categories such as the development of the circuits for generating ultrasonic signal and receiving the signal from the inspection equipment, the development of signal processing algorithm and H/W of the data processing system, and the development of the specification for application programs and system S/W for the analysis and evaluation computer. The results of main activities are as follows 1) the design of the ultrasonic detector and the automatic ultrasonic signal processing system by using the investigation of the state-of-the-art technology in the inside and outside of the country. 2) the development of H/W and S/W of the data processing system based on the results. Especially, the H/W of the data processing system, which have both advantages of digital and analog controls through the real-time digital signal processing, was developed using the DSP which can process the digital signal in the real-time, and was developed not only firmware of the data processing system in order for the peripherals but also the test algorithm of specimen for the calibration. The application programs and the system S/W of the analysis/evaluation computer were developed. Developed equipment was verified by the performance test. Based on developed prototype for the automatic ultrasonic signal processing system, the localization of the overall ultrasonic inspection equipment for nuclear industries would be expected through the further studies of the H/W establishment of real applications, developing the S/W specification of the analysis computer. (author)

  10. Helping organizations to address their effort toward the implementation of improvements in their software process

    OpenAIRE

    Muñoz-Mata, Mirna Ariadna; Mejia-Miranda, Jezreel; Valtierra-Alvarado, Claudia

    2015-01-01

    Due to the importance of Software Development Small and Medium Enterprises (SMEs) in the software industry, it is necessary to guarantee the quality of their products. In this context, the implementation of software process improvements offers an attractive way to achieve it. Unfortunately, the lack of knowledge on how to address the improvement effort makes the implementation of software improvements in SMEs a path full of obstacles, and most of the times impossible to achieve. In order to h...

  11. Cognitive regulation of smoking behavior within a cigarette: Automatic and nonautomatic processes.

    Science.gov (United States)

    Motschman, Courtney A; Tiffany, Stephen T

    2016-06-01

    There has been limited research on cognitive processes governing smoking behavior in individuals who are tobacco dependent. In a replication (Baxter & Hinson, 2001) and extension, this study examined the theory (Tiffany, 1990) that drug use may be controlled by automatic processes that develop over repeated use. Heavy and occasional cigarette smokers completed a button-press reaction time (RT) task while concurrently smoking a cigarette, pretending to smoke a lit cigarette, or not smoking. Slowed RT during the button-press task indexed the cognitive disruption associated with nonautomatic control of behavior. Occasional smokers' RTs were slowed when smoking or pretending to smoke compared with when not smoking. Heavy smokers' RTs were slowed when pretending to smoke versus not smoking; however, their RTs were similarly fast when smoking compared with not smoking. The results indicated that smoking behavior was more highly regulated by controlled, nonautomatic processes among occasional smokers and by automatic processes among heavy smokers. Patterns of RT across the interpuff interval indicated that occasional smokers were significantly slowed in anticipation of and immediately after puffing onset, whereas heavy smokers were only slowed significantly after puffing onset. These findings suggest that the entirety of the smoking sequence becomes automatized, with the behaviors leading up to puffing becoming more strongly regulated by automatic processes with experience. These results have relevance to theories on the cognitive regulation of cigarette smoking and support the importance of interventions that focus on routinized behaviors that individuals engage in during and leading up to drug use. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  12. Automatic methods for processing track-detector data at the PAVICOM facility

    International Nuclear Information System (INIS)

    Aleksandrov, A.B.; Goncharova, L.A.; Polukhina, N.G.; Fejnberg, E.L.; Davydov, D.A.; Publichenko, P.A.; Roganova, T.M.

    2007-01-01

    New automatic methods essentially simplify and hasten the data treatment of tracking detectors. It allows handling big data files and appreciably improves their statistics; this fact predetermines an elaboration of new experiments, which suppose to use large volume targets, emulsive and solid-state large square tracking detectors. Thereupon the problem of training competent physicists able to work on modern automatic equipment is very relevant. About ten Moscow students working in LPI at PAVICOM facility master new methods every year. Most of the students working in high-energy physics take the print only about archaic hand methods of data handling from tracking detectors. In 2005 on the base of the PAVICOM facility and physics training of the MSU a new educational work for determination of the energy of neutrons passing through nuclear emulsion, which lets students acquire a base habit of data handling from tracking detectors using an automatic facility, was prepared; it can be included in the training process for students of any physical faculty. Specialists mastering methods of an automatic handling by the simple and obvious example of tracking detectors will be able to use their knowledge in various areas of science and techniques. The organization of upper division courses is a new additional aspect of using the PAVICOM facility described in an earlier paper [4

  13. IMPROVING (SOFTWARE) PATENT QUALITY THROUGH THE ADMINISTRATIVE PROCESS.

    Science.gov (United States)

    Rai, Arti K

    2013-11-24

    The available evidence indicates that patent quality, particularly in the area of software, needs improvement. This Article argues that even an agency as institutionally constrained as the U.S. Patent and Trademark Office ("PTO") could implement a portfolio of pragmatic, cost-effective quality improvement strategies. The argument in favor of these strategies draws upon not only legal theory and doctrine but also new data from a PTO software examination unit with relatively strict practices. Strategies that resolve around Section 112 of the patent statute could usefully be deployed at the initial examination stage. Other strategies could be deployed within the new post-issuance procedures available to the agency under the America Invents Act. Notably, although the strategies the Article discusses have the virtue of being neutral as to technology, they are likely to have a very significant practical impact in the area of software.

  14. Mapping modern software process engineering techniques onto an HEP development environment

    International Nuclear Information System (INIS)

    Wellisch, J.P.

    2003-01-01

    One of the most challenging issues faced in HEP in recent years is the question of how to capitalise on software development and maintenance experience in a continuous manner. To capitalise means in our context to evaluate and apply new process technologies as they arise, and to further evolve technologies already widely in use. It also implies the definition and adoption of standards. The CMS off-line software improvement effort aims at continual software quality improvement, and continual improvement in the efficiency of the working environment with the goal to facilitate doing great new physics. To achieve this, we followed a process improvement program based on ISO-15504, and Rational Unified Process. This experiment in software process improvement in HEP has been progressing now for a period of 3 years. Taking previous experience from ATLAS and SPIDER into account, we used a soft approach of continuous change within the limits of current culture to create of de facto software process standards within the CMS off line community as the only viable route to a successful software process improvement program in HEP. We will present the CMS approach to software process improvement in this process R and D, describe lessons learned, and mistakes made. We will demonstrate the benefits gained, and the current status of the software processes established in CMS off-line software

  15. Mapping modern software process engineering techniques onto an HEP development environment

    Science.gov (United States)

    Wellisch, J. P.

    2003-04-01

    One of the most challenging issues faced in HEP in recent years is the question of how to capitalise on software development and maintenance experience in a continuous manner. To capitalise means in our context to evaluate and apply new process technologies as they arise, and to further evolve technologies already widely in use. It also implies the definition and adoption of standards. The CMS off-line software improvement effort aims at continual software quality improvement, and continual improvement in the efficiency of the working environment with the goal to facilitate doing great new physics. To achieve this, we followed a process improvement program based on ISO-15504, and Rational Unified Process. This experiment in software process improvement in HEP has been progressing now for a period of 3 years. Taking previous experience from ATLAS and SPIDER into account, we used a soft approach of continuous change within the limits of current culture to create of de facto software process standards within the CMS off line community as the only viable route to a successful software process improvement program in HEP. We will present the CMS approach to software process improvement in this process R&D, describe lessons learned, and mistakes made. We will demonstrate the benefits gained, and the current status of the software processes established in CMS off-line software.

  16. Honeywell Modular Automation System Computer Software Documentation for the Magnesium Hydroxide Precipitation Process

    International Nuclear Information System (INIS)

    STUBBS, A.M.

    2001-01-01

    The purpose of this Computer Software Document (CSWD) is to provide configuration control of the Honeywell Modular Automation System (MAS) in use at the Plutonium Finishing Plant (PFP) for the Magnesium Hydroxide Precipitation Process in Rm 230C/234-5Z. The magnesium hydroxide process control software Rev 0 is being updated to include control programming for a second hot plate. The process control programming was performed by the system administrator. Software testing for the additional hot plate was performed per PFP Job Control Work Package 2Z-00-1703. The software testing was verified by Quality Control to comply with OSD-Z-184-00044, Magnesium Hydroxide Precipitation Process

  17. Commercial off-the-shelf software dedication process based on the commercial grade survey of supplier

    International Nuclear Information System (INIS)

    Kim, J. Y.; Lee, J. S.; Chon, S. W.; Lee, G. Y.; Park, J. K.

    2000-01-01

    Commercial Off-The-Shelf(COTS) software dedication process can apply to a combination of methods like the hardware commercial grade item dedication process. In general, these methods are : methods 1(special test and inspection), method 2(commercial grade survey of supplier), method 3(source verification), and method 4(acceptance supplier/item performance record). In this paper, the suggested procedure-oriented dedication process on the basis of method 2 for COTS software is consistent with EPRI/TR-106439 and NUREG/CR-6421 requirements. Additional tailoring policy based on code and standards related to COTS software may be also founded in the suggested commercial software dedication process. Suggested commercial software dedication process has been developed for a commercial I and C software dedication who performs COTS qualification according to the dedication procedure

  18. Medical software certification processes in Europe, USA and Brazil

    OpenAIRE

    Devesa, Filipe Rafael Simões

    2014-01-01

    Actualmente o software médico é um mercado em crescimento contínuo. É essencial para um fabricante deste tipo de produto médico seguir os requisitos regulamentares obrigatórios e implementar as medidas necessárias na empresa para poder comercializar o seu produto em países diferentes. O produto médico a ser certificado é o OneCare Sensing comercializado pela ISA-Intellicare, uma solução de software de monitorização destinado a recolher, armazenar, visualizar, transmitir e fornecer alertas ...

  19. ERP Software Selection Model using Analytic Network Process

    OpenAIRE

    Lesmana , Andre Surya; Astanti, Ririn Diar; Ai, The Jin

    2014-01-01

    During the implementation of Enterprise Resource Planning (ERP) in any company, one of the most important issues is the selection of ERP software that can satisfy the needs and objectives of the company. This issue is crucial since it may affect the duration of ERP implementation and the costs incurred for the ERP implementation. This research tries to construct a model of the selection of ERP software that are beneficial to the company in order to carry out the selection of the right ERP sof...

  20. Sample registration software for process automation in the Neutron Activation Analysis (NAA) Facility in Malaysia nuclear agency

    Energy Technology Data Exchange (ETDEWEB)

    Rahman, Nur Aira Abd, E-mail: nur-aira@nuclearmalaysia.gov.my; Yussup, Nolida; Ibrahim, Maslina Bt. Mohd; Mokhtar, Mukhlis B.; Soh Shaari, Syirrazie Bin Che; Azman, Azraf B. [Technical Support Division, Malaysian Nuclear Agency, 43000, Kajang, Selangor (Malaysia); Salim, Nazaratul Ashifa Bt. Abdullah [Division of Waste and Environmental Technology, Malaysian Nuclear Agency, 43000, Kajang, Selangor (Malaysia); Ismail, Nadiah Binti [Fakulti Kejuruteraan Elektrik, UiTM Pulau Pinang, 13500 Permatang Pauh, Pulau Pinang (Malaysia)

    2015-04-29

    Neutron Activation Analysis (NAA) had been established in Nuclear Malaysia since 1980s. Most of the procedures established were done manually including sample registration. The samples were recorded manually in a logbook and given ID number. Then all samples, standards, SRM and blank were recorded on the irradiation vial and several forms prior to irradiation. These manual procedures carried out by the NAA laboratory personnel were time consuming and not efficient. Sample registration software is developed as part of IAEA/CRP project on ‘Development of Process Automation in the Neutron Activation Analysis (NAA) Facility in Malaysia Nuclear Agency (RC17399)’. The objective of the project is to create a pc-based data entry software during sample preparation stage. This is an effective method to replace redundant manual data entries that needs to be completed by laboratory personnel. The software developed will automatically generate sample code for each sample in one batch, create printable registration forms for administration purpose, and store selected parameters that will be passed to sample analysis program. The software is developed by using National Instruments Labview 8.6.

  1. Sample registration software for process automation in the Neutron Activation Analysis (NAA) Facility in Malaysia nuclear agency

    Science.gov (United States)

    Rahman, Nur Aira Abd; Yussup, Nolida; Salim, Nazaratul Ashifa Bt. Abdullah; Ibrahim, Maslina Bt. Mohd; Mokhtar, Mukhlis B.; Soh@Shaari, Syirrazie Bin Che; Azman, Azraf B.; Ismail, Nadiah Binti

    2015-04-01

    Neutron Activation Analysis (NAA) had been established in Nuclear Malaysia since 1980s. Most of the procedures established were done manually including sample registration. The samples were recorded manually in a logbook and given ID number. Then all samples, standards, SRM and blank were recorded on the irradiation vial and several forms prior to irradiation. These manual procedures carried out by the NAA laboratory personnel were time consuming and not efficient. Sample registration software is developed as part of IAEA/CRP project on `Development of Process Automation in the Neutron Activation Analysis (NAA) Facility in Malaysia Nuclear Agency (RC17399)'. The objective of the project is to create a pc-based data entry software during sample preparation stage. This is an effective method to replace redundant manual data entries that needs to be completed by laboratory personnel. The software developed will automatically generate sample code for each sample in one batch, create printable registration forms for administration purpose, and store selected parameters that will be passed to sample analysis program. The software is developed by using National Instruments Labview 8.6.

  2. Automatic and controlled processing in sentence recall: The role of long-term and working memory

    OpenAIRE

    Jefferies, Elizabeth; Lambon Ralph, Matthew A.; Baddeley, Alan D.

    2004-01-01

    Immediate serial recall is better for sentences than word lists presumably because of the additional support that meaningful material receives from long-term memory. This may occur automatically, without the involvement of attention, or may require additional attentionally demanding processing. For example, the episodic buffer model (Baddeley, 2000) proposes that the executive component of working memory plays a crucial role in the formation of links between different representational formats...

  3. Image processing and pattern recognition with CVIPtools MATLAB toolbox: automatic creation of masks for veterinary thermographic images

    Science.gov (United States)

    Mishra, Deependra K.; Umbaugh, Scott E.; Lama, Norsang; Dahal, Rohini; Marino, Dominic J.; Sackman, Joseph

    2016-09-01

    CVIPtools is a software package for the exploration of computer vision and image processing developed in the Computer Vision and Image Processing Laboratory at Southern Illinois University Edwardsville. CVIPtools is available in three variants - a) CVIPtools Graphical User Interface, b) CVIPtools C library and c) CVIPtools MATLAB toolbox, which makes it accessible to a variety of different users. It offers students, faculty, researchers and any user a free and easy way to explore computer vision and image processing techniques. Many functions have been implemented and are updated on a regular basis, the library has reached a level of sophistication that makes it suitable for both educational and research purposes. In this paper, the detail list of the functions available in the CVIPtools MATLAB toolbox are presented and how these functions can be used in image analysis and computer vision applications. The CVIPtools MATLAB toolbox allows the user to gain practical experience to better understand underlying theoretical problems in image processing and pattern recognition. As an example application, the algorithm for the automatic creation of masks for veterinary thermographic images is presented.

  4. Processing of Mining Induced Seismic Events by Spectra Analyzer Software

    Czech Academy of Sciences Publication Activity Database

    Kaláb, Zdeněk; Lednická, Markéta; Lyubushin, A. A.

    2011-01-01

    Roč. 6, č. 1 (2011), s. 75-83 ISSN 1896-3145. [Ochrona środowiska w górnictwie podziemnym, odkrywkowym i otworowym. Walbrzych, 18.05.2011-20.05.2011] Institutional research plan: CEZ:AV0Z30860518 Keywords : mining seismicity * Spectra Analyzer Software * wavelet decomposition * time-frequency map Subject RIV: DC - Siesmology, Volcanology, Earth Structure

  5. Process acceptance and adjustment techniques for Swiss automatic screw machine parts. Final report

    International Nuclear Information System (INIS)

    Robb, J.M.

    1976-01-01

    Product tolerance requirements for small, cylindrical, piece parts produced on swiss automatic screw machines have progressed to the reliability limits of inspection equipment. The miniature size, configuration, and tolerance requirements (plus or minus 0.0001 in.) (0.00254 mm) of these parts preclude the use of screening techniques to accept product or adjust processes during setup and production runs; therefore, existing means of product acceptance and process adjustment must be refined or new techniques must be developed. The purpose of this endeavor has been to determine benefits gained through the implementation of a process acceptance technique (PAT) to swiss automatic screw machine processes. PAT is a statistical approach developed for the purpose of accepting product and centering processes for parts produced by selected, controlled processes. Through this endeavor a determination has been made of the conditions under which PAT can benefit a controlled process and some specific types of screw machine processes upon which PAT could be applied. However, it was also determined that PAT, if used indiscriminately, may become a record keeping burden when applied to more than one dimension at a given machining operation

  6. Automatic-heuristic and executive-analytic processing during reasoning: Chronometric and dual-task considerations.

    Science.gov (United States)

    De Neys, Wim

    2006-06-01

    Human reasoning has been shown to overly rely on intuitive, heuristic processing instead of a more demanding analytic inference process. Four experiments tested the central claim of current dual-process theories that analytic operations involve time-consuming executive processing whereas the heuristic system would operate automatically. Participants solved conjunction fallacy problems and indicative and deontic selection tasks. Experiment 1 established that making correct analytic inferences demanded more processing time than did making heuristic inferences. Experiment 2 showed that burdening the executive resources with an attention-demanding secondary task decreased correct, analytic responding and boosted the rate of conjunction fallacies and indicative matching card selections. Results were replicated in Experiments 3 and 4 with a different secondary-task procedure. Involvement of executive resources for the deontic selection task was less clear. Findings validate basic processing assumptions of the dual-process framework and complete the correlational research programme of K. E. Stanovich and R. F. West (2000).

  7. A dual-task investigation of automaticity in visual word processing

    Science.gov (United States)

    McCann, R. S.; Remington, R. W.; Van Selst, M.

    2000-01-01

    An analysis of activation models of visual word processing suggests that frequency-sensitive forms of lexical processing should proceed normally while unattended. This hypothesis was tested by having participants perform a speeded pitch discrimination task followed by lexical decisions or word naming. As the stimulus onset asynchrony between the tasks was reduced, lexical-decision and naming latencies increased dramatically. Word-frequency effects were additive with the increase, indicating that frequency-sensitive processing was subject to postponement while attention was devoted to the other task. Either (a) the same neural hardware shares responsibility for lexical processing and central stages of choice reaction time task processing and cannot perform both computations simultaneously, or (b) lexical processing is blocked in order to optimize performance on the pitch discrimination task. Either way, word processing is not as automatic as activation models suggest.

  8. Expected requirements in support tool for software process improvement in SMEs

    OpenAIRE

    Muñoz Mata, Mirna; Mejía Miranda, Jezreel; Amescua Seco, Antonio; Calvo-Manzano Villalón, José Antonio; Cuevas Agustín, Gonzalo; San Feliu Gilabert, Tomás

    2012-01-01

    Nowadays being competitive is an important challenge for software development organizations. In order to achieve this, since last years, software process improvement has been an obvious and logical way. Unfortunately, even when many organizations are motivated to implement software process initiatives, not all know how best to do so, especially in Small and Medium Enterprises (SMEs) where due to its especial features, they have to be carefully in how to manage its resources to assure their ma...

  9. Data-driven management using quantitative metric and automatic auditing program (QMAP) improves consistency of radiation oncology processes.

    Science.gov (United States)

    Yu, Naichang; Xia, Ping; Mastroianni, Anthony; Kolar, Matthew D; Chao, Samuel T; Greskovich, John F; Suh, John H

    Process consistency in planning and delivery of radiation therapy is essential to maintain patient safety and treatment quality and efficiency. Ensuring the timely completion of each critical clinical task is one aspect of process consistency. The purpose of this work is to report our experience in implementing a quantitative metric and automatic auditing program (QMAP) with a goal of improving the timely completion of critical clinical tasks. Based on our clinical electronic medical records system, we developed a software program to automatically capture the completion timestamp of each critical clinical task while providing frequent alerts of potential delinquency. These alerts were directed to designated triage teams within a time window that would offer an opportunity to mitigate the potential for late completion. Since July 2011, 18 metrics were introduced in our clinical workflow. We compared the delinquency rates for 4 selected metrics before the implementation of the metric with the delinquency rate of 2016. One-tailed Student t test was used for statistical analysis RESULTS: With an average of 150 daily patients on treatment at our main campus, the late treatment plan completion rate and late weekly physics check were reduced from 18.2% and 8.9% in 2011 to 4.2% and 0.1% in 2016, respectively (P < .01). The late weekly on-treatment physician visit rate was reduced from 7.2% in 2012 to <1.6% in 2016. The yearly late cone beam computed tomography review rate was reduced from 1.6% in 2011 to <0.1% in 2016. QMAP is effective in reducing late completions of critical tasks, which can positively impact treatment quality and patient safety by reducing the potential for errors resulting from distractions, interruptions, and rush in completion of critical tasks. Copyright © 2016 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  10. Associative priming in a masked perceptual identification task: evidence for automatic processes.

    Science.gov (United States)

    Pecher, Diane; Zeelenberg, René; Raaijmakers, Jeroen G W

    2002-10-01

    Two experiments investigated the influence of automatic and strategic processes on associative priming effects in a perceptual identification task in which prime-target pairs are briefly presented and masked. In this paradigm, priming is defined as a higher percentage of correctly identified targets for related pairs than for unrelated pairs. In Experiment 1, priming was obtained for mediated word pairs. This mediated priming effect was affected neither by the presence of direct associations nor by the presentation time of the primes, indicating that automatic priming effects play a role in perceptual identification. Experiment 2 showed that the priming effect was not affected by the proportion (.90 vs. .10) of related pairs if primes were presented briefly to prevent their identification. However, a large proportion effect was found when primes were presented for 1000 ms so that they were clearly visible. These results indicate that priming in a masked perceptual identification task is the result of automatic processes and is not affected by strategies. The present paradigm provides a valuable alternative to more commonly used tasks such as lexical decision.

  11. Is place-value processing in four-digit numbers fully automatic? Yes, but not always.

    Science.gov (United States)

    García-Orza, Javier; Estudillo, Alejandro J; Calleja, Marina; Rodríguez, José Miguel

    2017-12-01

    Knowing the place-value of digits in multi-digit numbers allows us to identify, understand and distinguish between numbers with the same digits (e.g., 1492 vs. 1942). Research using the size congruency task has shown that the place-value in a string of three zeros and a non-zero digit (e.g., 0090) is processed automatically. In the present study, we explored whether place-value is also automatically activated when more complex numbers (e.g., 2795) are presented. Twenty-five participants were exposed to pairs of four-digit numbers that differed regarding the position of some digits and their physical size. Participants had to decide which of the two numbers was presented in a larger font size. In the congruent condition, the number shown in a bigger font size was numerically larger. In the incongruent condition, the number shown in a smaller font size was numerically larger. Two types of numbers were employed: numbers composed of three zeros and one non-zero digit (e.g., 0040-0400) and numbers composed of four non-zero digits (e.g., 2795-2759). Results showed larger congruency effects in more distant pairs in both type of numbers. Interestingly, this effect was considerably stronger in the strings composed of zeros. These results indicate that place-value coding is partially automatic, as it depends on the perceptual and numerical properties of the numbers to be processed.

  12. Automatic tissue image segmentation based on image processing and deep learning

    Science.gov (United States)

    Kong, Zhenglun; Luo, Junyi; Xu, Shengpu; Li, Ting

    2018-02-01

    Image segmentation plays an important role in multimodality imaging, especially in fusion structural images offered by CT, MRI with functional images collected by optical technologies or other novel imaging technologies. Plus, image segmentation also provides detailed structure description for quantitative visualization of treating light distribution in the human body when incorporated with 3D light transport simulation method. Here we used image enhancement, operators, and morphometry methods to extract the accurate contours of different tissues such as skull, cerebrospinal fluid (CSF), grey matter (GM) and white matter (WM) on 5 fMRI head image datasets. Then we utilized convolutional neural network to realize automatic segmentation of images in a deep learning way. We also introduced parallel computing. Such approaches greatly reduced the processing time compared to manual and semi-automatic segmentation and is of great importance in improving speed and accuracy as more and more samples being learned. Our results can be used as a criteria when diagnosing diseases such as cerebral atrophy, which is caused by pathological changes in gray matter or white matter. We demonstrated the great potential of such image processing and deep leaning combined automatic tissue image segmentation in personalized medicine, especially in monitoring, and treatments.

  13. Automatic differentiation tools in the dynamic simulation of chemical engineering processes

    Directory of Open Access Journals (Sweden)

    Castro M.C.

    2000-01-01

    Full Text Available Automatic Differentiation is a relatively recent technique developed for the differentiation of functions applicable directly to the source code to compute the function written in standard programming languages. That technique permits the automatization of the differentiation step, crucial for dynamic simulation and optimization of processes. The values for the derivatives obtained with AD are exact (to roundoff. The theoretical exactness of the AD comes from the fact that it uses the same rules of differentiation as in differential calculus, but these rules are applied to an algorithmic specification of the function rather than to a formula. The main purpose of this contribution is to discuss the impact of Automatic Differentiation in the field of dynamic simulation of chemical engineering processes. The influence of the differentiation technique on the behavior of the integration code, the performance of the generated code and the incorporation of AD tools in consistent initialization tools are discussed from the viewpoint of dynamic simulation of typical models in chemical engineering.

  14. A Case of Engineering Quality for Mobile Healthcare Applications Using Augmented Personal Software Process Improvement

    Directory of Open Access Journals (Sweden)

    Shahbaz Ahmed Khan Ghayyur

    2016-01-01

    Full Text Available Mobile healthcare systems are currently considered as key research areas in the domain of software engineering. The adoption of modern technologies, for mobile healthcare systems, is a quick option for industry professionals. Software architecture is a key feature that contributes towards a software product, solution, or services. Software architecture helps in better communication, documentation of design decisions, risks identification, basis for reusability, scalability, scheduling, and reduced maintenance cost and lastly it helps to avoid software failures. Hence, in order to solve the abovementioned issues in mobile healthcare, the software architecture is integrated with personal software process. Personal software process has been applied successfully but it is unable to address the issues related to architectural design and evaluation capabilities. Hence, a new technique architecture augmented personal process is presented in order to enhance the quality of the mobile healthcare systems through the use of architectural design with integration of personal software process. The proposed process was validated by case studies. It was found that the proposed process helped in reducing the overall costs and effort. Moreover, an improved architectural design helped in development of high quality mobile healthcare system.

  15. ERPs reveal deficits in automatic cerebral stimulus processing in patients with NIDDM.

    Science.gov (United States)

    Vanhanen, M; Karhu, J; Koivisto, K; Pääkkönen, A; Partanen, J; Laakso, M; Riekkinen, P

    1996-11-04

    We compared auditory event-related potentials (ERPs) and neuropsychological test scores in nine patients with non-insulin-dependent diabetes mellitus (NIDDM) and in nine control subjects. The measures of automatic stimulus processing, habituation of auditory N100 and mismatch negativity (MMN) were impaired in patients. No differences were observed in the N2b and P3 components, which presumably reflect conscious cognitive analysis of the stimuli. A trend towards impaired performance in the Digit Span backward was found in diabetic subjects, but in the tests of secondary or long-term memory the groups were comparable. Patients with NIDDM may have defects in arousal and in the automatic ability to redirect attention, which can affect their cognitive performance.

  16. ACTIV - a program for automatic processing of gamma-ray spectra

    International Nuclear Information System (INIS)

    Zlokazov, V.B.

    1982-01-01

    Program ACTIV is intended for precise analysis of γ-rays and X-ray spectra and allows the user to carry out the full cycle of automatic processing of a series of spectra, i.e. calibration, automatic peak search, determination of peak positions and areas, identification of the radioisotopes and the transformation of the areas found into masses of isotopes in the irradiated sample. ACTIV uses a complex mathematical technique and is oriented mainly to large computers, but using overlaid loading, it can be run also on small computers like the PDP 11/70. Compared with other similar programs, ACTIV has some advantages in accuracy of peak shape description and in the reliability of the peak search and its least-square analysis. The program can be used for the purpose of activation analysis. The program can analyze spectra with poor statistics and with broad and narrow peaks. (orig.)

  17. Learning algorithms and automatic processing of languages; Algorithmes a apprentissage et traitement automatique des langues

    Energy Technology Data Exchange (ETDEWEB)

    Fluhr, Christian Yves Andre

    1977-06-15

    This research thesis concerns the field of artificial intelligence. It addresses learning algorithms applied to automatic processing of languages. The author first briefly describes some mechanisms of human intelligence in order to describe how these mechanisms are simulated on a computer. He outlines the specific role of learning in various manifestations of intelligence. Then, based on the Markov's algorithm theory, the author discusses the notion of learning algorithm. Two main types of learning algorithms are then addressed: firstly, an 'algorithm-teacher dialogue' type sanction-based algorithm which aims at learning how to solve grammatical ambiguities in submitted texts; secondly, an algorithm related to a document system which structures semantic data automatically obtained from a set of texts in order to be able to understand by references to any question on the content of these texts.

  18. The Automatic Conservative: Ideology-Based Attentional Asymmetries in the Processing of Valenced Information

    Science.gov (United States)

    Carraro, Luciana; Castelli, Luigi; Macchiella, Claudia

    2011-01-01

    Research has widely explored the differences between conservatives and liberals, and it has been also recently demonstrated that conservatives display different reactions toward valenced stimuli. However, previous studies have not yet fully illuminated the cognitive underpinnings of these differences. In the current work, we argued that political ideology is related to selective attention processes, so that negative stimuli are more likely to automatically grab the attention of conservatives as compared to liberals. In Experiment 1, we demonstrated that negative (vs. positive) information impaired the performance of conservatives, more than liberals, in an Emotional Stroop Task. This finding was confirmed in Experiment 2 and in Experiment 3 employing a Dot-Probe Task, demonstrating that threatening stimuli were more likely to attract the attention of conservatives. Overall, results support the conclusion that people embracing conservative views of the world display an automatic selective attention for negative stimuli. PMID:22096486

  19. A Survey of Commonly Applied Methods for Software Process Improvement

    Science.gov (United States)

    1994-02-01

    Kaoru Ishikawa [ Ishikawa 85] under the label "total quality control" or "TOC." TOM is at use in many software organizations, to a greater or lesser...destructive side, and that they are difficult to dislodge or modify once they are in place. Ishikawa [ Ishikawa 85] notes that *even when industrial...32 CMU/SEI-93-TR-27 [ Ishikawa 85] Ishikawa , K., What is Total Quality Control? The Japanese Way. Translated by David J. Lu, Prentice-Hall, Englewood

  20. Computer-aided recording of automatic endoscope washing and disinfection processes as an integral part of medical documentation for quality assurance purposes

    Directory of Open Access Journals (Sweden)

    Klein Stefanie

    2010-07-01

    Full Text Available Abstract Background The reprocessing of medical endoscopes is carried out using automatic cleaning and disinfection machines. The documentation and archiving of records of properly conducted reprocessing procedures is the last and increasingly important part of the reprocessing cycle for flexible endoscopes. Methods This report describes a new computer program designed to monitor and document the automatic reprocessing of flexible endoscopes and accessories in fully automatic washer-disinfectors; it does not contain nor compensate the manual cleaning step. The program implements national standards for the monitoring of hygiene in flexible endoscopes and the guidelines for the reprocessing of medical products. No FDA approval has been obtained up to now. The advantages of this newly developed computer program are firstly that it simplifies the documentation procedures of medical endoscopes and that it could be used universally with any washer-disinfector and that it is independent of the various interfaces and software products provided by the individual suppliers of washer-disinfectors. Results The computer program presented here has been tested on a total of four washer-disinfectors in more than 6000 medical examinations within 9 months. Conclusions We present for the first time an electronic documentation system for automated washer-disinfectors for medical devices e.g. flexible endoscopes which can be used on any washer-disinfectors that documents the procedures involved in the automatic cleaning process and can be easily connected to most hospital documentation systems.

  1. The use of process simulation models in virtual commissioning of process automation software in drinking water treatment plants

    NARCIS (Netherlands)

    Worm, G.I.M.; Kelderman, J.P.; Lapikas, T.; Van der Helm, A.W.C.; Van Schagen, K.M.; Rietveld, L.C.

    2012-01-01

    This research deals with the contribution of process simulation models to the factory acceptance test (FAT) of process automation (PA) software of drinking water treatment plants. Two test teams tested the same piece of modified PA-software. One team used an advanced virtual commissioning (AVC)

  2. Product focused software process improvement (P-SPI) : concepts and their application

    NARCIS (Netherlands)

    Solingen, van D.M.; Kusters, R.J.; Trienekens, J.J.M.; Uijtrecht, van A.

    1999-01-01

    Management problems in the development of software have been addressed over recent years by a focus on improvement of the development process. This paper states that software process improvement (SPI) should have an explicit product focus. The practical implementation of a method for product-focused

  3. Designing dependable process-oriented software - a CSP-based approach

    NARCIS (Netherlands)

    Jovanovic, D.S.

    2006-01-01

    This thesis advocates dependability as a crucial aspect of software quality. Process orientation,as it is defined in this thesis, concentrates on the notion of a process as a basic building component of a dataflow-centred software architecture. The dependability approach in the proposed variant of

  4. Development of geophysical and geochemical data processing software based on component GIS

    International Nuclear Information System (INIS)

    Ke Dan; Yu Xiang; Wu Qubo; Han Shaoyang; Li Xi

    2013-01-01

    Based on component GIS and mixed programming techniques, a software which combines the basic GIS functions, conventional and unconventional data process methods for the regional geophysical and geochemical data together, is designed and developed. The software has many advantages, such as friendly interface, easy to use and utility functions and provides a useful platform for regional geophysical and geochemical data processing. (authors)

  5. Study of an automatic dosing of neptunium in the industrial process of separation neptunium 237-plutonium 238

    International Nuclear Information System (INIS)

    Ros, Pierre

    1973-01-01

    The objective is to study and to adapt a method of automatic dosing of neptunium to the industrial process of separation and purification of plutonium 238, while taking the information quality and economic aspects into account. After a recall of some generalities on the production of plutonium 238, and the process of separation plutonium-neptunium, the author addresses the dosing of neptunium. The adopted measurement technique is spectrophotometry (of neptunium, of neptunium peroxide) which is the most flexible and economic to adapt to automatic control. The author proposes a project of chemical automatic machine, and discusses the complex (stoichiometry, form) and some aspects of neptunium dosing (redox reactions, process control) [fr

  6. Tuning COCOMO-II for Software Process Improvement: A Tool Based Approach

    Directory of Open Access Journals (Sweden)

    SYEDA UMEMA HANI

    2016-10-01

    Full Text Available In order to compete in the international software development market the software organizations have to adopt internationally accepted software practices i.e. standard like ISO (International Standard Organization or CMMI (Capability Maturity Model Integration in spite of having scarce resources and tools. The aim of this study is to develop a tool which could be used to present an actual picture of Software Process Improvement benefits in front of the software development companies. However, there are few tools available to assist in making predictions, they are too expensive and could not cover dataset that reflect the cultural behavior of organizations for software development in developing countries. In extension to our previously done research reported elsewhere for Pakistani software development organizations which has quantified benefits of SDPI (Software Development Process Improvement, this research has used sixty-two datasets from three different software development organizations against the set of metrics used in COCOMO-II (Constructive Cost Model 2000. It derived a verifiable equation for calculating ISF (Ideal Scale Factor and tuned the COCOMO-II model to bring prediction capability for SDPI (benefit measurement classes such as ESCP (Effort, Schedule, Cost, and Productivity. This research has contributed towards software industry by giving a reliable and low-cost mechanism for generating prediction models with high prediction accuracy. Hopefully, this study will help software organizations to use this tool not only to predict ESCP but also to predict an exact impact of SDPI.

  7. Hazard Analysis of Software Requirements Specification for Process Module of FPGA-based Controllers in NPP

    Energy Technology Data Exchange (ETDEWEB)

    Jung; Sejin; Kim, Eui-Sub; Yoo, Junbeom [Konkuk University, Seoul (Korea, Republic of); Keum, Jong Yong; Lee, Jang-Soo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Software in PLC, FPGA which are used to develop I and C system also should be analyzed to hazards and risks before used. NUREG/CR-6430 proposes the method for performing software hazard analysis. It suggests analysis technique for software affected hazards and it reveals that software hazard analysis should be performed with the aspects of software life cycle such as requirements analysis, design, detailed design, implements. It also provides the guide phrases for applying software hazard analysis. HAZOP (Hazard and operability analysis) is one of the analysis technique which is introduced in NUREG/CR-6430 and it is useful technique to use guide phrases. HAZOP is sometimes used to analyze the safety of software. Analysis method of NUREG/CR-6430 had been used in Korea nuclear power plant software for PLC development. Appropriate guide phrases and analysis process are selected to apply efficiently and NUREG/CR-6430 provides applicable methods for software hazard analysis is identified in these researches. We perform software hazard analysis of FPGA software requirements specification with two approaches which are NUREG/CR-6430 and HAZOP with using general GW. We also perform the comparative analysis with them. NUREG/CR-6430 approach has several pros and cons comparing with the HAZOP with general guide words and approach. It is enough applicable to analyze the software requirements specification of FPGA.

  8. Mapping modern software process engineering techniques onto an HEP development environment

    CERN Document Server

    Wellisch, J P

    2003-01-01

    One of the most challenging issues faced in HEP in recent years is the question of how to capitalise on software development and maintenance experience in a continuous manner. To capitalise means in our context to evaluate and apply new process technologies as they arise, and to further evolve technologies already widely in use. It also implies the definition and adoption of standards. The CMS off- line software improvement effort aims at continual software quality improvement, and continual improvement in the efficiency of the working environment with the goal to facilitate doing great new physics. To achieve this, we followed a process improvement program based on ISO-15504, and Rational Unified Process. This experiment in software process improvement in HEP has been progressing now for a period of 3 years. Taking previous experience from ATLAS and SPIDER into account, we used a soft approach of continuous change within the limits of current culture to create of de facto software process standards within th...

  9. State-of-the-Art: Evolution of Software Life Cycle Process for NPPs

    International Nuclear Information System (INIS)

    Suh, Yong Suk; Park, Heui Youn; Son, Ki Sung; Lee, Ki Hyun; Kim, Hyeon Soo

    2007-01-01

    This paper is to investigate the evolution of software life cycle process (SLCP) for nuclear power plants (NPPs) based on IEEE Std 7-4.3.2 which has been updated twice (namely 1993 and 2003 ) since it was published in 1982 and relevant software certifications. IEEE Std 7-4.3.2 specifies additional computer specific requirements to supplement the criteria and requirements of IEEE Std 603. It also specifies the software quality requirements as follows: computer software shall be developed, modified, or accepted in accordance with an approved software quality assurance (QA) plan. IEEE Std 7-4.3.2-1982 specifies a minimum software development process as follows: plan, design and implementation. ANSI/ASME NQA-1-1979 is not directly related to software development process but to overall quality assurance criteria. IEEE Std 7-4.3.2-1993 addresses ASME NQA-2a-1990 Part 2.7 for software development requirements. ASME NQA-2a-1990 Part 2.7 which was interpreted into KEPIC QAP-2 II.7, specifies software development process in more detail as follows: requirements, design, implementation, test, installation and checkout, operation and maintenance, and retirement. Along with this, software QA plan is emphasized in IEEE Std 730-1989. In IEEE Std 7-4.3.2-2003, IEEE/EIA Std 12207.0-1996 replaces the ASME NQA as a requirement for software development. The evolution of SLCP from ASME NQA to IEEE/EIA Std 12207.0 is discussed in Section 2 of this paper. The publication of IEEE/EIA Std 12207.0 is motivated from industrial experiences and practices to promote the quality of software. In Section 3, three international software certifications relating to the IEEE/EIA Std 12207.0 are introduced

  10. A Process Framework for Designing Software Reference Architectures for Providing Tools as a Service

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Babar, Muhammad Ali; Probst, Christian W.

    2016-01-01

    of software systems need customized and systematic SRA design and evaluation methods. In this paper, we present a software Reference Architecture Design process Framework (RADeF) that can be used for analysis, design and evaluation of the SRA for provisioning of Tools as a Service as part of a cloud......Software Reference Architecture (SRA), which is a generic architecture solution for a specific type of software systems, provides foundation for the design of concrete architectures in terms of architecture design guidelines and architecture elements. The complexity and size of certain types......-enabled workSPACE (TSPACE). The framework is based on the state of the art results from literature and our experiences with designing software architectures for cloud-based systems. We have applied RADeF SRA design two types of TSPACE: software architecting TSPACE and software implementation TSPACE...

  11. Automatic processing of gamma ray spectra employing classical and modified Fourier transform approach

    International Nuclear Information System (INIS)

    Rattan, S.S.; Madan, V.K.

    1994-01-01

    This report describes methods for automatic processing of gamma ray spectra acquired with HPGe detectors. The processing incorporated both classical and signal processing approach. The classical method was used for smoothing, detecting significant peaks, finding peak envelope limits and a proposed method of finding peak limits, peak significance index, full width at half maximum, detecting doublets for further analysis. To facilitate application of signal processing to nuclear spectra, Madan et al. gave a new classification of signals and identified nuclear spectra as Type II signals, mathematically formalized modified Fourier transform and pioneered its application to process doublet envelopes acquired with modern spectrometers. It was extended to facilitate routine analysis of the spectra. A facility for energy and efficiency calibration was also included. The results obtained by analyzing observed gamma-ray spectra using the above approach compared favourably with those obtained with SAMPO and also those derived from table of radioisotopes. (author). 15 refs., 3 figs., 3 tabs

  12. Generating Impact Maps from Automatically Detected Bomb Craters in Aerial Wartime Images Using Marked Point Processes

    Science.gov (United States)

    Kruse, Christian; Rottensteiner, Franz; Hoberg, Thorsten; Ziems, Marcel; Rebke, Julia; Heipke, Christian

    2018-04-01

    The aftermath of wartime attacks is often felt long after the war ended, as numerous unexploded bombs may still exist in the ground. Typically, such areas are documented in so-called impact maps which are based on the detection of bomb craters. This paper proposes a method for the automatic detection of bomb craters in aerial wartime images that were taken during the Second World War. The object model for the bomb craters is represented by ellipses. A probabilistic approach based on marked point processes determines the most likely configuration of objects within the scene. Adding and removing new objects to and from the current configuration, respectively, changing their positions and modifying the ellipse parameters randomly creates new object configurations. Each configuration is evaluated using an energy function. High gradient magnitudes along the border of the ellipse are favored and overlapping ellipses are penalized. Reversible Jump Markov Chain Monte Carlo sampling in combination with simulated annealing provides the global energy optimum, which describes the conformance with a predefined model. For generating the impact map a probability map is defined which is created from the automatic detections via kernel density estimation. By setting a threshold, areas around the detections are classified as contaminated or uncontaminated sites, respectively. Our results show the general potential of the method for the automatic detection of bomb craters and its automated generation of an impact map in a heterogeneous image stock.

  13. Process mining can be applied to software too!

    NARCIS (Netherlands)

    Rubin, V.A.; Mitsyuk, A.A.; Lomazova, I.A.; Aalst, van der W.M.P.

    2014-01-01

    Modern information systems produce tremendous amounts of event data. The area of process mining deals with extracting knowledge from this data. Real-life processes can be effectively discovered, analyzed and optimized with the help of mature process mining techniques. There is a variety of process

  14. Improving Video Game Development: Facilitating Heterogeneous Team Collaboration through Flexible Software Processes

    Science.gov (United States)

    Musil, Juergen; Schweda, Angelika; Winkler, Dietmar; Biffl, Stefan

    Based on our observations of Austrian video game software development (VGSD) practices we identified a lack of systematic processes/method support and inefficient collaboration between various involved disciplines, i.e. engineers and artists. VGSD includes heterogeneous disciplines, e.g. creative arts, game/content design, and software. Nevertheless, improving team collaboration and process support is an ongoing challenge to enable a comprehensive view on game development projects. Lessons learned from software engineering practices can help game developers to increase game development processes within a heterogeneous environment. Based on a state of the practice survey in the Austrian games industry, this paper presents (a) first results with focus on process/method support and (b) suggests a candidate flexible process approach based on Scrum to improve VGSD and team collaboration. Results showed (a) a trend to highly flexible software processes involving various disciplines and (b) identified the suggested flexible process approach as feasible and useful for project application.

  15. Semi-automatic software based detection of atrial fibrillation in acute ischaemic stroke and transient ischaemic attack

    DEFF Research Database (Denmark)

    Nickelsen, M N; Snoer, A; Ali, A M

    2017-01-01

    (12.6%) patients were diagnosed with PAF (atrial fibrillation > 30 s). Pathfinder SL software including a systematic control of events registered 16 (94.1%) patients with PAF. Manually 15 (88.2%) patients were detected with PAF. Pathfinder SL had a negative predictive value of 99% and sensitivity...

  16. Adjustment of automatic control systems of production facilities at coal processing plants using multivariant physico- mathematical models

    Science.gov (United States)

    Evtushenko, V. F.; Myshlyaev, L. P.; Makarov, G. V.; Ivushkin, K. A.; Burkova, E. V.

    2016-10-01

    The structure of multi-variant physical and mathematical models of control system is offered as well as its application for adjustment of automatic control system (ACS) of production facilities on the example of coal processing plant.

  17. A fully automatic processing chain to produce Burn Scar Mapping products, using the full Landsat archive over Greece

    Science.gov (United States)

    Kontoes, Charalampos; Papoutsis, Ioannis; Herekakis, Themistoklis; Michail, Dimitrios; Ieronymidi, Emmanuela

    2013-04-01

    Remote sensing tools for the accurate, robust and timely assessment of the damages inflicted by forest wildfires provide information that is of paramount importance to public environmental agencies and related stakeholders before, during and after the crisis. The Institute for Astronomy, Astrophysics, Space Applications and Remote Sensing of the National Observatory of Athens (IAASARS/NOA) has developed a fully automatic single and/or multi date processing chain that takes as input archived Landsat 4, 5 or 7 raw images and produces precise diachronic burnt area polygons and damage assessments over the Greek territory. The methodology consists of three fully automatic stages: 1) the pre-processing stage where the metadata of the raw images are extracted, followed by the application of the LEDAPS software platform for calibration and mask production and the Automated Precise Orthorectification Package, developed by NASA, for image geo-registration and orthorectification, 2) the core-BSM (Burn Scar Mapping) processing stage which incorporates a published classification algorithm based on a series of physical indexes, the application of two filters for noise removal using graph-based techniques and the grouping of pixels classified as burnt to form the appropriate pixels clusters before proceeding to conversion from raster to vector, and 3) the post-processing stage where the products are thematically refined and enriched using auxiliary GIS layers (underlying land cover/use, administrative boundaries, etc.) and human logic/evidence to suppress false alarms and omission errors. The established processing chain has been successfully applied to the entire archive of Landsat imagery over Greece spanning from 1984 to 2012, which has been collected and managed in IAASARS/NOA. The number of full Landsat frames that were subject of process in the framework of the study was 415. These burn scar mapping products are generated for the first time to such a temporal and spatial

  18. Comparison of automatic quantification software for the measurement of ventricular volume and ejection fraction in gated myocardial perfusion SPECT

    International Nuclear Information System (INIS)

    Van Staden, J.A.; Herbst, C.P.; Du Raan, H.; Lotter, M.G.; Otto, A.C.

    2004-01-01

    Full text: Introduction: Gated myocardial perfusion SPECT has been used to calculate left ventricular ejection fraction (LVEF) and left ventricular end-diastolic volume (LVEDV) and has correlated well with conventional methods. However, the comparative accuracy of and correlations across various types of gated SPECT software are not well understood. Materials and methods: Twelve patients participated in a radionuclide gated blood-pool (GBP) study in addition to undergoing 99m Tc-sestamibi gated SPECT. Three different software algorithms, Quantitative Gated SPECT (QGS) from Cedars-Sinai, MultiDim from Stanford University Medical School and GQUANT from Alfa Nuclear were used to compute LVEF and LVEDV. These software algorithms operate in 3-dimensional space, two dependent on surface detection and the other on statistical parameters. The LVEF as calculated from gated SPECT myocardial perfusion images were compared with LVEF calculated from the GBP studies in the same patients to assess accuracy of the three software algorithms. Results: The software success-rate was 92% (11/12 pts) for MultiDim and 100% for the QGS and GQUANT. Agreement between LVEF measured with MultiDim and QGS, MultiDim and GQUANT and QGS and GQUANT were excellent (LVEF-MuItidim 0.80 LVEF QGS +5.02, r = 0.93, LVEF GQUANT = 1.10 LVEF MuItidim -1.33, r 0.90 and LVEF GQUANT = 1.02 LVEF QGS -1.40, r = 0.96). The correlation coefficient for LVEF between gated SPECT and the GBP study was 0.95, 0.95 and 0.97, for MultiDim, GQUANT and QGS, respectively. Conclusion: All 3 software programs showed good correlation between LVEF for gated SPECT and the GBP study. Good agreement for LVEF was observed also between the three software algorithms. However, because each method has unique characteristics that depend on its specific algorithm and thus behaves differently in the various patients, the methods should not be used interchangeably. (author)

  19. Image processing applied to automatic detection of defects during ultrasonic examination

    International Nuclear Information System (INIS)

    Moysan, J.

    1992-10-01

    This work is a study about image processing applied to ultrasonic BSCAN images which are obtained in the field of non destructive testing of weld. The goal is to define what image processing techniques can bring to ameliorate the exploitation of the data collected and, more precisely, what image processing can do to extract the meaningful echoes which enable to characterize and to size the defects. The report presents non destructive testing by ultrasounds in the nuclear field and it indicates specificities of the propagation of ultrasonic waves in austenitic weld. It gives a state of the art of the data processing applied to ultrasonic images in nondestructive evaluation. A new image analysis is then developed. It is based on a powerful tool, the co-occurrence matrix. This matrix enables to represent, in a whole representation, relations between amplitudes of couples of pixels. From the matrix analysis, a new complete and automatic method has been set down in order to define a threshold which separates echoes from noise. An automatic interpretation of the ultrasonic echoes is then possible. Complete validation has been done with standard pieces

  20. An open software system based on X Windows for process control and equipment monitoring

    International Nuclear Information System (INIS)

    Aimar, A.; Carlier, E.; Mertens, V.

    1992-01-01

    The construction and application of a configurable open software system for process control and equipment monitoring can speed up and simplify the development and maintenance of equipment specific software as compared to individual solutions. The present paper reports the status of such an approach for the distributed control systems of SPS and LEP beam transfer components, based on X Windows and the OSF/Motif tool kit and applying data modeling and software engineering methods. (author)