WorldWideScience

Sample records for software correction stabilizatsiya

  1. Incremental Interactive Verification of the Correctness of Object-Oriented Software

    DEFF Research Database (Denmark)

    Mehnert, Hannes

    Development of correct object-oriented software is difficult, in particular if a formalised proof of its correctness is demanded. A lot of current software is developed using the object-oriented programming paradigm. This paradigm compensated for safety and security issues with imperative...... structure. For efficiency, our implementation uses copy-on-write and shared mutable data, not observable by a client. I further use this data structure to verify the correctness of a solution to the point location problem. The results demonstrate that I am able to verify the correctness of object-oriented...... programming, such as manual memory management. Popularly used integrated development environments (IDEs) provide features such as debugging and unit testing to facilitate development of robust software, but hardly any development environment supports the development of provable correct software. A tight...

  2. Software for Correcting the Dynamic Error of Force Transducers

    Directory of Open Access Journals (Sweden)

    Naoki Miyashita

    2014-07-01

    Full Text Available Software which corrects the dynamic error of force transducers in impact force measurements using their own output signal has been developed. The software corrects the output waveform of the transducers using the output waveform itself, estimates its uncertainty and displays the results. In the experiment, the dynamic error of three transducers of the same model are evaluated using the Levitation Mass Method (LMM, in which the impact forces applied to the transducers are accurately determined as the inertial force of the moving part of the aerostatic linear bearing. The parameters for correcting the dynamic error are determined from the results of one set of impact measurements of one transducer. Then, the validity of the obtained parameters is evaluated using the results of the other sets of measurements of all the three transducers. The uncertainties in the uncorrected force and those in the corrected force are also estimated. If manufacturers determine the correction parameters for each model using the proposed method, and provide the software with the parameters corresponding to each model, then users can obtain the waveform corrected against dynamic error and its uncertainty. The present status and the future prospects of the developed software are discussed in this paper.

  3. New Software to Help EFL Students Self-Correct Their Writing

    Science.gov (United States)

    Lawley, Jim

    2015-01-01

    This paper describes the development of web-based software at a university in Spain to help students of EFL self-correct their free-form writing. The software makes use of an eighty-million-word corpus of English known to be correct as a normative corpus for error correction purposes. It was discovered that bigrams (two-word combinations of words)…

  4. Improvements for Optics Measurement and Corrections software

    CERN Document Server

    Bach, T

    2013-01-01

    This note presents the improvements for the OMC software during a 14 month technical student internship at CERN. The goal of the work was to improve existing software in terms of maintainability, features and performance. Significant improvements in stability, speed and overall development process were reached. The main software, a Java GUI at the LHC CCC, run for months without noteworthy problems. The overall running time of the software chain used for optics corrections was reduced from nearly half an hour to around two minutes. This was the result of analysing and improving several involved programs and algorithms.

  5. Teaching and Assessment of Mathematical Principles for Software Correctness Using a Reasoning Concept Inventory

    Science.gov (United States)

    Drachova-Strang, Svetlana V.

    2013-01-01

    As computing becomes ubiquitous, software correctness has a fundamental role in ensuring the safety and security of the systems we build. To design and develop software correctly according to their formal contracts, CS students, the future software practitioners, need to learn a critical set of skills that are necessary and sufficient for…

  6. Software correction of scatter coincidence in positron CT

    International Nuclear Information System (INIS)

    Endo, M.; Iinuma, T.A.

    1984-01-01

    This paper describes a software correction of scatter coincidence in positron CT which is based on an estimation of scatter projections from true projections by an integral transform. Kernels for the integral transform are projected distributions of scatter coincidences for a line source at different positions in a water phantom and are calculated by Klein-Nishina's formula. True projections of any composite object can be determined from measured projections by iterative applications of the integral transform. The correction method was tested in computer simulations and phantom experiments with Positologica. The results showed that effects of scatter coincidence are not negligible in the quantitation of images, but the correction reduces them significantly. (orig.)

  7. Software Package for Optics Measurement and Correction in the LHC

    CERN Document Server

    Aiba, M; Tomas, R; Vanbavinckhove, G

    2010-01-01

    A software package has been developed for the LHC on-line optics measurement and correction. This package includes several different algorithms to measure phase advance, beta functions, dispersion, coupling parameters and even some non-linear terms. A Graphical User Interface provides visualization tools to compare measurements to model predictions, fit analytical formula, localize error sources and compute and send corrections to the hardware.

  8. Software for Generating Troposphere Corrections for InSAR Using GPS and Weather Model Data

    Science.gov (United States)

    Moore, Angelyn W.; Webb, Frank H.; Fishbein, Evan F.; Fielding, Eric J.; Owen, Susan E.; Granger, Stephanie L.; Bjoerndahl, Fredrik; Loefgren, Johan; Fang, Peng; Means, James D.; hide

    2013-01-01

    Atmospheric errors due to the troposphere are a limiting error source for spaceborne interferometric synthetic aperture radar (InSAR) imaging. This software generates tropospheric delay maps that can be used to correct atmospheric artifacts in InSAR data. The software automatically acquires all needed GPS (Global Positioning System), weather, and Digital Elevation Map data, and generates a tropospheric correction map using a novel algorithm for combining GPS and weather information while accounting for terrain. Existing JPL software was prototypical in nature, required a MATLAB license, required additional steps to acquire and ingest needed GPS and weather data, and did not account for topography in interpolation. Previous software did not achieve a level of automation suitable for integration in a Web portal. This software overcomes these issues. GPS estimates of tropospheric delay are a source of corrections that can be used to form correction maps to be applied to InSAR data, but the spacing of GPS stations is insufficient to remove short-wavelength tropospheric artifacts. This software combines interpolated GPS delay with weather model precipitable water vapor (PWV) and a digital elevation model to account for terrain, increasing the spatial resolution of the tropospheric correction maps and thus removing short wavelength tropospheric artifacts to a greater extent. It will be integrated into a Web portal request system, allowing use in a future L-band SAR Earth radar mission data system. This will be a significant contribution to its technology readiness, building on existing investments in in situ space geodetic networks, and improving timeliness, quality, and science value of the collected data

  9. Metric-based method of software requirements correctness improvement

    Directory of Open Access Journals (Sweden)

    Yaremchuk Svitlana

    2017-01-01

    Full Text Available The work highlights the most important principles of software reliability management (SRM. The SRM concept construes a basis for developing a method of requirements correctness improvement. The method assumes that complicated requirements contain more actual and potential design faults/defects. The method applies a newer metric to evaluate the requirements complexity and double sorting technique evaluating the priority and complexity of a particular requirement. The method enables to improve requirements correctness due to identification of a higher number of defects with restricted resources. Practical application of the proposed method in the course of demands review assured a sensible technical and economic effect.

  10. A software-based x-ray scatter correction method for breast tomosynthesis

    OpenAIRE

    Jia Feng, Steve Si; Sechopoulos, Ioannis

    2011-01-01

    Purpose: To develop a software-based scatter correction method for digital breast tomosynthesis (DBT) imaging and investigate its impact on the image quality of tomosynthesis reconstructions of both phantoms and patients.

  11. Classifying Desirable Features of Software Visualization Tools for Corrective Maintenance

    NARCIS (Netherlands)

    Sensalire, Mariam; Ogao, Patrick; Telea, Alexandru

    2008-01-01

    We provide an evaluation of 15 software visualization tools applicable to corrective maintenance. The tasks supported as well as the techniques used are presented and graded based on the support level. By analyzing user acceptation of current tools, we aim to help developers to select what to

  12. An open-source software program for performing Bonferroni and related corrections for multiple comparisons

    Directory of Open Access Journals (Sweden)

    Kyle Lesack

    2011-01-01

    Full Text Available Increased type I error resulting from multiple statistical comparisons remains a common problem in the scientific literature. This may result in the reporting and promulgation of spurious findings. One approach to this problem is to correct groups of P-values for "family-wide significance" using a Bonferroni correction or the less conservative Bonferroni-Holm correction or to correct for the "false discovery rate" with a Benjamini-Hochberg correction. Although several solutions are available for performing this correction through commercially available software there are no widely available easy to use open source programs to perform these calculations. In this paper we present an open source program written in Python 3.2 that performs calculations for standard Bonferroni, Bonferroni-Holm and Benjamini-Hochberg corrections.

  13. Robust recurrent neural network modeling for software fault detection and correction prediction

    International Nuclear Information System (INIS)

    Hu, Q.P.; Xie, M.; Ng, S.H.; Levitin, G.

    2007-01-01

    Software fault detection and correction processes are related although different, and they should be studied together. A practical approach is to apply software reliability growth models to model fault detection, and fault correction process is assumed to be a delayed process. On the other hand, the artificial neural networks model, as a data-driven approach, tries to model these two processes together with no assumptions. Specifically, feedforward backpropagation networks have shown their advantages over analytical models in fault number predictions. In this paper, the following approach is explored. First, recurrent neural networks are applied to model these two processes together. Within this framework, a systematic networks configuration approach is developed with genetic algorithm according to the prediction performance. In order to provide robust predictions, an extra factor characterizing the dispersion of prediction repetitions is incorporated into the performance function. Comparisons with feedforward neural networks and analytical models are developed with respect to a real data set

  14. 77 FR 4582 - Certain Mobile Devices and Related Software Corrected Notice of Request for Statements on the...

    Science.gov (United States)

    2012-01-30

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-750] Certain Mobile Devices and Related Software Corrected Notice of Request for Statements on the Public Interest AGENCY: U.S. International Trade Commission. ACTION: Correction to Notice. SUMMARY: This Notice corrects the notice in the same matter...

  15. A software-based x-ray scatter correction method for breast tomosynthesis

    International Nuclear Information System (INIS)

    Jia Feng, Steve Si; Sechopoulos, Ioannis

    2011-01-01

    Purpose: To develop a software-based scatter correction method for digital breast tomosynthesis (DBT) imaging and investigate its impact on the image quality of tomosynthesis reconstructions of both phantoms and patients. Methods: A Monte Carlo (MC) simulation of x-ray scatter, with geometry matching that of the cranio-caudal (CC) view of a DBT clinical prototype, was developed using the Geant4 toolkit and used to generate maps of the scatter-to-primary ratio (SPR) of a number of homogeneous standard-shaped breasts of varying sizes. Dimension-matched SPR maps were then deformed and registered to DBT acquisition projections, allowing for the estimation of the primary x-ray signal acquired by the imaging system. Noise filtering of the estimated projections was then performed to reduce the impact of the quantum noise of the x-ray scatter. Three dimensional (3D) reconstruction was then performed using the maximum likelihood-expectation maximization (MLEM) method. This process was tested on acquisitions of a heterogeneous 50/50 adipose/glandular tomosynthesis phantom with embedded masses, fibers, and microcalcifications and on acquisitions of patients. The image quality of the reconstructions of the scatter-corrected and uncorrected projections was analyzed by studying the signal-difference-to-noise ratio (SDNR), the integral of the signal in each mass lesion (integrated mass signal, IMS), and the modulation transfer function (MTF). Results: The reconstructions of the scatter-corrected projections demonstrated superior image quality. The SDNR of masses embedded in a 5 cm thick tomosynthesis phantom improved 60%-66%, while the SDNR of the smallest mass in an 8 cm thick phantom improved by 59% (p < 0.01). The IMS of the masses in the 5 cm thick phantom also improved by 15%-29%, while the IMS of the masses in the 8 cm thick phantom improved by 26%-62% (p < 0.01). Some embedded microcalcifications in the tomosynthesis phantoms were visible only in the scatter-corrected

  16. 'TrueCoinc' software utility for calculation of the true coincidence correction

    International Nuclear Information System (INIS)

    Sudar, S.

    2002-01-01

    The true coincidence correction plays an important role in the overall accuracy of the γ ray spectrometry especially in the case of present-day high volume detectors. The calculation of true coincidence corrections needs detailed nuclear structure information. Recently these data are available in computerized form from the Nuclear Data Centers through the Internet or on a CD-ROM of the Table of Isotopes. The aim has been to develop software for this calculation, using available databases for the levels data. The user has to supply only the parameters of the detector to be used. The new computer program runs under the Windows 95/98 operating system. In the framework of the project a new formula was prepared for calculating the summing out correction and calculation of the intensity of alias lines (sum peaks). The file converter for reading the ENDSF-2 type files was completed. Reading and converting the original ENDSF was added to the program. A computer accessible database of the X rays energies and intensities was created. The X ray emissions were taken in account in the 'summing out' calculation. Calculation of the true coincidence 'summing in' correction was done. The output was arranged to show independently two types of corrections and to calculate the final correction as multiplication of the two. A minimal intensity threshold can be set to show the final list only for the strongest lines. The calculation takes into account all the transitions, independently of the threshold. The program calculates the intensity of X rays (K, L lines). The true coincidence corrections for X rays were calculated. The intensities of the alias γ lines were calculated. (author)

  17. Software-controlled, highly automated intrafraction prostate motion correction with intrafraction stereographic targeting: System description and clinical results

    International Nuclear Information System (INIS)

    Mutanga, Theodore F.; Boer, Hans C. J. de; Rajan, Vinayakrishnan; Dirkx, Maarten L. P.; Os, Marjolein J. H. van; Incrocci, Luca; Heijmen, Ben J. M.

    2012-01-01

    Purpose: A new system for software-controlled, highly automated correction of intrafraction prostate motion,'' intrafraction stereographic targeting'' (iSGT), is described and evaluated. Methods: At our institute, daily prostate positioning is routinely performed at the start of treatment beam using stereographic targeting (SGT). iSGT was implemented by extension of the SGT software to facilitate fast and accurate intrafraction motion corrections with minimal user interaction. iSGT entails megavoltage (MV) image acquisitions with the first segment of selected IMRT beams, automatic registration of implanted markers, followed by remote couch repositioning to correct for intrafraction motion above a predefined threshold, prior to delivery of the remaining segments. For a group of 120 patients, iSGT with corrections for two nearly lateral beams was evaluated in terms of workload and impact on effective intrafraction displacements in the sagittal plane. Results: SDs of systematic (Σ) and random (σ) displacements relative to the planning CT measured directly after initial SGT setup correction were eff eff eff eff eff eff < 0.7 mm, requiring corrections in 82.4% of the fractions. Because iSGT is highly automated, the extra time added by iSGT is <30 s if a correction is required. Conclusions: Without increasing imaging dose, iSGT successfully reduces intrafraction prostate motion with minimal workload and increase in fraction time. An action level of 2 mm is recommended.

  18. Software Maintenance and Evolution: The Implication for Software ...

    African Journals Online (AJOL)

    Software Maintenance and Evolution: The Implication for Software Development. ... Software maintenance is the process of modifying existing operational software by correcting errors, ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  19. Testing effort dependent software reliability model for imperfect debugging process considering both detection and correction

    International Nuclear Information System (INIS)

    Peng, R.; Li, Y.F.; Zhang, W.J.; Hu, Q.P.

    2014-01-01

    This paper studies the fault detection process (FDP) and fault correction process (FCP) with the incorporation of testing effort function and imperfect debugging. In order to ensure high reliability, it is essential for software to undergo a testing phase, during which faults can be detected and corrected by debuggers. The testing resource allocation during this phase, which is usually depicted by the testing effort function, considerably influences not only the fault detection rate but also the time to correct a detected fault. In addition, testing is usually far from perfect such that new faults may be introduced. In this paper, we first show how to incorporate testing effort function and fault introduction into FDP and then develop FCP as delayed FDP with a correction effort. Various specific paired FDP and FCP models are obtained based on different assumptions of fault introduction and correction effort. An illustrative example is presented. The optimal release policy under different criteria is also discussed

  20. A New Method to Detect and Correct the Critical Errors and Determine the Software-Reliability in Critical Software-System

    International Nuclear Information System (INIS)

    Krini, Ossmane; Börcsök, Josef

    2012-01-01

    In order to use electronic systems comprising of software and hardware components in safety related and high safety related applications, it is necessary to meet the Marginal risk numbers required by standards and legislative provisions. Existing processes and mathematical models are used to verify the risk numbers. On the hardware side, various accepted mathematical models, processes, and methods exist to provide the required proof. To this day, however, there are no closed models or mathematical procedures known that allow for a dependable prediction of software reliability. This work presents a method that makes a prognosis on the residual critical error number in software. Conventional models lack this ability and right now, there are no methods that forecast critical errors. The new method will show that an estimate of the residual error number of critical errors in software systems is possible by using a combination of prediction models, a ratio of critical errors, and the total error number. Subsequently, the critical expected value-function at any point in time can be derived from the new solution method, provided the detection rate has been calculated using an appropriate estimation method. Also, the presented method makes it possible to make an estimate on the critical failure rate. The approach is modelled on a real process and therefore describes two essential processes - detection and correction process.

  1. Reliable software systems via chains of object models with provably correct behavior

    International Nuclear Information System (INIS)

    Yakhnis, A.; Yakhnis, V.

    1996-01-01

    This work addresses specification and design of reliable safety-critical systems, such as nuclear reactor control systems. Reliability concerns are addressed in complimentary fashion by different fields. Reliability engineers build software reliability models, etc. Safety engineers focus on prevention of potential harmful effects of systems on environment. Software/hardware correctness engineers focus on production of reliable systems on the basis of mathematical proofs. The authors think that correctness may be a crucial guiding issue in the development of reliable safety-critical systems. However, purely formal approaches are not adequate for the task, because they neglect the connection with the informal customer requirements. They alleviate that as follows. First, on the basis of the requirements, they build a model of the system interactions with the environment, where the system is viewed as a black box. They will provide foundations for automated tools which will (a) demonstrate to the customer that all of the scenarios of system behavior are presented in the model, (b) uncover scenarios not present in the requirements, and (c) uncover inconsistent scenarios. The developers will work with the customer until the black box model will not possess scenarios (b) and (c) above. Second, the authors will build a chain of several increasingly detailed models, where the first model is the black box model and the last model serves to automatically generated proved executable code. The behavior of each model will be proved to conform to the behavior of the previous one. They build each model as a cluster of interactive concurrent objects, thus they allow both top-down and bottom-up development

  2. Graph-based software specification and verification

    NARCIS (Netherlands)

    Kastenberg, H.

    2008-01-01

    The (in)correct functioning of many software systems heavily influences the way we qualify our daily lives. Software companies as well as academic computer science research groups spend much effort on applying and developing techniques for improving the correctness of software systems. In this

  3. Plateletpheresis efficiency and mathematical correction of software-derived platelet yield prediction: A linear regression and ROC modeling approach.

    Science.gov (United States)

    Jaime-Pérez, José Carlos; Jiménez-Castillo, Raúl Alberto; Vázquez-Hernández, Karina Elizabeth; Salazar-Riojas, Rosario; Méndez-Ramírez, Nereida; Gómez-Almaguer, David

    2017-10-01

    Advances in automated cell separators have improved the efficiency of plateletpheresis and the possibility of obtaining double products (DP). We assessed cell processor accuracy of predicted platelet (PLT) yields with the goal of a better prediction of DP collections. This retrospective proof-of-concept study included 302 plateletpheresis procedures performed on a Trima Accel v6.0 at the apheresis unit of a hematology department. Donor variables, software predicted yield and actual PLT yield were statistically evaluated. Software prediction was optimized by linear regression analysis and its optimal cut-off to obtain a DP assessed by receiver operating characteristic curve (ROC) modeling. Three hundred and two plateletpheresis procedures were performed; in 271 (89.7%) occasions, donors were men and in 31 (10.3%) women. Pre-donation PLT count had the best direct correlation with actual PLT yield (r = 0.486. P Simple correction derived from linear regression analysis accurately corrected this underestimation and ROC analysis identified a precise cut-off to reliably predict a DP. © 2016 Wiley Periodicals, Inc.

  4. Selection and optimization of spectrometric amplifiers for gamma spectrometry: part II - linearity, live time correction factors and software

    International Nuclear Information System (INIS)

    Moraes, Marco Antonio Proenca Vieira de; Pugliesi, Reinaldo

    1996-01-01

    The objective of the present work was to establish simple criteria to choose the best combination of electronic modules to achieve an adequate high resolution gamma spectrometer. Linearity, live time correction factors and softwares of a gamma spectrometric system composed by a Hp Ge detector have been studied by using several kinds of spectrometric amplifiers: Canberra 2021, Canberra 2025, Ortec 673 and Tennelec 244 and the MCA cards Ortec and Nucleus. The results showed low values of integral non-linearity for all spectrometric amplifiers connected to the Ortec and Nucleus boards. The MCA card should be able to correct amplifier dead time for 17 kcps count rates. (author)

  5. Software tool for portal dosimetry research.

    Science.gov (United States)

    Vial, P; Hunt, P; Greer, P B; Oliver, L; Baldock, C

    2008-09-01

    This paper describes a software tool developed for research into the use of an electronic portal imaging device (EPID) to verify dose for intensity modulated radiation therapy (IMRT) beams. A portal dose image prediction (PDIP) model that predicts the EPID response to IMRT beams has been implemented into a commercially available treatment planning system (TPS). The software tool described in this work was developed to modify the TPS PDIP model by incorporating correction factors into the predicted EPID image to account for the difference in EPID response to open beam radiation and multileaf collimator (MLC) transmitted radiation. The processes performed by the software tool include; i) read the MLC file and the PDIP from the TPS, ii) calculate the fraction of beam-on time that each point in the IMRT beam is shielded by MLC leaves, iii) interpolate correction factors from look-up tables, iv) create a corrected PDIP image from the product of the original PDIP and the correction factors and write the corrected image to file, v) display, analyse, and export various image datasets. The software tool was developed using the Microsoft Visual Studio.NET framework with the C# compiler. The operation of the software tool was validated. This software provided useful tools for EPID dosimetry research, and it is being utilised and further developed in ongoing EPID dosimetry and IMRT dosimetry projects.

  6. Evaluation of three methods for retrospective correction of vignetting on medical microscopy images utilizing two open source software tools.

    Science.gov (United States)

    Babaloukas, Georgios; Tentolouris, Nicholas; Liatis, Stavros; Sklavounou, Alexandra; Perrea, Despoina

    2011-12-01

    Correction of vignetting on images obtained by a digital camera mounted on a microscope is essential before applying image analysis. The aim of this study is to evaluate three methods for retrospective correction of vignetting on medical microscopy images and compare them with a prospective correction method. One digital image from four different tissues was used and a vignetting effect was applied on each of these images. The resulted vignetted image was replicated four times and in each replica a different method for vignetting correction was applied with fiji and gimp software tools. The highest peak signal-to-noise ratio from the comparison of each method to the original image was obtained from the prospective method in all tissues. The morphological filtering method provided the highest peak signal-to-noise ratio value amongst the retrospective methods. The prospective method is suggested as the method of choice for correction of vignetting and if it is not applicable, then the morphological filtering may be suggested as the retrospective alternative method. © 2011 The Authors Journal of Microscopy © 2011 Royal Microscopical Society.

  7. Achieving strategic surety for high consequence software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1996-09-01

    A strategic surety roadmap for high consequence software systems under the High Integrity Software (HIS) Program at Sandia National Laboratories guides research in identifying methodologies to improve software surety. Selected research tracks within this roadmap are identified and described detailing current technology and outlining advancements to be pursued over the coming decade to reach HIS goals. The tracks discussed herein focus on Correctness by Design, and System Immunology{trademark}. Specific projects are discussed with greater detail given on projects involving Correct Specification via Visualization, Synthesis, & Analysis; Visualization of Abstract Objects; and Correct Implementation of Components.

  8. Malavefes: A computational voice-enabled malaria fuzzy informatics software for correct dosage prescription of anti-malarial drugs

    Directory of Open Access Journals (Sweden)

    Olugbenga O. Oluwagbemi

    2018-04-01

    Full Text Available Malaria is one of the infectious diseases consistently inherent in many Sub-Sahara African countries. Among the issues of concern are the consequences of wrong diagnosis and dosage administration of anti-malarial drugs on sick patients; these have resulted into various degrees of complications ranging from severe headaches, stomach and body discomfort, blurred vision, dizziness, hallucinations, and in extreme cases, death. Many expert systems have been developed to support different infectious disease diagnoses, but not sure of any yet, that have been specifically designed as a voice-based application to diagnose and translate malaria patients’ symptomatic data for pre-laboratory screening and correct prescription of proper dosage of the appropriate medication. We developed Malavefes, (a malaria voice-enabled computational fuzzy expert system for correct dosage prescription of anti-malarial drugs using Visual Basic.NET., and Java programming languages. Data collation for this research was conducted by survey from existing literature and interview from public health experts. The database for this malaria drug informatics system was implemented using Microsoft Access. The Root Sum Square (RSS was implemented as the inference engine of Malavefes to make inferences from rules, while Centre of Gravity (CoG was implemented as the defuzzification engine. The drug recommendation module was voice-enabled. Additional anti-malaria drug expiration validation software was developed using Java programming language. We conducted a user-evaluation of the performance and user-experience of the Malavefes software. Keywords: Informatics, Bioinformatics, Fuzzy, Anti-malaria, Voice computing, Dosage prescription

  9. Reed-Solomon error-correction as a software patch mechanism.

    Energy Technology Data Exchange (ETDEWEB)

    Pendley, Kevin D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2013-11-01

    This report explores how error-correction data generated by a Reed-Solomon code may be used as a mechanism to apply changes to an existing installed codebase. Using the Reed-Solomon code to generate error-correction data for a changed or updated codebase will allow the error-correction data to be applied to an existing codebase to both validate and introduce changes or updates from some upstream source to the existing installed codebase.

  10. Selecting and Buying Educational Software.

    Science.gov (United States)

    Ahl, David H.

    1983-01-01

    Guidelines for selecting/buying educational software are discussed under the following headings: educational soundness; appropriateness; challenge and progress; motivation and reward; correctness; compatibility with systems; instructions and handlings. Includes several sources of software reviews. (JN)

  11. Validation of geotechnical software for repository performance assessment

    International Nuclear Information System (INIS)

    LeGore, T.; Hoover, J.D.; Khaleel, R.; Thornton, E.C.; Anantatmula, R.P.; Lanigan, D.C.

    1989-01-01

    An important step in the characterization of a high level nuclear waste repository is to demonstrate that geotechnical software, used in performance assessment, correctly models validation. There is another type of validation, called software validation. It is based on meeting the requirements of specifications documents (e.g. IEEE specifications) and does not directly address the correctness of the specifications. The process of comparing physical experimental results with the predicted results should incorporate an objective measure of the level of confidence regarding correctness. This paper reports on a methodology developed that allows the experimental uncertainties to be explicitly included in the comparison process. The methodology also allows objective confidence levels to be associated with the software. In the event of a poor comparison, the method also lays the foundation for improving the software

  12. Toward Baseline Software Anomalies in NASA Missions

    Science.gov (United States)

    Layman, Lucas; Zelkowitz, Marvin; Basili, Victor; Nikora, Allen P.

    2012-01-01

    In this fast abstract, we provide preliminary findings an analysis of 14,500 spacecraft anomalies from unmanned NASA missions. We provide some baselines for the distributions of software vs. non-software anomalies in spaceflight systems, the risk ratings of software anomalies, and the corrective actions associated with software anomalies.

  13. Error Free Software

    Science.gov (United States)

    1985-01-01

    A mathematical theory for development of "higher order" software to catch computer mistakes resulted from a Johnson Space Center contract for Apollo spacecraft navigation. Two women who were involved in the project formed Higher Order Software, Inc. to develop and market the system of error analysis and correction. They designed software which is logically error-free, which, in one instance, was found to increase productivity by 600%. USE.IT defines its objectives using AXES -- a user can write in English and the system converts to computer languages. It is employed by several large corporations.

  14. An application of the baseline correction technique for correcting distorted seismic acceleration time histories

    International Nuclear Information System (INIS)

    Lee, Gyu Mahn; Kim, Jong Wook; Jeoung, Kyeong Hoon; Kim, Tae Wan; Park, Keun Bae; Kim, Keung Koo

    2008-03-01

    Three kinds of baseline correction techniques named as 'Newmark', 'Zero-VD' and 'Newmark and Zero-VD' were introduced to correct the distorted physical characteristics of a seismic time history accelogram. The corrected seismic accelerations and distorted raw acceleration showed an identical response spectra in frequency domains, but showed various time history profiles in velocity and displacement domains. The referred correction techniques were programmed with UNIX-HP Fortran. The verification of the baseline corrected seismic data in terms of frequency response spectrum were performed by ANSYS of a commerical FEM software

  15. Aspects of the design and verification of software for a computerized reactor protection system

    International Nuclear Information System (INIS)

    Voges, U.

    1976-01-01

    In contrary to hardware, software lasts forever. If software is considered to be correct, it remains correct all the time (except you make any changes to it). Therefore failure rates, MTBF, MTTR etc. cannot be used for software. The main effort has to be put on: 1) how to make reliable software, 2) how to prove software to be correct. The first part deals with the developmental stage, the specification, design and implementation of the software, the second part with the 'produced' software, its test and verification. (orig./RW) [de

  16. The effects of automatic spelling correction software on understanding and comprehension in compensated dyslexia: improved recall following dictation.

    Science.gov (United States)

    Hiscox, Lucy; Leonavičiūtė, Erika; Humby, Trevor

    2014-08-01

    Dyslexia is associated with difficulties in language-specific skills such as spelling, writing and reading; the difficulty in acquiring literacy skills is not a result of low intelligence or the absence of learning opportunity, but these issues will persist throughout life and could affect long-term education. Writing is a complex process involving many different functions, integrated by the working memory system; people with dyslexia have a working memory deficit, which means that concentration on writing quality may be detrimental to understanding. We confirm impaired working memory in a sample of university students with (compensated) dyslexia, and using a within-subject design with three test conditions, we show that these participants demonstrated better understanding of a piece of text if they had used automatic spelling correction software during a dictation/transcription task. We hypothesize that the use of the autocorrecting software reduced demand on working memory, by allowing word writing to be more automatic, thus enabling better processing and understanding of the content of the transcriptions and improved recall. Long-term and regular use of autocorrecting assistive software should be beneficial for people with and without dyslexia and may improve confidence, written work, academic achievement and self-esteem, which are all affected in dyslexia. Copyright © 2014 John Wiley & Sons, Ltd.

  17. Software war stories case studies in software management

    CERN Document Server

    Reifer, Donald J

    2014-01-01

    This book provides readers with practical advice on how to handle the many issues that can arise as a software project unfolds. The book uses twelve case studies to communicate lessons learned addressing such issues as they occur in government, industrial and academic settings. These cases focus on addressing the things that can be done to establish and meet reasonable expectations.  Such corrective actions often involve more than just dealing with project issues.  For example, software practitioners may have to address obstacles placed in their way by procurement, organizational procedures an

  18. Ground-Based Correction of Remote-Sensing Spectral Imagery

    Science.gov (United States)

    Alder-Golden, Steven M.; Rochford, Peter; Matthew, Michael; Berk, Alexander

    2007-01-01

    Software has been developed for an improved method of correcting for the atmospheric optical effects (primarily, effects of aerosols and water vapor) in spectral images of the surface of the Earth acquired by airborne and spaceborne remote-sensing instruments. In this method, the variables needed for the corrections are extracted from the readings of a radiometer located on the ground in the vicinity of the scene of interest. The software includes algorithms that analyze measurement data acquired from a shadow-band radiometer. These algorithms are based on a prior radiation transport software model, called MODTRAN, that has been developed through several versions up to what are now known as MODTRAN4 and MODTRAN5 . These components have been integrated with a user-friendly Interactive Data Language (IDL) front end and an advanced version of MODTRAN4. Software tools for handling general data formats, performing a Langley-type calibration, and generating an output file of retrieved atmospheric parameters for use in another atmospheric-correction computer program known as FLAASH have also been incorporated into the present soft-ware. Concomitantly with the soft-ware described thus far, there has been developed a version of FLAASH that utilizes the retrieved atmospheric parameters to process spectral image data.

  19. Impact on dose and image quality of a software-based scatter correction in mammography.

    Science.gov (United States)

    Monserrat, Teresa; Prieto, Elena; Barbés, Benigno; Pina, Luis; Elizalde, Arlette; Fernández, Belén

    2017-01-01

    Background In 2014, Siemens developed a new software-based scatter correction (Progressive Reconstruction Intelligently Minimizing Exposure [PRIME]), enabling grid-less digital mammography. Purpose To compare doses and image quality between PRIME (grid-less) and standard (with anti-scatter grid) modes. Material and Methods Contrast-to-noise ratio (CNR) was measured for various polymethylmethacrylate (PMMA) thicknesses and dose values provided by the mammograph were recorded. CDMAM phantom images were acquired for various PMMA thicknesses and inverse Image Quality Figure (IQF inv ) was calculated. Values of incident entrance surface air kerma (ESAK) and average glandular dose (AGD) were obtained from the DICOM header for a total of 1088 pairs of clinical cases. Two experienced radiologists compared subjectively the image quality of a total of 149 pairs of clinical cases. Results CNR values were higher and doses were lower in PRIME mode for all thicknesses. IQF inv values in PRIME mode were lower for all thicknesses except for 40 mm of PMMA equivalent, in which IQF inv was slightly greater in PRIME mode. A mean reduction of 10% in ESAK and 12% in AGD in PRIME mode with respect to standard mode was obtained. The clinical image quality in PRIME and standard acquisitions resulted to be similar in most of the cases (84% for the first radiologist and 67% for the second one). Conclusion The use of PRIME software reduces, in average, the dose of radiation to the breast without affecting image quality. This reduction is greater for thinner and denser breasts.

  20. Software as quality product

    International Nuclear Information System (INIS)

    Enders, A.

    1975-01-01

    In many discussions on the reliability of computer systems, software is presented as the weak link in the chain. The contribution attempts to identify the reasons for this situation as seen from the software development. The concepts correctness and reliability of programmes are explained as they are understood in the specialist discussion of today. Measures and methods are discussed which are particularly relevant as far as the obtaining of fault-free and reliable programmes is concerned. Conclusions are drawn for the user of software so that he is in the position to judge himself what can be justly expected frm the product software compared to other products. (orig./LH) [de

  1. Ballistic deficit correction

    International Nuclear Information System (INIS)

    Duchene, G.; Moszynski, M.; Curien, D.

    1991-01-01

    The EUROGAM data-acquisition has to handle a large number of events/s. Typical in-beam experiments using heavy-ion fusion reactions assume the production of about 50 000 compound nuclei per second deexciting via particle and γ-ray emissions. The very powerful γ-ray detection of EUROGAM is expected to produce high-fold event rates as large as 10 4 events/s. Such high count rates introduce, in a common dead time mode, large dead times for the whole system associated with the processing of the pulse, its digitization and its readout (from the preamplifier pulse up to the readout of the information). In order to minimize the dead time the shaping time constant τ, usually about 3 μs for large volume Ge detectors has to be reduced. Smaller shaping times, however, will adversely affect the energy resolution due to ballistic deficit. One possible solution is to operate the linear amplifier, with a somewhat smaller shaping time constant (in the present case we choose τ = 1.5 μs), in combination with a ballistic deficit compensator. The ballistic deficit can be corrected in different ways using a Gated Integrator, a hardware correction or even a software correction. In this paper we present a comparative study of the software and hardware corrections as well as gated integration

  2. High-speed atmospheric correction for spectral image processing

    Science.gov (United States)

    Perkins, Timothy; Adler-Golden, Steven; Cappelaere, Patrice; Mandl, Daniel

    2012-06-01

    Land and ocean data product generation from visible-through-shortwave-infrared multispectral and hyperspectral imagery requires atmospheric correction or compensation, that is, the removal of atmospheric absorption and scattering effects that contaminate the measured spectra. We have recently developed a prototype software system for automated, low-latency, high-accuracy atmospheric correction based on a C++-language version of the Spectral Sciences, Inc. FLAASH™ code. In this system, pre-calculated look-up tables replace on-the-fly MODTRAN® radiative transfer calculations, while the portable C++ code enables parallel processing on multicore/multiprocessor computer systems. The initial software has been installed on the Sensor Web at NASA Goddard Space Flight Center, where it is currently atmospherically correcting new data from the EO-1 Hyperion and ALI sensors. Computation time is around 10 s per data cube per processor. Further development will be conducted to implement the new atmospheric correction software on board the upcoming HyspIRI mission's Intelligent Payload Module, where it would generate data products in nearreal time for Direct Broadcast to the ground. The rapid turn-around of data products made possible by this software would benefit a broad range of applications in areas of emergency response, environmental monitoring and national defense.

  3. A software engineering process for safety-critical software application

    International Nuclear Information System (INIS)

    Kang, Byung Heon; Kim, Hang Bae; Chang, Hoon Seon; Jeon, Jong Sun

    1995-01-01

    Application of computer software to safety-critical systems in on the increase. To be successful, the software must be designed and constructed to meet the functional and performance requirements of the system. For safety reason, the software must be demonstrated not only to meet these requirements, but also to operate safely as a component within the system. For longer-term cost consideration, the software must be designed and structured to ease future maintenance and modifications. This paper presents a software engineering process for the production of safety-critical software for a nuclear power plant. The presentation is expository in nature of a viable high quality safety-critical software development. It is based on the ideas of a rational design process and on the experience of the adaptation of such process in the production of the safety-critical software for the shutdown system number two of Wolsung 2, 3 and 4 nuclear power generation plants. This process is significantly different from a conventional process in terms of rigorous software development phases and software design techniques, The process covers documentation, design, verification and testing using mathematically precise notations and highly reviewable tabular format to specify software requirements and software requirements and software requirements and code against software design using static analysis. The software engineering process described in this paper applies the principle of information-hiding decomposition in software design using a modular design technique so that when a change is required or an error is detected, the affected scope can be readily and confidently located. it also facilitates a sense of high degree of confidence in the 'correctness' of the software production, and provides a relatively simple and straightforward code implementation effort. 1 figs., 10 refs. (Author)

  4. Software Quality Improvement in the OMC Team

    CERN Document Server

    Maier, Viktor

    Physicists use self-written software as a tool to fulfill their tasks and often the developed software is used for several years or even decades. If a software product lives for a long time, it has to be changed and adapted to external influences. This implies that the source code has to be read, understood and modified. The same applies to the software of the Optics Measurements and Corrections (OMC) team at CERN. Their task is to track, analyze and correct the beams in the LHC and other accelerators. To solve this task, they revert to a self-written software base with more than 150,000 physical lines of code. The base is subject to continuous changes as well. Their software does its job and is effective, but runs regrettably not efficient because some parts of the source code are in a bad shape and has a low quality. The implementation could be faster and more memory efficient. In addition it is difficult to read and understand the code. Source code files and functions are too big and identifiers do not rev...

  5. V & V Within Reuse-Based Software Engineering

    Science.gov (United States)

    Addy, Edward A.

    1996-01-01

    Verification and validation (V&V) is used to increase the level of assurance of critical software, particularly that of safety-critical and mission critical software. This paper describes the working group's success in identifying V&V tasks that could be performed in the domain engineering and transition levels of reuse-based software engineering. The primary motivation for V&V at the domain level is to provide assurance that the domain requirements are correct and that the domain artifacts correctly implement the domain requirements. A secondary motivation is the possible elimination of redundant V&V activities at the application level. The group also considered the criteria and motivation for performing V&V in domain engineering.

  6. Multinomial-exponential reliability function: a software reliability model

    International Nuclear Information System (INIS)

    Saiz de Bustamante, Amalio; Saiz de Bustamante, Barbara

    2003-01-01

    The multinomial-exponential reliability function (MERF) was developed during a detailed study of the software failure/correction processes. Later on MERF was approximated by a much simpler exponential reliability function (EARF), which keeps most of MERF mathematical properties, so the two functions together makes up a single reliability model. The reliability model MERF/EARF considers the software failure process as a non-homogeneous Poisson process (NHPP), and the repair (correction) process, a multinomial distribution. The model supposes that both processes are statistically independent. The paper discusses the model's theoretical basis, its mathematical properties and its application to software reliability. Nevertheless it is foreseen model applications to inspection and maintenance of physical systems. The paper includes a complete numerical example of the model application to a software reliability analysis

  7. Mengkaji Penggunaan Software Apple Color untuk Color Grading saat Pasca Produksi

    Directory of Open Access Journals (Sweden)

    Ahmad Faisal Choiril Anam Fathoni

    2011-04-01

    Full Text Available In post-production process, there is one process that is not as well known as the video editing process, the addition of animation, special effects enrichment, motion graphics or audio editing and audio mixing, an important process which is rarely realized called Color Correction or Color Grading. Various software have been made to handle this process, ranging from additional filters are already available for free in any editing software, to high-end devices worth billions of dollars dedicated for specifically conducting Color Correction. Apple Color is one of the software included in the purchase of Final Cut Studio package which also include Final Cut Pro for Video Editing, Soundtrack Pro for Sound Editing and Mixing, and Motion for compositing. Apple's Color is specially designed for color correction tasks after previously edited in Final Cut Pro. This paper is designed to introduce Apple's software as well as analyze the feasibility of Apple Color as a professional device in the world of production, especially post-production. Some professional color correction software will be compared briefly with Apple Color to get an objective conclusion. 

  8. Correction magnetic field in electromagnet of proton accelerator using CST software; Correcao do campo magnetico em um eletroima de um acelerador de protons usando o software CST

    Energy Technology Data Exchange (ETDEWEB)

    Rabelo, L.A.; Campos, T.P.R., E-mail: luisarabelo88@gmail.com [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte (Brazil). Dept. de Engenharia Nuclear

    2016-07-01

    The aim of this paper is to present the study and simulation of uniform magnetic field electromagnets new circular accelerator model for protons with energy range between 15 MeV and 64 MeV. In addition, investigating materials and the changes induced by the presence of 'gaps' synchronism correction. The electromagnet simulations, predefined, were made in electromagnetic field simulation software CST EM Studio® 3D 2015. The results showed an even distribution of the magnetic field in the compact electromagnet with the same homogenization structures. The results showed regular distribution of the magnetic field in the compact electromagnet with homogenization structures. In conclusion, the electromagnetic model proposed shown to be feasible for a circular accelerator and comply the synchronization requirements. (author)

  9. Verification of safety critical software

    International Nuclear Information System (INIS)

    Son, Ki Chang; Chun, Chong Son; Lee, Byeong Joo; Lee, Soon Sung; Lee, Byung Chai

    1996-01-01

    To assure quality of safety critical software, software should be developed in accordance with software development procedures and rigorous software verification and validation should be performed. Software verification is the formal act of reviewing, testing of checking, and documenting whether software components comply with the specified requirements for a particular stage of the development phase[1]. New software verification methodology was developed and was applied to the Shutdown System No. 1 and 2 (SDS1,2) for Wolsung 2,3 and 4 nuclear power plants by Korea Atomic Energy Research Institute(KAERI) and Atomic Energy of Canada Limited(AECL) in order to satisfy new regulation requirements of Atomic Energy Control Boars(AECB). Software verification methodology applied to SDS1 for Wolsung 2,3 and 4 project will be described in this paper. Some errors were found by this methodology during the software development for SDS1 and were corrected by software designer. Outputs from Wolsung 2,3 and 4 project have demonstrated that the use of this methodology results in a high quality, cost-effective product. 15 refs., 6 figs. (author)

  10. Example of software configuration management model

    International Nuclear Information System (INIS)

    Roth, P.

    2006-01-01

    Software configuration management is the mechanism used to track and control software changes and may include the following actions: A tracking system should be established for any changes made to the existing software configuration. Requirement of the configuration management system are the following: - Backup the different software configuration; - Record the details (the date, the subject, the filenames, the supporting documents, the tests, ...) of the changes introduced in the new configuration; - Document all the differences between the different versions. Configuration management allows simultaneous exploitation of one specific version and development of the next version. Minor correction can be perform in the current exploitation version

  11. On Model Based Synthesis of Embedded Control Software

    OpenAIRE

    Alimguzhin, Vadim; Mari, Federico; Melatti, Igor; Salvo, Ivano; Tronci, Enrico

    2012-01-01

    Many Embedded Systems are indeed Software Based Control Systems (SBCSs), that is control systems whose controller consists of control software running on a microcontroller device. This motivates investigation on Formal Model Based Design approaches for control software. Given the formal model of a plant as a Discrete Time Linear Hybrid System and the implementation specifications (that is, number of bits in the Analog-to-Digital (AD) conversion) correct-by-construction control software can be...

  12. Correct software in web applications and web services

    CERN Document Server

    Thalheim, Bernhard; Prinz, Andreas; Buchberger, Bruno

    2015-01-01

    The papers in this volume aim at obtaining a common understanding of the challenging research questions in web applications comprising web information systems, web services, and web interoperability; obtaining a common understanding of verification needs in web applications; achieving a common understanding of the available rigorous approaches to system development, and the cases in which they have succeeded; identifying how rigorous software engineering methods can be exploited to develop suitable web applications; and at developing a European-scale research agenda combining theory, methods a

  13. Writing references and using citation management software.

    Science.gov (United States)

    Sungur, Mukadder Orhan; Seyhan, Tülay Özkan

    2013-09-01

    The correct citation of references is obligatory to gain scientific credibility, to honor the original ideas of previous authors and to avoid plagiarism. Currently, researchers can easily find, cite and store references using citation management software. In this review, two popular citation management software programs (EndNote and Mendeley) are summarized.

  14. Software management issues

    International Nuclear Information System (INIS)

    Kunz, P.F.

    1990-06-01

    The difficulty of managing the software in large HEP collaborations appears to becoming progressively worst with each new generation of detector. If one were to extrapolate to the SSC, it will become a major problem. This paper explores the possible causes of the difficulty and makes suggestions on what corrective actions should be taken

  15. Software quality assurance plans for safety-critical software

    International Nuclear Information System (INIS)

    Liddle, P.

    2006-01-01

    Application software is defined as safety-critical if a fault in the software could prevent the system components from performing their nuclear-safety functions. Therefore, for nuclear-safety systems, the AREVA TELEPERM R XS (TXS) system is classified 1E, as defined in the Inst. of Electrical and Electronics Engineers (IEEE) Std 603-1998. The application software is classified as Software Integrity Level (SIL)-4, as defined in IEEE Std 7-4.3.2-2003. The AREVA NP Inc. Software Program Manual (SPM) describes the measures taken to ensure that the TELEPERM XS application software attains a level of quality commensurate with its importance to safety. The manual also describes how TELEPERM XS correctly performs the required safety functions and conforms to established technical and documentation requirements, conventions, rules, and standards. The program manual covers the requirements definition, detailed design, integration, and test phases for the TELEPERM XS application software, and supporting software created by AREVA NP Inc. The SPM is required for all safety-related TELEPERM XS system applications. The program comprises several basic plans and practices: 1. A Software Quality-Assurance Plan (SQAP) that describes the processes necessary to ensure that the software attains a level of quality commensurate with its importance to safety function. 2. A Software Safety Plan (SSP) that identifies the process to reasonably ensure that safety-critical software performs as intended during all abnormal conditions and events, and does not introduce any new hazards that could jeopardize the health and safety of the public. 3. A Software Verification and Validation (V and V) Plan that describes the method of ensuring the software is in accordance with the requirements. 4. A Software Configuration Management Plan (SCMP) that describes the method of maintaining the software in an identifiable state at all times. 5. A Software Operations and Maintenance Plan (SO and MP) that

  16. Measurement, analysis and correction of the closed orbit distortion ...

    Indian Academy of Sciences (India)

    2013-02-01

    Feb 1, 2013 ... quency (RF), linear coupling are being carried out. ..... Its guide tool is used to develop GUI for COD correction software. In addition ... rection software also has a feature to save all the parameters such as predicted/measured.

  17. CMS software deployment on OSG

    International Nuclear Information System (INIS)

    Kim, B; Avery, P; Thomas, M; Wuerthwein, F

    2008-01-01

    A set of software deployment tools has been developed for the installation, verification, and removal of a CMS software release. The tools that are mainly targeted for the deployment on the OSG have the features of instant release deployment, corrective resubmission of the initial installation job, and an independent web-based deployment portal with Grid security infrastructure login mechanism. We have been deploying over 500 installations and found the tools are reliable and adaptable to cope with problems with changes in the Grid computing environment and the software releases. We present the design of the tools, statistics that we gathered during the operation of the tools, and our experience with the CMS software deployment on the OSG Grid computing environment

  18. CMS software deployment on OSG

    Energy Technology Data Exchange (ETDEWEB)

    Kim, B; Avery, P [University of Florida, Gainesville, FL 32611 (United States); Thomas, M [California Institute of Technology, Pasadena, CA 91125 (United States); Wuerthwein, F [University of California at San Diego, La Jolla, CA 92093 (United States)], E-mail: bockjoo@phys.ufl.edu, E-mail: thomas@hep.caltech.edu, E-mail: avery@phys.ufl.edu, E-mail: fkw@fnal.gov

    2008-07-15

    A set of software deployment tools has been developed for the installation, verification, and removal of a CMS software release. The tools that are mainly targeted for the deployment on the OSG have the features of instant release deployment, corrective resubmission of the initial installation job, and an independent web-based deployment portal with Grid security infrastructure login mechanism. We have been deploying over 500 installations and found the tools are reliable and adaptable to cope with problems with changes in the Grid computing environment and the software releases. We present the design of the tools, statistics that we gathered during the operation of the tools, and our experience with the CMS software deployment on the OSG Grid computing environment.

  19. Geological Corrections in Gravimetry

    Science.gov (United States)

    Mikuška, J.; Marušiak, I.

    2015-12-01

    Applying corrections for the known geology to gravity data can be traced back into the first quarter of the 20th century. Later on, mostly in areas with sedimentary cover, at local and regional scales, the correction known as gravity stripping has been in use since the mid 1960s, provided that there was enough geological information. Stripping at regional to global scales became possible after releasing the CRUST 2.0 and later CRUST 1.0 models in the years 2000 and 2013, respectively. Especially the later model provides quite a new view on the relevant geometries and on the topographic and crustal densities as well as on the crust/mantle density contrast. Thus, the isostatic corrections, which have been often used in the past, can now be replaced by procedures working with an independent information interpreted primarily from seismic studies. We have developed software for performing geological corrections in space domain, based on a-priori geometry and density grids which can be of either rectangular or spherical/ellipsoidal types with cells of the shapes of rectangles, tesseroids or triangles. It enables us to calculate the required gravitational effects not only in the form of surface maps or profiles but, for instance, also along vertical lines, which can shed some additional light on the nature of the geological correction. The software can work at a variety of scales and considers the input information to an optional distance from the calculation point up to the antipodes. Our main objective is to treat geological correction as an alternative to accounting for the topography with varying densities since the bottoms of the topographic masses, namely the geoid or ellipsoid, generally do not represent geological boundaries. As well we would like to call attention to the possible distortions of the corrected gravity anomalies. This work was supported by the Slovak Research and Development Agency under the contract APVV-0827-12.

  20. The achievement and assessment of safety in systems containing software

    International Nuclear Information System (INIS)

    Ball, A.; Dale, C.J.; Butterfield, M.H.

    1986-01-01

    In order to establish confidence in the safe operation of a reactor protection system, there is a need to establish, as far as it is possible, that: (i) the algorithms used are correct; (ii) the system is a correct implementation of the algorithms; and (iii) the hardware is sufficiently reliable. This paper concentrates principally on the second of these, as it applies to the software aspect of the more accurate and complex trip functions to be performed by modern reactor protection systems. In order to engineer safety into software, there is a need to use a development strategy which will stand a high chance of achieving a correct implementation of the trip algorithms. This paper describes three broad methodologies by which it is possible to enhance the integrity of software: fault avoidance, fault tolerance and fault removal. Fault avoidance is concerned with making the software as fault free as possible by appropriate choice of specification, design and implementation methods. A fault tolerant strategy may be advisable in many safety critical applications, in order to guard against residual faults present in the software of the installed system. Fault detection and removal techniques are used to remove as many faults as possible of those introduced during software development. The paper also discusses safety and reliability assessment as it applies to software, outlining the various approaches available. Finally, there is an outline of a research project underway in the UKAEA which is intended to assess methods for developing and testing safety and protection systems involving software. (author)

  1. The software and algorithms for hyperspectral data processing

    Science.gov (United States)

    Shyrayeva, Anhelina; Martinov, Anton; Ivanov, Victor; Katkovsky, Leonid

    2017-04-01

    Hyperspectral remote sensing technique is widely used for collecting and processing -information about the Earth's surface objects. Hyperspectral data are combined to form a three-dimensional (x, y, λ) data cube. Department of Aerospace Research of the Institute of Applied Physical Problems of the Belarusian State University presents a general model of the software for hyperspectral image data analysis and processing. The software runs in Windows XP/7/8/8.1/10 environment on any personal computer. This complex has been has been written in C++ language using QT framework and OpenGL for graphical data visualization. The software has flexible structure that consists of a set of independent plugins. Each plugin was compiled as Qt Plugin and represents Windows Dynamic library (dll). Plugins can be categorized in terms of data reading types, data visualization (3D, 2D, 1D) and data processing The software has various in-built functions for statistical and mathematical analysis, signal processing functions like direct smoothing function for moving average, Savitzky-Golay smoothing technique, RGB correction, histogram transformation, and atmospheric correction. The software provides two author's engineering techniques for the solution of atmospheric correction problem: iteration method of refinement of spectral albedo's parameters using Libradtran and analytical least square method. The main advantages of these methods are high rate of processing (several minutes for 1 GB data) and low relative error in albedo retrieval (less than 15%). Also, the software supports work with spectral libraries, region of interest (ROI) selection, spectral analysis such as cluster-type image classification and automatic hypercube spectrum comparison by similarity criterion with similar ones from spectral libraries, and vice versa. The software deals with different kinds of spectral information in order to identify and distinguish spectrally unique materials. Also, the following advantages

  2. Workflow-Based Software Development Environment

    Science.gov (United States)

    Izygon, Michel E.

    2013-01-01

    The Software Developer's Assistant (SDA) helps software teams more efficiently and accurately conduct or execute software processes associated with NASA mission-critical software. SDA is a process enactment platform that guides software teams through project-specific standards, processes, and procedures. Software projects are decomposed into all of their required process steps or tasks, and each task is assigned to project personnel. SDA orchestrates the performance of work required to complete all process tasks in the correct sequence. The software then notifies team members when they may begin work on their assigned tasks and provides the tools, instructions, reference materials, and supportive artifacts that allow users to compliantly perform the work. A combination of technology components captures and enacts any software process use to support the software lifecycle. It creates an adaptive workflow environment that can be modified as needed. SDA achieves software process automation through a Business Process Management (BPM) approach to managing the software lifecycle for mission-critical projects. It contains five main parts: TieFlow (workflow engine), Business Rules (rules to alter process flow), Common Repository (storage for project artifacts, versions, history, schedules, etc.), SOA (interface to allow internal, GFE, or COTS tools integration), and the Web Portal Interface (collaborative web environment

  3. 'Correction of unrealizable service choreographies’

    NARCIS (Netherlands)

    Mancioppi, M.

    2015-01-01

    This thesis is devoted to the detection and correction of design flaws affecting service choreographies. Service choreographies are models that specify how software services are composed in a decentralized, message-driven fashion. In particular, this work focuses on flaws that compromise the

  4. Software metrics to improve software quality in HEP

    International Nuclear Information System (INIS)

    Lancon, E.

    1996-01-01

    The ALEPH reconstruction program maintainability has been evaluated with a case tool implementing an ISO standard methodology based on software metrics. It has been found that the overall quality of the program is good and has shown improvement over the past five years. Frequently modified routines exhibits lower quality; most buys were located in routines with particularly low quality. Implementing from the beginning a quality criteria could have avoided time losses due to bug corrections. (author)

  5. Modeling Student Software Testing Processes: Attitudes, Behaviors, Interventions, and Their Effects

    Science.gov (United States)

    Buffardi, Kevin John

    2014-01-01

    Effective software testing identifies potential bugs and helps correct them, producing more reliable and maintainable software. As software development processes have evolved, incremental testing techniques have grown in popularity, particularly with introduction of test-driven development (TDD). However, many programmers struggle to adopt TDD's…

  6. Strengthening Software Authentication with the ROSE Software Suite

    International Nuclear Information System (INIS)

    White, G

    2006-01-01

    Many recent nonproliferation and arms control software projects include a software authentication regime. These include U.S. Government-sponsored projects both in the United States and in the Russian Federation (RF). This trend toward requiring software authentication is only accelerating. Demonstrating assurance that software performs as expected without hidden ''backdoors'' is crucial to a project's success. In this context, ''authentication'' is defined as determining that a software package performs only its intended purpose and performs said purpose correctly and reliably over the planned duration of an agreement. In addition to visual inspections by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs, both to aid visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary and not extensible. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool has to be based on a complete language compiler. ROSE is precisely such a compiler infrastructure developed within the Department of Energy (DOE) and targeted at the optimization of scientific applications and user-defined libraries within large-scale applications (typically applications of a million lines of code). ROSE is a robust, source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C and C++ (handling the full C, C99, C++ languages and with current collaborations to support Fortran90). We propose to extend ROSE to address a number of security-specific requirements, and apply it to software authentication for nonproliferation and arms control projects

  7. Korean WA-DGNSS User Segment Software Design

    Directory of Open Access Journals (Sweden)

    Sayed Chhattan Shah

    2013-03-01

    Full Text Available Korean WA-DGNSS is a large scale research project funded by Ministry of Land, Transport and Maritime Affairs Korea. It aims to augment the Global Navigation Satellite System by broadcasting additional signals from geostationary satellites and providing differential correction messages and integrity data for the GNSS satellites. The project is being carried out by a consortium of universities and research institutes. The research team at Electronics and Telecommunications Research Institute is involved in design and development of data processing softwares for wide area reference station and user segment. This paper focuses on user segment software design. Korean WA-DGNSS user segment software is designed to perform several functions such as calculation of pseudorange, ionosphere and troposphere delays, application of fast and slow correction messages, and data verification. It is based on a layered architecture that provides a model to develop flexible and reusable software and is divided into several independent, interchangeable and reusable components to reduce complexity and maintenance cost. The current version is designed to collect and process GPS and WA-DGNSS data however it is flexible to accommodate future GNSS systems such as GLONASS and Galileo.

  8. Design of a software for gamma detector efficiency

    International Nuclear Information System (INIS)

    Lopez, G.

    2011-01-01

    Gamma spectroscopy with highly-pure-germanium detector is one of the most used method for qualitative and quantitative analysis of samples. Nevertheless Gamma spectroscopy results require to be corrected, first for taking into account the self-shielding effect that represents the absorption of the photons by the sample itself and secondly for correcting the fact that 2 photons emitted simultaneously with energy E 1 and E 2 are likely to be simultaneously detected and then counted as a single photon with an energy E 1 +E 2 . This effect is called gamma-gamma coincidence. A software has been designed to simulate both effect and produce correcting factors in the case of cylindrical geometries. This software has been validated on Americium 241 for the self-shielding effect and on Cesium 134 for gamma-gamma coincidence. (A.C.)

  9. SOFTM: a software maintenance expert system in Prolog

    DEFF Research Database (Denmark)

    Pau, L.; Negret, J. M.

    1988-01-01

    A description is given of a knowledge-based system called SOFTM, serving the following purposes: (1) assisting a software programmer or analyst in his application code maintenance tasks, (2) generating and updating automatically software correction documentation, (3) helping the end user register......, and on interfacing capabilities of Prolog II to a variety of other languages...

  10. ITERATIVE SCATTER CORRECTION FOR GRID-LESS BEDSIDE CHEST RADIOGRAPHY: PERFORMANCE FOR A CHEST PHANTOM.

    Science.gov (United States)

    Mentrup, Detlef; Jockel, Sascha; Menser, Bernd; Neitzel, Ulrich

    2016-06-01

    The aim of this work was to experimentally compare the contrast improvement factors (CIFs) of a newly developed software-based scatter correction to the CIFs achieved by an antiscatter grid. To this end, three aluminium discs were placed in the lung, the retrocardial and the abdominal areas of a thorax phantom, and digital radiographs of the phantom were acquired both with and without a stationary grid. The contrast generated by the discs was measured in both images, and the CIFs achieved by grid usage were determined for each disc. Additionally, the non-grid images were processed with a scatter correction software. The contrasts generated by the discs were determined in the scatter-corrected images, and the corresponding CIFs were calculated. The CIFs obtained with the grid and with the software were in good agreement. In conclusion, the experiment demonstrates quantitatively that software-based scatter correction allows restoring the image contrast of a non-grid image in a manner comparable with an antiscatter grid. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. Advanced hardware design for error correcting codes

    CERN Document Server

    Coussy, Philippe

    2015-01-01

    This book provides thorough coverage of error correcting techniques. It includes essential basic concepts and the latest advances on key topics in design, implementation, and optimization of hardware/software systems for error correction. The book’s chapters are written by internationally recognized experts in this field. Topics include evolution of error correction techniques, industrial user needs, architectures, and design approaches for the most advanced error correcting codes (Polar Codes, Non-Binary LDPC, Product Codes, etc). This book provides access to recent results, and is suitable for graduate students and researchers of mathematics, computer science, and engineering. • Examines how to optimize the architecture of hardware design for error correcting codes; • Presents error correction codes from theory to optimized architecture for the current and the next generation standards; • Provides coverage of industrial user needs advanced error correcting techniques.

  12. The need for scientific software engineering in the pharmaceutical industry.

    Science.gov (United States)

    Luty, Brock; Rose, Peter W

    2017-03-01

    Scientific software engineering is a distinct discipline from both computational chemistry project support and research informatics. A scientific software engineer not only has a deep understanding of the science of drug discovery but also the desire, skills and time to apply good software engineering practices. A good team of scientific software engineers can create a software foundation that is maintainable, validated and robust. If done correctly, this foundation enable the organization to investigate new and novel computational ideas with a very high level of efficiency.

  13. The need for scientific software engineering in the pharmaceutical industry

    Science.gov (United States)

    Luty, Brock; Rose, Peter W.

    2017-03-01

    Scientific software engineering is a distinct discipline from both computational chemistry project support and research informatics. A scientific software engineer not only has a deep understanding of the science of drug discovery but also the desire, skills and time to apply good software engineering practices. A good team of scientific software engineers can create a software foundation that is maintainable, validated and robust. If done correctly, this foundation enable the organization to investigate new and novel computational ideas with a very high level of efficiency.

  14. An analysis of motion correction for 99Tcm DMSA renal imaging in paediatrics

    International Nuclear Information System (INIS)

    Meadows, A.; Hogg, P.

    2007-01-01

    Movement artefact during paediatric 99 Tc m DMSA renal imaging can reduce image quality and therefore render images non-diagnostic. This research assessed software used for the correction of movement artefact in children. The software comprised a count rate dependent dynamic acquisition with a 256 x 256 pixel frame-shift motion correction algorithm. A Williams' phantom was used to generate data during dynamic (experimental) and static (control) image acquisitions. During image acquisition, the Williams' phantom was moved to simulate seven typical paediatric patient movements; acquisitions also considered no movement (Gold Standard). Seven image data sets with motion artefact were corrected using the frame-shift software. The corrected, uncorrected, and static images were rated for quality by suitably qualified and experienced nuclear medicine professionals. The images were scored using an image quality assessment instrument, based on a Likert rating scale. Inferential statistics were applied to these data. The image quality ratings demonstrated a statistically significant (P 99 Tc m DMSA renal scans

  15. DFTBaby: A software package for non-adiabatic molecular dynamics simulations based on long-range corrected tight-binding TD-DFT(B)

    Science.gov (United States)

    Humeniuk, Alexander; Mitrić, Roland

    2017-12-01

    A software package, called DFTBaby, is published, which provides the electronic structure needed for running non-adiabatic molecular dynamics simulations at the level of tight-binding DFT. A long-range correction is incorporated to avoid spurious charge transfer states. Excited state energies, their analytic gradients and scalar non-adiabatic couplings are computed using tight-binding TD-DFT. These quantities are fed into a molecular dynamics code, which integrates Newton's equations of motion for the nuclei together with the electronic Schrödinger equation. Non-adiabatic effects are included by surface hopping. As an example, the program is applied to the optimization of excited states and non-adiabatic dynamics of polyfluorene. The python and Fortran source code is available at http://www.dftbaby.chemie.uni-wuerzburg.de.

  16. SWEPP Gamma-Ray Spectrometer System software design description

    International Nuclear Information System (INIS)

    Femec, D.A.; Killian, E.W.

    1994-08-01

    To assist in the characterization of the radiological contents of contract-handled waste containers at the Stored Waste Examination Pilot Plant (SWEPP), the SWEPP Gamma-Ray Spectrometer (SGRS) System has been developed by the Radiation Measurements and Development Unit of the Idaho National Engineering Laboratory. The SGRS system software controls turntable and detector system activities. In addition to determining the concentrations of gamma-ray-emitting radionuclides, this software also calculates attenuation-corrected isotopic mass ratios of-specific interest. This document describes the software design for the data acquisition and analysis software associated with the SGRS system

  17. SWEPP Gamma-Ray Spectrometer System software design description

    Energy Technology Data Exchange (ETDEWEB)

    Femec, D.A.; Killian, E.W.

    1994-08-01

    To assist in the characterization of the radiological contents of contract-handled waste containers at the Stored Waste Examination Pilot Plant (SWEPP), the SWEPP Gamma-Ray Spectrometer (SGRS) System has been developed by the Radiation Measurements and Development Unit of the Idaho National Engineering Laboratory. The SGRS system software controls turntable and detector system activities. In addition to determining the concentrations of gamma-ray-emitting radionuclides, this software also calculates attenuation-corrected isotopic mass ratios of-specific interest. This document describes the software design for the data acquisition and analysis software associated with the SGRS system.

  18. Attenuation correction for the HRRT PET-scanner using transmission scatter correction and total variation regularization

    DEFF Research Database (Denmark)

    Keller, Sune H; Svarer, Claus; Sibomana, Merence

    2013-01-01

    scatter correction in the μ-map reconstruction and total variation filtering to the transmission processing. Results: Comparing MAP-TR and the new TXTV with gold standard CT-based attenuation correction, we found that TXTV has less bias as compared to MAP-TR. We also compared images acquired at the HRRT......In the standard software for the Siemens high-resolution research tomograph (HRRT) positron emission tomography (PET) scanner the most commonly used segmentation in the μ -map reconstruction for human brain scans is maximum a posteriori for transmission (MAP-TR). Bias in the lower cerebellum...

  19. Savannah River Plant Californium-252 Shuffler software manual

    International Nuclear Information System (INIS)

    Johnson, S.S.; Crane, T.W.; Eccleston, G.W.

    1979-03-01

    A software manual for operating the Savannah River Plant Shuffler nondestructive assay instrument is presented. The procedures for starting up the instrument, making assays, calibrating, and checking the performance of the hardware units are described. A list of the error messages with an explanation of the circumstances prompting the message and possible corrective measures is given. A summary of the software package is included showing the names and contents of the files and subroutines. The procedure for modifying the software package is outlined

  20. Static and Dynamic Verification of Critical Software for Space Applications

    Science.gov (United States)

    Moreira, F.; Maia, R.; Costa, D.; Duro, N.; Rodríguez-Dapena, P.; Hjortnaes, K.

    Space technology is no longer used only for much specialised research activities or for sophisticated manned space missions. Modern society relies more and more on space technology and applications for every day activities. Worldwide telecommunications, Earth observation, navigation and remote sensing are only a few examples of space applications on which we rely daily. The European driven global navigation system Galileo and its associated applications, e.g. air traffic management, vessel and car navigation, will significantly expand the already stringent safety requirements for space based applications Apart from their usefulness and practical applications, every single piece of onboard software deployed into the space represents an enormous investment. With a long lifetime operation and being extremely difficult to maintain and upgrade, at least when comparing with "mainstream" software development, the importance of ensuring their correctness before deployment is immense. Verification &Validation techniques and technologies have a key role in ensuring that the onboard software is correct and error free, or at least free from errors that can potentially lead to catastrophic failures. Many RAMS techniques including both static criticality analysis and dynamic verification techniques have been used as a means to verify and validate critical software and to ensure its correctness. But, traditionally, these have been isolated applied. One of the main reasons is the immaturity of this field in what concerns to its application to the increasing software product(s) within space systems. This paper presents an innovative way of combining both static and dynamic techniques exploiting their synergy and complementarity for software fault removal. The methodology proposed is based on the combination of Software FMEA and FTA with Fault-injection techniques. The case study herein described is implemented with support from two tools: The SoftCare tool for the SFMEA and SFTA

  1. Analyzing Software Requirements Errors in Safety-Critical, Embedded Systems

    Science.gov (United States)

    Lutz, Robyn R.

    1993-01-01

    This paper analyzes the root causes of safety-related software errors in safety-critical, embedded systems. The results show that software errors identified as potentially hazardous to the system tend to be produced by different error mechanisms than non- safety-related software errors. Safety-related software errors are shown to arise most commonly from (1) discrepancies between the documented requirements specifications and the requirements needed for correct functioning of the system and (2) misunderstandings of the software's interface with the rest of the system. The paper uses these results to identify methods by which requirements errors can be prevented. The goal is to reduce safety-related software errors and to enhance the safety of complex, embedded systems.

  2. Using simplex method in verifying software safety

    Directory of Open Access Journals (Sweden)

    Vujošević-Janičić Milena

    2009-01-01

    Full Text Available In this paper we have discussed the application of the Simplex method in checking software safety - the application in automated detection of buffer overflows in C programs. This problem is important because buffer overflows are suitable targets for hackers' security attacks and sources of serious program misbehavior. We have also described our implementation, including a system for generating software correctness conditions and a Simplex based theorem prover that resolves these conditions.

  3. Hologram production and representation for corrected image

    Science.gov (United States)

    Jiao, Gui Chao; Zhang, Rui; Su, Xue Mei

    2015-12-01

    In this paper, a CCD sensor device is used to record the distorted homemade grid images which are taken by a wide angle camera. The distorted images are corrected by using methods of position calibration and correction of gray with vc++ 6.0 and opencv software. Holography graphes for the corrected pictures are produced. The clearly reproduced images are obtained where Fresnel algorithm is used in graph processing by reducing the object and reference light from Fresnel diffraction to delete zero-order part of the reproduced images. The investigation is useful in optical information processing and image encryption transmission.

  4. Workshop on Software Development Tools for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Vetter, Jeffrey [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Georgia Inst. of Technology, Atlanta, GA (United States)

    2007-08-01

    Petascale computing systems will soon be available to the DOE science community. Recent studies in the productivity of HPC platforms point to better software environments as a key enabler to science on these systems. To prepare for the deployment and productive use of these petascale platforms, the DOE science and general HPC community must have the software development tools, such as performance analyzers and debuggers that meet application requirements for scalability, functionality, reliability, and ease of use. In this report, we identify and prioritize the research opportunities in the area of software development tools for high performance computing. To facilitate this effort, DOE hosted a group of 55 leading international experts in this area at the Software Development Tools for PetaScale Computing (SDTPC) Workshop, which was held in Washington, D.C. on August 1 and 2, 2007. Software development tools serve as an important interface between the application teams and the target HPC architectures. Broadly speaking, these roles can be decomposed into three categories: performance tools, correctness tools, and development environments. Accordingly, this SDTPC report has four technical thrusts: performance tools, correctness tools, development environment infrastructures, and scalable tool infrastructures. The last thrust primarily targets tool developers per se, rather than end users. Finally, this report identifies non-technical strategic challenges that impact most tool development. The organizing committee emphasizes that many critical areas are outside the scope of this charter; these important areas include system software, compilers, and I/O.

  5. fatalityCMR: capture-recapture software to correct raw counts of wildlife fatalities using trial experiments for carcass detection probability and persistence time

    Science.gov (United States)

    Peron, Guillaume; Hines, James E.

    2014-01-01

    Many industrial and agricultural activities involve wildlife fatalities by collision, poisoning or other involuntary harvest: wind turbines, highway network, utility network, tall structures, pesticides, etc. Impacted wildlife may benefit from official protection, including the requirement to monitor the impact. Carcass counts can often be conducted to quantify the number of fatalities, but they need to be corrected for carcass persistence time (removal by scavengers and decay) and detection probability (searcher efficiency). In this article we introduce a new piece of software that fits a superpopulation capture-recapture model to raw count data. It uses trial data to estimate detection and daily persistence probabilities. A recurrent issue is that fatalities of rare, protected species are infrequent, in which case the software offers the option to switch to an ‘evidence of absence’ mode, i.e., estimate the number of carcasses that may have been missed by field crews. The software allows distinguishing between different turbine types (e.g. different vegetation cover under turbines, or different technical properties), as well between two carcass age-classes or states, with transition between those classes (e.g, fresh and dry). There is a data simulation capacity that may be used at the planning stage to optimize sampling design. Resulting mortality estimates can be used 1) to quantify the required amount of compensation, 2) inform mortality projections for proposed development sites, and 3) inform decisions about management of existing sites.

  6. Antenna Controller Replacement Software

    Science.gov (United States)

    Chao, Roger Y.; Morgan, Scott C.; Strain, Martha M.; Rockwell, Stephen T.; Shimizu, Kenneth J.; Tehrani, Barzia J.; Kwok, Jaclyn H.; Tuazon-Wong, Michelle; Valtier, Henry; Nalbandi, Reza; hide

    2010-01-01

    The Antenna Controller Replacement (ACR) software accurately points and monitors the Deep Space Network (DSN) 70-m and 34-m high-efficiency (HEF) ground-based antennas that are used to track primarily spacecraft and, periodically, celestial targets. To track a spacecraft, or other targets, the antenna must be accurately pointed at the spacecraft, which can be very far away with very weak signals. ACR s conical scanning capability collects the signal in a circular pattern around the target, calculates the location of the strongest signal, and adjusts the antenna pointing to point directly at the spacecraft. A real-time, closed-loop servo control algorithm performed every 0.02 second allows accurate positioning of the antenna in order to track these distant spacecraft. Additionally, this advanced servo control algorithm provides better antenna pointing performance in windy conditions. The ACR software provides high-level commands that provide a very easy user interface for the DSN operator. The operator only needs to enter two commands to start the antenna and subreflector, and Master Equatorial tracking. The most accurate antenna pointing is accomplished by aligning the antenna to the Master Equatorial, which because of its small size and sheltered location, has the most stable pointing. The antenna has hundreds of digital and analog monitor points. The ACR software provides compact displays to summarize the status of the antenna, subreflector, and the Master Equatorial. The ACR software has two major functions. First, it performs all of the steps required to accurately point the antenna (and subreflector and Master Equatorial) at the spacecraft (or celestial target). This involves controlling the antenna/ subreflector/Master-Equatorial hardware, initiating and monitoring the correct sequence of operations, calculating the position of the spacecraft relative to the antenna, executing the real-time servo control algorithm to maintain the correct position, and

  7. Software Certification - Coding, Code, and Coders

    Science.gov (United States)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  8. Semantics and correctness proofs for programs with partial functions

    International Nuclear Information System (INIS)

    Yakhnis, A.; Yakhnis, V.

    1996-01-01

    This paper presents a portion of the work on specification, design, and implementation of safety-critical systems such as reactor control systems. A natural approach to this problem, once all the requirements are captured, would be to state the requirements formally and then either to prove (preferably via automated tools) that the system conforms to spec (program verification), or to try to simultaneously generate the system and a mathematical proof that the requirements are being met (program derivation). An obstacle to this is frequent presence of partially defined operations within the software and its specifications. Indeed, the usual proofs via first order logic presuppose everywhere defined operations. Recognizing this problem, David Gries, in ''The Science of Programming,'' 1981, introduced the concept of partial functions into the mainstream of program correctness and gave hints how his treatment of partial functions could be formalized. Still, however, existing theorem provers and software verifiers have difficulties in checking software with partial functions, because of absence of uniform first order treatment of partial functions within classical 2-valued logic. Several rigorous mechanisms that took partiality into account were introduced [Wirsing 1990, Breu 1991, VDM 1986, 1990, etc.]. However, they either did not discuss correctness proofs or departed from first order logic. To fill this gap, the authors provide a semantics for software correctness proofs with partial functions within classical 2-valued 1st order logic. They formalize the Gries treatment of partial functions and also cover computations of functions whose argument lists may be only partially available. An example is nuclear reactor control relying on sensors which may fail to deliver sense data. This approach is sufficiently general to cover correctness proofs in various implementation languages

  9. Software design specification and analysis(NuFDS) approach for the safety critical software based on porgrammable logic controller(PLC)

    International Nuclear Information System (INIS)

    Koo, Seo Ryong; Seong, Poong Hyun; Jung, Jin Yong; Choi, Seong Soo

    2004-01-01

    This paper introduces the software design specification and analysis technique for the safety-critical system based on Programmable Logic Controller (PLC). During software development phases, the design phase should perform an important role to connect between requirements phase and implementation phase as a process of translating problem requirements into software structures. In this work, the Nuclear FBD-style Design Specification and analysis (NuFDS) approach was proposed. The NuFDS approach for nuclear Instrumentation and Control (I and C) software are suggested in a straight forward manner. It consists of four major specifications as follows; Database, Software Architecture, System Behavior, and PLC Hardware Configuration. Additionally, correctness, completeness, consistency, and traceability check techniques are also suggested for the formal design analysis in NuFDS approach. In addition, for the tool supporting, we are developing NuSDS tool based on the NuFDS approach which is a tool, especially for the software design specification in nuclear fields

  10. A Generic Software Safety Document Generator

    Science.gov (United States)

    Denney, Ewen; Venkatesan, Ram Prasad

    2004-01-01

    Formal certification is based on the idea that a mathematical proof of some property of a piece of software can be regarded as a certificate of correctness which, in principle, can be subjected to external scrutiny. In practice, however, proofs themselves are unlikely to be of much interest to engineers. Nevertheless, it is possible to use the information obtained from a mathematical analysis of software to produce a detailed textual justification of correctness. In this paper, we describe an approach to generating textual explanations from automatically generated proofs of program safety, where the proofs are of compliance with an explicit safety policy that can be varied. Key to this is tracing proof obligations back to the program, and we describe a tool which implements this to certify code auto-generated by AutoBayes and AutoFilter, program synthesis systems under development at the NASA Ames Research Center. Our approach is a step towards combining formal certification with traditional certification methods.

  11. Software life after in-service

    International Nuclear Information System (INIS)

    Tseng, M.; Eng, P.

    1993-01-01

    Software engineers and designers tend to conclude a software project at the in-service milestone of the software life cycle. But the reality is that the 'life after in-service' is significantly longer than other phases of the life cycle, typically 20 years or more depending on the maintainability of the hardware platform and the designed life of the plant. During this period, the software asset (as with other physical assets in the plant) continues to be upgraded to correct deficiencies, meet new requirements, cope with obsolescence of equipment and so on. The software life cycle ends with a migration of the software to a different platform. It is typical in a software development project to put a great deal of emphasis on design methodologies, techniques, tools, development environment, standard procedures, and project management to ensure quality product is delivered on schedule and within budget. More often than not, a disproportion of emphasis is placed on the issues and needs of the in-service phase. Once the software is in-service, the designers move on to other projects, while the maintenance and support staff must manage the software. This paper examines the issues in three steps. First it presents a view of software from maintenance and support staff perspectives, including complexity of software, suitability of documentation, configuration management, training, difficulties and risks associated with making changes, required skills and knowledge. Second, it identifies the concerns raised from these viewpoints, including costs of maintaining the software, ability to meet additional requirements, availability of support tools, length of time required to engineer and install changes, and a strategy for the migration of software asset. Finally it discusses some approaches to deal with the concerns. (Author) 5 refs., fig

  12. The threshold contrast thickness evaluated with different CDMAM phantoms and software

    Directory of Open Access Journals (Sweden)

    Fabiszewska Ewa

    2016-03-01

    Full Text Available The image quality in digital mammography is described by specifying the thickness and diameter of disks with threshold visibility. The European Commission recommends the CDMAM phantom as a tool to evaluate threshold contrast visibility in digital mammography [1, 2]. Inaccuracy of the manufacturing process of CDMAM 3.4 phantoms (Artinis Medical System BV, as well as differences between software used to analyze the images, may lead to discrepancies in the evaluation of threshold contrast visibility. The authors of this work used three CDMAM 3.4 phantoms with serial numbers 1669, 1840, and 1841 and two mammography systems of the same manufacturer with an identical types of detectors. The images were analyzed with EUREF software (version 1.5.5 with CDCOM 1.6. exe file and Artinis software (version 1.2 with CDCOM 1.6. exe file. The differences between the observed thicknesses of the threshold contrast structures, which were caused by differences between the CDMAM 3.4 phantoms, were not reproduced in the same way on two mammography units of the same type. The thickness reported by the Artinis software (version 1.2 with CDCOM 1.6. exe file was generally greater than the one determined by the EUREF software (version 1.5.5 with CDCOM 1.6. exe file, but the ratio of the results depended on the phantom and diameter of the structure. It was not possible to establish correction factors, which would allow correction of the differences between the results obtained for different CDMAM 3.4 phantoms, or to correct the differences between software. Great care must be taken when results of the tests performed with different CDMAM 3.4 phantoms and with different software application are interpreted.

  13. 12: Assuring the quality of critical software

    International Nuclear Information System (INIS)

    Jacky, J.; Kalet, I.

    1987-01-01

    The authors recommend quality assurance procedures for radiation therapy software. Software quality assurance deals with preventing, detecting and repairing programming errors. Error detection difficulties are most severe in computer-based control systems, for example therapy machine control systems, because it may be impossible for users to confirm correct operation while treatments are in progress, or to intervene if things go wrong. Software quality assurance techniques observed in other industries in which public safety is at risk are reviewed. In some of these industries software must be approved or certified before it can be used. Approval is subject to technical reviews and audits by experts other than the program authors. The main obstacles to adoption of these techniques in the radiation therapy field are costs, lack of familiarity and doubts regarding efficacy. 18 refs

  14. Software for precise tracking of cell proliferation

    International Nuclear Information System (INIS)

    Kurokawa, Hiroshi; Noda, Hisayori; Sugiyama, Mayu; Sakaue-Sawano, Asako; Fukami, Kiyoko; Miyawaki, Atsushi

    2012-01-01

    Highlights: ► We developed software for analyzing cultured cells that divide as well as migrate. ► The active contour model (Snakes) was used as the core algorithm. ► The time backward analysis was also used for efficient detection of cell division. ► With user-interactive correction functions, the software enables precise tracking. ► The software was successfully applied to cells with fluorescently-labeled nuclei. -- Abstract: We have developed a multi-target cell tracking program TADOR, which we applied to a series of fluorescence images. TADOR is based on an active contour model that is modified in order to be free of the problem of locally optimal solutions, and thus is resistant to signal fluctuation and morphological changes. Due to adoption of backward tracing and addition of user-interactive correction functions, TADOR is used in an off-line and semi-automated mode, but enables precise tracking of cell division. By applying TADOR to the analysis of cultured cells whose nuclei had been fluorescently labeled, we tracked cell division and cell-cycle progression on coverslips over an extended period of time.

  15. Precise Documentation: The Key to Better Software

    Science.gov (United States)

    Parnas, David Lorge

    The prime cause of the sorry “state of the art” in software development is our failure to produce good design documentation. Poor documentation is the cause of many errors and reduces efficiency in every phase of a software product's development and use. Most software developers believe that “documentation” refers to a collection of wordy, unstructured, introductory descriptions, thousands of pages that nobody wanted to write and nobody trusts. In contrast, Engineers in more traditional disciplines think of precise blueprints, circuit diagrams, and mathematical specifications of component properties. Software developers do not know how to produce precise documents for software. Software developments also think that documentation is something written after the software has been developed. In other fields of Engineering much of the documentation is written before and during the development. It represents forethought not afterthought. Among the benefits of better documentation would be: easier reuse of old designs, better communication about requirements, more useful design reviews, easier integration of separately written modules, more effective code inspection, more effective testing, and more efficient corrections and improvements. This paper explains how to produce and use precise software documentation and illustrate the methods with several examples.

  16. PET motion correction using PRESTO with ITK motion estimation

    Energy Technology Data Exchange (ETDEWEB)

    Botelho, Melissa [Institute of Biophysics and Biomedical Engineering, Science Faculty of University of Lisbon (Portugal); Caldeira, Liliana; Scheins, Juergen [Institute of Neuroscience and Medicine (INM-4), Forschungszentrum Jülich (Germany); Matela, Nuno [Institute of Biophysics and Biomedical Engineering, Science Faculty of University of Lisbon (Portugal); Kops, Elena Rota; Shah, N Jon [Institute of Neuroscience and Medicine (INM-4), Forschungszentrum Jülich (Germany)

    2014-07-29

    The Siemens BrainPET scanner is a hybrid MRI/PET system. PET images are prone to motion artefacts which degrade the image quality. Therefore, motion correction is essential. The library PRESTO converts motion-corrected LORs into highly accurate generic projection data [1], providing high-resolution PET images. ITK is an open-source software used for registering multidimensional data []. ITK provides motion estimation necessary to PRESTO.

  17. PET motion correction using PRESTO with ITK motion estimation

    International Nuclear Information System (INIS)

    Botelho, Melissa; Caldeira, Liliana; Scheins, Juergen; Matela, Nuno; Kops, Elena Rota; Shah, N Jon

    2014-01-01

    The Siemens BrainPET scanner is a hybrid MRI/PET system. PET images are prone to motion artefacts which degrade the image quality. Therefore, motion correction is essential. The library PRESTO converts motion-corrected LORs into highly accurate generic projection data [1], providing high-resolution PET images. ITK is an open-source software used for registering multidimensional data []. ITK provides motion estimation necessary to PRESTO.

  18. How much software verification and validation is adequate for nuclear safety?

    International Nuclear Information System (INIS)

    Fujii, R.U.

    1994-01-01

    For over 26 years, software verification and validation (V ampersand V) has been applied to major DOD weapon systems, especially nuclear weapon systems, to ensure that the software is free of catastrophic software errors. Software V ampersand V is a systems engineering discipline that evaluates the software as part of the entire system including hardware, human operators, and other interfacing software. When applied from a systems perspective, software V ampersand V has been proven to be an effective technique for the early detection and correction of errors. Several V ampersand V cost/benefit case studies for the Rome Air Development Center have shown that the dollar savings of the early detection of errors clearly offset the cost of the software V ampersand V

  19. DEVELOPING EVALUATION INSTRUMENT FOR MATHEMATICS EDUCATIONAL SOFTWARE

    Directory of Open Access Journals (Sweden)

    Wahyu Setyaningrum

    2012-02-01

    Full Text Available The rapid increase and availability of mathematics software, either for classroom or individual learning activities, presents a challenge for teachers. It has been argued that many products are limited in quality. Some of the more commonly used software products have been criticized for poor content, activities which fail to address some learning issues, poor graphics presentation, inadequate documentation, and other technical problems. The challenge for schools is to ensure that the educational software used in classrooms is appropriate and effective in supporting intended outcomes and goals. This paper aimed to develop instrument for evaluating mathematics educational software in order to help teachers in selecting the appropriate software. The instrument considers the notion of educational including content, teaching and learning skill, interaction, and feedback and error correction; and technical aspects of educational software including design, clarity, assessment and documentation, cost and hardware and software interdependence. The instrument use a checklist approach, the easier and effective methods in assessing the quality of educational software, thus the user needs to put tick in each criteria. The criteria in this instrument are adapted and extended from standard evaluation instrument in several references.   Keywords: mathematics educational software, educational aspect, technical aspect.

  20. Application of Formal Methods in Software Engineering

    Directory of Open Access Journals (Sweden)

    Adriana Morales

    2011-12-01

    Full Text Available The purpose of this research work is to examine: (1 why are necessary the formal methods for software systems today, (2 high integrity systems through the methodology C-by-C –Correctness-by-Construction–, and (3 an affordable methodology to apply formal methods in software engineering. The research process included reviews of the literature through Internet, in publications and presentations in events. Among the Research results found that: (1 there is increasing the dependence that the nations have, the companies and people of software systems, (2 there is growing demand for software Engineering to increase social trust in the software systems, (3 exist methodologies, as C-by-C, that can provide that level of trust, (4 Formal Methods constitute a principle of computer science that can be applied software engineering to perform reliable process in software development, (5 software users have the responsibility to demand reliable software products, and (6 software engineers have the responsibility to develop reliable software products. Furthermore, it is concluded that: (1 it takes more research to identify and analyze other methodologies and tools that provide process to apply the Formal Software Engineering methods, (2 Formal Methods provide an unprecedented ability to increase the trust in the exactitude of the software products and (3 by development of new methodologies and tools is being achieved costs are not more a disadvantage for application of formal methods.

  1. The MINERVA Software Development Process

    Science.gov (United States)

    Narkawicz, Anthony; Munoz, Cesar A.; Dutle, Aaron M.

    2017-01-01

    This paper presents a software development process for safety-critical software components of cyber-physical systems. The process is called MINERVA, which stands for Mirrored Implementation Numerically Evaluated against Rigorously Verified Algorithms. The process relies on formal methods for rigorously validating code against its requirements. The software development process uses: (1) a formal specification language for describing the algorithms and their functional requirements, (2) an interactive theorem prover for formally verifying the correctness of the algorithms, (3) test cases that stress the code, and (4) numerical evaluation on these test cases of both the algorithm specifications and their implementations in code. The MINERVA process is illustrated in this paper with an application to geo-containment algorithms for unmanned aircraft systems. These algorithms ensure that the position of an aircraft never leaves a predetermined polygon region and provide recovery maneuvers when the region is inadvertently exited.

  2. Real-time distortion correction for visual inspection systems based on FPGA

    Science.gov (United States)

    Liang, Danhua; Zhang, Zhaoxia; Chen, Xiaodong; Yu, Daoyin

    2008-03-01

    Visual inspection is a kind of new technology based on the research of computer vision, which focuses on the measurement of the object's geometry and location. It can be widely used in online measurement, and other real-time measurement process. Because of the defects of the traditional visual inspection, a new visual detection mode -all-digital intelligent acquisition and transmission is presented. The image processing, including filtering, image compression, binarization, edge detection and distortion correction, can be completed in the programmable devices -FPGA. As the wide-field angle lens is adopted in the system, the output images have serious distortion. Limited by the calculating speed of computer, software can only correct the distortion of static images but not the distortion of dynamic images. To reach the real-time need, we design a distortion correction system based on FPGA. The method of hardware distortion correction is that the spatial correction data are calculated first under software circumstance, then converted into the address of hardware storage and stored in the hardware look-up table, through which data can be read out to correct gray level. The major benefit using FPGA is that the same circuit can be used for other circularly symmetric wide-angle lenses without being modified.

  3. An effort allocation model considering different budgetary constraint on fault detection process and fault correction process

    Directory of Open Access Journals (Sweden)

    Vijay Kumar

    2016-01-01

    Full Text Available Fault detection process (FDP and Fault correction process (FCP are important phases of software development life cycle (SDLC. It is essential for software to undergo a testing phase, during which faults are detected and corrected. The main goal of this article is to allocate the testing resources in an optimal manner to minimize the cost during testing phase using FDP and FCP under dynamic environment. In this paper, we first assume there is a time lag between fault detection and fault correction. Thus, removal of a fault is performed after a fault is detected. In addition, detection process and correction process are taken to be independent simultaneous activities with different budgetary constraints. A structured optimal policy based on optimal control theory is proposed for software managers to optimize the allocation of the limited resources with the reliability criteria. Furthermore, release policy for the proposed model is also discussed. Numerical example is given in support of the theoretical results.

  4. Structuring Interactive Correctness Proofs by Formalizing Coding Idioms

    OpenAIRE

    Gast, Holger

    2012-01-01

    This paper examines a novel strategy for developing correctness proofs in interactive software verification for C programs. Rather than proceeding backwards from the generated verification conditions, we start by developing a library of the employed data structures and related coding idioms. The application of that library then leads to correctness proofs that reflect informal arguments about the idioms. We apply this strategy to the low-level memory allocator of the L4 microkernel, a case st...

  5. A research review of quality assessment for software

    Science.gov (United States)

    1991-01-01

    Measures were recommended to assess the quality of software submitted to the AdaNet program. The quality factors that are important to software reuse are explored and methods of evaluating those factors are discussed. Quality factors important to software reuse are: correctness, reliability, verifiability, understandability, modifiability, and certifiability. Certifiability is included because the documentation of many factors about a software component such as its efficiency, portability, and development history, constitute a class for factors important to some users, not important at all to other, and impossible for AdaNet to distinguish between a priori. The quality factors may be assessed in different ways. There are a few quantitative measures which have been shown to indicate software quality. However, it is believed that there exists many factors that indicate quality and have not been empirically validated due to their subjective nature. These subjective factors are characterized by the way in which they support the software engineering principles of abstraction, information hiding, modularity, localization, confirmability, uniformity, and completeness.

  6. Dedicated software for diffractive optics design and simulation

    International Nuclear Information System (INIS)

    Firsov, A; Brzhezinskaya, M; Erko, A; Firsov, A; Svintsov, A

    2013-01-01

    An efficient software package for the structure design and simulation of imaging properties of diffraction optical elements has been developed. It operates with point source and consists of: the ZON software, to calculate the structure of an optical element in transmission and reflection; the KRGF software, to simulate the diffraction properties of an ideal optical element with point source; the DS software, to calculate the diffraction properties by taking into consideration material and shadowing effects. Optional software allows simulation with a real non-point source. Zone plate thickness profile, source shape as well as substrate curvature are considered in this calculation. This is especially important for the diffractive focusing elements and gratings at a total external reflection, given that the lateral size of the structure can be up to 1 m. The program package can be used in combination with the Nanomaker software to prepare data for ion and e-beam surface modifications and corrections.

  7. Characterizing the contribution of quality requirements to software sustainability

    NARCIS (Netherlands)

    Condori-Fernandez, Nelly; Lago, Patricia

    2018-01-01

    Most respondents considered modifiability as relevant for addressing both technical and environmental sustainability. Functional correctness, availability, modifiability, interoperability and recoverability favor positively the endurability of software systems. This study has also identified

  8. Building quality into performance and safety assessment software

    International Nuclear Information System (INIS)

    Wojciechowski, L.C.

    2011-01-01

    Quality assurance is integrated throughout the development lifecycle for performance and safety assessment software. The software used in the performance and safety assessment of a Canadian deep geological repository (DGR) follows the CSA quality assurance standard CSA-N286.7 [1], Quality Assurance of Analytical, Scientific and Design Computer Programs for Nuclear Power Plants. Quality assurance activities in this standard include tasks such as verification and inspection; however, much more is involved in producing a quality software computer program. The types of errors found with different verification methods are described. The integrated quality process ensures that defects are found and corrected as early as possible. (author)

  9. Learning from examples - Generation and evaluation of decision trees for software resource analysis

    Science.gov (United States)

    Selby, Richard W.; Porter, Adam A.

    1988-01-01

    A general solution method for the automatic generation of decision (or classification) trees is investigated. The approach is to provide insights through in-depth empirical characterization and evaluation of decision trees for software resource data analysis. The trees identify classes of objects (software modules) that had high development effort. Sixteen software systems ranging from 3,000 to 112,000 source lines were selected for analysis from a NASA production environment. The collection and analysis of 74 attributes (or metrics), for over 4,700 objects, captured information about the development effort, faults, changes, design style, and implementation style. A total of 9,600 decision trees were automatically generated and evaluated. The trees correctly identified 79.3 percent of the software modules that had high development effort or faults, and the trees generated from the best parameter combinations correctly identified 88.4 percent of the modules on the average.

  10. Software compensation in Particle Flow reconstruction

    CERN Document Server

    Lan Tran, Huong; Sefkow, Felix; Green, Steven; Marshall, John; Thomson, Mark; Simon, Frank

    2017-01-01

    The Particle Flow approach to calorimetry requires highly granular calorimeters and sophisticated software algorithms in order to reconstruct and identify individual particles in complex event topologies. The high spatial granularity, together with analog energy information, can be further exploited in software compensation. In this approach, the local energy density is used to discriminate electromagnetic and purely hadronic sub-showers within hadron showers in the detector to improve the energy resolution for single particles by correcting for the intrinsic non-compensation of the calorimeter system. This improvement in the single particle energy resolution also results in a better overall jet energy resolution by improving the energy measurement of identified neutral hadrons and improvements in the pattern recognition stage by a more accurate matching of calorimeter energies to tracker measurements. This paper describes the software compensation technique and its implementation in Particle Flow reconstruct...

  11. ETICS meta-data software editing - from check out to commit operations

    International Nuclear Information System (INIS)

    Begin, M-E; Sancho, G D-A; Ronco, S D; Gentilini, M; Ronchieri, E; Selmi, M

    2008-01-01

    People involved in modular projects need to improve the build software process, planning the correct execution order and detecting circular dependencies. The lack of suitable tools may cause delays in the development, deployment and maintenance of the software. Experience in such projects has shown that the use of version control and build systems is not able to support the development of the software efficiently, due to a large number of errors each of which causes the breaking of the build process. Common causes of errors are for example the adoption of new libraries, libraries incompatibility, the extension of the current project in order to support new software modules. In this paper, we describe a possible solution implemented in ETICS, an integrated infrastructure for the automated configuration, build and test of Grid and distributed software. ETICS has defined meta-data software abstractions, from which it is possible to download, build and test software projects, setting for instance dependencies, environment variables and properties. Furthermore, the meta-data information is managed by ETICS reflecting the version control system philosophy, because of the existence of a meta-data repository and the handling of a list of operations, such as check out and commit. All the information related to a specific software are stored in the repository only when they are considered to be correct. By means of this solution, we introduce a sort of flexibility inside the ETICS system, allowing users to work accordingly to their needs. Moreover, by introducing this functionality, ETICS will be a version control system like for the management of the meta-data

  12. Avaliação de alunos e professores acerca do software "Sinais Vitais" Evaluación de estudiantes y profesores acerca del software "señales vitales" Evaluation of students and teachers concerning the "vital signs" software

    Directory of Open Access Journals (Sweden)

    Marcos Venícios de Oliveira Lopes

    2004-12-01

    Full Text Available Este artigo teve por objetivo levantar as opiniões de alunos e professores sobre o software "Sinais Vitais". O trabalho foi desenvolvido no Departamento de Enfermagem da Universidade Federal do Ceará. Trabalhou-se com um total de seis alunos e três professores, os quais foram submetidos a uma entrevista após a utilização do software. As entrevistas geraram dez categorias, as quais foram separadas em dois temas: Características que estimularam a utilização do software "Sinais Vitais" e Software educacionalmente correto. Concluiu-se que os professores valorizaram a correção do conteúdo, enquanto os alunos enfocaram mais a dinâmica do programa.Este artículo tuvo por objetivo levantar las opiniones de alumnos y profesores sobre el software "Señales Vitales". El trabajo fue desarrollado en el Departamento de Enfermería de la Universidad Federal de Ceará. Se trabajó con un total de 6 alumnos y 3 profesores, los cuales fueron sometidos a una entrevista después de la utilización del software. Las entrevistas generaron 10 categorías, que fueron separadas en dos temas: Características que estimularon la utilización del software "Señales Vitales"; y software educativamente correcto. Se concluyó que los profesores valoraron la corrección del contenido, mientras que los alumnos enfocaron más la dinámica del programa.This article had for objective to get students' and teacher's opinions about "Vital Signals" software. The investigation was developed in the Nursing Department of the Federal University of Ceará. The sample population was a total of 6 students and 3 teachers, who were submitted to an interview after using the software. The interviews generated 10 categories, which were separated in two themes: Features which stimulated the use of the "Vital Signals" software; and software educationally correct. The results showed that the teachers valued the correction of the content, while the students focused more on the dynamics

  13. Energy efficiency of error correction on wireless systems

    NARCIS (Netherlands)

    Havinga, Paul J.M.

    1999-01-01

    Since high error rates are inevitable to the wireless environment, energy-efficient error-control is an important issue for mobile computing systems. We have studied the energy efficiency of two different error correction mechanisms and have measured the efficiency of an implementation in software.

  14. Protection of Mobile Agents Execution Using a Modified Self-Validating Branch-Based Software Watermarking with External Sentinel

    Science.gov (United States)

    Tomàs-Buliart, Joan; Fernández, Marcel; Soriano, Miguel

    Critical infrastructures are usually controlled by software entities. To monitor the well-function of these entities, a solution based in the use of mobile agents is proposed. Some proposals to detect modifications of mobile agents, as digital signature of code, exist but they are oriented to protect software against modification or to verify that an agent have been executed correctly. The aim of our proposal is to guarantee that the software is being executed correctly by a non trusted host. The way proposed to achieve this objective is by the improvement of the Self-Validating Branch-Based Software Watermarking by Myles et al.. The proposed modification is the incorporation of an external element called sentinel which controls branch targets. This technique applied in mobile agents can guarantee the correct operation of an agent or, at least, can detect suspicious behaviours of a malicious host during the execution of the agent instead of detecting when the execution of the agent have finished.

  15. A NEW EXHAUST VENTILATION SYSTEM DESIGN SOFTWARE

    Directory of Open Access Journals (Sweden)

    H. Asilian Mahabady

    2007-09-01

    Full Text Available A Microsoft Windows based ventilation software package is developed to reduce time-consuming and boring procedure of exhaust ventilation system design. This program Assure accurate and reliable air pollution control related calculations. Herein, package is tentatively named Exhaust Ventilation Design Software which is developed in VB6 programming environment. Most important features of Exhaust Ventilation Design Software that are ignored in formerly developed packages are Collector design and fan dimension data calculations. Automatic system balance is another feature of this package. Exhaust Ventilation Design Software algorithm for design is based on two methods: Balance by design (Static pressure balance and design by Blast gate. The most important section of software is a spreadsheet that is designed based on American Conference of Governmental Industrial Hygienists calculation sheets. Exhaust Ventilation Design Software is developed so that engineers familiar with American Conference of Governmental Industrial Hygienists datasheet can easily employ it for ventilation systems design. Other sections include Collector design section (settling chamber, cyclone, and packed tower, fan geometry and dimension data section, a unit converter section (that helps engineers to deal with units, a hood design section and a Persian HTML help. Psychometric correction is also considered in Exhaust Ventilation Design Software. In Exhaust Ventilation Design Software design process, efforts are focused on improving GUI (graphical user interface and use of programming standards in software design. Reliability of software has been evaluated and results show acceptable accuracy.

  16. Software inspections at Fermilab -- Use and experience

    International Nuclear Information System (INIS)

    Berman, E.F.

    1998-01-01

    Because of the critical nature of DA/Online software it is important to commission software which is correct, usable, reliable, and maintainable, i.e., has the highest quality possible. In order to help meet these goals Fermi National Accelerator Laboratory (Fermilab) has begun implementing a formal software inspection process. Formal Inspections are used to reduce the number of defects in software at as early a stage as possible. These Inspections, in use at a wide variety of institutions (e.g., NASA, Motorola), implement a well-defined procedure that can be used to improve the quality of many different types of deliverables. The inspection process, initially designed by Michael Fagan, will be described as it was developed and as it is currently implemented at Fermilab where it has been used to improve the quality of a variety of different experiment DA/Online software. Benefits of applying inspections at many points in the software life-cycle and benefits to the people involved will be investigated. Experience with many different types of Inspections and the lessons learned about the inspection process itself will be detailed. Finally, the future of Inspections at Fermilab will be given

  17. SU-E-P-43: A Knowledge Based Approach to Guidelines for Software Safety

    International Nuclear Information System (INIS)

    Salomons, G; Kelly, D

    2015-01-01

    Purpose: In the fall of 2012, a survey was distributed to medical physicists across Canada. The survey asked the respondents to comment on various aspects of software development and use in their clinic. The survey revealed that most centers employ locally produced (in-house) software of some kind. The respondents also indicated an interest in having software guidelines, but cautioned that the realities of cancer clinics include variations, that preclude a simple solution. Traditional guidelines typically involve periodically repeating a set of prescribed tests with defined tolerance limits. However, applying a similar formula to software is problematic since it assumes that the users have a perfect knowledge of how and when to apply the software and that if the software operates correctly under one set of conditions it will operate correctly under all conditions Methods: In the approach presented here the personnel involved with the software are included as an integral part of the system. Activities performed to improve the safety of the software are done with both software and people in mind. A learning oriented approach is taken, following the premise that the best approach to safety is increasing the understanding of those associated with the use or development of the software. Results: The software guidance document is organized by areas of knowledge related to use and development of software. The categories include: knowledge of the underlying algorithm and its limitations; knowledge of the operation of the software, such as input values, parameters, error messages, and interpretation of output; and knowledge of the environment for the software including both data and users. Conclusion: We propose a new approach to developing guidelines which is based on acquiring knowledge-rather than performing tests. The ultimate goal is to provide robust software guidelines which will be practical and effective

  18. SU-E-P-43: A Knowledge Based Approach to Guidelines for Software Safety

    Energy Technology Data Exchange (ETDEWEB)

    Salomons, G [Cancer Center of Southeastern Ontario & Queen’s University, Kingston, ON (Canada); Kelly, D [Royal Military College of Canada, Kingston, ON, CA (Canada)

    2015-06-15

    Purpose: In the fall of 2012, a survey was distributed to medical physicists across Canada. The survey asked the respondents to comment on various aspects of software development and use in their clinic. The survey revealed that most centers employ locally produced (in-house) software of some kind. The respondents also indicated an interest in having software guidelines, but cautioned that the realities of cancer clinics include variations, that preclude a simple solution. Traditional guidelines typically involve periodically repeating a set of prescribed tests with defined tolerance limits. However, applying a similar formula to software is problematic since it assumes that the users have a perfect knowledge of how and when to apply the software and that if the software operates correctly under one set of conditions it will operate correctly under all conditions Methods: In the approach presented here the personnel involved with the software are included as an integral part of the system. Activities performed to improve the safety of the software are done with both software and people in mind. A learning oriented approach is taken, following the premise that the best approach to safety is increasing the understanding of those associated with the use or development of the software. Results: The software guidance document is organized by areas of knowledge related to use and development of software. The categories include: knowledge of the underlying algorithm and its limitations; knowledge of the operation of the software, such as input values, parameters, error messages, and interpretation of output; and knowledge of the environment for the software including both data and users. Conclusion: We propose a new approach to developing guidelines which is based on acquiring knowledge-rather than performing tests. The ultimate goal is to provide robust software guidelines which will be practical and effective.

  19. Technique for unit testing of safety software verification and validation

    International Nuclear Information System (INIS)

    Li Duo; Zhang Liangju; Feng Junting

    2008-01-01

    The key issue arising from digitalization of the reactor protection system for nuclear power plant is how to carry out verification and validation (V and V), to demonstrate and confirm the software that performs reactor safety functions is safe and reliable. One of the most important processes for software V and V is unit testing, which verifies and validates the software coding based on concept design for consistency, correctness and completeness during software development. The paper shows a preliminary study on the technique for unit testing of safety software V and V, focusing on such aspects as how to confirm test completeness, how to establish test platform, how to develop test cases and how to carry out unit testing. The technique discussed here was successfully used in the work of unit testing on safety software of a digital reactor protection system. (authors)

  20. Performance assessment of the commercial CFD software for the prediction of the PWR internal flow - Corrected version

    International Nuclear Information System (INIS)

    Lee, Gong Hee; Bang, Young Seok; Woo, Sweng Woong; Cheong, Ae Ju; Kim, Do Hyeong; Kang, Min Ku

    2013-01-01

    As the computer hardware technology develops the license applicants for nuclear power plant use the commercial CFD software with the aim of reducing the excessive conservatism associated with using simplified and conservative analysis tools. Even if some of CFD software developers and its users think that a state of the art CFD software can be used to solve reasonably at least the single-phase nuclear reactor safety problems there is still the limitations and the uncertainties in the calculation result. From a regulatory perspective, Korea Institute of Nuclear Safety (KINS) has been presently conducting the performance assessment of the commercial CFD software for the nuclear reactor safety problems. In this study, in order to examine the prediction performance of the commercial CFD software with the porous model in the analysis of the scale-down APR+ (Advanced Power Reactor Plus) internal flow, simulation was conducted with the on-board numerical models in ANSYS CFX R.14 and FLUENT R.14. It was concluded that depending on the CFD software the internal flow distribution of the scale-down APR+ was locally some-what different. Although there was a limitation in estimating the prediction performance of the commercial CFD software due to the limited number of the measured data, CFXR.14 showed the more reasonable predicted results in comparison with FLUENT R.14. Meanwhile, due to the difference of discretization methodology, FLUENT R.14 required more computational memory than CFX R.14 for the same grid system. Therefore the CFD software suitable to the available computational resource should be selected for the massive parallel computation. (authors)

  1. Software compensation in particle flow reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Tran, Huong Lan; Krueger, Katja; Sefkow, Felix [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Green, Steven; Marshall, John; Thomson, Mark [Cavendish Laboratory, Cambridge (United Kingdom); Simon, Frank [Max-Planck-Institut fuer Physik, Muenchen (Germany)

    2017-10-15

    The particle flow approach to calorimetry benefits from highly granular calorimeters and sophisticated software algorithms in order to reconstruct and identify individual particles in complex event topologies. The high spatial granularity, together with analogue energy information, can be further exploited in software compensation. In this approach, the local energy density is used to discriminate electromagnetic and purely hadronic sub-showers within hadron showers in the detector to improve the energy resolution for single particles by correcting for the intrinsic non-compensation of the calorimeter system. This improvement in the single particle energy resolution also results in a better overall jet energy resolution by improving the energy measurement of identified neutral hadrons and improvements in the pattern recognition stage by a more accurate matching of calorimeter energies to tracker measurements. This paper describes the software compensation technique and its implementation in particle flow reconstruction with the Pandora Particle Flow Algorithm (PandoraPFA). The impact of software compensation on the choice of optimal transverse granularity for the analogue hadronic calorimeter option of the International Large Detector (ILD) concept is also discussed.

  2. Software compensation in particle flow reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Tran, Huong Lan; Krueger, Katja; Sefkow, Felix [Deutsches Elektronen-Synchrotron DESY, Hamburg (Germany); Green, Steven; Marshall, John; Thomson, Mark [Cavendish Laboratory, Cambridge (United Kingdom); Simon, Frank [Max-Planck-Institut fuer Physik, Muenchen (Germany)

    2017-10-15

    The particle flow approach to calorimetry benefits from highly granular calorimeters and sophisticated software algorithms in order to reconstruct and identify individual particles in complex event topologies. The high spatial granularity, together with analogue energy information, can be further exploited in software compensation. In this approach, the local energy density is used to discriminate electromagnetic and purely hadronic sub-showers within hadron showers in the detector to improve the energy resolution for single particles by correcting for the intrinsic non-compensation of the calorimeter system. This improvement in the single particle energy resolution also results in a better overall jet energy resolution by improving the energy measurement of identified neutral hadrons and improvements in the pattern recognition stage by a more accurate matching of calorimeter energies to tracker measurements. This paper describes the software compensation technique and its implementation in particle flow reconstruction with the Pandora Particle Flow Algorithm (PandoraPFA). The impact of software compensation on the choice of optimal transverse granularity for the analogue hadronic calorimeter option of the International Large Detector (ILD) concept is also discussed. (orig.)

  3. Software compensation in particle flow reconstruction

    International Nuclear Information System (INIS)

    Tran, Huong Lan; Krueger, Katja; Sefkow, Felix; Green, Steven; Marshall, John; Thomson, Mark; Simon, Frank

    2017-10-01

    The particle flow approach to calorimetry benefits from highly granular calorimeters and sophisticated software algorithms in order to reconstruct and identify individual particles in complex event topologies. The high spatial granularity, together with analogue energy information, can be further exploited in software compensation. In this approach, the local energy density is used to discriminate electromagnetic and purely hadronic sub-showers within hadron showers in the detector to improve the energy resolution for single particles by correcting for the intrinsic non-compensation of the calorimeter system. This improvement in the single particle energy resolution also results in a better overall jet energy resolution by improving the energy measurement of identified neutral hadrons and improvements in the pattern recognition stage by a more accurate matching of calorimeter energies to tracker measurements. This paper describes the software compensation technique and its implementation in particle flow reconstruction with the Pandora Particle Flow Algorithm (PandoraPFA). The impact of software compensation on the choice of optimal transverse granularity for the analogue hadronic calorimeter option of the International Large Detector (ILD) concept is also discussed.

  4. Software safety analysis on the model specified by NuSCR and SMV input language at requirements phase of software development life cycle using SMV

    International Nuclear Information System (INIS)

    Koh, Kwang Yong; Seong, Poong Hyun

    2005-01-01

    Safety-critical software process is composed of development process, verification and validation (V and V) process and safety analysis process. Safety analysis process has been often treated as an additional process and not found in a conventional software process. But software safety analysis (SSA) is required if software is applied to a safety system, and the SSA shall be performed independently for the safety software through software development life cycle (SDLC). Of all the phases in software development, requirements engineering is generally considered to play the most critical role in determining the overall software quality. NASA data demonstrate that nearly 75% of failures found in operational software were caused by errors in the requirements. The verification process in requirements phase checks the correctness of software requirements specification, and the safety analysis process analyzes the safety-related properties in detail. In this paper, the method for safety analysis at requirements phase of software development life cycle using symbolic model verifier (SMV) is proposed. Hazard is discovered by hazard analysis and in other to use SMV for the safety analysis, the safety-related properties are expressed by computation tree logic (CTL)

  5. Comparison of computed tomography dose reporting software

    International Nuclear Information System (INIS)

    Abdullah, A.; Sun, Z.; Pongnapang, N.; Ng, K. H.

    2008-01-01

    Computed tomography (CT) dose reporting software facilitates the estimation of doses to patients undergoing CT examinations. In this study, comparison of three software packages, i.e. CT-Expo (version 1.5, Medizinische Hochschule, Hannover (Germany)), ImPACT CT Patients Dosimetry Calculator (version 0.99x, Imaging Performance Assessment on Computed Tomography, www.impactscan.org) and WinDose (version 2.1a, Wellhofer Dosimetry, Schwarzenbruck (Germany)), has been made in terms of their calculation algorithm and the results of calculated doses. Estimations were performed for head, chest, abdominal and pelvic examinations based on the protocols recommended by European guidelines using single-slice CT (SSCT) (Siemens Somatom Plus 4, Erlangen (Germany)) and multi-slice CT (MSCT) (Siemens Sensation 16, Erlangen (Germany)) for software-based female and male phantoms. The results showed that there are some differences in final dose reporting provided by these software packages. There are deviations of effective doses produced by these software packages. Percentages of coefficient of variance range from 3.3 to 23.4 % in SSCT and from 10.6 to 43.8 % in MSCT. It is important that researchers state the name of the software that is used to estimate the various CT dose quantities. Users must also understand the equivalent terminologies between the information obtained from the CT console and the software packages in order to use the software correctly. (authors)

  6. A Behavior-Preserving Translation From FBD Design to C Implementation for Reactor Protection System Software

    International Nuclear Information System (INIS)

    Yoo, Junbeom; Kim, Euisub; Lee, Jangsoo

    2013-01-01

    Software safety for nuclear reactor protection systems (RPSs) is the most important requirement for the obtainment of permission for operation and export from government authorities, which is why it should be managed with well-experienced software development processes. The RPS software is typically modeled with function block diagrams (FBDs) in the design phase, and then mechanically translated into C programs in the implementation phase, which is finally compiled into executable machine codes and loaded on RPS hardware - PLC (Programmable Logic Controller). Whereas C Compilers are fully-verified COTS (Commercial Off-The-Shelf) software, translators from FBDs to C programs are provided by PLC vendors. Long-term experience, experiments and simulations have validated their correctness and function safety. This paper proposes a behavior-preserving translation from FBD design to C implementation for RPS software. It includes two sets of translation algorithms and rules as well as a prototype translator. We used an example of RPS software in a Korean nuclear power plant to demonstrate the correctness and effectiveness of the proposed translation

  7. A Behavior-Preserving Translation From FBD Design to C Implementation for Reactor Protection System Software

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Junbeom; Kim, Euisub [Konkuk Univ., Seoul (Korea, Republic of); Lee, Jangsoo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-08-15

    Software safety for nuclear reactor protection systems (RPSs) is the most important requirement for the obtainment of permission for operation and export from government authorities, which is why it should be managed with well-experienced software development processes. The RPS software is typically modeled with function block diagrams (FBDs) in the design phase, and then mechanically translated into C programs in the implementation phase, which is finally compiled into executable machine codes and loaded on RPS hardware - PLC (Programmable Logic Controller). Whereas C Compilers are fully-verified COTS (Commercial Off-The-Shelf) software, translators from FBDs to C programs are provided by PLC vendors. Long-term experience, experiments and simulations have validated their correctness and function safety. This paper proposes a behavior-preserving translation from FBD design to C implementation for RPS software. It includes two sets of translation algorithms and rules as well as a prototype translator. We used an example of RPS software in a Korean nuclear power plant to demonstrate the correctness and effectiveness of the proposed translation.

  8. Software Design for Smile Analysis

    Directory of Open Access Journals (Sweden)

    A. Sarkhosh

    2010-12-01

    Full Text Available Introduction: Esthetics and attractiveness of the smile is one of the major demands in contemporary orthodontic treatment. In order to improve a smile design, it is necessary to record “posed smile” as an intentional, non-pressure, static, natural and reproduciblesmile. The record then should be analyzed to determine its characteristics. In this study,we intended to design and introduce a software to analyze the smile rapidly and precisely in order to produce an attractive smile for the patients.Materials and Methods: For this purpose, a practical study was performed to design multimedia software “Smile Analysis” which can receive patients’ photographs and videographs. After giving records to the software, the operator should mark the points and lines which are displayed on the system’s guide and also define the correct scale for each image. Thirty-three variables are measured by the software and displayed on the report page. Reliability of measurements in both image and video was significantly high(=0.7-1.Results: In order to evaluate intra- operator and inter-operator reliability, five cases were selected randomly. Statistical analysis showed that calculations performed in smile analysis software were both valid and highly reliable (for both video and photo.Conclusion: The results obtained from smile analysis could be used in diagnosis,treatment planning and evaluation of the treatment progress.

  9. A Reference Architecture for Distributed Software Deployment

    NARCIS (Netherlands)

    Van der Burg, S.

    2013-01-01

    Nowadays, software systems are bigger and more complicated than people may think. Apart from the fact that a system has to be correctly constructed and should meet the client's wishes, they also have to be made ready for use to end-users or in an isolated test environment. This process is known as

  10. Software Support of Modelling using Ergonomic Tools in Engineering

    Directory of Open Access Journals (Sweden)

    Darina Dupláková

    2017-08-01

    Full Text Available One of the preconditions for correct development of industrial production is continuous interconnecting of virtual reality and real world by computer software. Computer software are used for product modelling, creation of technical documentation, scheduling, management and optimization of manufacturing processes, and efficiency increase of human work in manufacturing plants. This article describes the frequent used ergonomic software which helping to increase of human work by error rate reducing, risks factors of working environment, injury in workplaces and elimination of arising occupational diseases. They are categorized in the field of micro ergonomics and they are applicable at the manufacturing level with flexible approach in solving of established problems.

  11. Development and application of new quality model for software projects.

    Science.gov (United States)

    Karnavel, K; Dillibabu, R

    2014-01-01

    The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects.

  12. SEffEst: Effort estimation in software projects using fuzzy logic and neural networks

    Directory of Open Access Journals (Sweden)

    Israel

    2012-08-01

    Full Text Available Academia and practitioners confirm that software project effort prediction is crucial for an accurate software project management. However, software development effort estimation is uncertain by nature. Literature has developed methods to improve estimation correctness, using artificial intelligence techniques in many cases. Following this path, this paper presents SEffEst, a framework based on fuzzy logic and neural networks designed to increase effort estimation accuracy on software development projects. Trained using ISBSG data, SEffEst presents remarkable results in terms of prediction accuracy.

  13. SWEPP gamma-ray spectrometer system software user's guide

    International Nuclear Information System (INIS)

    Femec, D.A.

    1994-08-01

    The SWEPP Gamma-Ray Spectrometer (SGRS) System has been developed by the Radiation Measurement and Development Unit of the Idaho National Engineering Laboratory to assist in the characterization of the radiological contents of contact-handled waste containers at the Stored Waste Examination Pilot Plant (SWEPP). In addition to determining the concentrations of gamma-ray-emitting radionuclides, the software also calculates attenuation-corrected isotopic mass ratios of specific interest, and provides controls for SGRS hardware as required. This document serves as a user's guide for the data acquisition and analysis software associated with the SGRS system

  14. Improving software quality - The use of formal inspections at the Jet Propulsion Laboratory

    Science.gov (United States)

    Bush, Marilyn

    1990-01-01

    The introduction of software formal inspections (Fagan Inspections) at JPL for finding and fixing defects early in the software development life cycle are reviewed. It is estimated that, by the year 2000, some software efforts will rise to as much as 80 percent of the total. Software problems are especially important at NASA as critical flight software must be error-free. It is shown that formal inspections are particularly effective at finding and removing defects having to do with clarity, correctness, consistency, and completeness. A very significant discovery was that code audits were not as effective at finding defects as code inspections.

  15. Command and Control Software Development Memory Management

    Science.gov (United States)

    Joseph, Austin Pope

    2017-01-01

    This internship was initially meant to cover the implementation of unit test automation for a NASA ground control project. As is often the case with large development projects, the scope and breadth of the internship changed. Instead, the internship focused on finding and correcting memory leaks and errors as reported by a COTS software product meant to track such issues. Memory leaks come in many different flavors and some of them are more benign than others. On the extreme end a program might be dynamically allocating memory and not correctly deallocating it when it is no longer in use. This is called a direct memory leak and in the worst case can use all the available memory and crash the program. If the leaks are small they may simply slow the program down which, in a safety critical system (a system for which a failure or design error can cause a risk to human life), is still unacceptable. The ground control system is managed in smaller sub-teams, referred to as CSCIs. The CSCI that this internship focused on is responsible for monitoring the health and status of the system. This team's software had several methods/modules that were leaking significant amounts of memory. Since most of the code in this system is safety-critical, correcting memory leaks is a necessity.

  16. Control rod position fault diagnosis and its software realization of pressurized water reactor

    International Nuclear Information System (INIS)

    Chang Zhengke; Shao Dinghong

    2004-11-01

    PLC software is adopted in the Rod Position Monitoring System of QS2NPS. By this software, the position of control rods can be monitored in real time, the abnormal phenomena can be identified immediately, the correctness and timeliness of fault diagnosis are improved remarkably. the identification and recordance of rod position fault, the performance validation of measure channel are realized also. The function and effect of this software are introduced. (authors)

  17. Color reproduction software for a digital still camera

    Science.gov (United States)

    Lee, Bong S.; Park, Du-Sik; Nam, Byung D.

    1998-04-01

    We have developed a color reproduction software for a digital still camera. The image taken by the camera was colorimetrically reproduced on the monitor after characterizing the camera and the monitor, and color matching between two devices. The reproduction was performed at three levels; level processing, gamma correction, and color transformation. The image contrast was increased after the level processing adjusting the level of dark and bright portions of the image. The relationship between the level processed digital values and the measured luminance values of test gray samples was calculated, and the gamma of the camera was obtained. The method for getting the unknown monitor gamma was proposed. As a result, the level processed values were adjusted by the look-up table created by the camera and the monitor gamma correction. For a color transformation matrix for the camera, 3 by 3 or 3 by 4 matrix was used, which was calculated by the regression between the gamma corrected values and the measured tristimulus values of each test color samples the various reproduced images were displayed on the dialogue box implemented in our software, which were generated according to four illuminations for the camera and three color temperatures for the monitor. An user can easily choose he best reproduced image comparing each others.

  18. Software engineering and Ada (Trademark) training: An implementation model for NASA

    Science.gov (United States)

    Legrand, Sue; Freedman, Glenn

    1988-01-01

    The choice of Ada for software engineering for projects such as the Space Station has resulted in government and industrial groups considering training programs that help workers become familiar with both a software culture and the intricacies of a new computer language. The questions of how much time it takes to learn software engineering with Ada, how much an organization should invest in such training, and how the training should be structured are considered. Software engineering is an emerging, dynamic discipline. It is defined by the author as the establishment and application of sound engineering environments, tools, methods, models, principles, and concepts combined with appropriate standards, guidelines, and practices to support computing which is correct, modifiable, reliable and safe, efficient, and understandable throughout the life cycle of the application. Neither the training programs needed, nor the content of such programs, have been well established. This study addresses the requirements for training for NASA personnel and recommends an implementation plan. A curriculum and a means of delivery are recommended. It is further suggested that a knowledgeable programmer may be able to learn Ada in 5 days, but that it takes 6 to 9 months to evolve into a software engineer who uses the language correctly and effectively. The curriculum and implementation plan can be adapted for each NASA Center according to the needs dictated by each project.

  19. LHC Orbit Correction Reproducibility and Related Machine Protection

    CERN Document Server

    Baer, T; Schmidt, R; Wenninger, J

    2012-01-01

    The Large Hadron Collider (LHC) has an unprecedented nominal stored beam energy of up to 362 MJ per beam. In order to ensure an adequate machine protection by the collimation system, a high reproducibility of the beam position at collimators and special elements like the final focus quadrupoles is essential. This is realized by a combination of manual orbit corrections, feed forward and real time feedback. In order to protect the LHC against inconsistent orbit corrections, which could put the machine in a vulnerable state, a novel software-based interlock system for orbit corrector currents was developed. In this paper, the principle of the new interlock system is described and the reproducibility of the LHC orbit correction is discussed against the background of this system.

  20. Correction of the wavefront using the irradiance transport equation

    Science.gov (United States)

    García, M.; Granados, F.; Cornejo, A.

    2008-07-01

    The correction of the wavefront in optical systems implies the use of wavefront sensors, software, and auxiliary optical systems. We propose evaluated the wavefront using the fact that the wavefront and its intensity are related in the mathematical expression the irradiance transport equation (ITE)

  1. Automatic physiological waveform processing for FMRI noise correction and analysis.

    Directory of Open Access Journals (Sweden)

    Daniel J Kelley

    2008-03-01

    Full Text Available Functional MRI resting state and connectivity studies of brain focus on neural fluctuations at low frequencies which share power with physiological fluctuations originating from lung and heart. Due to the lack of automated software to process physiological signals collected at high magnetic fields, a gap exists in the processing pathway between the acquisition of physiological data and its use in fMRI software for both physiological noise correction and functional analyses of brain activation and connectivity. To fill this gap, we developed an open source, physiological signal processing program, called PhysioNoise, in the python language. We tested its automated processing algorithms and dynamic signal visualization on resting monkey cardiac and respiratory waveforms. PhysioNoise consistently identifies physiological fluctuations for fMRI noise correction and also generates covariates for subsequent analyses of brain activation and connectivity.

  2. Model reliability and software quality assurance in simulation of nuclear fuel waste management systems

    International Nuclear Information System (INIS)

    Oeren, T.I.; Elzas, M.S.; Sheng, G.; Wageningen Agricultural Univ., Netherlands; McMaster Univ., Hamilton, Ontario)

    1985-01-01

    As is the case with all scientific simulation studies, computerized simulation of nuclear fuel waste management systems can introduce and hide various types of errors. Frameworks to clarify issues of model reliability and software quality assurance are offered. Potential problems with reference to the main areas of concern for reliability and quality are discussed; e.g., experimental issues, decomposition, scope, fidelity, verification, requirements, testing, correctness, robustness are treated with reference to the experience gained in the past. A list comprising over 80 most common computerization errors is provided. Software tools and techniques used to detect and to correct computerization errors are discussed

  3. Evaluating predictive models of software quality

    International Nuclear Information System (INIS)

    Ciaschini, V; Canaparo, M; Ronchieri, E; Salomoni, D

    2014-01-01

    Applications from High Energy Physics scientific community are constantly growing and implemented by a large number of developers. This implies a strong churn on the code and an associated risk of faults, which is unavoidable as long as the software undergoes active evolution. However, the necessities of production systems run counter to this. Stability and predictability are of paramount importance; in addition, a short turn-around time for the defect discovery-correction-deployment cycle is required. A way to reconcile these opposite foci is to use a software quality model to obtain an approximation of the risk before releasing a program to only deliver software with a risk lower than an agreed threshold. In this article we evaluated two quality predictive models to identify the operational risk and the quality of some software products. We applied these models to the development history of several EMI packages with intent to discover the risk factor of each product and compare it with its real history. We attempted to determine if the models reasonably maps reality for the applications under evaluation, and finally we concluded suggesting directions for further studies.

  4. Evaluating Predictive Models of Software Quality

    Science.gov (United States)

    Ciaschini, V.; Canaparo, M.; Ronchieri, E.; Salomoni, D.

    2014-06-01

    Applications from High Energy Physics scientific community are constantly growing and implemented by a large number of developers. This implies a strong churn on the code and an associated risk of faults, which is unavoidable as long as the software undergoes active evolution. However, the necessities of production systems run counter to this. Stability and predictability are of paramount importance; in addition, a short turn-around time for the defect discovery-correction-deployment cycle is required. A way to reconcile these opposite foci is to use a software quality model to obtain an approximation of the risk before releasing a program to only deliver software with a risk lower than an agreed threshold. In this article we evaluated two quality predictive models to identify the operational risk and the quality of some software products. We applied these models to the development history of several EMI packages with intent to discover the risk factor of each product and compare it with its real history. We attempted to determine if the models reasonably maps reality for the applications under evaluation, and finally we concluded suggesting directions for further studies.

  5. Motion detection and correction for dynamic 15O-water myocardial perfusion PET studies

    International Nuclear Information System (INIS)

    Naum, Alexandru; Laaksonen, Marko S.; Oikonen, Vesa; Teraes, Mika; Jaervisalo, Mikko J.; Knuuti, Juhani; Tuunanen, Helena; Nuutila, Pirjo; Kemppainen, Jukka

    2005-01-01

    Patient motion during dynamic PET studies is a well-documented source of errors. The purpose of this study was to investigate the incidence of frame-to-frame motion in dynamic 15 O-water myocardial perfusion PET studies, to test the efficacy of motion correction methods and to study whether implementation of motion correction would have an impact on the perfusion results. We developed a motion detection procedure using external radioactive skin markers and frame-to-frame alignment. To evaluate motion, marker coordinates inside the field of view were determined in each frame for each study. The highest number of frames with identical spatial coordinates during the study were defined as ''non-moved''. Movement was considered present if even one marker changed position, by one pixel/frame compared with reference, in one axis, and such frames were defined as ''moved''. We tested manual, in-house-developed motion correction software and an automatic motion correction using a rigid body point model implemented in MIPAV (Medical Image Processing, Analysis and Visualisation) software. After motion correction, remaining motion was re-analysed. Myocardial blood flow (MBF) values were calculated for both non-corrected and motion-corrected datasets. At rest, patient motion was found in 18% of the frames, but during pharmacological stress the fraction increased to 45% and during physical exercise it rose to 80%. Both motion correction algorithms significantly decreased (p<0.006) the number of moved frames and the amplitude of motion (p<0.04). Motion correction significantly increased MBF results during bicycle exercise (p<0.02). At rest or during adenosine infusion, the motion correction had no significant effects on MBF values. Significant motion is a common phenomenon in dynamic cardiac studies during adenosine infusion but especially during exercise. Applying motion correction for the data acquired during exercise clearly changed the MBF results, indicating that motion

  6. Accurate and fiducial-marker-free correction for three-dimensional chromatic shift in biological fluorescence microscopy.

    Science.gov (United States)

    Matsuda, Atsushi; Schermelleh, Lothar; Hirano, Yasuhiro; Haraguchi, Tokuko; Hiraoka, Yasushi

    2018-05-15

    Correction of chromatic shift is necessary for precise registration of multicolor fluorescence images of biological specimens. New emerging technologies in fluorescence microscopy with increasing spatial resolution and penetration depth have prompted the need for more accurate methods to correct chromatic aberration. However, the amount of chromatic shift of the region of interest in biological samples often deviates from the theoretical prediction because of unknown dispersion in the biological samples. To measure and correct chromatic shift in biological samples, we developed a quadrisection phase correlation approach to computationally calculate translation, rotation, and magnification from reference images. Furthermore, to account for local chromatic shifts, images are split into smaller elements, for which the phase correlation between channels is measured individually and corrected accordingly. We implemented this method in an easy-to-use open-source software package, called Chromagnon, that is able to correct shifts with a 3D accuracy of approximately 15 nm. Applying this software, we quantified the level of uncertainty in chromatic shift correction, depending on the imaging modality used, and for different existing calibration methods, along with the proposed one. Finally, we provide guidelines to choose the optimal chromatic shift registration method for any given situation.

  7. True coincidence summing correction and mathematical efficiency modeling of a well detector

    Energy Technology Data Exchange (ETDEWEB)

    Jäderström, H., E-mail: henrik.jaderstrom@canberra.com [CANBERRA Industries Inc., 800 Research Parkway, Meriden, CT 06450 (United States); Mueller, W.F. [CANBERRA Industries Inc., 800 Research Parkway, Meriden, CT 06450 (United States); Atrashkevich, V. [Stroitely St 4-4-52, Moscow (Russian Federation); Adekola, A.S. [CANBERRA Industries Inc., 800 Research Parkway, Meriden, CT 06450 (United States)

    2015-06-01

    True coincidence summing (TCS) occurs when two or more photons are emitted from the same decay of a radioactive nuclide and are detected within the resolving time of the gamma ray detector. TCS changes the net peak areas of the affected full energy peaks in the spectrum and the nuclide activity is rendered inaccurate if no correction is performed. TCS is independent of the count rate, but it is strongly dependent on the peak and total efficiency, as well as the characteristics of a given nuclear decay. The TCS effects are very prominent for well detectors because of the high efficiencies, and make accounting for TCS a necessity. For CANBERRA's recently released Small Anode Germanium (SAGe) well detector, an extension to CANBERRA's mathematical efficiency calibration method (In Situ Object Calibration Software or ISOCS, and Laboratory SOurceless Calibration Software or LabSOCS) has been developed that allows for calculation of peak and total efficiencies for SAGe well detectors. The extension also makes it possible to calculate TCS corrections for well detectors using the standard algorithm provided with CANBERRAS's Spectroscopy software Genie 2000. The peak and total efficiencies from ISOCS/LabSOCS have been compared to MCNP with agreements within 3% for peak efficiencies and 10% for total efficiencies for energies above 30 keV. A sample containing Ra-226 daughters has been measured within the well and analyzed with and without TCS correction and applying the correction factor shows significant improvement of the activity determination for the energy range 46–2447 keV. The implementation of ISOCS/LabSOCS for well detectors offers a powerful tool for efficiency calibration for these detectors. The automated algorithm to correct for TCS effects in well detectors makes nuclide specific calibration unnecessary and offers flexibility in carrying out gamma spectral analysis.

  8. Conceptual study of calibration software for large scale input accountancy tank

    International Nuclear Information System (INIS)

    Uchikoshi, Seiji; Yasu, Kan-ichi; Watanabe, Yuichi; Matsuda, Yuji; Kawai, Akio; Tamura, Toshiyuki; Shimizu, Hidehiko.

    1996-01-01

    Demonstration experiments for large scale input accountancy tank are going to be under way by Nuclear Material Control Center. Development of calibration software for accountancy system with dip-tube manometer is an important task in the experiments. A conceptual study of the software has been carried out to construct high precision accountancy system. And, the study was based on ANSI N15.19-1989. Items of the study are overall configuration, correction method for influence of bubble formation, function model of calibration, and fitting method for calibration curve. Following remarks are the results of this study. 1) Overall configuration of the software was constructed. 2) It was shown by numerical solution, that the influence of bubble formation can be corrected using period of pressure wave. 3) Two function models of calibration for well capacity and for inner structure volume were prepared from tank design, and good fitness of the model for net capacity (balance of both models) was confirmed by fitting to designed shape of the tank. 4) The necessity of further consideration about both-variables-in-error-model and cumulative-error-model was recognized. We are going to develop a practical software on the basis of the results, and to verify it by the demonstration experiments. (author)

  9. Lung nodule volumetry: segmentation algorithms within the same software package cannot be used interchangeably

    Energy Technology Data Exchange (ETDEWEB)

    Ashraf, H.; Bach, K.S.; Hansen, H. [Copenhagen University, Department of Radiology, Gentofte Hospital, Hellerup (Denmark); Hoop, B. de [University Medical Centre Utrecht, Department of Radiology, Utrecht (Netherlands); Shaker, S.B.; Dirksen, A. [Copenhagen University, Department of Respiratory Medicine, Gentofte Hospital, Hellerup (Denmark); Prokop, M. [University Medical Centre Utrecht, Department of Radiology, Utrecht (Netherlands); Radboud University Nijmegen, Department of Radiology, Nijmegen (Netherlands); Pedersen, J.H. [Copenhagen University, Department of Cardiothoracic Surgery RT, Rigshospitalet, Copenhagen (Denmark)

    2010-08-15

    We examined the reproducibility of lung nodule volumetry software that offers three different volumetry algorithms. In a lung cancer screening trial, 188 baseline nodules >5 mm were identified. Including follow-ups, these nodules formed a study-set of 545 nodules. Nodules were independently double read by two readers using commercially available volumetry software. The software offers readers three different analysing algorithms. We compared the inter-observer variability of nodule volumetry when the readers used the same and different algorithms. Both readers were able to correctly segment and measure 72% of nodules. In 80% of these cases, the readers chose the same algorithm. When readers used the same algorithm, exactly the same volume was measured in 50% of readings and a difference of >25% was observed in 4%. When the readers used different algorithms, 83% of measurements showed a difference of >25%. Modern volumetric software failed to correctly segment a high number of screen detected nodules. While choosing a different algorithm can yield better segmentation of a lung nodule, reproducibility of volumetric measurements deteriorates substantially when different algorithms were used. It is crucial even in the same software package to choose identical parameters for follow-up. (orig.)

  10. Lung nodule volumetry: segmentation algorithms within the same software package cannot be used interchangeably

    International Nuclear Information System (INIS)

    Ashraf, H.; Bach, K.S.; Hansen, H.; Hoop, B. de; Shaker, S.B.; Dirksen, A.; Prokop, M.; Pedersen, J.H.

    2010-01-01

    We examined the reproducibility of lung nodule volumetry software that offers three different volumetry algorithms. In a lung cancer screening trial, 188 baseline nodules >5 mm were identified. Including follow-ups, these nodules formed a study-set of 545 nodules. Nodules were independently double read by two readers using commercially available volumetry software. The software offers readers three different analysing algorithms. We compared the inter-observer variability of nodule volumetry when the readers used the same and different algorithms. Both readers were able to correctly segment and measure 72% of nodules. In 80% of these cases, the readers chose the same algorithm. When readers used the same algorithm, exactly the same volume was measured in 50% of readings and a difference of >25% was observed in 4%. When the readers used different algorithms, 83% of measurements showed a difference of >25%. Modern volumetric software failed to correctly segment a high number of screen detected nodules. While choosing a different algorithm can yield better segmentation of a lung nodule, reproducibility of volumetric measurements deteriorates substantially when different algorithms were used. It is crucial even in the same software package to choose identical parameters for follow-up. (orig.)

  11. MAUS: MICE Analysis User Software

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    The Muon Ionization Cooling Experiment (MICE) has developed the MICE Analysis User Software (MAUS) to simulate and analyse experimental data. It serves as the primary codebase for the experiment, providing for online data quality checks and offline batch simulation and reconstruction. The code is structured in a Map-Reduce framework to allow parallelization whether on a personal machine or in the control room. Various software engineering practices from industry are also used to ensure correct and maintainable physics code, which include unit, functional and integration tests, continuous integration and load testing, code reviews, and distributed version control systems. Lastly, there are various small design decisions like using JSON as the data structure, using SWIG to allow developers to write components in either Python or C++, or using the SCons python-based build system that may be of interest to other experiments.

  12. Jagiellonian University Development of the LHCb VELO monitoring software platform

    CERN Document Server

    Majewski, Maciej

    2017-01-01

    One of the most important parts of the LHCb spectrometer is the VErtex LOcator (VELO), dedicated to the precise tracking close to the proton–proton interaction point. The quality of data produced by the VELO depends on the calibration process, which must be monitored to ensure its correctness. This work presents details on how the calibration monitoring is conducted and how it could be improved. It also includes information on monitoring software and data flow in the LHCb software framework.

  13. Software Design of Mobile Antenna for Auto Satellite Tracking Using Modem Correction and Elevation Azimuth Method

    Directory of Open Access Journals (Sweden)

    Djamhari Sirat

    2010-10-01

    Full Text Available Pointing accuracy is an important thing in satellite communication. Because the satellite’s distance to the surface of the earth's satellite is so huge, thus 1 degree of pointing error will make the antenna can not send data to satellites. To overcome this, the auto-tracking satellite controller is made. This system uses a microcontroller as the controller, with the GPS as the indicator location of the antenna, digital compass as the beginning of antenna pointing direction, rotary encoder as sensor azimuth and elevation, and modem to see Eb/No signal. The microcontroller use serial communication to read the input. Thus the programming should be focused on in the UART and serial communication software UART. This controller use 2 phase in the process of tracking satellites. Early stages is the method Elevation-Azimuth, where at this stage with input from GPS, Digital Compass, and the position of satellites (both coordinates, and height that are stored in microcontroller. Controller will calculate the elevation and azimuth angle, then move the antenna according to the antenna azimuth and elevation angle. Next stages is correction modem, where in this stage controller only use modem as the input, and antenna movement is set up to obtain the largest value of Eb/No signal. From the results of the controller operation, there is a change in the value of the original input level from -81.7 dB to -30.2 dB with end of Eb/No value, reaching 5.7 dB.

  14. Quality assurance of nuclear medicine computer software

    International Nuclear Information System (INIS)

    Cradduck, T.D.

    1986-01-01

    Although quality assurance activities have become well established for the hardware found in nuclear medicine little attention has been paid to computer software. This paper outlines some of the problems that exist and indicates some of the solutions presently under development. The major thrust has been towards establishment of programming standards and comprehensive documentation. Some manufacturers have developed installation verification procedures which programmers are urged to use as models for their own programs. Items that tend to cause erroneous results are discussed with the emphasis for error detection and correction being placed on proper education and training of the computer operator. The concept of interchangeable data files or 'software phantoms' for purposes of quality assurance is discussed. (Author)

  15. Finding upper bounds for software failure probabilities - experiments and results

    International Nuclear Information System (INIS)

    Kristiansen, Monica; Winther, Rune

    2005-09-01

    This report looks into some aspects of using Bayesian hypothesis testing to find upper bounds for software failure probabilities. In the first part, the report evaluates the Bayesian hypothesis testing approach for finding upper bounds for failure probabilities of single software components. The report shows how different choices of prior probability distributions for a software component's failure probability influence the number of tests required to obtain adequate confidence in a software component. In the evaluation, both the effect of the shape of the prior distribution as well as one's prior confidence in the software component were investigated. In addition, different choices of prior probability distributions are discussed based on their relevance in a software context. In the second part, ideas on how the Bayesian hypothesis testing approach can be extended to assess systems consisting of multiple software components are given. One of the main challenges when assessing systems consisting of multiple software components is to include dependency aspects in the software reliability models. However, different types of failure dependencies between software components must be modelled differently. Identifying different types of failure dependencies are therefore an important condition for choosing a prior probability distribution, which correctly reflects one's prior belief in the probability for software components failing dependently. In this report, software components include both general in-house software components, as well as pre-developed software components (e.g. COTS, SOUP, etc). (Author)

  16. SOFTWARE OPEN SOURCE, SOFTWARE GRATIS?

    Directory of Open Access Journals (Sweden)

    Nur Aini Rakhmawati

    2006-01-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 Berlakunya Undang – undang Hak Atas Kekayaan Intelektual (HAKI, memunculkan suatu alternatif baru untuk menggunakan software open source. Penggunaan software open source menyebar seiring dengan isu global pada Information Communication Technology (ICT saat ini. Beberapa organisasi dan perusahaan mulai menjadikan software open source sebagai pertimbangan. Banyak konsep mengenai software open source ini. Mulai dari software yang gratis sampai software tidak berlisensi. Tidak sepenuhnya isu software open source benar, untuk itu perlu dikenalkan konsep software open source mulai dari sejarah, lisensi dan bagaimana cara memilih lisensi, serta pertimbangan dalam memilih software open source yang ada. Kata kunci :Lisensi, Open Source, HAKI

  17. Characterization of Morphology using MAMA Software

    Energy Technology Data Exchange (ETDEWEB)

    Gravelle, Julie [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-02

    The MAMA (Morphological Analysis for Material Attribution) software was developed at the Los Alamos National Laboratory funded through the National Technical Nuclear Forensics Center in the Department of Homeland Security. The software allows images to be analysed and quantified. The largest project I worked on was to quantify images of plutonium oxides and ammonium diuranates prepared by the group with the software and provide analyses on the particles of each sample. Images were quantified through MAMA, with a color analysis, a lexicon description and powder x-ray diffraction. Through this we were able to visually see a difference between some of the syntheses. An additional project was to revise the manual for MAMA to help streamline training and provide useful tips to users to more quickly become acclimated to using the software. The third project investigated expanding the scope of MAMA and finding a statistically relevant baseline for the particulates through the analysis of maps in the software and using known measurements to compare the error associated with the software. During this internship, I worked on several different projects dealing with the MAMA software. The revision of the usermanual for the MAMA software was the first project I was able to work and collaborate on. I first learned how to use the software by getting instruction from a skilled user at the laboratory, Dan Schwartz, and by using the existing user manual and examples. After becoming accustomed to the program, I started to go over the manual to correct and change items that were not as useful or descriptive as they could have been. I also added in tips that I learned as I explored the software. The updated manual was also worked on by several others who have been developing the program. The goal of these revisions was to ensure the most concise and simple directions to the software were available to future users. By incorporating tricks and shortcuts that I discovered and picked up

  18. CSE database: extended annotations and new recommendations for ECG software testing.

    Science.gov (United States)

    Smíšek, Radovan; Maršánová, Lucie; Němcová, Andrea; Vítek, Martin; Kozumplík, Jiří; Nováková, Marie

    2017-08-01

    Nowadays, cardiovascular diseases represent the most common cause of death in western countries. Among various examination techniques, electrocardiography (ECG) is still a highly valuable tool used for the diagnosis of many cardiovascular disorders. In order to diagnose a person based on ECG, cardiologists can use automatic diagnostic algorithms. Research in this area is still necessary. In order to compare various algorithms correctly, it is necessary to test them on standard annotated databases, such as the Common Standards for Quantitative Electrocardiography (CSE) database. According to Scopus, the CSE database is the second most cited standard database. There were two main objectives in this work. First, new diagnoses were added to the CSE database, which extended its original annotations. Second, new recommendations for diagnostic software quality estimation were established. The ECG recordings were diagnosed by five new cardiologists independently, and in total, 59 different diagnoses were found. Such a large number of diagnoses is unique, even in terms of standard databases. Based on the cardiologists' diagnoses, a four-round consensus (4R consensus) was established. Such a 4R consensus means a correct final diagnosis, which should ideally be the output of any tested classification software. The accuracy of the cardiologists' diagnoses compared with the 4R consensus was the basis for the establishment of accuracy recommendations. The accuracy was determined in terms of sensitivity = 79.20-86.81%, positive predictive value = 79.10-87.11%, and the Jaccard coefficient = 72.21-81.14%, respectively. Within these ranges, the accuracy of the software is comparable with the accuracy of cardiologists. The accuracy quantification of the correct classification is unique. Diagnostic software developers can objectively evaluate the success of their algorithm and promote its further development. The annotations and recommendations proposed in this work will allow

  19. Comparison between In-house developed and Diamond commercial software for patient specific independent monitor unit calculation and verification with heterogeneity corrections.

    Science.gov (United States)

    Kuppusamy, Vijayalakshmi; Nagarajan, Vivekanandan; Jeevanandam, Prakash; Murugan, Lavanya

    2016-02-01

    The study was aimed to compare two different monitor unit (MU) or dose verification software in volumetric modulated arc therapy (VMAT) using modified Clarkson's integration technique for 6 MV photons beams. In-house Excel Spreadsheet based monitor unit verification calculation (MUVC) program and PTW's DIAMOND secondary check software (SCS), version-6 were used as a secondary check to verify the monitor unit (MU) or dose calculated by treatment planning system (TPS). In this study 180 patients were grouped into 61 head and neck, 39 thorax and 80 pelvic sites. Verification plans are created using PTW OCTAVIUS-4D phantom and also measured using 729 detector chamber and array with isocentre as the suitable point of measurement for each field. In the analysis of 154 clinically approved VMAT plans with isocentre at a region above -350 HU, using heterogeneity corrections, In-house Spreadsheet based MUVC program and Diamond SCS showed good agreement TPS. The overall percentage average deviations for all sites were (-0.93% + 1.59%) and (1.37% + 2.72%) for In-house Excel Spreadsheet based MUVC program and Diamond SCS respectively. For 26 clinically approved VMAT plans with isocentre at a region below -350 HU showed higher variations for both In-house Spreadsheet based MUVC program and Diamond SCS. It can be concluded that for patient specific quality assurance (QA), the In-house Excel Spreadsheet based MUVC program and Diamond SCS can be used as a simple and fast accompanying to measurement based verification for plans with isocentre at a region above -350 HU. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  20. Methods of Software Verification

    Directory of Open Access Journals (Sweden)

    R. E. Gurin

    2015-01-01

    Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and

  1. ATLAS software stack on ARM64

    CERN Document Server

    Smith, Joshua Wyatt; The ATLAS collaboration

    2016-01-01

    The ATLAS experiment explores new hardware and software platforms that, in the future, may be more suited to its data intensive workloads. One such alternative hardware platform is the ARM architecture, which is designed to be extremely power efficient and is found in most smartphones and tablets. CERN openlab recently installed a small cluster of ARM 64-bit evaluation prototype servers. Each server is based on a single-socket ARM 64-bit system on a chip, with 32 Cortex-A57 cores. In total, each server has 128 GB RAM connected with four fast memory channels. This paper reports on the port of the ATLAS software stack onto these new prototype ARM64 servers. This included building the "external" packages that the ATLAS software relies on. Patches were needed to introduce this new architecture into the build as well as patches that correct for platform specific code that caused failures on non-x86 architectures. These patches were applied such that porting to further platforms will need no or only very little adj...

  2. Characterizing and Modeling the Cost of Rework in a Library of Reusable Software Components

    Science.gov (United States)

    Basili, Victor R.; Condon, Steven E.; ElEmam, Khaled; Hendrick, Robert B.; Melo, Walcelio

    1997-01-01

    In this paper we characterize and model the cost of rework in a Component Factory (CF) organization. A CF is responsible for developing and packaging reusable software components. Data was collected on corrective maintenance activities for the Generalized Support Software reuse asset library located at the Flight Dynamics Division of NASA's GSFC. We then constructed a predictive model of the cost of rework using the C4.5 system for generating a logical classification model. The predictor variables for the model are measures of internal software product attributes. The model demonstrates good prediction accuracy, and can be used by managers to allocate resources for corrective maintenance activities. Furthermore, we used the model to generate proscriptive coding guidelines to improve programming, practices so that the cost of rework can be reduced in the future. The general approach we have used is applicable to other environments.

  3. Review of Bruce A reactor regulating system software

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-12-01

    Each of the four reactor units at the Ontario Hydro Bruce A Nuclear Generating Station is controlled by the Reactor Regulating System (RRS) software running on digital computers. This research report presents an assessment of the quality and reliability of the RRS software based on a review of the RRS design documentation, an analysis of certain significant Event Reports (SERs), and an examination of selected software changes. We found that the RRS software requirements (i.e., what the software should do) were never clearly documented, and that design documents, which should describe how the requirements are implemented, are incomplete and inaccurate. Some RRS-related SERs (i.e., reports on unexpected incidents relating to the reactor control) implied that there were faults in the RRS, or that RRS changes should be made to help prevent certain unexpected events. The follow-up investigations were generally poorly documented, and so it could not usually be determined that problems were properly resolved. The Bruce A software change control procedures require improvement. For the software changes examined, there was insufficient evidence provided by Ontario Hydro that the required procedures regarding change approval, independent review, documentation updates, and testing were followed. Ontario Hydro relies on the expertise of their technical staff to modify the RRS software correctly; they have confidence in the software code itself, even if the documentation is not up-to-date. Ontario Hydro did not produce the documentation required for an independent formal assessment of the reliability of the RRS. (author). 37 refs., 3 figs.

  4. Review of Bruce A reactor regulating system software

    International Nuclear Information System (INIS)

    1995-12-01

    Each of the four reactor units at the Ontario Hydro Bruce A Nuclear Generating Station is controlled by the Reactor Regulating System (RRS) software running on digital computers. This research report presents an assessment of the quality and reliability of the RRS software based on a review of the RRS design documentation, an analysis of certain significant Event Reports (SERs), and an examination of selected software changes. We found that the RRS software requirements (i.e., what the software should do) were never clearly documented, and that design documents, which should describe how the requirements are implemented, are incomplete and inaccurate. Some RRS-related SERs (i.e., reports on unexpected incidents relating to the reactor control) implied that there were faults in the RRS, or that RRS changes should be made to help prevent certain unexpected events. The follow-up investigations were generally poorly documented, and so it could not usually be determined that problems were properly resolved. The Bruce A software change control procedures require improvement. For the software changes examined, there was insufficient evidence provided by Ontario Hydro that the required procedures regarding change approval, independent review, documentation updates, and testing were followed. Ontario Hydro relies on the expertise of their technical staff to modify the RRS software correctly; they have confidence in the software code itself, even if the documentation is not up-to-date. Ontario Hydro did not produce the documentation required for an independent formal assessment of the reliability of the RRS. (author). 37 refs., 3 figs

  5. Repeat-aware modeling and correction of short read errors.

    Science.gov (United States)

    Yang, Xiao; Aluru, Srinivas; Dorman, Karin S

    2011-02-15

    High-throughput short read sequencing is revolutionizing genomics and systems biology research by enabling cost-effective deep coverage sequencing of genomes and transcriptomes. Error detection and correction are crucial to many short read sequencing applications including de novo genome sequencing, genome resequencing, and digital gene expression analysis. Short read error detection is typically carried out by counting the observed frequencies of kmers in reads and validating those with frequencies exceeding a threshold. In case of genomes with high repeat content, an erroneous kmer may be frequently observed if it has few nucleotide differences with valid kmers with multiple occurrences in the genome. Error detection and correction were mostly applied to genomes with low repeat content and this remains a challenging problem for genomes with high repeat content. We develop a statistical model and a computational method for error detection and correction in the presence of genomic repeats. We propose a method to infer genomic frequencies of kmers from their observed frequencies by analyzing the misread relationships among observed kmers. We also propose a method to estimate the threshold useful for validating kmers whose estimated genomic frequency exceeds the threshold. We demonstrate that superior error detection is achieved using these methods. Furthermore, we break away from the common assumption of uniformly distributed errors within a read, and provide a framework to model position-dependent error occurrence frequencies common to many short read platforms. Lastly, we achieve better error correction in genomes with high repeat content. The software is implemented in C++ and is freely available under GNU GPL3 license and Boost Software V1.0 license at "http://aluru-sun.ece.iastate.edu/doku.php?id = redeem". We introduce a statistical framework to model sequencing errors in next-generation reads, which led to promising results in detecting and correcting errors

  6. A new controller for the JET error field correction coils

    International Nuclear Information System (INIS)

    Zanotto, L.; Sartori, F.; Bigi, M.; Piccolo, F.; De Benedetti, M.

    2005-01-01

    This paper describes the hardware and the software structure of a new controller for the JET error field correction coils (EFCC) system, a set of ex-vessel coils that recently replaced the internal saddle coils. The EFCC controller has been developed on a conventional VME hardware platform using a new software framework, recently designed for real-time applications at JET, and replaces the old disruption feedback controller increasing the flexibility and the optimization of the system. The use of conventional hardware has required a particular effort in designing the software part in order to meet the specifications. The peculiarities of the new controller will be highlighted, such as its very useful trigger logic interface, which allows in principle exploring various error field experiment scenarios

  7. Formal methods in software development: A road less travelled

    Directory of Open Access Journals (Sweden)

    John A van der Poll

    2010-08-01

    Full Text Available An integration of traditional verification techniques and formal specifications in software engineering is presented. Advocates of such techniques claim that mathematical formalisms allow them to produce quality, verifiably correct, or at least highly dependable software and that the testing and maintenance phases are shortened. Critics on the other hand maintain that software formalisms are hard to master, tedious to use and not well suited for the fast turnaround times demanded by industry. In this paper some popular formalisms and the advantages of using these during the early phases of the software development life cycle are presented. Employing the Floyd-Hoare verification principles during the formal specification phase facilitates reasoning about the properties of a specification. Some observations that may help to alleviate the formal-methods controversy are established and a number of formal methods successes is presented. Possible conditions for an increased acceptance of formalisms in oftware development are discussed.

  8. VAT-69, a software system for gamma spectroscopy

    International Nuclear Information System (INIS)

    Furr, A.K.; Roscoe, B.A.; Parkinson, T.F.

    1979-01-01

    The software system was originally developed solely for neutron activation analysis. Its usefulness has been enhanced by adding modules that allow processing of gamma spectra from natural radioisotopes and from fission products. It allows: (1) separation of overlapping peaks, allowing retrieval of a peak of interest in the presence of an interfering peak, (2) calibration of each gamma spectrum for energy and peak width, using criteria based on gamma peak data internal to the individual spectrum, (3) correction for errors due to rapidly changing dead times during the counting interval, permitting accurate count data for samples containing mixed short-, medium-, and long-lived isotopes. One disadvantage of the original software was that it produced more output information than desired. The modifications that have been implemented to produce final concentration values include: (1) computation of a weighted-average concentration of the ith element where two or more gamma peaks are available, (2) rejection of gamma peaks when the difference in energies of the located peak and library peak exceeds a preset value, (3) rejection of concentration values based on gamma peaks which do not satisfy preselected criteria for irradiation time and wait time, (4) computation of the error in concentration of the ith element, and (5) correction of sample concentration for trace elements in the irradiation vials. Overall performance of the software system is checked periodically by analyzing standards. Several thousand spectra are processed each year with VAT-69, with typically 25 to 40 elements quantitatively determined

  9. Patch Transporter: Incentivized, Decentralized Software Patch System for WSN and IoT Environments

    Science.gov (United States)

    Lee, JongHyup

    2018-01-01

    In the complicated settings of WSN (Wireless Sensor Networks) and IoT (Internet of Things) environments, keeping a number of heterogeneous devices updated is a challenging job, especially with respect to effectively discovering target devices and rapidly delivering the software updates. In this paper, we convert the traditional software update process to a distributed service. We set an incentive system for faithfully transporting the patches to the recipient devices. The incentive system motivates independent, self-interested transporters for helping the devices to be updated. To ensure the system correctly operates, we employ the blockchain system that enforces the commitment in a decentralized manner. We also present a detailed specification for the proposed protocol and validate it by model checking and simulations for correctness. PMID:29438337

  10. Patch Transporter: Incentivized, Decentralized Software Patch System for WSN and IoT Environments.

    Science.gov (United States)

    Lee, JongHyup

    2018-02-13

    [-12]In the complicated settings of WSN (Wireless Sensor Networks) and IoT (Internet of Things) environments, keeping a number of heterogeneous devices updated is a challenging job, especially with respect to effectively discovering target devices and rapidly delivering the software updates. In this paper, we convert the traditional software update process to a distributed service. We set an incentive system for faithfully transporting the patches to the recipient devices. The incentive system motivates independent, self-interested transporters for helping the devices to be updated. To ensure the system correctly operates, we employ the blockchain system that enforces the commitment in a decentralized manner. We also present a detailed specification for the proposed protocol and validate it by model checking and simulations for correctness.

  11. Patch Transporter: Incentivized, Decentralized Software Patch System for WSN and IoT Environments

    Directory of Open Access Journals (Sweden)

    JongHyup Lee

    2018-02-01

    Full Text Available In the complicated settings of WSN (Wireless Sensor Networks and IoT (Internet of Things environments, keeping a number of heterogeneous devices updated is a challenging job, especially with respect to effectively discovering target devices and rapidly delivering the software updates. In this paper, we convert the traditional software update process to a distributed service. We set an incentive system for faithfully transporting the patches to the recipient devices. The incentive system motivates independent, self-interested transporters for helping the devices to be updated. To ensure the system correctly operates, we employ the blockchain system that enforces the commitment in a decentralized manner. We also present a detailed specification for the proposed protocol and validate it by model checking and simulations for correctness.

  12. Empirically Determined Response Matrices for On-Line Orbit and Energy Correction at Jefferson Lab

    International Nuclear Information System (INIS)

    Leigh Harwood; Alicia Hofler; Michele Joyce; Valeri Lebedev; David Bryan

    2001-01-01

    Jefferson Lab uses feedback loops (less than 1 hertz update rate) to correct drifts in CEBAF's electron beam orbit and energy. Previous incarnations of these loops used response matrices that were computed by a numerical model of the machine. Jefferson Lab is transitioning this feedback system to use empirically determined response matrices whereby the software introduces small orbit or energy deviations using the loop's actuators and measures the system response with the loop's sensors. This method is in routine use for orbit correction. This paper will describe the orbit correction system and future plans to extend this method to energy correction

  13. Real-time perspective correction in video stream

    Directory of Open Access Journals (Sweden)

    Glagolev Vladislav

    2018-01-01

    Full Text Available The paper describes an algorithm used for software perspective correction. The algorithm uses the camera’s orientation angles and transforms the coordinates of pixels on a source image to coordinates on a virtual image form the camera whose focal plane is perpendicular to the gravity vector. This algorithm can be used as a low-cost replacement of a gyrostabilazer in specific applications that restrict using movable parts or heavy and pricey equipment.

  14. ATLAS software stack on ARM64

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00529764; The ATLAS collaboration; Stewart, Graeme; Seuster, Rolf; Quadt, Arnulf

    2017-01-01

    This paper reports on the port of the ATLAS software stack onto new prototype ARM64 servers. This included building the “external” packages that the ATLAS software relies on. Patches were needed to introduce this new architecture into the build as well as patches that correct for platform specific code that caused failures on non-x86 architectures. These patches were applied such that porting to further platforms will need no or only very little adjustments. A few additional modifications were needed to account for the different operating system, Ubuntu instead of Scientific Linux 6 / CentOS7. Selected results from the validation of the physics outputs on these ARM 64-bit servers will be shown. CPU, memory and IO intensive benchmarks using ATLAS specific environment and infrastructure have been performed, with a particular emphasis on the performance vs. energy consumption.

  15. ATLAS software stack on ARM64

    Science.gov (United States)

    Smith, Joshua Wyatt; Stewart, Graeme A.; Seuster, Rolf; Quadt, Arnulf; ATLAS Collaboration

    2017-10-01

    This paper reports on the port of the ATLAS software stack onto new prototype ARM64 servers. This included building the “external” packages that the ATLAS software relies on. Patches were needed to introduce this new architecture into the build as well as patches that correct for platform specific code that caused failures on non-x86 architectures. These patches were applied such that porting to further platforms will need no or only very little adjustments. A few additional modifications were needed to account for the different operating system, Ubuntu instead of Scientific Linux 6 / CentOS7. Selected results from the validation of the physics outputs on these ARM 64-bit servers will be shown. CPU, memory and IO intensive benchmarks using ATLAS specific environment and infrastructure have been performed, with a particular emphasis on the performance vs. energy consumption.

  16. Survey of Verification and Validation Techniques for Small Satellite Software Development

    Science.gov (United States)

    Jacklin, Stephen A.

    2015-01-01

    The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.

  17. SDMdata: A Web-Based Software Tool for Collecting Species Occurrence Records.

    Directory of Open Access Journals (Sweden)

    Xiaoquan Kong

    Full Text Available It is important to easily and efficiently obtain high quality species distribution data for predicting the potential distribution of species using species distribution models (SDMs. There is a need for a powerful software tool to automatically or semi-automatically assist in identifying and correcting errors. Here, we use Python to develop a web-based software tool (SDMdata to easily collect occurrence data from the Global Biodiversity Information Facility (GBIF and check species names and the accuracy of coordinates (latitude and longitude. It is an open source software (GNU Affero General Public License/AGPL licensed allowing anyone to access and manipulate the source code. SDMdata is available online free of charge from .

  18. Software Dependability and Safety Evaluations ESA's Initiative

    Science.gov (United States)

    Hernek, M.

    ESA has allocated funds for an initiative to evaluate Dependability and Safety methods of Software. The objectives of this initiative are; · More extensive validation of Safety and Dependability techniques for Software · Provide valuable results to improve the quality of the Software thus promoting the application of Dependability and Safety methods and techniques. ESA space systems are being developed according to defined PA requirement specifications. These requirements may be implemented through various design concepts, e.g. redundancy, diversity etc. varying from project to project. Analysis methods (FMECA. FTA, HA, etc) are frequently used during requirements analysis and design activities to assure the correct implementation of system PA requirements. The criticality level of failures, functions and systems is determined and by doing that the critical sub-systems are identified, on which dependability and safety techniques are to be applied during development. Proper performance of the software development requires the development of a technical specification for the products at the beginning of the life cycle. Such technical specification comprises both functional and non-functional requirements. These non-functional requirements address characteristics of the product such as quality, dependability, safety and maintainability. Software in space systems is more and more used in critical functions. Also the trend towards more frequent use of COTS and reusable components pose new difficulties in terms of assuring reliable and safe systems. Because of this, its dependability and safety must be carefully analysed. ESA identified and documented techniques, methods and procedures to ensure that software dependability and safety requirements are specified and taken into account during the design and development of a software system and to verify/validate that the implemented software systems comply with these requirements [R1].

  19. Software analysis by simulation for nuclear plant availability and safety goals

    International Nuclear Information System (INIS)

    Lapassat, A.M.; Segalard, J.; Salichon, M.; Le Meur, M.; Boulc'h, J.

    1988-01-01

    The microprocessors utilisation for monitoring protection and safety of nuclear reactor has become reality in the eighties. The authorities responsible for reactor safety systems have considered the necessity of the correct functioning of reactor control systems. The problems take off, when analysis of software, has led us in a first time to develop a completely software tool of verification and validation of programs and specifications. The CEA (French Atomic Energie Commission) responsible of reliable distributed techniques of nuclear plant discusses in this paper the software test and simulation tools used to analyse real-time software. The tool O.S.T. make part of a big program of help for the conception and the evaluation for the systems' fault tolerance which the European ESPRIT SMART no. 1609 (System Measurement and Architecture Technique) will be the kernel [fr

  20. Correction tool for Active Shape Model based lumbar muscle segmentation.

    Science.gov (United States)

    Valenzuela, Waldo; Ferguson, Stephen J; Ignasiak, Dominika; Diserens, Gaelle; Vermathen, Peter; Boesch, Chris; Reyes, Mauricio

    2015-08-01

    In the clinical environment, accuracy and speed of the image segmentation process plays a key role in the analysis of pathological regions. Despite advances in anatomic image segmentation, time-effective correction tools are commonly needed to improve segmentation results. Therefore, these tools must provide faster corrections with a low number of interactions, and a user-independent solution. In this work we present a new interactive correction method for correcting the image segmentation. Given an initial segmentation and the original image, our tool provides a 2D/3D environment, that enables 3D shape correction through simple 2D interactions. Our scheme is based on direct manipulation of free form deformation adapted to a 2D environment. This approach enables an intuitive and natural correction of 3D segmentation results. The developed method has been implemented into a software tool and has been evaluated for the task of lumbar muscle segmentation from Magnetic Resonance Images. Experimental results show that full segmentation correction could be performed within an average correction time of 6±4 minutes and an average of 68±37 number of interactions, while maintaining the quality of the final segmentation result within an average Dice coefficient of 0.92±0.03.

  1. Prediction of software operational reliability using testing environment factor

    International Nuclear Information System (INIS)

    Jung, Hoan Sung

    1995-02-01

    Software reliability is especially important to customers these days. The need to quantify software reliability of safety-critical systems has been received very special attention and the reliability is rated as one of software's most important attributes. Since the software is an intellectual product of human activity and since it is logically complex, the failures are inevitable. No standard models have been established to prove the correctness and to estimate the reliability of software systems by analysis and/or testing. For many years, many researches have focused on the quantification of software reliability and there are many models developed to quantify software reliability. Most software reliability models estimate the reliability with the failure data collected during the test assuming that the test environments well represent the operation profile. User's interest is on the operational reliability rather than on the test reliability, however. The experiences show that the operational reliability is higher than the test reliability. With the assumption that the difference in reliability results from the change of environment, testing environment factor comprising the aging factor and the coverage factor are defined in this work to predict the ultimate operational reliability with the failure data. It is by incorporating test environments applied beyond the operational profile into testing environment factor Test reliability can also be estimated with this approach without any model change. The application results are close to the actual data. The approach used in this thesis is expected to be applicable to ultra high reliable software systems that are used in nuclear power plants, airplanes, and other safety-critical applications

  2. Bringing Model Checking Closer to Practical Software Engineering

    CERN Document Server

    AUTHOR|(CDS)2079681; Templon, J A; Willemse, T.A.C.

    Software grows in size and complexity, making it increasingly challenging to ensure that it behaves correctly. This is especially true for distributed systems, where a multitude of components are running concurrently, making it dicult to anticipate all the possible behaviors emerging in the system as a whole. Certain design errors, such as deadlocks and race-conditions, can often go unnoticed when testing is the only form of verication employed in the software engineering life-cycle. Even when bugs are detected in a running software, revealing the root cause and reproducing the behavior can be time consuming (and even impossible), given the lack of control the engineer has over the execution of the concurrent components, as well as the number of possible scenarios that could have produced the problem. This is especially pronounced for large-scale distributed systems such as the Worldwide Large Hadron Collider Computing Grid. Formal verication methods oer more rigorous means of determining whether a system sat...

  3. Optimal structure of fault-tolerant software systems

    International Nuclear Information System (INIS)

    Levitin, Gregory

    2005-01-01

    This paper considers software systems consisting of fault-tolerant components. These components are built from functionally equivalent but independently developed versions characterized by different reliability and execution time. Because of hardware resource constraints, the number of versions that can run simultaneously is limited. The expected system execution time and its reliability (defined as probability of obtaining the correct output within a specified time) strictly depend on parameters of software versions and sequence of their execution. The system structure optimization problem is formulated in which one has to choose software versions for each component and find the sequence of their execution in order to achieve the greatest system reliability subject to cost constraints. The versions are to be chosen from a list of available products. Each version is characterized by its reliability, execution time and cost. The suggested optimization procedure is based on an algorithm for determining system execution time distribution that uses the moment generating function approach and on the genetic algorithm. Both N-version programming and the recovery block scheme are considered within a universal model. Illustrated example is presented

  4. SWEPP gamma-ray spectrometer system software test plan and report

    International Nuclear Information System (INIS)

    Femec, D.A.

    1994-09-01

    The SWEPP Gamma-Ray Spectrometer (SGRS) System has been developed by the Radiation Measurements and Development Unit of the Idaho National Engineering Laboratory to assist in the characterization of the radiological contents of contact-handled waste containers at the Stored Waste Examination Pilot Plant (SWEPP). In addition to determining the concentrations of gamma-ray-emitting radionuclides, the software also calculates attenuation-corrected isotopic mass ratios of specific interest, and provides controls for SGRS hardware as required. This document presents the test plan and report for the data acquisition and analysis software associated with the SGRS system

  5. A quantitative comparison of corrective and perfective maintenance

    Science.gov (United States)

    Henry, Joel; Cain, James

    1994-01-01

    This paper presents a quantitative comparison of corrective and perfective software maintenance activities. The comparison utilizes basic data collected throughout the maintenance process. The data collected are extensive and allow the impact of both types of maintenance to be quantitatively evaluated and compared. Basic statistical techniques test relationships between and among process and product data. The results show interesting similarities and important differences in both process and product characteristics.

  6. Software maintenance in scientific and engineering environments: An introduction and guide

    Science.gov (United States)

    Wright, David

    1986-01-01

    The purpose of software maintenance techniques is addressed. The aims of perfective, adaptive and corrective software maintenance are defined and discussed, especially in the NASA research environment. Areas requiring maintenance, and tools available for this, and suggestions for their use are made. Stress is placed on the organizational aspect of maintenance at both the individual and group level. Particular emphasis is placed on the use of various forms of documentation as the basis around which to organize. Finally, suggestions are given on how to proceed in the partial or complete absence of such documentation.

  7. Efficient color correction method for smartphone camera-based health monitoring application.

    Science.gov (United States)

    Duc Dang; Chae Ho Cho; Daeik Kim; Oh Seok Kwon; Jo Woon Chong

    2017-07-01

    Smartphone health monitoring applications are recently highlighted due to the rapid development of hardware and software performance of smartphones. However, color characteristics of images captured by different smartphone models are dissimilar each other and this difference may give non-identical health monitoring results when the smartphone health monitoring applications monitor physiological information using their embedded smartphone cameras. In this paper, we investigate the differences in color properties of the captured images from different smartphone models and apply a color correction method to adjust dissimilar color values obtained from different smartphone cameras. Experimental results show that the color corrected images using the correction method provide much smaller color intensity errors compared to the images without correction. These results can be applied to enhance the consistency of smartphone camera-based health monitoring applications by reducing color intensity errors among the images obtained from different smartphones.

  8. The NOvA software testing framework

    International Nuclear Information System (INIS)

    Tamsett, M; Group, C

    2015-01-01

    The NOvA experiment at Fermilab is a long-baseline neutrino experiment designed to study vε appearance in a vμ beam. NOvA has already produced more than one million Monte Carlo and detector generated files amounting to more than 1 PB in size. This data is divided between a number of parallel streams such as far and near detector beam spills, cosmic ray backgrounds, a number of data-driven triggers and over 20 different Monte Carlo configurations. Each of these data streams must be processed through the appropriate steps of the rapidly evolving, multi-tiered, interdependent NOvA software framework. In total there are greater than 12 individual software tiers, each of which performs a different function and can be configured differently depending on the input stream. In order to regularly test and validate that all of these software stages are working correctly NOvA has designed a powerful, modular testing framework that enables detailed validation and benchmarking to be performed in a fast, efficient and accessible way with minimal expert knowledge. The core of this system is a novel series of python modules which wrap, monitor and handle the underlying C++ software framework and then report the results to a slick front-end web-based interface. This interface utilises modern, cross-platform, visualisation libraries to render the test results in a meaningful way. They are fast and flexible, allowing for the easy addition of new tests and datasets. In total upwards of 14 individual streams are regularly tested amounting to over 70 individual software processes, producing over 25 GB of output files. The rigour enforced through this flexible testing framework enables NOvA to rapidly verify configurations, results and software and thus ensure that data is available for physics analysis in a timely and robust manner. (paper)

  9. E-COCOMO: The Extended COst Constructive MOdel for Cleanroom Software Engineering

    Directory of Open Access Journals (Sweden)

    Hitesh KUMAR SHARMA

    2014-02-01

    Full Text Available Mistakes create rework. Rework takes time and increases costs. The traditional software engineering methodology defines the ratio of Design:Code:Test as 40:20:40. As we can easily see that 40% time and efforts are used in testing phase in traditional approach, that means we have to perform rework again if we found some bugs in testing phase. This rework is being performed after Design and code phase. This rework will increase the cost exponentially. The cleanroom software engineering methodology controls the exponential growth in cost by removing this rework. It says that "do the work correct in first attempt and move to next phase after getting the proof of correctness". This new approach minimized the rework and reduces the cost in the exponential ratio. Due to the removal of testing phase, the COCOMO (COst COnstructive MOdel used for the traditional engineering is not directly applicable in cleanroom software engineering. The traditional cost drivers used for traditional COCOMO needs to be revised. We have proposed the Extended version of COCOMO (i.e. E-COCOMO in which we have incorporated some new cost drivers. This paper explains the proposed E-COCOMO and the detailed description of proposed new cost driver.

  10. An integrated software testing framework for FGA-based controllers in nuclear power plants

    International Nuclear Information System (INIS)

    Kim, Jae Yeob; Kim, Eun Sub; Yoo, Jun Beom; Lee, Young Jun; Choi, Jong Gyun

    2016-01-01

    Field-programmable gate arrays (FPGAs) have received much attention from the nuclear industry as an alternative platform to programmable logic controllers for digital instrumentation and control. The software aspect of FPGA development consists of several steps of synthesis and refinement, and also requires verification activities, such as simulations that are performed individually at each step. This study proposed an integrated software-testing framework for simulating all artifacts of the FPGA software development simultaneously and evaluating whether all artifacts work correctly using common oracle programs. This method also generates a massive number of meaningful simulation scenarios that reflect reactor shutdown logics. The experiment, which was performed on two FPGA software implementations, showed that it can dramatically save both time and costs

  11. Motion correction of PET brain images through deconvolution: I. Theoretical development and analysis in software simulations

    Science.gov (United States)

    Faber, T. L.; Raghunath, N.; Tudorascu, D.; Votaw, J. R.

    2009-02-01

    Image quality is significantly degraded even by small amounts of patient motion in very high-resolution PET scanners. Existing correction methods that use known patient motion obtained from tracking devices either require multi-frame acquisitions, detailed knowledge of the scanner, or specialized reconstruction algorithms. A deconvolution algorithm has been developed that alleviates these drawbacks by using the reconstructed image to estimate the original non-blurred image using maximum likelihood estimation maximization (MLEM) techniques. A high-resolution digital phantom was created by shape-based interpolation of the digital Hoffman brain phantom. Three different sets of 20 movements were applied to the phantom. For each frame of the motion, sinograms with attenuation and three levels of noise were simulated and then reconstructed using filtered backprojection. The average of the 20 frames was considered the motion blurred image, which was restored with the deconvolution algorithm. After correction, contrast increased from a mean of 2.0, 1.8 and 1.4 in the motion blurred images, for the three increasing amounts of movement, to a mean of 2.5, 2.4 and 2.2. Mean error was reduced by an average of 55% with motion correction. In conclusion, deconvolution can be used for correction of motion blur when subject motion is known.

  12. Motion correction of PET brain images through deconvolution: I. Theoretical development and analysis in software simulations

    Energy Technology Data Exchange (ETDEWEB)

    Faber, T L; Raghunath, N; Tudorascu, D; Votaw, J R [Department of Radiology, Emory University Hospital, 1364 Clifton Road, N.E. Atlanta, GA 30322 (United States)], E-mail: tfaber@emory.edu

    2009-02-07

    Image quality is significantly degraded even by small amounts of patient motion in very high-resolution PET scanners. Existing correction methods that use known patient motion obtained from tracking devices either require multi-frame acquisitions, detailed knowledge of the scanner, or specialized reconstruction algorithms. A deconvolution algorithm has been developed that alleviates these drawbacks by using the reconstructed image to estimate the original non-blurred image using maximum likelihood estimation maximization (MLEM) techniques. A high-resolution digital phantom was created by shape-based interpolation of the digital Hoffman brain phantom. Three different sets of 20 movements were applied to the phantom. For each frame of the motion, sinograms with attenuation and three levels of noise were simulated and then reconstructed using filtered backprojection. The average of the 20 frames was considered the motion blurred image, which was restored with the deconvolution algorithm. After correction, contrast increased from a mean of 2.0, 1.8 and 1.4 in the motion blurred images, for the three increasing amounts of movement, to a mean of 2.5, 2.4 and 2.2. Mean error was reduced by an average of 55% with motion correction. In conclusion, deconvolution can be used for correction of motion blur when subject motion is known.

  13. Nurturing reliable and robust open-source scientific software

    Science.gov (United States)

    Uieda, L.; Wessel, P.

    2017-12-01

    Scientific results are increasingly the product of software. The reproducibility and validity of published results cannot be ensured without access to the source code of the software used to produce them. Therefore, the code itself is a fundamental part of the methodology and must be published along with the results. With such a reliance on software, it is troubling that most scientists do not receive formal training in software development. Tools such as version control, continuous integration, and automated testing are routinely used in industry to ensure the correctness and robustness of software. However, many scientist do not even know of their existence (although efforts like Software Carpentry are having an impact on this issue; software-carpentry.org). Publishing the source code is only the first step in creating an open-source project. For a project to grow it must provide documentation, participation guidelines, and a welcoming environment for new contributors. Expanding the project community is often more challenging than the technical aspects of software development. Maintainers must invest time to enforce the rules of the project and to onboard new members, which can be difficult to justify in the context of the "publish or perish" mentality. This problem will continue as long as software contributions are not recognized as valid scholarship by hiring and tenure committees. Furthermore, there are still unsolved problems in providing attribution for software contributions. Many journals and metrics of academic productivity do not recognize citations to sources other than traditional publications. Thus, some authors choose to publish an article about the software and use it as a citation marker. One issue with this approach is that updating the reference to include new contributors involves writing and publishing a new article. A better approach would be to cite a permanent archive of individual versions of the source code in services such as Zenodo

  14. Global Software Engineering: A Software Process Approach

    Science.gov (United States)

    Richardson, Ita; Casey, Valentine; Burton, John; McCaffery, Fergal

    Our research has shown that many companies are struggling with the successful implementation of global software engineering, due to temporal, cultural and geographical distance, which causes a range of factors to come into play. For example, cultural, project managementproject management and communication difficulties continually cause problems for software engineers and project managers. While the implementation of efficient software processes can be used to improve the quality of the software product, published software process models do not cater explicitly for the recent growth in global software engineering. Our thesis is that global software engineering factors should be included in software process models to ensure their continued usefulness in global organisations. Based on extensive global software engineering research, we have developed a software process, Global Teaming, which includes specific practices and sub-practices. The purpose is to ensure that requirements for successful global software engineering are stipulated so that organisations can ensure successful implementation of global software engineering.

  15. An experience of qualified preventive screening: shiraz smart screening software.

    Science.gov (United States)

    Islami Parkoohi, Parisa; Zare, Hashem; Abdollahifard, Gholamreza

    2015-01-01

    Computerized preventive screening software is a cost effective intervention tool to address non-communicable chronic diseases. Shiraz Smart Screening Software (SSSS) was developed as an innovative tool for qualified screening. It allows simultaneous smart screening of several high-burden chronic diseases and supports reminder notification functionality. The extent in which SSSS affects screening quality is also described. Following software development, preventive screening and annual health examinations of 261 school staff (Medical School of Shiraz, Iran) was carried out in a software-assisted manner. To evaluate the quality of the software-assisted screening, we used quasi-experimental study design and determined coverage, irregular attendance and inappropriateness proportions in relation with the manual and software-assisted screening as well as the corresponding number of requested tests. In manual screening method, 27% of employees were covered (with 94% irregular attendance) while by software-assisted screening, the coverage proportion was 79% (attendance status will clear after the specified time). The frequency of inappropriate screening test requests, before the software implementation, was 41.37% for fasting plasma glucose, 41.37% for lipid profile, 0.84% for occult blood, 0.19% for flexible sigmoidoscopy/colonoscopy, 35.29% for Pap smear, 19.20% for mammography and 11.2% for prostate specific antigen. All of the above were corrected by the software application. In total, 366 manual screening and 334 software-assisted screening tests were requested. SSSS is an innovative tool to improve the quality of preventive screening plans in terms of increased screening coverage, reduction in inappropriateness and the total number of requested tests.

  16. Motion correction in thoracic positron emission tomography

    CERN Document Server

    Gigengack, Fabian; Dawood, Mohammad; Schäfers, Klaus P

    2015-01-01

    Respiratory and cardiac motion leads to image degradation in Positron Emission Tomography (PET), which impairs quantification. In this book, the authors present approaches to motion estimation and motion correction in thoracic PET. The approaches for motion estimation are based on dual gating and mass-preserving image registration (VAMPIRE) and mass-preserving optical flow (MPOF). With mass-preservation, image intensity modulations caused by highly non-rigid cardiac motion are accounted for. Within the image registration framework different data terms, different variants of regularization and parametric and non-parametric motion models are examined. Within the optical flow framework, different data terms and further non-quadratic penalization are also discussed. The approaches for motion correction particularly focus on pipelines in dual gated PET. A quantitative evaluation of the proposed approaches is performed on software phantom data with accompanied ground-truth motion information. Further, clinical appl...

  17. Monitoring the software quality in FairRoot

    Energy Technology Data Exchange (ETDEWEB)

    Uhlig, Florian; Al-Turany, Mohammad [GSI, Darmstadt (Germany)

    2010-07-01

    Up-to-date informations about a software project helps to find problems as early as possible. This includes for example information if a software project can be build on all supported platforms without errors or if specified tests can be executed and deliver the correct results. We present the scheme which is used within the FairRoot framework to continuously monitor the status of the project. The tools used for these tasks are based on the open source tools CMake and CDash. CMake is used to generate standard build files for the different operating systems/compiler out of simple configuration files and to steer the build and test processes. The generated information is send to a central CDash server. From the generated web pages information about the status of the project at any given time can be obtained.

  18. Software engineering architecture-driven software development

    CERN Document Server

    Schmidt, Richard F

    2013-01-01

    Software Engineering: Architecture-driven Software Development is the first comprehensive guide to the underlying skills embodied in the IEEE's Software Engineering Body of Knowledge (SWEBOK) standard. Standards expert Richard Schmidt explains the traditional software engineering practices recognized for developing projects for government or corporate systems. Software engineering education often lacks standardization, with many institutions focusing on implementation rather than design as it impacts product architecture. Many graduates join the workforce with incomplete skil

  19. Systematic evaluation of three different commercial software solutions for automatic segmentation for adaptive therapy in head-and-neck, prostate and pleural cancer

    International Nuclear Information System (INIS)

    La Macchia, Mariangela; Fellin, Francesco; Amichetti, Maurizio; Cianchetti, Marco; Gianolini, Stefano; Paola, Vitali; Lomax, Antony J; Widesott, Lamberto

    2012-01-01

    To validate, in the context of adaptive radiotherapy, three commercial software solutions for atlas-based segmentation. Fifteen patients, five for each group, with cancer of the Head&Neck, pleura, and prostate were enrolled in the study. In addition to the treatment planning CT (pCT) images, one replanning CT (rCT) image set was acquired for each patient during the RT course. Three experienced physicians outlined on the pCT and rCT all the volumes of interest (VOIs). We used three software solutions (VelocityAI 2.6.2 (V), MIM 5.1.1 (M) by MIMVista and ABAS 2.0 (A) by CMS-Elekta) to generate the automatic contouring on the repeated CT. All the VOIs obtained with automatic contouring (AC) were successively corrected manually. We recorded the time needed for: 1) ex novo ROIs definition on rCT; 2) generation of AC by the three software solutions; 3) manual correction of AC. To compare the quality of the volumes obtained automatically by the software and manually corrected with those drawn from scratch on rCT, we used the following indexes: overlap coefficient (DICE), sensitivity, inclusiveness index, difference in volume, and displacement differences on three axes (x, y, z) from the isocenter. The time saved by the three software solutions for all the sites, compared to the manual contouring from scratch, is statistically significant and similar for all the three software solutions. The time saved for each site are as follows: about an hour for Head&Neck, about 40 minutes for prostate, and about 20 minutes for mesothelioma. The best DICE similarity coefficient index was obtained with the manual correction for: A (contours for prostate), A and M (contours for H&N), and M (contours for mesothelioma). From a clinical point of view, the automated contouring workflow was shown to be significantly shorter than the manual contouring process, even though manual correction of the VOIs is always needed

  20. Systematic evaluation of three different commercial software solutions for automatic segmentation for adaptive therapy in head-and-neck, prostate and pleural cancer.

    Science.gov (United States)

    La Macchia, Mariangela; Fellin, Francesco; Amichetti, Maurizio; Cianchetti, Marco; Gianolini, Stefano; Paola, Vitali; Lomax, Antony J; Widesott, Lamberto

    2012-09-18

    To validate, in the context of adaptive radiotherapy, three commercial software solutions for atlas-based segmentation. Fifteen patients, five for each group, with cancer of the Head&Neck, pleura, and prostate were enrolled in the study. In addition to the treatment planning CT (pCT) images, one replanning CT (rCT) image set was acquired for each patient during the RT course. Three experienced physicians outlined on the pCT and rCT all the volumes of interest (VOIs). We used three software solutions (VelocityAI 2.6.2 (V), MIM 5.1.1 (M) by MIMVista and ABAS 2.0 (A) by CMS-Elekta) to generate the automatic contouring on the repeated CT. All the VOIs obtained with automatic contouring (AC) were successively corrected manually. We recorded the time needed for: 1) ex novo ROIs definition on rCT; 2) generation of AC by the three software solutions; 3) manual correction of AC.To compare the quality of the volumes obtained automatically by the software and manually corrected with those drawn from scratch on rCT, we used the following indexes: overlap coefficient (DICE), sensitivity, inclusiveness index, difference in volume, and displacement differences on three axes (x, y, z) from the isocenter. The time saved by the three software solutions for all the sites, compared to the manual contouring from scratch, is statistically significant and similar for all the three software solutions. The time saved for each site are as follows: about an hour for Head&Neck, about 40 minutes for prostate, and about 20 minutes for mesothelioma. The best DICE similarity coefficient index was obtained with the manual correction for: A (contours for prostate), A and M (contours for H&N), and M (contours for mesothelioma). From a clinical point of view, the automated contouring workflow was shown to be significantly shorter than the manual contouring process, even though manual correction of the VOIs is always needed.

  1. Systematic evaluation of three different commercial software solutions for automatic segmentation for adaptive therapy in head-and-neck, prostate and pleural cancer

    Directory of Open Access Journals (Sweden)

    La Macchia Mariangela

    2012-09-01

    Full Text Available Abstract Purpose To validate, in the context of adaptive radiotherapy, three commercial software solutions for atlas-based segmentation. Methods and materials Fifteen patients, five for each group, with cancer of the Head&Neck, pleura, and prostate were enrolled in the study. In addition to the treatment planning CT (pCT images, one replanning CT (rCT image set was acquired for each patient during the RT course. Three experienced physicians outlined on the pCT and rCT all the volumes of interest (VOIs. We used three software solutions (VelocityAI 2.6.2 (V, MIM 5.1.1 (M by MIMVista and ABAS 2.0 (A by CMS-Elekta to generate the automatic contouring on the repeated CT. All the VOIs obtained with automatic contouring (AC were successively corrected manually. We recorded the time needed for: 1 ex novo ROIs definition on rCT; 2 generation of AC by the three software solutions; 3 manual correction of AC. To compare the quality of the volumes obtained automatically by the software and manually corrected with those drawn from scratch on rCT, we used the following indexes: overlap coefficient (DICE, sensitivity, inclusiveness index, difference in volume, and displacement differences on three axes (x, y, z from the isocenter. Results The time saved by the three software solutions for all the sites, compared to the manual contouring from scratch, is statistically significant and similar for all the three software solutions. The time saved for each site are as follows: about an hour for Head&Neck, about 40 minutes for prostate, and about 20 minutes for mesothelioma. The best DICE similarity coefficient index was obtained with the manual correction for: A (contours for prostate, A and M (contours for H&N, and M (contours for mesothelioma. Conclusions From a clinical point of view, the automated contouring workflow was shown to be significantly shorter than the manual contouring process, even though manual correction of the VOIs is always needed.

  2. Designing a Signal Conditioning System with Software Calibration for Resistor-feedback Patch Clamp Amplifier.

    Science.gov (United States)

    Hu, Gang; Zhu, Quanhui; Qu, Anlian

    2005-01-01

    In this paper, a programmable signal conditioning system based on software calibration for resistor-feedback patch clamp amplifier (PCA) has been described, this system is mainly composed of frequency correction, programmable gain and filter whose parameters are configured by software automatically to minimize the errors, A lab-designed data acquisition system (DAQ) is used to implement data collections and communications with PC. The laboratory test results show good agreement with design specifications.

  3. Determining spherical lens correction for astronaut training underwater.

    Science.gov (United States)

    Porter, Jason; Gibson, C Robert; Strauss, Samuel

    2011-09-01

    To develop a model that will accurately predict the distance spherical lens correction needed to be worn by National Aeronautics and Space Administration astronauts while training underwater. The replica space suit's helmet contains curved visors that induce refractive power when submersed in water. Anterior surface powers and thicknesses were measured for the helmet's protective and inside visors. The impact of each visor on the helmet's refractive power in water was analyzed using thick lens calculations and Zemax optical design software. Using geometrical optics approximations, a model was developed to determine the optimal distance spherical power needed to be worn underwater based on the helmet's total induced spherical power underwater and the astronaut's manifest spectacle plane correction in air. The validity of the model was tested using data from both eyes of 10 astronauts who trained underwater. The helmet's visors induced a total power of -2.737 D when placed underwater. The required underwater spherical correction (FW) was linearly related to the spectacle plane spherical correction in air (FAir): FW = FAir + 2.356 D. The mean magnitude of the difference between the actual correction worn underwater and the calculated underwater correction was 0.20 ± 0.11 D. The actual and calculated values were highly correlated (r = 0.971) with 70% of eyes having a difference in magnitude of astronauts. The model accurately predicts the actual values worn underwater and can be applied (more generally) to determine a suitable spectacle lens correction to be worn behind other types of masks when submerged underwater.

  4. Use of voice recognition software in an outpatient pediatric specialty practice.

    Science.gov (United States)

    Issenman, Robert M; Jaffer, Iqbal H

    2004-09-01

    Voice recognition software (VRS), with specialized medical vocabulary, is being promoted to enhance physician efficiency, decrease costs, and improve patient safety. This study reports the experience of a pediatric subspecialist (pediatric gastroenterology) physician with the use of Dragon Naturally Speaking (version 6; ScanSoft Inc, Peabody, MA), incorporated for use with a proprietary electronic medical record, in a large university medical center ambulatory care service. After 2 hours of group orientation and 2 hours of individual VRS instruction, the physician trained the software for 1 month (30 letters) during a hospital slowdown. Set-up, dictation, and correction times for the physician and medical transcriptionist were recorded for these training sessions, as well as for 42 subsequently dictated letters. Figures were extrapolated to the yearly clinic volume for the physician, to estimate costs (physician: 110 dollars per hour; transcriptionist: 11 dollars per hour, US dollars). The use of VRS required an additional 200% of physician dictation and correction time (9 minutes vs 3 minutes), compared with the use of electronic signatures for letters typed by an experienced transcriptionist and imported into the electronic medical record. When the cost of the license agreement and the costs of physician and transcriptionist time were included, the use of the software cost 100% more, for the amount of dictation performed annually by the physician. VRS is an intriguing technology. It holds the possibility of streamlining medical practice. However, the learning curve and accuracy of the tested version of the software limit broad physician acceptance at this time.

  5. Guide to verification and validation of the SCALE-4 radiation shielding software

    Energy Technology Data Exchange (ETDEWEB)

    Broadhead, B.L.; Emmett, M.B.; Tang, J.S.

    1996-12-01

    Whenever a decision is made to newly install the SCALE radiation shielding software on a computer system, the user should run a set of verification and validation (V&V) test cases to demonstrate that the software is properly installed and functioning correctly. This report is intended to serve as a guide for this V&V in that it specifies test cases to run and gives expected results. The report describes the V&V that has been performed for the radiation shielding software in a version of SCALE-4. This report provides documentation of sample problems which are recommended for use in the V&V of the SCALE-4 system for all releases. The results reported in this document are from the SCALE-4.2P version which was run on an IBM RS/6000 work-station. These results verify that the SCALE-4 radiation shielding software has been correctly installed and is functioning properly. A set of problems for use by other shielding codes (e.g., MCNP, TWOTRAN, MORSE) performing similar V&V are discussed. A validation has been performed for XSDRNPM and MORSE-SGC6 utilizing SASI and SAS4 shielding sequences and the SCALE 27-18 group (27N-18COUPLE) cross-section library for typical nuclear reactor spent fuel sources and a variety of transport package geometries. The experimental models used for the validation were taken from two previous applications of the SASI and SAS4 methods.

  6. Guide to verification and validation of the SCALE-4 radiation shielding software

    International Nuclear Information System (INIS)

    Broadhead, B.L.; Emmett, M.B.; Tang, J.S.

    1996-12-01

    Whenever a decision is made to newly install the SCALE radiation shielding software on a computer system, the user should run a set of verification and validation (V ampersand V) test cases to demonstrate that the software is properly installed and functioning correctly. This report is intended to serve as a guide for this V ampersand V in that it specifies test cases to run and gives expected results. The report describes the V ampersand V that has been performed for the radiation shielding software in a version of SCALE-4. This report provides documentation of sample problems which are recommended for use in the V ampersand V of the SCALE-4 system for all releases. The results reported in this document are from the SCALE-4.2P version which was run on an IBM RS/6000 work-station. These results verify that the SCALE-4 radiation shielding software has been correctly installed and is functioning properly. A set of problems for use by other shielding codes (e.g., MCNP, TWOTRAN, MORSE) performing similar V ampersand V are discussed. A validation has been performed for XSDRNPM and MORSE-SGC6 utilizing SASI and SAS4 shielding sequences and the SCALE 27-18 group (27N-18COUPLE) cross-section library for typical nuclear reactor spent fuel sources and a variety of transport package geometries. The experimental models used for the validation were taken from two previous applications of the SASI and SAS4 methods

  7. The Application of V&V within Reuse-Based Software Engineering

    Science.gov (United States)

    Addy, Edward

    1996-01-01

    Verification and Validation (V&V) is performed during application development for many systems, especially safety-critical and mission-critical systems. The V&V process is intended to discover errors as early as possible during the development process. Early discovery is important in order to minimize the cost and other impacts of correcting these errors. In reuse-based software engineering, decisions on the requirements, design and even implementation of domain assets can can be made prior to beginning development of a specific system. in order to bring the effectiveness of V&V to bear within reuse-based software engineering. V&V must be incorporated within the domain engineering process.

  8. Optimising MR perfusion imaging: comparison of different software-based approaches in acute ischaemic stroke

    Energy Technology Data Exchange (ETDEWEB)

    Schaafs, Lars-Arne [Charite-Universitaetsmedizin, Department of Radiology, Berlin (Germany); Charite-Universitaetsmedizin, Academic Neuroradiology, Department of Neurology and Center for Stroke Research, Berlin (Germany); Porter, David [Fraunhofer Institute for Medical Image Computing MEVIS, Bremen (Germany); Audebert, Heinrich J. [Charite-Universitaetsmedizin, Department of Neurology with Experimental Neurology, Berlin (Germany); Fiebach, Jochen B.; Villringer, Kersten [Charite-Universitaetsmedizin, Academic Neuroradiology, Department of Neurology and Center for Stroke Research, Berlin (Germany)

    2016-11-15

    Perfusion imaging (PI) is susceptible to confounding factors such as motion artefacts as well as delay and dispersion (D/D). We evaluate the influence of different post-processing algorithms on hypoperfusion assessment in PI analysis software packages to improve the clinical accuracy of stroke PI. Fifty patients with acute ischaemic stroke underwent MRI imaging in the first 24 h after onset. Diverging approaches to motion and D/D correction were applied. The calculated MTT and CBF perfusion maps were assessed by volumetry of lesions and tested for agreement with a standard approach and with the final lesion volume (FLV) on day 6 in patients with persisting vessel occlusion. MTT map lesion volumes were significantly smaller throughout the software packages with correction of motion and D/D when compared to the commonly used approach with no correction (p = 0.001-0.022). Volumes on CBF maps did not differ significantly (p = 0.207-0.925). All packages with advanced post-processing algorithms showed a high level of agreement with FLV (ICC = 0.704-0.879). Correction of D/D had a significant influence on estimated lesion volumes and leads to significantly smaller lesion volumes on MTT maps. This may improve patient selection. (orig.)

  9. Optimising MR perfusion imaging: comparison of different software-based approaches in acute ischaemic stroke

    International Nuclear Information System (INIS)

    Schaafs, Lars-Arne; Porter, David; Audebert, Heinrich J.; Fiebach, Jochen B.; Villringer, Kersten

    2016-01-01

    Perfusion imaging (PI) is susceptible to confounding factors such as motion artefacts as well as delay and dispersion (D/D). We evaluate the influence of different post-processing algorithms on hypoperfusion assessment in PI analysis software packages to improve the clinical accuracy of stroke PI. Fifty patients with acute ischaemic stroke underwent MRI imaging in the first 24 h after onset. Diverging approaches to motion and D/D correction were applied. The calculated MTT and CBF perfusion maps were assessed by volumetry of lesions and tested for agreement with a standard approach and with the final lesion volume (FLV) on day 6 in patients with persisting vessel occlusion. MTT map lesion volumes were significantly smaller throughout the software packages with correction of motion and D/D when compared to the commonly used approach with no correction (p = 0.001-0.022). Volumes on CBF maps did not differ significantly (p = 0.207-0.925). All packages with advanced post-processing algorithms showed a high level of agreement with FLV (ICC = 0.704-0.879). Correction of D/D had a significant influence on estimated lesion volumes and leads to significantly smaller lesion volumes on MTT maps. This may improve patient selection. (orig.)

  10. Software FMEA analysis for safety-related application software

    International Nuclear Information System (INIS)

    Park, Gee-Yong; Kim, Dong Hoon; Lee, Dong Young

    2014-01-01

    Highlights: • We develop a modified FMEA analysis suited for applying to software architecture. • A template for failure modes on a specific software language is established. • A detailed-level software FMEA analysis on nuclear safety software is presented. - Abstract: A method of a software safety analysis is described in this paper for safety-related application software. The target software system is a software code installed at an Automatic Test and Interface Processor (ATIP) in a digital reactor protection system (DRPS). For the ATIP software safety analysis, at first, an overall safety or hazard analysis is performed over the software architecture and modules, and then a detailed safety analysis based on the software FMEA (Failure Modes and Effect Analysis) method is applied to the ATIP program. For an efficient analysis, the software FMEA analysis is carried out based on the so-called failure-mode template extracted from the function blocks used in the function block diagram (FBD) for the ATIP software. The software safety analysis by the software FMEA analysis, being applied to the ATIP software code, which has been integrated and passed through a very rigorous system test procedure, is proven to be able to provide very valuable results (i.e., software defects) that could not be identified during various system tests

  11. Co-verification of hardware and software for ARM SoC design

    CERN Document Server

    Andrews, Jason

    2004-01-01

    Hardware/software co-verification is how to make sure that embedded system software works correctly with the hardware, and that the hardware has been properly designed to run the software successfully -before large sums are spent on prototypes or manufacturing. This is the first book to apply this verification technique to the rapidly growing field of embedded systems-on-a-chip(SoC). As traditional embedded system design evolves into single-chip design, embedded engineers must be armed with the necessary information to make educated decisions about which tools and methodology to deploy. SoC verification requires a mix of expertise from the disciplines of microprocessor and computer architecture, logic design and simulation, and C and Assembly language embedded software. Until now, the relevant information on how it all fits together has not been available. Andrews, a recognized expert, provides in-depth information about how co-verification really works, how to be successful using it, and pitfalls to avoid. H...

  12. Improvement of Nonlinearity Correction for BESIII ETOF Upgrade

    Science.gov (United States)

    Sun, Weijia; Cao, Ping; Ji, Xiaolu; Fan, Huanhuan; Dai, Hongliang; Zhang, Jie; Liu, Shubin; An, Qi

    2015-08-01

    An improved scheme to implement integral non-linearity (INL) correction of time measurements in the Beijing Spectrometer III Endcap Time-of-Flight (BESIII ETOF) upgrade system is presented in this paper. During upgrade, multi-gap resistive plate chambers (MRPC) are introduced as ETOF detectors which increases the total number of time measurement channels to 1728. The INL correction method adopted in BESIII TOF proved to be of limited use, because the sharply increased number of electronic channels required for reading out the detector strips degrade the system configuration efficiency severely. Furthermore, once installed into the spectrometer, BESIII TOF electronics do not support the TDCs' nonlinearity evaluation online. In this proposed method, INL data used for the correction algorithm are automatically imported from a non-volatile read-only memory (ROM) instead of from data acquisition software. This guarantees the real-time performance and system efficiency of the INL correction, especially for the ETOF upgrades with massive number of channels. Besides, a signal that is not synchronized to the system 41.65 MHz clock from BEPCII is sent to the frontend electronics (FEE) to simulate pseudo-random test pulses for the purpose of online nonlinearity evaluation. Test results show that the time measuring INL errors in one module with 72 channels can be corrected online and in real time.

  13. Journal and Wave Bearing Impedance Calculation Software

    Science.gov (United States)

    Hanford, Amanda; Campbell, Robert

    2012-01-01

    The wave bearing software suite is a MALTA application that computes bearing properties for user-specified wave bearing conditions, as well as plain journal bearings. Wave bearings are fluid film journal bearings with multi-lobed wave patterns around the circumference of the bearing surface. In this software suite, the dynamic coefficients are outputted in a way for easy implementation in a finite element model used in rotor dynamics analysis. The software has a graphical user interface (GUI) for inputting bearing geometry parameters, and uses MATLAB s structure interface for ease of interpreting data. This innovation was developed to provide the stiffness and damping components of wave bearing impedances. The computational method for computing bearing coefficients was originally designed for plain journal bearings and tilting pad bearings. Modifications to include a wave bearing profile consisted of changing the film thickness profile given by an equation, and writing an algorithm to locate the integration limits for each fluid region. Careful consideration was needed to implement the correct integration limits while computing the dynamic coefficients, depending on the form of the input/output variables specified in the algorithm.

  14. The D OE software trigger

    International Nuclear Information System (INIS)

    Linnemann, J.T.; Michigan State Univ., East Lansing, MI

    1992-10-01

    In the D OE experiment, the software filter operates in a processor farm with each node processing a single event. Processing is data-driven: the filter does local processing to verify the candidates from the hardware trigger. The filter code consists of independent pieces called ''tools''; processing for a given hardware bit is a ''script'' invoking one or more ''tools'' sequentially. An offline simulator drives the same code with the same configuration files, running on real or simulated data. Online tests use farm nodes parasiting on the data stream. We discuss the performance of the system and how we attempt to verify its correctness

  15. Module Testing Techniques for Nuclear Safety Critical Software Using LDRA Testing Tool

    International Nuclear Information System (INIS)

    Moon, Kwon-Ki; Kim, Do-Yeon; Chang, Hoon-Seon; Chang, Young-Woo; Yun, Jae-Hee; Park, Jee-Duck; Kim, Jae-Hack

    2006-01-01

    The safety critical software in the I and C systems of nuclear power plants requires high functional integrity and reliability. To achieve those requirement goals, the safety critical software should be verified and tested according to related codes and standards through verification and validation (V and V) activities. The safety critical software testing is performed at various stages during the development of the software, and is generally classified as three major activities: module testing, system integration testing, and system validation testing. Module testing involves the evaluation of module level functions of hardware and software. System integration testing investigates the characteristics of a collection of modules and aims at establishing their correct interactions. System validation testing demonstrates that the complete system satisfies its functional requirements. In order to generate reliable software and reduce high maintenance cost, it is important that software testing is carried out at module level. Module testing for the nuclear safety critical software has rarely been performed by formal and proven testing tools because of its various constraints. LDRA testing tool is a widely used and proven tool set that provides powerful source code testing and analysis facilities for the V and V of general purpose software and safety critical software. Use of the tool set is indispensable where software is required to be reliable and as error-free as possible, and its use brings in substantial time and cost savings, and efficiency

  16. Sandia software guidelines: Software quality planning

    Energy Technology Data Exchange (ETDEWEB)

    1987-08-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. In consonance with the IEEE Standard for Software Quality Assurance Plans, this volume identifies procedures to follow in producing a Software Quality Assurance Plan for an organization or a project, and provides an example project SQA plan. 2 figs., 4 tabs.

  17. Software

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, R.; Budd, G.; Ross, E.; Wells, P.

    2010-07-15

    The software section of this journal presented new software programs that have been developed to help in the exploration and development of hydrocarbon resources. Software provider IHS Inc. has made additions to its geological and engineering analysis software tool, IHS PETRA, a product used by geoscientists and engineers to visualize, analyze and manage well production, well log, drilling, reservoir, seismic and other related information. IHS PETRA also includes a directional well module and a decline curve analysis module to improve analysis capabilities in unconventional reservoirs. Petris Technology Inc. has developed a software to help manage the large volumes of data. PetrisWinds Enterprise (PWE) helps users find and manage wellbore data, including conventional wireline and MWD core data; analysis core photos and images; waveforms and NMR; and external files documentation. Ottawa-based Ambercore Software Inc. has been collaborating with Nexen on the Petroleum iQ software for steam assisted gravity drainage (SAGD) producers. Petroleum iQ integrates geology and geophysics data with engineering data in 3D and 4D. Calgary-based Envirosoft Corporation has developed a software that reduces the costly and time-consuming effort required to comply with Directive 39 of the Alberta Energy Resources Conservation Board. The product includes an emissions modelling software. Houston-based Seismic Micro-Technology (SMT) has developed the Kingdom software that features the latest in seismic interpretation. Holland-based Joa Oil and Gas and Calgary-based Computer Modelling Group have both supplied the petroleum industry with advanced reservoir simulation software that enables reservoir interpretation. The 2010 software survey included a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In

  18. Empirical Derivation of Correction Factors for Human Spiral Ganglion Cell Nucleus and Nucleolus Count Units.

    Science.gov (United States)

    Robert, Mark E; Linthicum, Fred H

    2016-01-01

    Profile count method for estimating cell number in sectioned tissue applies a correction factor for double count (resulting from transection during sectioning) of count units selected to represent the cell. For human spiral ganglion cell counts, we attempted to address apparent confusion between published correction factors for nucleus and nucleolus count units that are identical despite the role of count unit diameter in a commonly used correction factor formula. We examined a portion of human cochlea to empirically derive correction factors for the 2 count units, using 3-dimensional reconstruction software to identify double counts. The Neurotology and House Histological Temporal Bone Laboratory at University of California at Los Angeles. Using a fully sectioned and stained human temporal bone, we identified and generated digital images of sections of the modiolar region of the lower first turn of cochlea, identified count units with a light microscope, labeled them on corresponding digital sections, and used 3-dimensional reconstruction software to identify double-counted count units. For 25 consecutive sections, we determined that double-count correction factors for nucleus count unit (0.91) and nucleolus count unit (0.92) matched the published factors. We discovered that nuclei and, therefore, spiral ganglion cells were undercounted by 6.3% when using nucleolus count units. We determined that correction factors for count units must include an element for undercounting spiral ganglion cells as well as the double-count element. We recommend a correction factor of 0.91 for the nucleus count unit and 0.98 for the nucleolus count unit when using 20-µm sections. © American Academy of Otolaryngology—Head and Neck Surgery Foundation 2015.

  19. Software Tools for Emittance Measurement and Matching for 12 GeV CEBAF

    Energy Technology Data Exchange (ETDEWEB)

    Turner, Dennis L. [Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States)

    2016-05-01

    This paper discusses model-driven setup of the Continuous Electron Beam Accelerator Facility (CEBAF) for the 12GeV era, focusing on qsUtility. qsUtility is a set of software tools created to perform emittance measurements, analyze those measurements, and compute optics corrections based upon the measurements.qsUtility was developed as a toolset to facilitate reducing machine configuration time and reproducibility by way of an accurate accelerator model, and to provide Operations staff with tools to measure and correct machine optics with little or no assistance from optics experts.

  20. Developing Formal Correctness Properties from Natural Language Requirements

    Science.gov (United States)

    Nikora, Allen P.

    2006-01-01

    This viewgraph presentation reviews the rationale of the program to transform natural language specifications into formal notation.Specifically, automate generation of Linear Temporal Logic (LTL)correctness properties from natural language temporal specifications. There are several reasons for this approach (1) Model-based techniques becoming more widely accepted, (2) Analytical verification techniques (e.g., model checking, theorem proving) significantly more effective at detecting types of specification design errors (e.g., race conditions, deadlock) than manual inspection, (3) Many requirements still written in natural language, which results in a high learning curve for specification languages, associated tools and increased schedule and budget pressure on projects reduce training opportunities for engineers, and (4) Formulation of correctness properties for system models can be a difficult problem. This has relevance to NASA in that it would simplify development of formal correctness properties, lead to more widespread use of model-based specification, design techniques, assist in earlier identification of defects and reduce residual defect content for space mission software systems. The presentation also discusses: potential applications, accomplishments and/or technological transfer potential and the next steps.

  1. Guidelines for evaluating software configuration management plans for digital instrumentation and control systems

    International Nuclear Information System (INIS)

    Cheon, Se Woo; Park, Jong Kyun; Lee, Ki Young; Lee, Jang Soo; Kim, Jang Yeon

    2001-08-01

    Software configuration management (SCM) is the process for identifying software configuration items (CIs), controlling the implementation and changes to software, recording and reporting the status of changes, and verifying the completeness and correctness of the released software. SCM consists of two major aspects: planning and implementation. Effective SCM involves planning for how activities are to be performed, and performing these activities in accordance with the Plan. This report first reviews the background of SCM that include key standards, SCM disciplines, SCM basic functions, baselines, software entity, SCM process, the implementation of SCM, and the tools of SCM. In turn, the report provides the guidelines for evaluating the SCM Plan for digital I and C systems of nuclear power plants. Most of the guidelines in the report are based on IEEE Std 828 and ANSI/IEEE Std 1042. According to BTP-14, NUREG-0800, the evaluation topics on the SCM Plan is classified into three categories: management, implementation, and resource characteristics

  2. Results of application of automatic computation of static corrections on data from the South Banat Terrain

    Science.gov (United States)

    Milojević, Slavka; Stojanovic, Vojislav

    2017-04-01

    Due to the continuous development of the seismic acquisition and processing method, the increase of the signal/fault ratio always represents a current target. The correct application of the latest software solutions improves the processing results and justifies their development. A correct computation and application of static corrections represents one of the most important tasks in pre-processing. This phase is of great importance for further processing steps. Static corrections are applied to seismic data in order to compensate the effects of irregular topography, the difference between the levels of source points and receipt in relation to the level of reduction, of close to the low-velocity surface layer (weathering correction), or any reasons that influence the spatial and temporal position of seismic routes. The refraction statics method is the most common method for computation of static corrections. It is successful in resolving of both the long-period statics problems and determining of the difference in the statics caused by abrupt lateral changes in velocity in close to the surface layer. XtremeGeo FlatironsTM is a program whose main purpose is computation of static correction through a refraction statics method and allows the application of the following procedures: picking of first arrivals, checking of geometry, multiple methods for analysis and modelling of statics, analysis of the refractor anisotropy and tomography (Eikonal Tomography). The exploration area is located on the southern edge of the Pannonian Plain, in the plain area with altitudes of 50 to 195 meters. The largest part of the exploration area covers Deliblato Sands, where the geological structure of the terrain and high difference in altitudes significantly affects the calculation of static correction. Software XtremeGeo FlatironsTM has powerful visualization and tools for statistical analysis which contributes to significantly more accurate assessment of geometry close to the surface

  3. Examining software complexity and quality for scientific software

    International Nuclear Information System (INIS)

    Kelly, D.; Shepard, T.

    2005-01-01

    Research has not found a simple relationship between software complexity and software quality, and particularly no relationship between commonly used software complexity metrics and the occurrence of software faults. A study with an example of scientific software from the nuclear power industry illustrates the importance of addressing cognitive complexity, the complexity related to understanding the intellectual content of the software. Simple practices such as aptly-named variables contributes more to high quality software than limiting code sizes. This paper examines the research into complexity and quality and reports on a longitudinal study using the example of nuclear software. (author)

  4. Seeing atoms with aberration-corrected sub-Angstroem electron microscopy

    Energy Technology Data Exchange (ETDEWEB)

    O' Keefe, Michael A. [Materials Science Division, Lawrence Berkeley National Laboratory, National Center for Electron Microscopy, 2R0200, 1 Cyclotron Road, Berkeley, CA 94720-8197 (United States)], E-mail: sub-Angstrom@comcast.net

    2008-02-15

    High-resolution electron microscopy is able to provide atomic-level characterization of many materials in low-index orientations. To achieve the same level of characterization in more complex orientations requires that instrumental resolution be improved to values corresponding to the sub-Angstroem separations of atom positions projected into these orientations. Sub-Angstroem resolution in the high-resolution transmission electron microscope has been achieved in the last few years by software aberration correction, electron holography, and hardware aberration correction; the so-called 'one-Angstroem barrier' has been left behind. Aberration correction of the objective lens currently allows atomic-resolution imaging at the sub-0.8 A level and is advancing towards resolutions in the deep sub-Angstroem range (near 0.5 A). At current resolution levels, images with sub-Rayleigh resolution require calibration in order to pinpoint atom positions correctly. As resolution levels approach the 'sizes' of atoms, the atoms themselves will produce a limit to resolution, no matter how much the instrumental resolution is improved. By arranging imaging conditions suitably, each atom peak in the image can be narrower, so atoms are imaged smaller and may be resolved at finer separations.

  5. Developing of an automation for therapy dosimetry systems by using labview software

    Science.gov (United States)

    Aydin, Selim; Kam, Erol

    2018-06-01

    Traceability, accuracy and consistency of radiation measurements are essential in radiation dosimetry, particularly in radiotherapy, where the outcome of treatments is highly dependent on the radiation dose delivered to patients. Therefore it is very important to provide reliable, accurate and fast calibration services for therapy dosimeters since the radiation dose delivered to a radiotherapy patient is directly related to accuracy and reliability of these devices. In this study, we report the performance of in-house developed computer controlled data acquisition and monitoring software for the commercially available radiation therapy electrometers. LabVIEW® software suite is used to provide reliable, fast and accurate calibration services. The software also collects environmental data such as temperature, pressure and humidity in order to use to use these them in correction factor calculations. By using this software tool, a better control over the calibration process is achieved and the need for human intervention is reduced. This is the first software that can control frequently used dosimeter systems, in radiation thereapy field at hospitals, such as Unidos Webline, Unidos E, Dose-1 and PC Electrometers.

  6. Product-oriented Software Certification Process for Software Synthesis

    Science.gov (United States)

    Nelson, Stacy; Fischer, Bernd; Denney, Ewen; Schumann, Johann; Richardson, Julian; Oh, Phil

    2004-01-01

    The purpose of this document is to propose a product-oriented software certification process to facilitate use of software synthesis and formal methods. Why is such a process needed? Currently, software is tested until deemed bug-free rather than proving that certain software properties exist. This approach has worked well in most cases, but unfortunately, deaths still occur due to software failure. Using formal methods (techniques from logic and discrete mathematics like set theory, automata theory and formal logic as opposed to continuous mathematics like calculus) and software synthesis, it is possible to reduce this risk by proving certain software properties. Additionally, software synthesis makes it possible to automate some phases of the traditional software development life cycle resulting in a more streamlined and accurate development process.

  7. Y2K - an oil and gas software vendor's perspective

    International Nuclear Information System (INIS)

    Stewart, L.D.

    1998-01-01

    Oil and gas companies have begun to deal with third party software vendors to increase their efficiency and to concentrate on their core business. When it comes to dealing with year 2000 (Y2K) problems, information technology (IT) departments will have to deal with complex heterogeneous systems and long lists of independent software vendors and suppliers. This paper highlighted some of the unique issues that should be considered when testing any heterogeneous oil and gas system for Y2K. Most petroleum companies and vendors have established Y2K compliance teams with plans involving three major phases: (1) testing and identification of problems, (2) correction and avoidance of problems, and (3) preparation for recovery from unidentified problems

  8. Evaluation of peak-fitting software for gamma spectrum analysis

    International Nuclear Information System (INIS)

    Zahn, Guilherme S.; Genezini, Frederico A.; Moralles, Mauricio

    2009-01-01

    In all applications of gamma-ray spectroscopy, one of the most important and delicate parts of the data analysis is the fitting of the gamma-ray spectra, where information as the number of counts, the position of the centroid and the width, for instance, are associated with each peak of each spectrum. There's a huge choice of computer programs that perform this type of analysis, and the most commonly used in routine work are the ones that automatically locate and fit the peaks; this fit can be made in several different ways - the most common ways are to fit a Gaussian function to each peak or simply to integrate the area under the peak, but some software go far beyond and include several small corrections to the simple Gaussian peak function, in order to compensate for secondary effects. In this work several gamma-ray spectroscopy software are compared in the task of finding and fitting the gamma-ray peaks in spectra taken with standard sources of 137 Cs, 60 Co, 133 Ba and 152 Eu. The results show that all of the automatic software can be properly used in the task of finding and fitting peaks, with the exception of GammaVision; also, it was possible to verify that the automatic peak-fitting software did perform as well as - and sometimes even better than - a manual peak-fitting software. (author)

  9. Development of an automated asbestos counting software based on fluorescence microscopy.

    Science.gov (United States)

    Alexandrov, Maxym; Ichida, Etsuko; Nishimura, Tomoki; Aoki, Kousuke; Ishida, Takenori; Hirota, Ryuichi; Ikeda, Takeshi; Kawasaki, Tetsuo; Kuroda, Akio

    2015-01-01

    An emerging alternative to the commonly used analytical methods for asbestos analysis is fluorescence microscopy (FM), which relies on highly specific asbestos-binding probes to distinguish asbestos from interfering non-asbestos fibers. However, all types of microscopic asbestos analysis require laborious examination of large number of fields of view and are prone to subjective errors and large variability between asbestos counts by different analysts and laboratories. A possible solution to these problems is automated counting of asbestos fibers by image analysis software, which would lower the cost and increase the reliability of asbestos testing. This study seeks to develop a fiber recognition and counting software for FM-based asbestos analysis. We discuss the main features of the developed software and the results of its testing. Software testing showed good correlation between automated and manual counts for the samples with medium and high fiber concentrations. At low fiber concentrations, the automated counts were less accurate, leading us to implement correction mode for automated counts. While the full automation of asbestos analysis would require further improvements in accuracy of fiber identification, the developed software could already assist professional asbestos analysts and record detailed fiber dimensions for the use in epidemiological research.

  10. A concept of software testing for SMART MMIS software

    International Nuclear Information System (INIS)

    Seo, Yong Seok; Seong, Seung Hwan; Park, Keun Ok; Hur, Sub; Kim, Dong Hoon

    2001-01-01

    In order to achieve high quality of SMART MMIS software, the well-constructed software testing concept shall be required. This paper established software testing concept which is to be applied to SMART MMIS software, in terms of software testing organization, documentation. procedure, and methods. The software testing methods are classified into source code static analysis and dynamic testing. The software dynamic testing methods are discussed with two aspects: white-box and black-box testing. As software testing concept introduced in this paper is applied to the SMART MMIS software. the high quality of the software will be produced. In the future, software failure data will be collected through the construction of SMART MMIS prototyping facility which the software testing concept of this paper is applied to

  11. [Astigmatic keratotomy with the femtosecond laser: correction of high astigmatisms after keratoplasty].

    Science.gov (United States)

    Kook, D; Bühren, J; Klaproth, O K; Bauch, A S; Derhartunian, V; Kohnen, T

    2011-02-01

    The purpose of this study was to evaluate a novel technique for the correction of postoperative astigmatism after penetrating keratoplasty with the use of the femtosecond laser creating astigmatic keratotomies (femto-AK) in the scope of a retrospective case series. Clinical data of ten eyes of nine patients with high residual astigmatism after penetrating keratoplasty undergoing paired femto-AK using a 60-kHz femtosecond laser (IntraLase™, AMO) were analyzed. A new software algorithm was used to create paired arcuate cuts deep into the donor corneal button with different cut angles. Target values were refraction, uncorrected visual acuity, best corrected visual acuity, topographic data (Orbscan®, Bausch & Lomb, Rochester, NY, USA), and corneal wavefront analysis using Visual Optics Lab (VOL)-Pro 7.14 Software (Sarver and Associates). Vector analysis was performed using the Holladay, Cravy and Koch formula. Statistical analysis was performed to detect significances between visits using Student's t test. All procedures were performed without any major complications. The mean follow-up was 13 months. The mean patient age was 48.7 years. The preoperative mean uncorrected visual acuity (logMAR) was 1.27, best corrected visual acuity 0.55, mean subjective cylinder -7.4 D, and mean topometric astigmatism 9.3 D. The postoperative mean uncorrected visual acuity (logMAR) was 1.12, best corrected visual acuity 0.47, mean subjective cylinder -4.1 D, and mean topometric astigmatism 6.5 D. Differences between corneal higher order aberrations showed a high standard deviation and were therefore not statistically significant. Astigmatic keratotomy using the femtosecond laser seems to be a safe and effective tool for the correction of higher corneal astigmatisms. Due to the biomechanical properties of the cornea and missing empirical data for the novel femto-AK technology, higher numbers of patients are necessary to develop optimal treatment nomograms.

  12. MR-guided PET motion correction in LOR space using generic projection data for image reconstruction with PRESTO

    International Nuclear Information System (INIS)

    Scheins, J.; Ullisch, M.; Tellmann, L.; Weirich, C.; Rota Kops, E.; Herzog, H.; Shah, N.J.

    2013-01-01

    The BrainPET scanner from Siemens, designed as hybrid MR/PET system for simultaneous acquisition of both modalities, provides high-resolution PET images with an optimum resolution of 3 mm. However, significant head motion often compromises the achievable image quality, e.g. in neuroreceptor studies of human brain. This limitation can be omitted when tracking the head motion and accurately correcting measured Lines-of-Response (LORs). For this purpose, we present a novel method, which advantageously combines MR-guided motion tracking with the capabilities of the reconstruction software PRESTO (PET Reconstruction Software Toolkit) to convert motion-corrected LORs into highly accurate generic projection data. In this way, the high-resolution PET images achievable with PRESTO can also be obtained in presence of severe head motion

  13. Feasibility and performance of novel software to quantify metabolically active volumes and 3D partial volume corrected SUV and metabolic volumetric products of spinal bone marrow metastases on 18F-FDG-PET/CT.

    Science.gov (United States)

    Torigian, Drew A; Lopez, Rosa Fernandez; Alapati, Sridevi; Bodapati, Geetha; Hofheinz, Frank; van den Hoff, Joerg; Saboury, Babak; Alavi, Abass

    2011-01-01

    Our aim was to assess feasibility and performance of novel semi-automated image analysis software called ROVER to quantify metabolically active volume (MAV), maximum standardized uptake value-maximum (SUV(max)), 3D partial volume corrected mean SUV (cSUV(mean)), and 3D partial volume corrected mean MVP (cMVP(mean)) of spinal bone marrow metastases on fluorine-18 fluorodeoxyglucose-positron emission tomography/computerized tomography ((18)F-FDG-PET/CT). We retrospectively studied 16 subjects with 31 spinal metastases on FDG-PET/CT and MRI. Manual and ROVER determinations of lesional MAV and SUV(max), and repeated ROVER measurements of MAV, SUV(max), cSUV(mean) and cMVP(mean) were made. Bland-Altman and correlation analyses were performed to assess reproducibility and agreement. Our results showed that analyses of repeated ROVER measurements revealed MAV mean difference (D)=-0.03±0.53cc (95% CI(-0.22, 0.16)), lower limit of agreement (LLOA)=-1.07cc, and upper limit of agreement (ULOA)=1.01cc; SUV(max) D=0.00±0.00 with LOAs=0.00; cSUV(mean) D=-0.01±0.39 (95% CI(-0.15, 0.13)), LLOA=-0.76, and ULOA=0.75; cMVP(mean) D=-0.52±4.78cc (95% CI(-2.23, 1.23)), LLOA=-9.89cc, and ULOA=8.86cc. Comparisons between ROVER and manual measurements revealed volume D= -0.39±1.37cc (95% CI (-0.89, 0.11)), LLOA=-3.08cc, and ULOA=2.30cc; SUV(max) D=0.00±0.00 with LOAs=0.00. Mean percent increase in lesional SUV(mean) and MVP(mean) following partial volume correction using ROVER was 84.25±36.00% and 84.45±35.94% , respectively. In conclusion, it is feasible to estimate MAV, SUV(max), cSUV(mean), and cMVP(mean) of spinal bone marrow metastases from (18)F-FDG-PET/CT quickly and easily with good reproducibility via ROVER software. Partial volume correction is imperative, as uncorrected SUV(mean) and MVP(mean) are significantly underestimated, even for large lesions. This novel approach has great potential for practical, accurate, and precise combined structural-functional PET

  14. Measuring Software Test Verification for Complex Workpieces based on Virtual Gear Measuring Instrument

    Directory of Open Access Journals (Sweden)

    Yin Peili

    2017-08-01

    Full Text Available Validity and correctness test verification of the measuring software has been a thorny issue hindering the development of Gear Measuring Instrument (GMI. The main reason is that the software itself is difficult to separate from the rest of the measurement system for independent evaluation. This paper presents a Virtual Gear Measuring Instrument (VGMI to independently validate the measuring software. The triangular patch model with accurately controlled precision was taken as the virtual workpiece and a universal collision detection model was established. The whole process simulation of workpiece measurement is implemented by VGMI replacing GMI and the measuring software is tested in the proposed virtual environment. Taking involute profile measurement procedure as an example, the validity of the software is evaluated based on the simulation results; meanwhile, experiments using the same measuring software are carried out on the involute master in a GMI. The experiment results indicate a consistency of tooth profile deviation and calibration results, thus verifying the accuracy of gear measuring system which includes the measurement procedures. It is shown that the VGMI presented can be applied in the validation of measuring software, providing a new ideal platform for testing of complex workpiece-measuring software without calibrated artifacts.

  15. A New High-Precision Correction Method of Temperature Distribution in Model Stellar Atmospheres

    Directory of Open Access Journals (Sweden)

    Sapar A.

    2013-06-01

    Full Text Available The main features of the temperature correction methods, suggested and used in modeling of plane-parallel stellar atmospheres, are discussed. The main features of the new method are described. Derivation of the formulae for a version of the Unsöld-Lucy method, used by us in the SMART (Stellar Model Atmospheres and Radiative Transport software for modeling stellar atmospheres, is presented. The method is based on a correction of the model temperature distribution based on minimizing differences of flux from its accepted constant value and on the requirement of the lack of its gradient, meaning that local source and sink terms of radiation must be equal. The final relative flux constancy obtainable by the method with the SMART code turned out to have the precision of the order of 0.5 %. Some of the rapidly converging iteration steps can be useful before starting the high-precision model correction. The corrections of both the flux value and of its gradient, like in Unsöld-Lucy method, are unavoidably needed to obtain high-precision flux constancy. A new temperature correction method to obtain high-precision flux constancy for plane-parallel LTE model stellar atmospheres is proposed and studied. The non-linear optimization is carried out by the least squares, in which the Levenberg-Marquardt correction method and thereafter additional correction by the Broyden iteration loop were applied. Small finite differences of temperature (δT/T = 10−3 are used in the computations. A single Jacobian step appears to be mostly sufficient to get flux constancy of the order 10−2 %. The dual numbers and their generalization – the dual complex numbers (the duplex numbers – enable automatically to get the derivatives in the nilpotent part of the dual numbers. A version of the SMART software is in the stage of refactorization to dual and duplex numbers, what enables to get rid of the finite differences, as an additional source of lowering precision of the

  16. Potku – New analysis software for heavy ion elastic recoil detection analysis

    International Nuclear Information System (INIS)

    Arstila, K.; Julin, J.; Laitinen, M.I.; Aalto, J.; Konu, T.; Kärkkäinen, S.; Rahkonen, S.; Raunio, M.; Itkonen, J.; Santanen, J.-P.; Tuovinen, T.; Sajavaara, T.

    2014-01-01

    Time-of-flight elastic recoil detection (ToF-ERD) analysis software has been developed. The software combines a Python-language graphical front-end with a C code computing back-end in a user-friendly way. The software uses a list of coincident time-of-flight–energy (ToF–E) events as an input. The ToF calibration can be determined with a simple graphical procedure. The graphical interface allows the user to select different elements and isotopes from a ToF–E histogram and to convert the selections to individual elemental energy and depth profiles. The resulting sample composition can be presented as relative or absolute concentrations by integrating the depth profiles over user-defined ranges. Beam induced composition changes can be studied by displaying the event-based data in fractions relative to the substrate reference data. Optional angular input data allows for kinematic correction of the depth profiles. This open source software is distributed under the GPL license for Linux, Mac, and Windows environments

  17. Potku – New analysis software for heavy ion elastic recoil detection analysis

    Energy Technology Data Exchange (ETDEWEB)

    Arstila, K., E-mail: kai.arstila@jyu.fi [Department of Physics, University of Jyväskylä, P.O. Box 35, FI-40014, Jyväskylä (Finland); Julin, J.; Laitinen, M.I. [Department of Physics, University of Jyväskylä, P.O. Box 35, FI-40014, Jyväskylä (Finland); Aalto, J.; Konu, T.; Kärkkäinen, S.; Rahkonen, S.; Raunio, M.; Itkonen, J.; Santanen, J.-P.; Tuovinen, T. [Department of Mathematical Information Technology, University of Jyväskylä, P.O. Box 35, FI-40014, Jyväskylä (Finland); Sajavaara, T. [Department of Physics, University of Jyväskylä, P.O. Box 35, FI-40014, Jyväskylä (Finland)

    2014-07-15

    Time-of-flight elastic recoil detection (ToF-ERD) analysis software has been developed. The software combines a Python-language graphical front-end with a C code computing back-end in a user-friendly way. The software uses a list of coincident time-of-flight–energy (ToF–E) events as an input. The ToF calibration can be determined with a simple graphical procedure. The graphical interface allows the user to select different elements and isotopes from a ToF–E histogram and to convert the selections to individual elemental energy and depth profiles. The resulting sample composition can be presented as relative or absolute concentrations by integrating the depth profiles over user-defined ranges. Beam induced composition changes can be studied by displaying the event-based data in fractions relative to the substrate reference data. Optional angular input data allows for kinematic correction of the depth profiles. This open source software is distributed under the GPL license for Linux, Mac, and Windows environments.

  18. Correction

    DEFF Research Database (Denmark)

    Pinkevych, Mykola; Cromer, Deborah; Tolstrup, Martin

    2016-01-01

    [This corrects the article DOI: 10.1371/journal.ppat.1005000.][This corrects the article DOI: 10.1371/journal.ppat.1005740.][This corrects the article DOI: 10.1371/journal.ppat.1005679.].......[This corrects the article DOI: 10.1371/journal.ppat.1005000.][This corrects the article DOI: 10.1371/journal.ppat.1005740.][This corrects the article DOI: 10.1371/journal.ppat.1005679.]....

  19. Software-centric View on OVMS for LBT

    Science.gov (United States)

    Trowitzsch, J.; Borelli, J.; Pott, J.; Kürster, M.

    2012-09-01

    The performance of infrared interferometry (IF) and adaptive optics (AO) strongly depends on the mitigation and correction of telescope vibrations. Therefore, at the Large Binocular Telescope (LBT) the OVMS, the Optical Path Difference and Vibration Monitoring System, is being installed. It is meant to ensure suitable conditions for adaptive optics and interferometry. The vibration information is collected from accelerometers that are distributed over the optical elements of the LBT. The collected vibration measurements are converted into tip-tilt and optical path difference data. That data is utilized in the control strategies of the LBT adaptive secondary mirrors and the beam combining interferometers, LINC-NIRVANA and LBTI. Within the OVMS the software part is responsibility of the LINC-NIRVANA team at MPIA Heidelberg. It comprises the software for the real-time data acquisition from the accelerometers as well as the related telemetry interface and the vibration monitoring quick look tools. The basic design ideas, implementation details and special features are explained here.

  20. General guidelines for biomedical software development [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Luis Bastiao Silva

    2017-03-01

    Full Text Available Most bioinformatics tools available today were not written by professional software developers, but by people that wanted to solve their own problems, using computational solutions and spending the minimum time and effort possible, since these were just the means to an end. Consequently, a vast number of software applications are currently available, hindering the task of identifying the utility and quality of each. At the same time, this situation has hindered regular adoption of these tools in clinical practice. Typically, they are not sufficiently developed to be used by most clinical researchers and practitioners. To address these issues, it is necessary to re-think how biomedical applications are built and adopt new strategies that ensure quality, efficiency, robustness, correctness and reusability of software components. We also need to engage end-users during the development process to ensure that applications fit their needs. In this review, we present a set of guidelines to support biomedical software development, with an explanation of how they can be implemented and what kind of open-source tools can be used for each specific topic.

  1. General guidelines for biomedical software development [version 2; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Luis Bastiao Silva

    2017-07-01

    Full Text Available Most bioinformatics tools available today were not written by professional software developers, but by people that wanted to solve their own problems, using computational solutions and spending the minimum time and effort possible, since these were just the means to an end. Consequently, a vast number of software applications are currently available, hindering the task of identifying the utility and quality of each. At the same time, this situation has hindered regular adoption of these tools in clinical practice. Typically, they are not sufficiently developed to be used by most clinical researchers and practitioners. To address these issues, it is necessary to re-think how biomedical applications are built and adopt new strategies that ensure quality, efficiency, robustness, correctness and reusability of software components. We also need to engage end-users during the development process to ensure that applications fit their needs. In this review, we present a set of guidelines to support biomedical software development, with an explanation of how they can be implemented and what kind of open-source tools can be used for each specific topic.

  2. Attenuation correction for the HRRT PET-scanner using transmission scatter correction and total variation regularization.

    Science.gov (United States)

    Keller, Sune H; Svarer, Claus; Sibomana, Merence

    2013-09-01

    In the standard software for the Siemens high-resolution research tomograph (HRRT) positron emission tomography (PET) scanner the most commonly used segmentation in the μ -map reconstruction for human brain scans is maximum a posteriori for transmission (MAP-TR). Bias in the lower cerebellum and pons in HRRT brain images have been reported. The two main sources of the problem with MAP-TR are poor bone/soft tissue segmentation below the brain and overestimation of bone mass in the skull. We developed the new transmission processing with total variation (TXTV) method that introduces scatter correction in the μ-map reconstruction and total variation filtering to the transmission processing. Comparing MAP-TR and the new TXTV with gold standard CT-based attenuation correction, we found that TXTV has less bias as compared to MAP-TR. We also compared images acquired at the HRRT scanner using TXTV to the GE Advance scanner images and found high quantitative correspondence. TXTV has been used to reconstruct more than 4000 HRRT scans at seven different sites with no reports of biases. TXTV-based reconstruction is recommended for human brain scans on the HRRT.

  3. Automated Search-Based Robustness Testing for Autonomous Vehicle Software

    Directory of Open Access Journals (Sweden)

    Kevin M. Betts

    2016-01-01

    Full Text Available Autonomous systems must successfully operate in complex time-varying spatial environments even when dealing with system faults that may occur during a mission. Consequently, evaluating the robustness, or ability to operate correctly under unexpected conditions, of autonomous vehicle control software is an increasingly important issue in software testing. New methods to automatically generate test cases for robustness testing of autonomous vehicle control software in closed-loop simulation are needed. Search-based testing techniques were used to automatically generate test cases, consisting of initial conditions and fault sequences, intended to challenge the control software more than test cases generated using current methods. Two different search-based testing methods, genetic algorithms and surrogate-based optimization, were used to generate test cases for a simulated unmanned aerial vehicle attempting to fly through an entryway. The effectiveness of the search-based methods in generating challenging test cases was compared to both a truth reference (full combinatorial testing and the method most commonly used today (Monte Carlo testing. The search-based testing techniques demonstrated better performance than Monte Carlo testing for both of the test case generation performance metrics: (1 finding the single most challenging test case and (2 finding the set of fifty test cases with the highest mean degree of challenge.

  4. The various correction methods to the high precision aeromagnetic data

    International Nuclear Information System (INIS)

    Xu Guocang; Zhu Lin; Ning Yuanli; Meng Xiangbao; Zhang Hongjian

    2014-01-01

    In the airborne geophysical survey, an outstanding achievement first depends on the measurement precision of the instrument, and the choice of measurement conditions, the reliability of data collection, followed by the correct method of measurement data processing, the rationality of the data interpretation. Obviously, geophysical data processing is an important task for the comprehensive interpretation of the measurement results, processing method is correct or not directly related to the quality of the final results. we have developed a set of personal computer software to aeromagnetic and radiometric survey data processing in the process of actual production and scientific research in recent years, and successfully applied to the production. The processing methods and flowcharts to the high precision aromagnetic data were simply introduced in this paper. However, the mathematical techniques of the various correction programes to IGRF and flying height and magnetic diurnal variation were stressily discussed in the paper. Their processing effectness were illustrated by taking an example as well. (authors)

  5. Software Engineering Program: Software Process Improvement Guidebook

    Science.gov (United States)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  6. Fault Detection and Correction for the Solar Dynamics Observatory Attitude Control System

    Science.gov (United States)

    Starin, Scott R.; Vess, Melissa F.; Kenney, Thomas M.; Maldonado, Manuel D.; Morgenstern, Wendy M.

    2007-01-01

    The Solar Dynamics Observatory is an Explorer-class mission that will launch in early 2009. The spacecraft will operate in a geosynchronous orbit, sending data 24 hours a day to a devoted ground station in White Sands, New Mexico. It will carry a suite of instruments designed to observe the Sun in multiple wavelengths at unprecedented resolution. The Atmospheric Imaging Assembly includes four telescopes with focal plane CCDs that can image the full solar disk in four different visible wavelengths. The Extreme-ultraviolet Variability Experiment will collect time-correlated data on the activity of the Sun's corona. The Helioseismic and Magnetic Imager will enable study of pressure waves moving through the body of the Sun. The attitude control system on Solar Dynamics Observatory is responsible for four main phases of activity. The physical safety of the spacecraft after separation must be guaranteed. Fine attitude determination and control must be sufficient for instrument calibration maneuvers. The mission science mode requires 2-arcsecond control according to error signals provided by guide telescopes on the Atmospheric Imaging Assembly, one of the three instruments to be carried. Lastly, accurate execution of linear and angular momentum changes to the spacecraft must be provided for momentum management and orbit maintenance. In thsp aper, single-fault tolerant fault detection and correction of the Solar Dynamics Observatory attitude control system is described. The attitude control hardware suite for the mission is catalogued, with special attention to redundancy at the hardware level. Four reaction wheels are used where any three are satisfactory. Four pairs of redundant thrusters are employed for orbit change maneuvers and momentum management. Three two-axis gyroscopes provide full redundancy for rate sensing. A digital Sun sensor and two autonomous star trackers provide two-out-of-three redundancy for fine attitude determination. The use of software to maximize

  7. Intellectual Property Protection of Software – At the Crossroads of Software Patents and Open Source Software

    OpenAIRE

    Tantarimäki, Maria

    2018-01-01

    The thesis considers the intellectual property protection of software in Europe and in the US, which is increasingly important subject as the world is globalizing and digitalizing. The special nature of software has challenges the intellectual property rights. The current protection of software is based on copyright protection but in this thesis, two other options are considered: software patents and open source software. Software patents provide strong protection for software whereas the pur...

  8. Software Quality Assurance in Software Projects: A Study of Pakistan

    OpenAIRE

    Faisal Shafique Butt; Sundus Shaukat; M. Wasif Nisar; Ehsan Ullah Munir; Muhammad Waseem; Kashif Ayyub

    2013-01-01

    Software quality is specific property which tells what kind of standard software should have. In a software project, quality is the key factor of success and decline of software related organization. Many researches have been done regarding software quality. Software related organization follows standards introduced by Capability Maturity Model Integration (CMMI) to achieve good quality software. Quality is divided into three main layers which are Software Quality Assurance (SQA), Software Qu...

  9. STEM - software test and evaluation methods: fault detection using static analysis techniques

    International Nuclear Information System (INIS)

    Bishop, P.G.; Esp, D.G.

    1988-08-01

    STEM is a software reliability project with the objective of evaluating a number of fault detection and fault estimation methods which can be applied to high integrity software. This Report gives some interim results of applying both manual and computer-based static analysis techniques, in particular SPADE, to an early CERL version of the PODS software containing known faults. The main results of this study are that: The scope for thorough verification is determined by the quality of the design documentation; documentation defects become especially apparent when verification is attempted. For well-defined software, the thoroughness of SPADE-assisted verification for detecting a large class of faults was successfully demonstrated. For imprecisely-defined software (not recommended for high-integrity systems) the use of tools such as SPADE is difficult and inappropriate. Analysis and verification tools are helpful, through their reliability and thoroughness. However, they are designed to assist, not replace, a human in validating software. Manual inspection can still reveal errors (such as errors in specification and errors of transcription of systems constants) which current tools cannot detect. There is a need for tools to automatically detect typographical errors in system constants, for example by reporting outliers to patterns. To obtain the maximum benefit from advanced tools, they should be applied during software development (when verification problems can be detected and corrected) rather than retrospectively. (author)

  10. Distributed caching mechanism for various MPE software services

    CERN Document Server

    Svec, Andrej

    2017-01-01

    The MPE Software Section provides multiple software services to facilitate the testing and the operation of the CERN Accelerator complex. Continuous growth in the number of users and the amount of processed data result in the requirement of high scalability. Our current priority is to move towards a distributed and properly load balanced set of services based on containers. The aim of this project is to implement the generic caching mechanism applicable to our services and chosen architecture. The project will at first require research about the different aspects of distributed caching (persistence, no gc-caching, cache consistency etc.) and the available technologies followed by the implementation of the chosen solution. In order to validate the correctness and performance of the implementation in the last phase of the project it will be required to implement a monitoring layer and integrate it with the current ELK stack.

  11. Operability test procedure for TRUSAF assayer software upgrade

    International Nuclear Information System (INIS)

    Cejka, C.C.

    1995-01-01

    This OTP is to be used to ensure the operability of the Transuranic Waste Assay System (TRUWAS). The system was upgraded and requires a retest to assure satisfactory operation. The upgrade consists of an AST 486 computer to replace the IBM-PC/XT, and a software upgrade (CNEUT). The software calculations are performed in the same manner as in the previous system (NEUT), however, the new software is written in C Assembly Language. CNEUT is easier to use and far more powerful than the previous program. The TRUWAS is used to verify the TRU content of waste packages sent for storage in the Transuranic Storage and Assay Facility (TRUSAF). The TRUSAF is part of Westinghouse Hanford's certification program for waste to be shipped to the Waste Isolation Pilot Plant (WIPP) in New Mexico. The Transuranic Waste Assayer uses a combination passive-active neutron interrogation system to determine the TRU content of 55-gallon waste drums. The system consists of a shielded assay chamber; Deuterium-Tritium neutron generator; Helium-3 proportional counters; drum handling system; electronics including preamplifier, amplifier, and discriminator for each of the counter packages; and an AST 486 computer/printer system for data acquisition and analysis. The system can detect down to TRU levels of 10 nCi/g in the waste matrix. The equipment to be tested is: Assay Chamber Door Drum Turntable and Automatic Loading Platform Interlocks Assayer Software; and IBM computer/printer software. The objective of the test is to verify that the system is operational with the AST 486 computer, the software used in the new computer system correctly calculates TRU levels, and the new computer system is capable of storing and retrieving data

  12. Software architecture analysis tool : software architecture metrics collection

    NARCIS (Netherlands)

    Muskens, J.; Chaudron, M.R.V.; Westgeest, R.

    2002-01-01

    The Software Engineering discipline lacks the ability to evaluate software architectures. Here we describe a tool for software architecture analysis that is based on metrics. Metrics can be used to detect possible problems and bottlenecks in software architectures. Even though metrics do not give a

  13. Customer Interaction in Software Development: A Comparison of Software Methodologies Deployed in Namibian Software Firms

    CSIR Research Space (South Africa)

    Iyawa, GE

    2016-01-01

    Full Text Available within the Namibian context. An implication for software project managers and software developers is that customer interaction should be properly managed to ensure that the software methodologies for improving software development processes...

  14. A SOFTWARE RELIABILITY ESTIMATION METHOD TO NUCLEAR SAFETY SOFTWARE

    Directory of Open Access Journals (Sweden)

    GEE-YONG PARK

    2014-02-01

    Full Text Available A method for estimating software reliability for nuclear safety software is proposed in this paper. This method is based on the software reliability growth model (SRGM, where the behavior of software failure is assumed to follow a non-homogeneous Poisson process. Two types of modeling schemes based on a particular underlying method are proposed in order to more precisely estimate and predict the number of software defects based on very rare software failure data. The Bayesian statistical inference is employed to estimate the model parameters by incorporating software test cases as a covariate into the model. It was identified that these models are capable of reasonably estimating the remaining number of software defects which directly affects the reactor trip functions. The software reliability might be estimated from these modeling equations, and one approach of obtaining software reliability value is proposed in this paper.

  15. SigmaPlot 2000, Version 6.00, SPSS Inc. Computer Software Test Plan

    Energy Technology Data Exchange (ETDEWEB)

    HURLBUT, S.T.

    2000-10-24

    SigmaPlot is a vendor software product used in conjunction with the supercritical fluid extraction Fourier transform infrared spectrometer (SFE-FTIR) system. This product converts the raw spectral data to useful area numbers. SigmaPlot will be used in conjunction with procedure ZA-565-301, ''Determination of Moisture by Supercritical Fluid Extraction and Infrared Detection.'' This test plan will be performed in conjunction with or prior to HNF-6936, ''HA-53 Supercritical Fluid Extraction System Acceptance Test Plan'', to perform analyses for water. The test will ensure that the software can be installed properly and will manipulate the analytical data correctly.

  16. SU-E-J-80: A Comparative Analysis of MIM and Pinnacle Software for Adaptive Planning

    Energy Technology Data Exchange (ETDEWEB)

    Stanford, J; Duggar, W; Morris, B; Yang, C [University of Mississippi Med. Center, Jackson, MS (United States)

    2015-06-15

    Purpose: IMRT treatment is often administered with image guidance and small PTV margins. Change in body habitus such as weight loss and tumor response during the course of a treatment could be significant, thus warranting re-simulation and re-planning. Adaptive planning is challenging and places significant burden on the staff, as such some commercial vendors are now offering adaptive planning software to stream line the process of re-planning and dose accumulation between different CT data set. The purpose of this abstract is to compare the adaptive planning tools between Pinnacle version 9.8 and MIM 6.4 software. Methods: Head and Neck cases of previously treated patients that experienced anatomical changes during the course of their treatment were chosen for evaluation. The new CT data set from the re-simulation was imported to Pinnacle and MIM software. The dynamic planning tool in pinnacle was used to calculate the old plan with fixed MU setting on the new CT data. In MIM, the old CT was registered to the new data set, followed by a dose transformation to the new CT. The dose distribution to the PTV and critical structures from each software were analyzed and compared. Results: 9% difference was observed between the Global maximum doses reported by both software. Mean doses to organs at risk and PTV’s were within 6 % however pinnacle showed greater difference in PTV coverage change. Conclusion: MIM software adaptive planning corrects for geometrical changes without consideration for the effect of radiological path length on dose distribution; however Pinnacle corrects for both geometric and radiological effect on the dose distribution. Pinnacle gives a better estimate of the dosimetric impact due to anatomical changes.

  17. Space Flight Software Development Software for Intelligent System Health Management

    Science.gov (United States)

    Trevino, Luis C.; Crumbley, Tim

    2004-01-01

    The slide presentation examines the Marshall Space Flight Center Flight Software Branch, including software development projects, mission critical space flight software development, software technical insight, advanced software development technologies, and continuous improvement in the software development processes and methods.

  18. Applications of the BEam Cross section Analysis Software (BECAS)

    DEFF Research Database (Denmark)

    Blasques, José Pedro Albergaria Amaral; Bitsche, Robert; Fedorov, Vladimir

    2013-01-01

    A newly developed framework is presented for structural design and analysis of long slender beam-like structures, e.g., wind turbine blades. The framework is based on the BEam Cross section Analysis Software – BECAS – a finite element based cross section analysis tool. BECAS is used for the gener......A newly developed framework is presented for structural design and analysis of long slender beam-like structures, e.g., wind turbine blades. The framework is based on the BEam Cross section Analysis Software – BECAS – a finite element based cross section analysis tool. BECAS is used...... for the generation of beam finite element models which correctly account for effects stemming from material anisotropy and inhomogeneity in cross sections of arbitrary geometry. These type of modelling approach allows for an accurate yet computationally inexpensive representation of a general class of three...

  19. Software attribute visualization for high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  20. Quantification frameworks and their application for evaluating the software quality factor using quality characteristic value

    International Nuclear Information System (INIS)

    Kim, C.; Chung, C.H.; Won-Ahn, K.

    2004-01-01

    Many problems, related with safety, frequently occur because Digital Instrument and Control Systems are widely used and expanding their ranges to many applications in Nuclear Power Plants. It, however, does not hold a general position to estimate an appropriate software quality. Thus, the Quality Characteristic Value, a software quality factor through each software life cycle, is suggested in this paper. The Quality Characteristic Value is obtained as following procedure: 1) Scoring Quality Characteristic Factors (especially correctness, traceability, completeness, and understandability) onto Software Verification and Validation results, 2) Deriving the diamond-shaped graphs by setting values of Factors at each axis and lining every points, and lastly 3) Measuring the area of the graph for Quality Characteristic Value. In this paper, this methodology is applied to Plant Control System. In addition, the series of quantification frameworks exhibit some good characteristics in the view of software quality factor. More than any thing else, it is believed that introduced framework may be applicable to regulatory guide, software approval procedures, due to its soundness and simple characteristics. (authors)

  1. Making of attenuation-correcting computation table for RIs and emitted gamma ray table using MS-Excel

    International Nuclear Information System (INIS)

    Miura, Shigeyuki; Takahashi, Mitsuyuki; Sato, Isamu

    1995-01-01

    In the technical workshop of National Institute for Fusion Science in the last year, report was made on the making of attenuation-correcting computation table for R/S by using the software Lotus 1-2-3 on MS-DOS. It was decided to use this table by applying Windows, and further, to partially add some functions to this table. Excel 5.0 was to be used as the software, since Excel seems to be the main of Windows. It was decided to make anew the γ-ray data table which is linked to the radioactivity data in the RI attenuation-correcting computation table. First work is to convert the RI attenuation-correcting computation table made as the file of Lotus 1-2-3 to the file of Excel 5.0 of Windows, and this is very simple. As the result of the file conversion, it was found that the data file became compact. Next work is the addition of functions to this table. The function being added this time is that for judging whether R/S are those which are stipulated in the laws or not from the values of radioactivity calculated by the attenuation correction. The concrete method of this addition of function is explained. The data table on the γ-ray for respective nuclides was made. The present state of the data base on radiation was investigated. (K.I.)

  2. On the Update Problems for Software Defined Networks

    Directory of Open Access Journals (Sweden)

    V. A. Zakharov

    2014-01-01

    Full Text Available The designing of network update algorithms is urgent for the development of SDN control software. A particular case of Network Update Problem is that of restoring seamlessly a given network configuration after some packet forwarding rules have been disabled (say, at the expiry of their time-outs. We study this problem in the framework of a formal model of SDN, develop correct and safe network recovering algorithms, and show that in general case there is no way to restore network configuration seamlessly without referring to priorities of packet forwarding rules.

  3. Software cost/resource modeling: Software quality tradeoff measurement

    Science.gov (United States)

    Lawler, R. W.

    1980-01-01

    A conceptual framework for treating software quality from a total system perspective is developed. Examples are given to show how system quality objectives may be allocated to hardware and software; to illustrate trades among quality factors, both hardware and software, to achieve system performance objectives; and to illustrate the impact of certain design choices on software functionality.

  4. Calculation Software versus Illustration Software for Teaching Statistics

    DEFF Research Database (Denmark)

    Mortensen, Peter Stendahl; Boyle, Robin G.

    1999-01-01

    As personal computers have become more and more powerful, so have the software packages available to us for teaching statistics. This paper investigates what software packages are currently being used by progressive statistics instructors at university level, examines some of the deficiencies...... of such software, and indicates features that statistics instructors wish to have incorporated in software in the future. The basis of the paper is a survey of participants at ICOTS-5 (the Fifth International Conference on Teaching Statistics). These survey results, combined with the software based papers...

  5. Software system safety

    Science.gov (United States)

    Uber, James G.

    1988-01-01

    Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.

  6. End-to-end Information Flow Security Model for Software-Defined Networks

    Directory of Open Access Journals (Sweden)

    D. Ju. Chaly

    2015-01-01

    Full Text Available Software-defined networks (SDN are a novel paradigm of networking which became an enabler technology for many modern applications such as network virtualization, policy-based access control and many others. Software can provide flexibility and fast-paced innovations in the networking; however, it has a complex nature. In this connection there is an increasing necessity of means for assuring its correctness and security. Abstract models for SDN can tackle these challenges. This paper addresses to confidentiality and some integrity properties of SDNs. These are critical properties for multi-tenant SDN environments, since the network management software must ensure that no confidential data of one tenant are leaked to other tenants in spite of using the same physical infrastructure. We define a notion of end-to-end security in context of software-defined networks and propose a semantic model where the reasoning is possible about confidentiality, and we can check that confidential information flows do not interfere with non-confidential ones. We show that the model can be extended in order to reason about networks with secure and insecure links which can arise, for example, in wireless environments.The article is published in the authors’ wording.

  7. [Confirming the Utility of RAISUS Antifungal Susceptibility Testing by New-Software].

    Science.gov (United States)

    Ono, Tomoko; Suematsu, Hiroyuki; Sawamura, Haruki; Yamagishi, Yuka; Mikamo, Hiroshige

    2017-08-15

    Clinical and Laboratory Standards Institute (CLSI) methods for susceptibility tests of yeast are used in Japan. On the other hand, the methods have some disadvantage; 1) reading at 24 and 48 h, 2) using unclear scale, approximately 50% inhibition, to determine MICs, 3) calculating trailing growth and paradoxical effects. These makes it difficult to test the susuceptibility for yeasts. Old software of RAISUS, Ver. 6.0 series, resolved problem 1) and 2) but did not resolve problem 3). Recently, new software of RAISUS, Ver. 7.0 series, resolved problem 3). We confirmed that using the new software made it clear whether all these issue were settled or not. Eighty-four Candida isolated from Aichi Medical University was used in this study. We compared the MICs obtained by using RAISUS antifungal susceptibility testing of yeasts RSMY1, RSMY1, with those obtained by using ASTY. The concordance rates (±four-fold of MICs) between the MICs obtained by using ASTY and RSMY1 with the new software were more than 90%, except for miconazole (MCZ). The rate of MCZ was low, but MICs obtained by using CLSI methods and Yeast-like Fungus DP 'EIKEN' methods, E-DP, showed equivalent MICs of RSMY1 using the new software. The frequency of skip effects on RSMY1 using the new software markedly decreased relative to RSMY1 using the old software. In case of showing trailing growth, the new software of RAISUS made it possible to choice the correct MICs and to put up the sign of trailing growth on the result screen. New software of RAISUS enhances its usability and the accuracy of MICs. Using automatic instrument to determine MICs is useful to obtain objective results easily.

  8. Applying the metro map to software development management

    Science.gov (United States)

    Aguirregoitia, Amaia; Dolado, J. Javier; Presedo, Concepción

    2010-01-01

    This paper presents MetroMap, a new graphical representation model for controlling and managing the software development process. Metromap uses metaphors and visual representation techniques to explore several key indicators in order to support problem detection and resolution. The resulting visualization addresses diverse management tasks, such as tracking of deviations from the plan, analysis of patterns of failure detection and correction, overall assessment of change management policies, and estimation of product quality. The proposed visualization uses a metaphor with a metro map along with various interactive techniques to represent information concerning the software development process and to deal efficiently with multivariate visual queries. Finally, the paper shows the implementation of the tool in JavaFX with data of a real project and the results of testing the tool with the aforementioned data and users attempting several information retrieval tasks. The conclusion shows the results of analyzing user response time and efficiency using the MetroMap visualization system. The utility of the tool was positively evaluated.

  9. SIMA: Python software for analysis of dynamic fluorescence imaging data

    Directory of Open Access Journals (Sweden)

    Patrick eKaifosh

    2014-09-01

    Full Text Available Fluorescence imaging is a powerful method for monitoring dynamic signals in the nervous system. However, analysis of dynamic fluorescence imaging data remains burdensome, in part due to the shortage of available software tools. To address this need, we have developed SIMA, an open source Python package that facilitates common analysis tasks related to fluorescence imaging. Functionality of this package includes correction of motion artifacts occurring during in vivo imaging with laser-scanning microscopy, segmentation of imaged fields into regions of interest (ROIs, and extraction of signals from the segmented ROIs. We have also developed a graphical user interface (GUI for manual editing of the automatically segmented ROIs and automated registration of ROIs across multiple imaging datasets. This software has been designed with flexibility in mind to allow for future extension with different analysis methods and potential integration with other packages. Software, documentation, and source code for the SIMA package and ROI Buddy GUI are freely available at http://www.losonczylab.org/sima/.

  10. Potential Errors and Test Assessment in Software Product Line Engineering

    Directory of Open Access Journals (Sweden)

    Hartmut Lackner

    2015-04-01

    Full Text Available Software product lines (SPL are a method for the development of variant-rich software systems. Compared to non-variable systems, testing SPLs is extensive due to an increasingly amount of possible products. Different approaches exist for testing SPLs, but there is less research for assessing the quality of these tests by means of error detection capability. Such test assessment is based on error injection into correct version of the system under test. However to our knowledge, potential errors in SPL engineering have never been systematically identified before. This article presents an overview over existing paradigms for specifying software product lines and the errors that can occur during the respective specification processes. For assessment of test quality, we leverage mutation testing techniques to SPL engineering and implement the identified errors as mutation operators. This allows us to run existing tests against defective products for the purpose of test assessment. From the results, we draw conclusions about the error-proneness of the surveyed SPL design paradigms and how quality of SPL tests can be improved.

  11. The influence of software filtering in digital mammography image quality

    Science.gov (United States)

    Michail, C.; Spyropoulou, V.; Kalyvas, N.; Valais, I.; Dimitropoulos, N.; Fountos, G.; Kandarakis, I.; Panayiotakis, G.

    2009-05-01

    Breast cancer is one of the most frequently diagnosed cancers among women. Several techniques have been developed to help in the early detection of breast cancer such as conventional and digital x-ray mammography, positron and single-photon emission mammography, etc. A key advantage in digital mammography is that images can be manipulated as simple computer image files. Thus non-dedicated commercially available image manipulation software can be employed to process and store the images. The image processing tools of the Photoshop (CS 2) software usually incorporate digital filters which may be used to reduce image noise, enhance contrast and increase spatial resolution. However, improving an image quality parameter may result in degradation of another. The aim of this work was to investigate the influence of three sharpening filters, named hereafter sharpen, sharpen more and sharpen edges on image resolution and noise. Image resolution was assessed by means of the Modulation Transfer Function (MTF).In conclusion it was found that the correct use of commercial non-dedicated software on digital mammograms may improve some aspects of image quality.

  12. The influence of software filtering in digital mammography image quality

    International Nuclear Information System (INIS)

    Michail, C; Spyropoulou, V; Valais, I; Panayiotakis, G; Kalyvas, N; Fountos, G; Kandarakis, I; Dimitropoulos, N

    2009-01-01

    Breast cancer is one of the most frequently diagnosed cancers among women. Several techniques have been developed to help in the early detection of breast cancer such as conventional and digital x-ray mammography, positron and single-photon emission mammography, etc. A key advantage in digital mammography is that images can be manipulated as simple computer image files. Thus non-dedicated commercially available image manipulation software can be employed to process and store the images. The image processing tools of the Photoshop (CS 2) software usually incorporate digital filters which may be used to reduce image noise, enhance contrast and increase spatial resolution. However, improving an image quality parameter may result in degradation of another. The aim of this work was to investigate the influence of three sharpening filters, named hereafter sharpen, sharpen more and sharpen edges on image resolution and noise. Image resolution was assessed by means of the Modulation Transfer Function (MTF).In conclusion it was found that the correct use of commercial non-dedicated software on digital mammograms may improve some aspects of image quality.

  13. The results of bone deformity correction using a spider frame with web-based software for lower extremity long bone deformities.

    Science.gov (United States)

    Tekin, Ali Çağrı; Çabuk, Haluk; Dedeoğlu, Süleyman Semih; Saygılı, Mehmet Selçuk; Adaş, Müjdat; Esenyel, Cem Zeki; Büyükkurt, Cem Dinçay; Tonbul, Murat

    2016-03-22

    To present the functional and radiological results and evaluate the effectiveness of a computer-assisted external fixator (spider frame) in patients with lower extremity shortness and deformity. The study comprised 17 patients (14 male, 3 female) who were treated for lower extremity long bone deformity and shortness between 2012 and 2015 using a spider frame. The procedure's level of difficulty was determined preoperatively using the Paley Scale. Postoperatively, the results for the patients who underwent tibial operations were evaluated using the Paley criteria modified by ASAMI, and the results for the patients who underwent femoral operations were evaluated according to the Paley scoring system. The evaluations were made by calculating the External Fixator and Distraction indexes. The mean age of the patients was 24.58 years (range, 5-51 years). The spider frame was applied to the femur in 10 patients and to the tibia in seven. The mean follow-up period was 15 months (range, 6-31 months) from the operation day, and the mean amount of lengthening was 3.0 cm (range, 1-6 cm). The mean duration of fixator application was 202.7 days (range, 104-300 days). The mean External Fixator Index was 98 days/cm (range, 42-265 days/cm). The mean Distraction Index was 10.49 days/cm (range, 10-14 days/cm). The computer-assisted external fixator system (spider frame) achieves single-stage correction in cases of both deformity and shortness. The system can be applied easily, and because of its high-tech software, it offers the possibility of postoperative treatment of the deformity.

  14. Avoidable Software Procurements

    Science.gov (United States)

    2012-09-01

    software license, software usage, ELA, Software as a Service , SaaS , Software Asset...PaaS Platform as a Service SaaS Software as a Service SAM Software Asset Management SMS System Management Server SEWP Solutions for Enterprise Wide...delivery of full Cloud Services , we will see the transition of the Cloud Computing service model from Iaas to SaaS , or Software as a Service . Software

  15. Software-assisted small bowel motility analysis using free-breathing MRI: feasibility study.

    Science.gov (United States)

    Bickelhaupt, Sebastian; Froehlich, Johannes M; Cattin, Roger; Raible, Stephan; Bouquet, Hanspeter; Bill, Urs; Patak, Michael A

    2014-01-01

    To validate a software prototype allowing for small bowel motility analysis in free breathing by comparing it to manual measurements. In all, 25 patients (15 male, 10 female; mean age 39 years) were included in this Institutional Review Board-approved, retrospective study. Magnetic resonance imaging (MRI) was performed on a 1.5T system after standardized preparation acquiring motility sequences in free breathing over 69-84 seconds. Small bowel motility was analyzed manually and with the software. Functional parameters, measurement time, and reproducibility were compared using the coefficient of variance and paired Student's t-test. Correlation was analyzed using Pearson's correlation coefficient and linear regression. The 25 segments were analyzed twice both by hand and using the software with automatic breathing correction. All assessed parameters significantly correlated between the methods (P software (3.90%, standard deviation [SD] ± 5.69) than manual examinations (9.77%, SD ± 11.08). The time needed was significantly less (P software (4.52 minutes, SD ± 1.58) compared to manual measurement, lasting 17.48 minutes for manual (SD ± 1.75 minutes). The use of the software proves reliable and faster small bowel motility measurements in free-breathing MRI compared to manual analyses. The new technique allows for analyses of prolonged sequences acquired in free breathing, improving the informative value of the examinations by amplifying the evaluable data. Copyright © 2013 Wiley Periodicals, Inc.

  16. Evaluation of intensity drift correction strategies using MetaboDrift, a normalization tool for multi-batch metabolomics data.

    Science.gov (United States)

    Thonusin, Chanisa; IglayReger, Heidi B; Soni, Tanu; Rothberg, Amy E; Burant, Charles F; Evans, Charles R

    2017-11-10

    In recent years, mass spectrometry-based metabolomics has increasingly been applied to large-scale epidemiological studies of human subjects. However, the successful use of metabolomics in this context is subject to the challenge of detecting biologically significant effects despite substantial intensity drift that often occurs when data are acquired over a long period or in multiple batches. Numerous computational strategies and software tools have been developed to aid in correcting for intensity drift in metabolomics data, but most of these techniques are implemented using command-line driven software and custom scripts which are not accessible to all end users of metabolomics data. Further, it has not yet become routine practice to assess the quantitative accuracy of drift correction against techniques which enable true absolute quantitation such as isotope dilution mass spectrometry. We developed an Excel-based tool, MetaboDrift, to visually evaluate and correct for intensity drift in a multi-batch liquid chromatography - mass spectrometry (LC-MS) metabolomics dataset. The tool enables drift correction based on either quality control (QC) samples analyzed throughout the batches or using QC-sample independent methods. We applied MetaboDrift to an original set of clinical metabolomics data from a mixed-meal tolerance test (MMTT). The performance of the method was evaluated for multiple classes of metabolites by comparison with normalization using isotope-labeled internal standards. QC sample-based intensity drift correction significantly improved correlation with IS-normalized data, and resulted in detection of additional metabolites with significant physiological response to the MMTT. The relative merits of different QC-sample curve fitting strategies are discussed in the context of batch size and drift pattern complexity. Our drift correction tool offers a practical, simplified approach to drift correction and batch combination in large metabolomics studies

  17. Software engineering

    CERN Document Server

    Sommerville, Ian

    2010-01-01

    The ninth edition of Software Engineering presents a broad perspective of software engineering, focusing on the processes and techniques fundamental to the creation of reliable, software systems. Increased coverage of agile methods and software reuse, along with coverage of 'traditional' plan-driven software engineering, gives readers the most up-to-date view of the field currently available. Practical case studies, a full set of easy-to-access supplements, and extensive web resources make teaching the course easier than ever.

  18. Free software, Open source software, licenses. A short presentation including a procedure for research software and data dissemination

    OpenAIRE

    Gomez-Diaz , Teresa

    2014-01-01

    4 pages. Spanish version: Software libre, software de código abierto, licencias. Donde se propone un procedimiento de distribución de software y datos de investigación; The main goal of this document is to help the research community to understand the basic concepts of software distribution: Free software, Open source software, licenses. This document also includes a procedure for research software and data dissemination.

  19. Self-corrective T-loop design for differential space closure.

    Science.gov (United States)

    Viecilli, Rodrigo F

    2006-01-01

    The current approach to measuring T-loop force systems in patients requiring differential anchorage does not consider active unit angulations and steps during space closure. The angulations and steps during movement introduced by rotation can considerably modify the force system acting on the teeth. In this study, geometric modifications were determined during controlled tipping of the 6 anterior teeth, where there was no movement of the posterior teeth, thus configuring a type A anchorage situation. An optimal beta-titanium alloy 0.017 x 0.025-in T-loop spring was designed by using a simulation performed with LOOP software (dHAL Orthodontic Software, Athens, Greece) to allow compensation for anterior unit-position effect on the final force system. The force systems produced by this T-loop spring with and without geometric correction of the brackets have significant differences that should be considered in the segmented arch approach to space closure. The effects of steps, angles, and vertical forces were combined to produce an ideal T-loop design that would provide a more determinate force system. The effects and force systems are estimates based on simplified locations of the centers of resistance, assuming relatively constant behavior of the centers of rotation. These simplifications might differ slightly from what happens in vivo. The finite element method or an accurate spring tester capable of reproducing the geometric corrections should be used to ensure a precise force system.

  20. The D2G2 project: a new software tool for nuclear engineering design in Canada

    International Nuclear Information System (INIS)

    Rheaume, P.; Lefebvre, J.F.; Roy, R.; Koclas, J.

    2004-01-01

    Nowadays, high quality neutronic simulation codes are readily available. The open source software suite DRAGON/DONJON is a good example. It is free, it has proven quality and correctness over the years and is still developed and maintained at Ecole Polytechnique de Montreal. However, most simulation codes have the following weaknesses: limited usability, poor maintainability, no internal data standardization and poor portability. The D2G2 project is a software development initiative which aims to create an upper layer software tool that annihilates the weakness of classic simulation codes. This paper presents D2G2Client's and D2G2Server's principal capabilities, how they interact and the libraries they use. (author)

  1. Perbandingan Metode Depth of Field pada Lensa Kamera Fotografi dengan Efek Lensa pada Software Animasi

    Directory of Open Access Journals (Sweden)

    Ahmad Faisal Choiril Anam Fathoni

    2013-04-01

    Full Text Available The knowledge of photography becomes fundamental importance in the understanding of digital cinematography. The work of a good photography is a blend of knowledge and photographic ability (skill correctly. By learning photography properly, it allows an animator or digital art workers to apply some standard cinematography as well. By knowing the comparison between photography with "photographic" in 3D animation, animators will be easier to create a digital aesthetic standard with the help of the software. This paper discusses the comparison of the use of photographic camera lenses and camera parameters found in animation software 3D Studio Max. The final form is the camera pictures and parameters of used lens with the results of rendering images with the software-related parameter. 

  2. SpcAudace: Spectroscopic processing and analysis package of Audela software

    Science.gov (United States)

    Mauclaire, Benjamin

    2017-11-01

    SpcAudace processes long slit spectra with automated pipelines and performs astrophysical analysis of the latter data. These powerful pipelines do all the required steps in one pass: standard preprocessing, masking of bad pixels, geometric corrections, registration, optimized spectrum extraction, wavelength calibration and instrumental response computation and correction. Both high and low resolution long slit spectra are managed for stellar and non-stellar targets. Many types of publication-quality figures can be easily produced: pdf and png plots or annotated time series plots. Astrophysical quantities can be derived from individual or large amount of spectra with advanced functions: from line profile characteristics to equivalent width and periodogram. More than 300 documented functions are available and can be used into TCL scripts for automation. SpcAudace is based on Audela open source software.

  3. Unix Philosophy and the Real World: Control Software for Humanoid Robots

    Directory of Open Access Journals (Sweden)

    Neil Thomas Dantam

    2016-03-01

    Full Text Available Robot software combines the challenges of general purpose and real-time software, requiring complex logic and bounded resource use. Physical safety, particularly for dynamic systems such as humanoid robots, depends on correct software. General purpose computation has converged on unix-like operating systems -- standardized as POSIX, the Portable Operating System Interface -- for devices from cellular phones to supercomputers. The modular, multi-process design typical of POSIX applications is effective for building complex and reliable software. Absent from POSIX, however, is an interproccess communication mechanism that prioritizes newer data as typically desired for control of physical systems. We address this need in the Ach communication library which provides suitable semantics and performance for real-time robot control. Although initially designed for humanoid robots, Ach has broader applicability to complex mechatronic devices -- humanoid and otherwise -- that require real-time coupling of sensors, control, planning, and actuation. The initial user space implementation of Ach was limited in the ability to receive data from multiple sources. We remove this limitation by implementing Ach as a Linux kernel module, enabling Ach's high-performance and latest-message-favored semantics within conventional POSIX communication pipelines. We discuss how these POSIX interfaces and design principles apply to robot software, and we present a case study using the Ach kernel module for communication on the Baxter robot.

  4. Impact of Agile Software Development Model on Software Maintainability

    Science.gov (United States)

    Gawali, Ajay R.

    2012-01-01

    Software maintenance and support costs account for up to 60% of the overall software life cycle cost and often burdens tightly budgeted information technology (IT) organizations. Agile software development approach delivers business value early, but implications on software maintainability are still unknown. The purpose of this quantitative study…

  5. SPECT quantification: a review of the different correction methods with compton scatter, attenuation and spatial deterioration effects

    International Nuclear Information System (INIS)

    Groiselle, C.; Rocchisani, J.M.; Moretti, J.L.; Dreuille, O. de; Gaillard, J.F.; Bendriem, B.

    1997-01-01

    SPECT quantification: a review of the different correction methods with Compton scatter attenuation and spatial deterioration effects. The improvement of gamma-cameras, acquisition and reconstruction software opens new perspectives in term of image quantification in nuclear medicine. In order to meet the challenge, numerous works have been undertaken in recent years to correct for the different physical phenomena that prevent an exact estimation of the radioactivity distribution. The main phenomena that have to betaken into account are scatter, attenuation and resolution. In this work, authors present the physical basis of each issue, its consequences on quantification and the main methods proposed to correct them. (authors)

  6. Specification Improvement Through Analysis of Proof Structure (SITAPS): High Assurance Software Development

    Science.gov (United States)

    2016-02-01

    proof in mathematics. For example, consider the proof of the Pythagorean Theorem illustrated at: http://www.cut-the-knot.org/ pythagoras / where 112...methods and tools have made significant progress in their ability to model software designs and prove correctness theorems about the systems modeled...assumption criticality” or “ theorem root set size” SITAPS detects potentially brittle verification cases. SITAPS provides tools and techniques that

  7. Composing, Analyzing and Validating Software Models

    Science.gov (United States)

    Sheldon, Frederick T.

    1998-10-01

    This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.

  8. A 'Toolbox' Equivalent Process for Safety Analysis Software

    International Nuclear Information System (INIS)

    O'Kula, K.R.; Eng, Tony

    2004-01-01

    configuration and control procedure, an error notification and corrective action process, and evidence of available training on use of the software. The process is best performed with an independent SQA evaluator, i.e., a technically knowledgeable individual in the application area who is not part of the development team. The process provides a consistent, systematic approach based on the experience gained with SQA evaluations of the toolbox codes. Experience has shown that rarely will existing software be fully compliant with SQA criteria. Instead, the typical case is where SQA elements are deficient. For this case, it is recommended that supplemental remedial documentation be generated. Situations may also arise where the SQA evaluator must weigh whether the entire SQA suite be reconstituted. Regardless, the process is described sufficiently to guide a comprehensive evaluation. If the candidate software is successful in meeting process requirements, the software is ''toolbox-equivalent''. The benefit of the methodology outlined is that it provides a standard evaluation technique for choosing the most applicable software for a given application. One potential outcome is that the software of choice will be found to be applicable with ample SQA justification. Alternatively, the software in question may be found not to meet SQA process requirements. In this case, the analyst may then make an informed decision and possibly select one of the multiple-use, toolbox codes. With either outcome, the DSA is improved

  9. Software design practice using two SCADA software packages

    DEFF Research Database (Denmark)

    Basse, K.P.; Christensen, Georg Kronborg; Frederiksen, P. K.

    1996-01-01

    Typical software development for manufacturing control is done either by specialists with consideral real-time programming experience or done by the adaptation of standard software packages for manufacturing control. After investigation and test of two commercial software packages: "InTouch" and ......Touch" and "Fix", it is argued, that a more efficient software solution can be achieved by utilising an integrated specification for SCADA and PLC-programming. Experiences gained from process control is planned investigated for descrete parts manufacturing....

  10. Effect of metal artifact reduction software on image quality of C-arm cone-beam computed tomography during intracranial aneurysm treatment.

    Science.gov (United States)

    Enomoto, Yukiko; Yamauchi, Keita; Asano, Takahiko; Otani, Katharina; Iwama, Toru

    2018-01-01

    Background and purpose C-arm cone-beam computed tomography (CBCT) has the drawback that image quality is degraded by artifacts caused by implanted metal objects. We evaluated whether metal artifact reduction (MAR) prototype software can improve the subjective image quality of CBCT images of patients with intracranial aneurysms treated with coils or clips. Materials and methods Forty-four patients with intracranial aneurysms implanted with coils (40 patients) or clips (four patients) underwent one CBCT scan from which uncorrected and MAR-corrected CBCT image datasets were reconstructed. Three blinded readers evaluated the image quality of the image sets using a four-point scale (1: Excellent, 2: Good, 3: Poor, 4: Bad). The median scores of the three readers of uncorrected and MAR-corrected images were compared with the paired Wilcoxon signed-rank and inter-reader agreement of change scores was assessed by weighted kappa statistics. The readers also recorded new clinical findings, such as intracranial hemorrhage, air, or surrounding anatomical structures on MAR-corrected images. Results The image quality of MAR-corrected CBCT images was significantly improved compared with the uncorrected CBCT image ( p software improved image quality of CBCT images degraded by metal artifacts.

  11. Software development for simplified performance tests and weekly performance check in Younggwang NPP Unit 3 and 4

    International Nuclear Information System (INIS)

    Hur, K. Y.; Jang, S. H.; Lee, J. W.; Kim, J. T.; Park, J. C.

    2002-01-01

    This paper covers the current status of turbine cycle performance test in nuclear power plants and the software development which can solve some shortcomings related to performance tests. The software developed is for simplified performance tests and weekly performance checks in Yonggwang nuclear power plant unit 3 and 4. This software includes the requirements from the efficiency division for the consistency with actual performance analysis work and the usability of the collected performance test data. From the working survey, we identify the difference between the embedded performance analysis modules and the actual performance analysis work. This software helps operation or maintenance personnel to reduce work load, to support the trend analysis of essential parameters in a turbine cycle, and to utilize the correction curves for the decision-making in their work

  12. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  13. Software Epistemology

    Science.gov (United States)

    2016-03-01

    in-vitro decision to incubate a startup, Lexumo [7], which is developing a commercial Software as a Service ( SaaS ) vulnerability assessment...LTS Label Transition System MUSE Mining and Understanding Software Enclaves RTEMS Real-Time Executive for Multi-processor Systems SaaS Software ...as a Service SSA Static Single Assignment SWE Software Epistemology UD/DU Def-Use/Use-Def Chains (Dataflow Graph)

  14. Evaluation of Machine Learning Methods for LHC Optics Measurements and Corrections Software

    CERN Document Server

    AUTHOR|(CDS)2206853; Henning, Peter

    The field of artificial intelligence is driven by the goal to provide machines with human-like intelligence. However modern science is currently facing problems with high complexity that cannot be solved by humans in the same timescale as by machines. Therefore there is a demand on automation of complex tasks. To identify the category of tasks which can be performed by machines in the domain of optics measurements and correction on the Large Hadron Collider (LHC) is one of the central research subjects of this thesis. The application of machine learning methods and concepts of artificial intelligence can be found in various industry and scientific branches. In High Energy Physics these concepts are mostly used in offline analysis of experiments data and to perform regression tasks. In Accelerator Physics the machine learning approach has not found a wide application yet. Therefore potential tasks for machine learning solutions can be specified in this domain. The appropriate methods and their suitability for...

  15. Software configuration management plan, 241-AY and 241-AZ tank farm MICON automation system

    International Nuclear Information System (INIS)

    Hill, L.F.

    1997-01-01

    This document establishes a Computer Software Configuration Management Plan (CSCM) for controlling software for the MICON Distributed Control System (DCS) located at the 241-AY and 241-AZ Aging Waste Tank Farm facilities in the 200 East Area. The MICON DCS software controls and monitors the instrumentation and equipment associated with plant systems and processes. A CSCM identifies and defines the configuration items in a system (section 3.1), controls the release and change of these items throughout the system life cycle (section 3.2), records and reports the status of configuration items and change requests (section 3.3), and verifies the completeness and correctness of the items (section 3.4). All software development before initial release, or before software is baselined, is considered developmental. This plan does not apply to developmental software. This plan applies to software that has been baselined and released. The MICON software will monitor and control the related instrumentation and equipment of the 241-AY and 241-AZ Tank Farm ventilation systems. Eventually, this software may also assume the monitoring and control of the tank sludge washing equipment and other systems as they are brought on line. This plan applies to the System Cognizant Manager and MICON Cognizant Engineer (who is also referred to herein as the system administrator) responsible for the software/hardware and administration of the MICON system. This document also applies to any other organizations within Tank Farms which are currently active on the system including system cognizant engineers, nuclear operators, technicians, and control room supervisors

  16. RAVONSICS-challenging for assuring software reliability of nuclear I and C system

    International Nuclear Information System (INIS)

    Hai Zeng; Ming Yang; Yoshikawa, Hidekazu

    2015-01-01

    As the “central nerve system”, the highly reliable Instrumentation and Control (I and C) systems, which provide the right functions and functions correctly, are always desirable not only for the end users of NPPs but also the suppliers of I and C systems. The Digitalization of nuclear I and C system happened in recent years brought a lot of new features for nuclear I and C system. On one side digital technology provides more functionalities, and it should be more reliable and robust; on the other side, digital technology brings new challenge for nuclear I and C system, especially the software running in the hardware component. The software provides flexible functionalities for nuclear I and C system, but it also brings the difficulties to evaluate the reliability and safety of it because of the complexity of software. The reliability of software, which is indispensable part of I and C system, will have essential impact on the reliability of the whole system, and people definitely want to know what the reliability of this intangible part is. The methods used for the evaluation of reliability of system and hardware hardly work for software, because the inherent difference of failure mechanism exists between software and hardware. Failure in software is systematically induced by design error, but failure in hardware is randomly induced by material and production. To continue the effort on this hot topic and to try to achieve consensus on the potential methodology for software reliability evaluation, a cooperative research project called RAVONSICS (Reliability and Verification and Validation of Nuclear Safety I and C Software) is being carried on by 7 Chinese partners, which includes University, research institute, utility, vendor, and safety regulatory body. The objective of RAVONSICS is to bring forwards the methodology for the software reliability evaluation, and the software verification technique. RAVONSICS works cooperatively with its European sister project

  17. Correction of a Depth-Dependent Lateral Distortion in 3D Super-Resolution Imaging.

    Directory of Open Access Journals (Sweden)

    Lina Carlini

    Full Text Available Three-dimensional (3D localization-based super-resolution microscopy (SR requires correction of aberrations to accurately represent 3D structure. Here we show how a depth-dependent lateral shift in the apparent position of a fluorescent point source, which we term `wobble`, results in warped 3D SR images and provide a software tool to correct this distortion. This system-specific, lateral shift is typically > 80 nm across an axial range of ~ 1 μm. A theoretical analysis based on phase retrieval data from our microscope suggests that the wobble is caused by non-rotationally symmetric phase and amplitude aberrations in the microscope's pupil function. We then apply our correction to the bacterial cytoskeletal protein FtsZ in live bacteria and demonstrate that the corrected data more accurately represent the true shape of this vertically-oriented ring-like structure. We also include this correction method in a registration procedure for dual-color, 3D SR data and show that it improves target registration error (TRE at the axial limits over an imaging depth of 1 μm, yielding TRE values of < 20 nm. This work highlights the importance of correcting aberrations in 3D SR to achieve high fidelity between the measurements and the sample.

  18. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  19. Source brightness fluctuation correction of solar absorption fourier transform mid infrared spectra

    Directory of Open Access Journals (Sweden)

    T. Ridder

    2011-06-01

    Full Text Available The precision and accuracy of trace gas observations using solar absorption Fourier Transform infrared spectrometry depend on the stability of the light source. Fluctuations in the source brightness, however, cannot always be avoided. Current correction schemes, which calculate a corrected interferogram as the ratio of the raw DC interferogram and a smoothed DC interferogram, are applicable only to near infrared measurements. Spectra in the mid infrared spectral region below 2000 cm−1 are generally considered uncorrectable, if they are measured with a MCT detector. Such measurements introduce an unknown offset to MCT interferograms, which prevents the established source brightness fluctuation correction. This problem can be overcome by a determination of the offset using the modulation efficiency of the instrument. With known modulation efficiency the offset can be calculated, and the source brightness correction can be performed on the basis of offset-corrected interferograms. We present a source brightness fluctuation correction method which performs the smoothing of the raw DC interferogram in the interferogram domain by an application of a running mean instead of high-pass filtering the corresponding spectrum after Fourier transformation of the raw DC interferogram. This smoothing can be performed with the onboard software of commercial instruments. The improvement of MCT spectra and subsequent ozone profile and total column retrievals is demonstrated. Application to InSb interferograms in the near infrared spectral region proves the equivalence with the established correction scheme.

  20. Detection and correction of false segmental duplications caused by genome mis-assembly

    Science.gov (United States)

    2010-01-01

    Diploid genomes with divergent chromosomes present special problems for assembly software as two copies of especially polymorphic regions may be mistakenly constructed, creating the appearance of a recent segmental duplication. We developed a method for identifying such false duplications and applied it to four vertebrate genomes. For each genome, we corrected mis-assemblies, improved estimates of the amount of duplicated sequence, and recovered polymorphisms between the sequenced chromosomes. PMID:20219098

  1. Incorporation of the KERN ECDS-PC software into a project oriented software environment

    International Nuclear Information System (INIS)

    Oren, W.; Pushor, R.; Ruland, R.

    1986-11-01

    The Stanford Linear Accelerator Center (SLAC) is in the process of building a new particle collider, the Stanford Linear Collider (SLC). The tunnel which houses the SLC is about 3 km long and contains approximately 1000 magnets. Besides a very precise absolute positioning of these magnets, the alignment of adjacent magnet ends is of particular importance to the success of the whole project. Because of this and the limited time frame, a survey method which was not only reliable and self-checking but also fast had to be developed. Therefore, the concept of MAS (Magnet Alignment System) was developed. This system utilizes the on-line data collection and the rigorous least-squares bundle adjustment of the KERN ECDS-PC system to fulfill these requirements. The ECDS software is embedded in a project tailored software system with modules which take care of: fixture and magnet calibration corrections, the calculation of ideal coordinates and their comparison to measured coordinates, the translation of detected misalignments into the coordinate system of the mechanical adjustments and the control of the adjustments with on-line electronic dial-gauges. This paper gives a brief introduction to the SLC project and some of the survey problems which are unique to this machine. The basic ideas of the KERN ECDS-PC system are explained and a discussion of the practical aspects, such as targeting and set-ups, are given. MAS and its modules are explained in detail

  2. ATMOSPHERIC PHASE DELAY CORRECTION OF D-INSAR BASED ON SENTINEL-1A

    Directory of Open Access Journals (Sweden)

    X. Li

    2018-04-01

    Full Text Available In this paper, we used the Generic Atmospheric Correction Online Service for InSAR (GACOS tropospheric delay maps to correct the atmospheric phase delay of the differential interferometric synthetic aperture radar (D-InSAR monitoring, and we improved the accuracy of subsidence monitoring using D-InSAR technology. Atmospheric phase delay, as one of the most important errors that limit the monitoring accuracy of InSAR, would lead to the masking of true phase in subsidence monitoring. For the problem, this paper used the Sentinel-1A images and the tropospheric delay maps got from GACOS to monitor the subsidence of the Yellow River Delta in Shandong Province. The conventional D-InSAR processing was performed using the GAMMA software. The MATLAB codes were used to correct the atmospheric delay of the D-InSAR results. The results before and after the atmospheric phase delay correction were verified and analyzed in the main subsidence area. The experimental results show that atmospheric phase influences the deformation results to a certain extent. After the correction, the measurement error of vertical deformation is reduced by about 18 mm, which proves that the removal of atmospheric effects can improve the accuracy of the D-InSAR monitoring.

  3. Corrective action program at Krsko NPP

    International Nuclear Information System (INIS)

    Skaler, F.; Divjak, G.; Kavsek, D.

    2004-01-01

    The Krsko NPP develops software that enables electronic reporting of all kind of deviations and suggestions for improvement at the plant. All the employees and permanent subcontractors have the access to the system and can report deviations. NPP has centralized decision process for the distribution of reported deviation. At this point all direct actions are electronically tracked. The immediate benefits of this new tool were: Reporting threshold has been lowered; Number of reporting people has increased; One computerized form for all processes; Decision, which process will solve the deviation, is centralized; All types of deviation are in the same environment; Our experiences of the processes are incorporated in the program; Control of work that has been done; Archiving is electronic only. Software basic data: Application system Corrective action program is a WEB application. Data is stored in Oracle 8.1.7 i database. Users access application through PL/SQL gateway on Oracle 9i Application Server 1.0.2. using Microsoft Internet Explorer browsers(Version 5 or later). Reports are implemented by Oracle Reports 6i. Menus are designed by Apycom Java Menus and Buttons v4.23. Our Presentation will include: Basic idea; Implementation change management; Demonstration of the program.(author)

  4. Corrective action program at Krsko NPP

    Energy Technology Data Exchange (ETDEWEB)

    Skaler, F; Divjak, G; Kavsek, D [NPP Krsko, Krsko (Slovenia)

    2004-07-01

    The Krsko NPP develops software that enables electronic reporting of all kind of deviations and suggestions for improvement at the plant. All the employees and permanent subcontractors have the access to the system and can report deviations. NPP has centralized decision process for the distribution of reported deviation. At this point all direct actions are electronically tracked. The immediate benefits of this new tool were: Reporting threshold has been lowered; Number of reporting people has increased; One computerized form for all processes; Decision, which process will solve the deviation, is centralized; All types of deviation are in the same environment; Our experiences of the processes are incorporated in the program; Control of work that has been done; Archiving is electronic only. Software basic data: Application system Corrective action program is a WEB application. Data is stored in Oracle 8.1.7 i database. Users access application through PL/SQL gateway on Oracle 9i Application Server 1.0.2. using Microsoft Internet Explorer browsers(Version 5 or later). Reports are implemented by Oracle Reports 6i. Menus are designed by Apycom Java Menus and Buttons v4.23. Our Presentation will include: Basic idea; Implementation change management; Demonstration of the program.(author)

  5. Software engineering laboratory series: Annotated bibliography of software engineering laboratory literature

    Science.gov (United States)

    Morusiewicz, Linda; Valett, Jon

    1992-01-01

    This document is an annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory. More than 100 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. This document has been updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials have been grouped into eight general subject areas for easy reference: (1) the Software Engineering Laboratory; (2) the Software Engineering Laboratory: Software Development Documents; (3) Software Tools; (4) Software Models; (5) Software Measurement; (6) Technology Evaluations; (7) Ada Technology; and (8) Data Collection. This document contains an index of these publications classified by individual author.

  6. Software Engineering Guidebook

    Science.gov (United States)

    Connell, John; Wenneson, Greg

    1993-01-01

    The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.

  7. From Software Development to Software Assembly

    NARCIS (Netherlands)

    Sneed, Harry M.; Verhoef, Chris

    2016-01-01

    The lack of skilled programming personnel and the growing burden of maintaining customized software are forcing organizations to quit producing their own software. It's high time they turned to ready-made, standard components to fulfill their business requirements. Cloud services might be one way to

  8. QuantiFly: Robust Trainable Software for Automated Drosophila Egg Counting.

    Directory of Open Access Journals (Sweden)

    Dominic Waithe

    Full Text Available We report the development and testing of software called QuantiFly: an automated tool to quantify Drosophila egg laying. Many laboratories count Drosophila eggs as a marker of fitness. The existing method requires laboratory researchers to count eggs manually while looking down a microscope. This technique is both time-consuming and tedious, especially when experiments require daily counts of hundreds of vials. The basis of the QuantiFly software is an algorithm which applies and improves upon an existing advanced pattern recognition and machine-learning routine. The accuracy of the baseline algorithm is additionally increased in this study through correction of bias observed in the algorithm output. The QuantiFly software, which includes the refined algorithm, has been designed to be immediately accessible to scientists through an intuitive and responsive user-friendly graphical interface. The software is also open-source, self-contained, has no dependencies and is easily installed (https://github.com/dwaithe/quantifly. Compared to manual egg counts made from digital images, QuantiFly achieved average accuracies of 94% and 85% for eggs laid on transparent (defined and opaque (yeast-based fly media. Thus, the software is capable of detecting experimental differences in most experimental situations. Significantly, the advanced feature recognition capabilities of the software proved to be robust to food surface artefacts like bubbles and crevices. The user experience involves image acquisition, algorithm training by labelling a subset of eggs in images of some of the vials, followed by a batch analysis mode in which new images are automatically assessed for egg numbers. Initial training typically requires approximately 10 minutes, while subsequent image evaluation by the software is performed in just a few seconds. Given the average time per vial for manual counting is approximately 40 seconds, our software introduces a timesaving advantage for

  9. The results of bone deformity correction using a spider frame with web-based software for lower extremity long bone deformities

    Directory of Open Access Journals (Sweden)

    Tekin Ali Çağrı

    2016-01-01

    Full Text Available Aim: To present the functional and radiological results and evaluate the effectiveness of a computer-assisted external fixator (spider frame in patients with lower extremity shortness and deformity. Materials and methods: The study comprised 17 patients (14 male, 3 female who were treated for lower extremity long bone deformity and shortness between 2012 and 2015 using a spider frame. The procedure’s level of difficulty was determined preoperatively using the Paley Scale. Postoperatively, the results for the patients who underwent tibial operations were evaluated using the Paley criteria modified by ASAMI, and the results for the patients who underwent femoral operations were evaluated according to the Paley scoring system. The evaluations were made by calculating the External Fixator and Distraction indexes. Results: The mean age of the patients was 24.58 years (range, 5–51 years. The spider frame was applied to the femur in 10 patients and to the tibia in seven. The mean follow-up period was 15 months (range, 6–31 months from the operation day, and the mean amount of lengthening was 3.0 cm (range, 1–6 cm. The mean duration of fixator application was 202.7 days (range, 104–300 days. The mean External Fixator Index was 98 days/cm (range, 42–265 days/cm. The mean Distraction Index was 10.49 days/cm (range, 10–14 days/cm. Conclusion: The computer-assisted external fixator system (spider frame achieves single-stage correction in cases of both deformity and shortness. The system can be applied easily, and because of its high-tech software, it offers the possibility of postoperative treatment of the deformity.

  10. A Software Reuse Approach and Its Effect On Software Quality, An Empirical Study for The Software Industry

    OpenAIRE

    Mateen, Ahmed; Kausar, Samina; Sattar, Ahsan Raza

    2017-01-01

    Software reusability has become much interesting because of increased quality and reduce cost. A good process of software reuse leads to enhance the reliability, productivity, quality and the reduction of time and cost. Current reuse techniques focuses on the reuse of software artifact which grounded on anticipated functionality whereas, the non-functional (quality) aspect are also important. So, Software reusability used here to expand quality and productivity of software. It improves overal...

  11. A field study on root cause analysis of defects in space software

    International Nuclear Information System (INIS)

    Silva, Nuno; Cunha, João Carlos; Vieira, Marco

    2017-01-01

    Critical systems, such as space systems, are developed under strict requirements envisaging high integrity in accordance to specific standards. For such software systems, an independent assessment is put into effect (Independent Software Verification and Validation – ISVV) after the regular development lifecycle and V&V activities, aiming at finding residual faults and raising confidence in the software. However, it has been observed that there is still a significant number of defects remaining at this stage, questioning the effectiveness of the previous engineering processes. This paper presents a root cause analysis of 1070 defects found in four space software projects during ISVV, by applying an improved Orthogonal Defect Classification (ODC) taxonomy and examining the defect types, triggers and impacts, in order to identify why they reached such a later stage in the development. The paper also puts forward proposals for modifications to both the software development (to prevent defects) and the V&V activities (to better detect defects) and an assessment methodology for future works on root cause analysis. - Highlights: • Root cause analysis of space software defects by using an enhanced ODC taxonomy. • Prioritization of the root causes according to the more important defect impacts. • Identification of improvements to systems engineering and development processes. • Improvements to V&V activities as means to reduce the occurrence of defects. • Generic process to achieve the defects root causes and the corrections suggestions.

  12. Software quality assurance: in large scale and complex software-intensive systems

    NARCIS (Netherlands)

    Mistrik, I.; Soley, R.; Ali, N.; Grundy, J.; Tekinerdogan, B.

    2015-01-01

    Software Quality Assurance in Large Scale and Complex Software-intensive Systems presents novel and high-quality research related approaches that relate the quality of software architecture to system requirements, system architecture and enterprise-architecture, or software testing. Modern software

  13. Non-Uniformity Correction Using Nonlinear Characteristic Performance Curves for Calibration

    Science.gov (United States)

    Lovejoy, McKenna Roberts

    polynomial with 16-bit precision, significant improvement over the one and two-point correction algorithms. All algorithm have been implemented in software with satisfactory results and the third order gain equalization non-uniformity correction algorithm has been implemented in hardware.

  14. Great software debates

    CERN Document Server

    Davis, A

    2004-01-01

    The industry’s most outspoken and insightful critic explains how the software industry REALLY works. In Great Software Debates, Al Davis, shares what he has learned about the difference between the theory and the realities of business and encourages you to question and think about software engineering in ways that will help you succeed where others fail. In short, provocative essays, Davis fearlessly reveals the truth about process improvement, productivity, software quality, metrics, agile development, requirements documentation, modeling, software marketing and sales, empiricism, start-up financing, software research, requirements triage, software estimation, and entrepreneurship.

  15. Hitchhiker'S Guide to Voxel Segmentation for Partial Volume Correction of in Vivo Magnetic Resonance Spectroscopy

    Directory of Open Access Journals (Sweden)

    Scott Quadrelli

    2016-01-01

    Full Text Available Partial volume effects have the potential to cause inaccuracies when quantifying metabolites using proton magnetic resonance spectroscopy (MRS. In order to correct for cerebrospinal fluid content, a spectroscopic voxel needs to be segmented according to different tissue contents. This article aims to detail how automated partial volume segmentation can be undertaken and provides a software framework for researchers to develop their own tools. While many studies have detailed the impact of partial volume correction on proton magnetic resonance spectroscopy quantification, there is a paucity of literature explaining how voxel segmentation can be achieved using freely available neuroimaging packages.

  16. Computer-Aided Software Engineering - An approach to real-time software development

    Science.gov (United States)

    Walker, Carrie K.; Turkovich, John J.

    1989-01-01

    A new software engineering discipline is Computer-Aided Software Engineering (CASE), a technology aimed at automating the software development process. This paper explores the development of CASE technology, particularly in the area of real-time/scientific/engineering software, and a history of CASE is given. The proposed software development environment for the Advanced Launch System (ALS CASE) is described as an example of an advanced software development system for real-time/scientific/engineering (RT/SE) software. The Automated Programming Subsystem of ALS CASE automatically generates executable code and corresponding documentation from a suitably formatted specification of the software requirements. Software requirements are interactively specified in the form of engineering block diagrams. Several demonstrations of the Automated Programming Subsystem are discussed.

  17. Frequency Correction for MIRO Chirp Transformation Spectroscopy Spectrum

    Science.gov (United States)

    Lee, Seungwon

    2012-01-01

    This software processes the flyby spectra of the Chirp Transform Spectrometer (CTS) of the Microwave Instrument for Rosetta Orbiter (MIRO). The tool corrects the effect of Doppler shift and local-oscillator (LO) frequency shift during the flyby mode of MIRO operations. The frequency correction for CTS flyby spectra is performed and is integrated with multiple spectra into a high signal-to-noise averaged spectrum at the rest-frame RF frequency. This innovation also generates the 8 molecular line spectra by dividing continuous 4,096-channel CTS spectra. The 8 line spectra can then be readily used for scientific investigations. A spectral line that is at its rest frequency in the frame of the Earth or an asteroid will be observed with a time-varying Doppler shift as seen by MIRO. The frequency shift is toward the higher RF frequencies on approach, and toward lower RF frequencies on departure. The magnitude of the shift depends on the flyby velocity. The result of time-varying Doppler shift is that of an observed spectral line will be seen to move from channel to channel in the CTS spectrometer. The direction (higher or lower frequency) in the spectrometer depends on the spectral line frequency under consideration. In order to analyze the flyby spectra, two steps are required. First, individual spectra must be corrected for the Doppler shift so that individual spectra can be superimposed at the same rest frequency for integration purposes. Second, a correction needs to be applied to the CTS spectra to account for the LO frequency shifts that are applied to asteroid mode.

  18. High pressure single-crystal micro X-ray diffraction analysis with GSE_ADA/RSV software

    Science.gov (United States)

    Dera, Przemyslaw; Zhuravlev, Kirill; Prakapenka, Vitali; Rivers, Mark L.; Finkelstein, Gregory J.; Grubor-Urosevic, Ognjen; Tschauner, Oliver; Clark, Simon M.; Downs, Robert T.

    2013-08-01

    GSE_ADA/RSV is a free software package for custom analysis of single-crystal micro X-ray diffraction (SCμXRD) data, developed with particular emphasis on data from samples enclosed in diamond anvil cells and subject to high pressure conditions. The package has been in extensive use at the high pressure beamlines of Advanced Photon Source (APS), Argonne National Laboratory and Advanced Light Source (ALS), Lawrence Berkeley National Laboratory. The software is optimized for processing of wide-rotation images and includes a variety of peak intensity corrections and peak filtering features, which are custom-designed to make processing of high pressure SCμXRD easier and more reliable.

  19. Model-Based Software Testing for Object-Oriented Software

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Model-based testing is one of the best solutions for testing object-oriented software. It has a better test coverage than other testing styles. Model-based testing takes into consideration behavioural aspects of a class, which are usually unchecked in other testing methods. An increase in the complexity of software has forced the software industry…

  20. FDSTools: A software package for analysis of massively parallel sequencing data with the ability to recognise and correct STR stutter and other PCR or sequencing noise.

    Science.gov (United States)

    Hoogenboom, Jerry; van der Gaag, Kristiaan J; de Leeuw, Rick H; Sijen, Titia; de Knijff, Peter; Laros, Jeroen F J

    2017-03-01

    Massively parallel sequencing (MPS) is on the advent of a broad scale application in forensic research and casework. The improved capabilities to analyse evidentiary traces representing unbalanced mixtures is often mentioned as one of the major advantages of this technique. However, most of the available software packages that analyse forensic short tandem repeat (STR) sequencing data are not well suited for high throughput analysis of such mixed traces. The largest challenge is the presence of stutter artefacts in STR amplifications, which are not readily discerned from minor contributions. FDSTools is an open-source software solution developed for this purpose. The level of stutter formation is influenced by various aspects of the sequence, such as the length of the longest uninterrupted stretch occurring in an STR. When MPS is used, STRs are evaluated as sequence variants that each have particular stutter characteristics which can be precisely determined. FDSTools uses a database of reference samples to determine stutter and other systemic PCR or sequencing artefacts for each individual allele. In addition, stutter models are created for each repeating element in order to predict stutter artefacts for alleles that are not included in the reference set. This information is subsequently used to recognise and compensate for the noise in a sequence profile. The result is a better representation of the true composition of a sample. Using Promega Powerseq™ Auto System data from 450 reference samples and 31 two-person mixtures, we show that the FDSTools correction module decreases stutter ratios above 20% to below 3%. Consequently, much lower levels of contributions in the mixed traces are detected. FDSTools contains modules to visualise the data in an interactive format allowing users to filter data with their own preferred thresholds. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  1. Software Atom: An approach towards software components structuring to improve reusability

    Directory of Open Access Journals (Sweden)

    Muhammad Hussain Mughal

    2017-12-01

    Full Text Available Diversity of application domain compelled to design sustainable classification scheme for significantly amassing software repository. The atomic reusable software components are articulated to improve the software component reusability in volatile industry.  Numerous approaches of software classification have been proposed over past decades. Each approach has some limitations related to coupling and cohesion. In this paper, we proposed a novel approach by constituting the software based on radical functionalities to improve software reusability. We analyze the element's semantics in Periodic Table used in chemistry to design our classification approach, and present this approach using tree-based classification to curtail software repository search space complexity and further refined based on semantic search techniques. We developed a Global unique Identifier (GUID for indexing the functions and related components. We have exploited the correlation between chemistry element and software elements to simulate one to one mapping between them. Our approach is inspired from sustainability chemical periodic table. We have proposed software periodic table (SPT representing atomic software components extracted from real application software. Based on SPT classified repository tree parsing & extraction to enable the user to program their software by customizing the ingredients of software requirements. The classified repository of software ingredients assist user to exploits their requirements to software engineer and enable requirement engineer to develop a rapid large-scale prototype with great essence. Furthermore, we would predict the usability of the categorized repository based on feedback of users.  The continuous evolution of that proposed repository will be fine-tuned based on utilization and SPT would be gradually optimized by ant colony optimization techniques. Succinctly would provoke automating the software development process.

  2. Service-oriented architecture for the ARGOS instrument control software

    Science.gov (United States)

    Borelli, J.; Barl, L.; Gässler, W.; Kulas, M.; Rabien, Sebastian

    2012-09-01

    The Advanced Rayleigh Guided ground layer Adaptive optic System, ARGOS, equips the Large Binocular Telescope (LBT) with a constellation of six rayleigh laser guide stars. By correcting atmospheric turbulence near the ground, the system is designed to increase the image quality of the multi-object spectrograph LUCIFER approximately by a factor of 3 over a field of 4 arc minute diameter. The control software has the critical task of orchestrating several devices, instruments, and high level services, including the already existing adaptive optic system and the telescope control software. All these components are widely distributed over the telescope, adding more complexity to the system design. The approach used by the ARGOS engineers is to write loosely coupled and distributed services under the control of different ownership systems, providing a uniform mechanism to offer, discover, interact and use these distributed capabilities. The control system counts with several finite state machines, vibration and flexure compensation loops, and safety mechanism, such as interlocks, aircraft, and satellite avoidance systems.

  3. Proceedings of the Fifth Triennial Software Quality Forum 2000, Software for the Next Millennium, Software Quality Forum

    Energy Technology Data Exchange (ETDEWEB)

    Scientific Software Engineering Group, CIC-12

    2000-04-01

    The Software Quality Forum is a triennial conference held by the Software Quality Assurance Subcommittee for the Department of Energy's Quality Managers. The forum centers on key issues, information, and technology important in software development for the Nuclear Weapons Complex. This year it will be opened up to include local information technology companies and software vendors presenting their solutions, ideas, and lessons learned. The Software Quality Forum 2000 will take on a more hands-on, instructional tone than those previously held. There will be an emphasis on providing information, tools, and resources to assist developers in their goal of producing next generation software.

  4. NASA software documentation standard software engineering program

    Science.gov (United States)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as Standard) can be applied to the documentation of all NASA software. This Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. This basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  5. Science and Software

    Science.gov (United States)

    Zelt, C. A.

    2017-12-01

    Earth science attempts to understand how the earth works. This research often depends on software for modeling, processing, inverting or imaging. Freely sharing open-source software is essential to prevent reinventing the wheel and allows software to be improved and applied in ways the original author may never have envisioned. For young scientists, releasing software can increase their name ID when applying for jobs and funding, and create opportunities for collaborations when scientists who collect data want the software's creator to be involved in their project. However, we frequently hear scientists say software is a tool, it's not science. Creating software that implements a new or better way of earth modeling or geophysical processing, inverting or imaging should be viewed as earth science. Creating software for things like data visualization, format conversion, storage, or transmission, or programming to enhance computational performance, may be viewed as computer science. The former, ideally with an application to real data, can be published in earth science journals, the latter possibly in computer science journals. Citations in either case should accurately reflect the impact of the software on the community. Funding agencies need to support more software development and open-source releasing, and the community should give more high-profile awards for developing impactful open-source software. Funding support and community recognition for software development can have far reaching benefits when the software is used in foreseen and unforeseen ways, potentially for years after the original investment in the software development. For funding, an open-source release that is well documented should be required, with example input and output files. Appropriate funding will provide the incentive and time to release user-friendly software, and minimize the need for others to duplicate the effort. All funded software should be available through a single web site

  6. Molecular radiotherapy: the NUKFIT software for calculating the time-integrated activity coefficient.

    Science.gov (United States)

    Kletting, P; Schimmel, S; Kestler, H A; Hänscheid, H; Luster, M; Fernández, M; Bröer, J H; Nosske, D; Lassmann, M; Glatting, G

    2013-10-01

    Calculation of the time-integrated activity coefficient (residence time) is a crucial step in dosimetry for molecular radiotherapy. However, available software is deficient in that it is either not tailored for the use in molecular radiotherapy and/or does not include all required estimation methods. The aim of this work was therefore the development and programming of an algorithm which allows for an objective and reproducible determination of the time-integrated activity coefficient and its standard error. The algorithm includes the selection of a set of fitting functions from predefined sums of exponentials and the choice of an error model for the used data. To estimate the values of the adjustable parameters an objective function, depending on the data, the parameters of the error model, the fitting function and (if required and available) Bayesian information, is minimized. To increase reproducibility and user-friendliness the starting values are automatically determined using a combination of curve stripping and random search. Visual inspection, the coefficient of determination, the standard error of the fitted parameters, and the correlation matrix are provided to evaluate the quality of the fit. The functions which are most supported by the data are determined using the corrected Akaike information criterion. The time-integrated activity coefficient is estimated by analytically integrating the fitted functions. Its standard error is determined assuming Gaussian error propagation. The software was implemented using MATLAB. To validate the proper implementation of the objective function and the fit functions, the results of NUKFIT and SAAM numerical, a commercially available software tool, were compared. The automatic search for starting values was successfully tested for reproducibility. The quality criteria applied in conjunction with the Akaike information criterion allowed the selection of suitable functions. Function fit parameters and their standard

  7. WormGender - Open-Source Software for Automatic Caenorhabditis elegans Sex Ratio Measurement.

    Directory of Open Access Journals (Sweden)

    Marta K Labocha

    Full Text Available Fast and quantitative analysis of animal phenotypes is one of the major challenges of current biology. Here we report the WormGender open-source software, which is designed for accurate quantification of sex ratio in Caenorhabditis elegans. The software functions include, i automatic recognition and counting of adult hermaphrodites and males, ii a manual inspection feature that enables manual correction of errors, and iii flexibility to use new training images to optimize the software for different imaging conditions. We evaluated the performance of our software by comparing manual and automated assessment of sex ratio. Our data showed that the WormGender software provided overall accurate sex ratio measurements. We further demonstrated the usage of WormGender by quantifying the high incidence of male (him phenotype in 27 mutant strains. Mutants of nine genes (brc-1, C30G12.6, cep-1, coh-3, him-3, him-5, him-8, skr-1, unc-86 showed significant him phenotype. The WormGender is written in Java and can be installed and run on both Windows and Mac platforms. The source code is freely available together with a user manual and sample data at http://www.QuantWorm.org/. The source code and sample data are also available at http://dx.doi.org/10.6084/m9.figshare.1541248.

  8. Physics Model-Based Scatter Correction in Multi-Source Interior Computed Tomography.

    Science.gov (United States)

    Gong, Hao; Li, Bin; Jia, Xun; Cao, Guohua

    2018-02-01

    Multi-source interior computed tomography (CT) has a great potential to provide ultra-fast and organ-oriented imaging at low radiation dose. However, X-ray cross scattering from multiple simultaneously activated X-ray imaging chains compromises imaging quality. Previously, we published two hardware-based scatter correction methods for multi-source interior CT. Here, we propose a software-based scatter correction method, with the benefit of no need for hardware modifications. The new method is based on a physics model and an iterative framework. The physics model was derived analytically, and was used to calculate X-ray scattering signals in both forward direction and cross directions in multi-source interior CT. The physics model was integrated to an iterative scatter correction framework to reduce scatter artifacts. The method was applied to phantom data from both Monte Carlo simulations and physical experimentation that were designed to emulate the image acquisition in a multi-source interior CT architecture recently proposed by our team. The proposed scatter correction method reduced scatter artifacts significantly, even with only one iteration. Within a few iterations, the reconstructed images fast converged toward the "scatter-free" reference images. After applying the scatter correction method, the maximum CT number error at the region-of-interests (ROIs) was reduced to 46 HU in numerical phantom dataset and 48 HU in physical phantom dataset respectively, and the contrast-noise-ratio at those ROIs increased by up to 44.3% and up to 19.7%, respectively. The proposed physics model-based iterative scatter correction method could be useful for scatter correction in dual-source or multi-source CT.

  9. Salvo: Seismic imaging software for complex geologies

    Energy Technology Data Exchange (ETDEWEB)

    OBER,CURTIS C.; GJERTSEN,ROB; WOMBLE,DAVID E.

    2000-03-01

    This report describes Salvo, a three-dimensional seismic-imaging software for complex geologies. Regions of complex geology, such as overthrusts and salt structures, can cause difficulties for many seismic-imaging algorithms used in production today. The paraxial wave equation and finite-difference methods used within Salvo can produce high-quality seismic images in these difficult regions. However this approach comes with higher computational costs which have been too expensive for standard production. Salvo uses improved numerical algorithms and methods, along with parallel computing, to produce high-quality images and to reduce the computational and the data input/output (I/O) costs. This report documents the numerical algorithms implemented for the paraxial wave equation, including absorbing boundary conditions, phase corrections, imaging conditions, phase encoding, and reduced-source migration. This report also describes I/O algorithms for large seismic data sets and images and parallelization methods used to obtain high efficiencies for both the computations and the I/O of seismic data sets. Finally, this report describes the required steps to compile, port and optimize the Salvo software, and describes the validation data sets used to help verify a working copy of Salvo.

  10. Pragmatic Software Innovation

    DEFF Research Database (Denmark)

    Aaen, Ivan; Jensen, Rikke Hagensby

    2014-01-01

    We understand software innovation as concerned with introducing innovation into the development of software intensive systems, i.e. systems in which software development and/or integration are dominant considerations. Innovation is key in almost any strategy for competitiveness in existing markets......, for creating new markets, or for curbing rising public expenses, and software intensive systems are core elements in most such strategies. Software innovation therefore is vital for about every sector of the economy. Changes in software technologies over the last decades have opened up for experimentation......, learning, and flexibility in ongoing software projects, but how can this change be used to facilitate software innovation? How can a team systematically identify and pursue opportunities to create added value in ongoing projects? In this paper, we describe Deweyan pragmatism as the philosophical foundation...

  11. Pile-up correction by Genetic Algorithm and Artificial Neural Network

    Science.gov (United States)

    Kafaee, M.; Saramad, S.

    2009-08-01

    Pile-up distortion is a common problem for high counting rates radiation spectroscopy in many fields such as industrial, nuclear and medical applications. It is possible to reduce pulse pile-up using hardware-based pile-up rejections. However, this phenomenon may not be eliminated completely by this approach and the spectrum distortion caused by pile-up rejection can be increased as well. In addition, inaccurate correction or rejection of pile-up artifacts in applications such as energy dispersive X-ray (EDX) spectrometers can lead to losses of counts, will give poor quantitative results and even false element identification. Therefore, it is highly desirable to use software-based models to predict and correct any recognized pile-up signals in data acquisition systems. The present paper describes two new intelligent approaches for pile-up correction; the Genetic Algorithm (GA) and Artificial Neural Networks (ANNs). The validation and testing results of these new methods have been compared, which shows excellent agreement with the measured data with 60Co source and NaI detector. The Monte Carlo simulation of these new intelligent algorithms also shows their advantages over hardware-based pulse pile-up rejection methods.

  12. Educational software and improvement of first grade school students' knowledge about prevention of overweight and obesity

    Directory of Open Access Journals (Sweden)

    Luana Santos Vital Alves Coelho

    Full Text Available Objective.To evaluate the effects of educational software to improve first grade school students' knowledge about prevention of overweight and obesity. Methods. This non-controlled trial with a before-and-after evaluation was carried out in an school located in the municipality of Divinópolis (Brazil among 71 students aged 6 to 10 years. The educational software about prevention of overweight and obesity was designed and then validated. The educational intervention comprised the use of the software. Before and after of the intervention we applied a questionnaire based on the Ten Steps to Healthy Eating for Children, proposed by the Brazilian Ministry of Health. Results. Comparing the times before and after application of the educational software, we observed statistically significant differences in proportion of questions answered correctly by first grade school students, mainly concerning daily eating of healthy and unhealthy food, adequate preparation of food and importance of exercise. Conclusion. This study highlights the importance of educational actions using software to build knowledge of first grade school students about prevention of overweight and obesity.

  13. Control software of a variably polarizing undulator (APPLE type) for SX beamline in the SPring-8

    Energy Technology Data Exchange (ETDEWEB)

    Hiramatsu, Yoichi [Kansai Research Establishment, Japan Atomic Energy Research Institute, Mikazuki, Hyogo (Japan); Shimada, Taihei; Miyahara, Yoshikazu

    1999-12-01

    This paper describes the control software of a variably polarizing undulator (APPLE Type) that was installed at the SX beamline (cell number 23) in the SPring-8 storage ring in February, 1998. This undulator produces a polarized radiation in the energy range of soft X-ray by changing the gap distance between two pairs of permanent magnet arrays (gap movement). The main characteristic of the undulator is a capability to generate right and left circular polarization alternately at a period of 2 sec (0.5 Hz) by high speed phase-shifting (periodic phase movement). The developed software makes a fast correction of the closed orbit distortion (COD) of an electron beam by exciting steering magnets at a rate of time interval of 24 msec (42 Hz) during the movement of magnet arrays. Also, the software is capable to put these magnet arrays into a constant periodic phase movement with an error less than 0.1% for the period of 2 sec. The software was developed in accordance with the directions of SPring-8 standard for software development. (author)

  14. Software testability and its application to avionic software

    Science.gov (United States)

    Voas, Jeffrey M.; Miller, Keith W.; Payne, Jeffery E.

    1993-01-01

    Randomly generated black-box testing is an established yet controversial method of estimating software reliability. Unfortunately, as software applications have required higher reliabilities, practical difficulties with black-box testing have become increasingly problematic. These practical problems are particularly acute in life-critical avionics software, where requirements of 10 exp -7 failures per hour of system reliability can translate into a probability of failure (POF) of perhaps 10 exp -9 or less for each individual execution of the software. This paper describes the application of one type of testability analysis called 'sensitivity analysis' to B-737 avionics software; one application of sensitivity analysis is to quantify whether software testing is capable of detecting faults in a particular program and thus whether we can be confident that a tested program is not hiding faults. We so 80 by finding the testabilities of the individual statements of the program, and then use those statement testabilities to find the testabilities of the functions and modules. For the B-737 system we analyzed, we were able to isolate those functions that are more prone to hide errors during system/reliability testing.

  15. The software life cycle

    CERN Document Server

    Ince, Darrel

    1990-01-01

    The Software Life Cycle deals with the software lifecycle, that is, what exactly happens when software is developed. Topics covered include aspects of software engineering, structured techniques of software development, and software project management. The use of mathematics to design and develop computer systems is also discussed. This book is comprised of 20 chapters divided into four sections and begins with an overview of software engineering and software development, paying particular attention to the birth of software engineering and the introduction of formal methods of software develop

  16. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 14

    Science.gov (United States)

    1996-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  17. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 15

    Science.gov (United States)

    1997-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  18. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 13

    Science.gov (United States)

    1995-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  19. Software Unit Testing during the Development of Digital Reactor Protection System of HTR-PM

    International Nuclear Information System (INIS)

    Guo Chao; Xiong Huasheng; Li Duo; Zhou Shuqiao; Li Jianghai

    2014-01-01

    Reactor Protection System (RPS) of High Temperature Gas-Cooled Reactor - Pebble bed Module (HTR-PM) is the first digital RPS designed and to be operated in the Nuclear Power Plant (NPP) of China, and its development process has receives a lot of concerns around the world. As a 1E-level safety system, the RPS has to be designed and developed following a series of nuclear laws and technical disciplines including software verification and validation (software V&V). Software V&V process demonstrates whether all stages during the software development are performed correctly, completely, accurately, and consistently, and the results of each stage are testable. Software testing is one of the most significant and time-consuming effort during software V&V. In this paper, we give a comprehensive introduction to the software unit testing during the development of RPS in HTR-PM. We first introduce the objective of the testing for our project in the aspects of static testing, black-box testing, and white-box testing. Then the testing techniques, including static testing and dynamic testing, are explained, and the testing strategy we employed is also introduced. We then introduce the principles of three kinds of coverage criteria we used including statement coverage, branch coverage, and the modified condition/decision coverage. As a 1E-level safety software, testing coverage needs to be up to 100% mandatorily. Then we talk the details of safety software testing during software development in HTR-PM, including the organization, methods and tools, testing stages, and testing report. The test result and experiences are shared and finally we draw a conclusion for the unit testing process. The introduction of this paper can contribute to improve the process of unit testing and software development for other digital instrumentation and control systems in NPPs. (author)

  20. Development of software and modification of Q-FISH protocol for estimation of individual telomere length in immunopathology.

    Science.gov (United States)

    Barkovskaya, M Sh; Bogomolov, A G; Knauer, N Yu; Rubtsov, N B; Kozlov, V A

    2017-04-01

    Telomere length is an important indicator of proliferative cell history and potential. Decreasing telomere length in the cells of an immune system can indicate immune aging in immune-mediated and chronic inflammatory diseases. Quantitative fluorescent in situ hybridization (Q-FISH) of a labeled (C 3 TA[Formula: see text] peptide nucleic acid probe onto fixed metaphase cells followed by digital image microscopy allows the evaluation of telomere length in the arms of individual chromosomes. Computer-assisted analysis of microscopic images can provide quantitative information on the number of telomeric repeats in individual telomeres. We developed new software to estimate telomere length. The MeTeLen software contains new options that can be used to solve some Q-FISH and microscopy problems, including correction of irregular light effects and elimination of background fluorescence. The identification and description of chromosomes and chromosome regions are essential to the Q-FISH technique. To improve the quality of cytogenetic analysis after Q-FISH, we optimized the temperature and time of DNA-denaturation to get better DAPI-banding of metaphase chromosomes. MeTeLen was tested by comparing telomere length estimations for sister chromatids, background fluorescence estimations, and correction of nonuniform light effects. The application of the developed software for analysis of telomere length in patients with rheumatoid arthritis was demonstrated.

  1. Calidad del software: camino hacia una verdadera industria del software

    Directory of Open Access Journals (Sweden)

    Saulo Ernesto Rojas Salamanca

    1999-07-01

    Full Text Available El software es quizá uno de los productos de la ingeniería que más ha evolucionado en muy poco tiempo, pasando desde el software empírico o artesanal hasta llegar al software desarrollado bajo los principios y herramientas de la ingeniería del software. Sin embargo, dentro de estos cambios, las personas encargadas de la elaboración del software se han enfrentado a problemas muy comunes: unos debido a la exigencia cada vez mayor en la capacidad de resultados del software, debido al permanente cambio de condiciones lo que aumenta su complejidad y obsolescencia; y otros, debido a la carencia de herramientas adecuadas y estándares de tipo organizacional encaminados al mejoramiento de los procesos en el desarrollo del software. Hacia la búsqueda de mecanismos de solución de estos últimos problemas se orienta este artículo...

  2. Using software metrics and software reliability models to attain acceptable quality software for flight and ground support software for avionic systems

    Science.gov (United States)

    Lawrence, Stella

    1992-01-01

    This paper is concerned with methods of measuring and developing quality software. Reliable flight and ground support software is a highly important factor in the successful operation of the space shuttle program. Reliability is probably the most important of the characteristics inherent in the concept of 'software quality'. It is the probability of failure free operation of a computer program for a specified time and environment.

  3. Inter-comparison and Quality Assurance of acquisition and processing software for MUGA studies in Cuba

    International Nuclear Information System (INIS)

    Lopez, A.; Ponce, F.; Peix, A.; Gonzalez, J.; Perez, M.; Diaz, M.

    2002-01-01

    With the purpose of create the bases for quality control and quality assurance of the acquisition and processing program of gated cardiac blood-pool (MUGA) studies, we used the VENSTRA cardiac function phantom in 7 cameras (4 SOPHA- DSX-1000, 2 GE- IMAGAMMA-2001 and 1 SIEMENS- HERMES) and made 3 acquisition for each Global Left Ventricular Ejection Fraction (LVEF 30%, 60% and 80%) and for each Heart Rate (HR 40, 80 and 160 heart beat/min). The planar resolution and the planar uniformity were proper in all the equipment. Differences less than 5% were found between the acquisition and processing program. To evaluate the processing program without the acquisition parameter's influence, we used one group of these image like software phantom and test the semi-automatic software in all cameras. The semi-automatic protocol showed difference less than 3% between software. The automatic processing software of gated cardiac studies were checked with the COST-B2 software phantom; the difference between the Left Ventricle Ejection Fraction calculated by these software was less than 5% and the regional wall motion analysis was complete coincident in the 93% of the cases. The use of VENSTRA and COST- B2 phantom confirm the correct functioning of the acquisition and the LVEF calculus software of MUGA studies in the 83% of cuban nuclear medicine centers

  4. V-1 nuclear power plant standby RPP-16S computer software

    International Nuclear Information System (INIS)

    Suchy, R.

    1988-01-01

    The software structure of the function of program modules of the RPP-16S standby computer which is part of the information system of the V-1 Bohunice nuclear power plant are described. The multitasking AMOS operational system is used for the organization of programs in the computer. The program modules are classified in five groups by function, i.e., in modules for the periodical collection of values and for the measurement of process quantities for both nuclear power plant units; for the primary processing of the values; for the monitoring of exceedance of preset limits; for unit operators' communication with the computer. The fifth group consists of users program modules. The standby computer software was tested in the actual operating conditions of the V-1 power plant. The results showed it operated correctly; minor shortcomings were removed. (Z.M.). 1 fig

  5. Software Radar Technology

    Directory of Open Access Journals (Sweden)

    Tang Jun

    2015-08-01

    Full Text Available In this paper, the definition and the key features of Software Radar, which is a new concept, are proposed and discussed. We consider the development of modern radar system technology to be divided into three stages: Digital Radar, Software radar and Intelligent Radar, and the second stage is just commencing now. A Software Radar system should be a combination of various modern digital modular components conformed to certain software and hardware standards. Moreover, a software radar system with an open system architecture supporting to decouple application software and low level hardware would be easy to adopt "user requirements-oriented" developing methodology instead of traditional "specific function-oriented" developing methodology. Compared with traditional Digital Radar, Software Radar system can be easily reconfigured and scaled up or down to adapt to the changes of requirements and technologies. A demonstration Software Radar signal processing system, RadarLab 2.0, which has been developed by Tsinghua University, is introduced in this paper and the suggestions for the future development of Software Radar in China are also given in the conclusion.

  6. Application of Metric-based Software Reliability Analysis to Example Software

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Smidts, Carol

    2008-07-01

    The software reliability of TELLERFAST ATM software is analyzed by using two metric-based software reliability analysis methods, a state transition diagram-based method and a test coverage-based method. The procedures for the software reliability analysis by using the two methods and the analysis results are provided in this report. It is found that the two methods have a relation of complementary cooperation, and therefore further researches on combining the two methods to reflect the benefit of the complementary cooperative effect to the software reliability analysis are recommended

  7. Regression dilution bias: tools for correction methods and sample size calculation.

    Science.gov (United States)

    Berglund, Lars

    2012-08-01

    Random errors in measurement of a risk factor will introduce downward bias of an estimated association to a disease or a disease marker. This phenomenon is called regression dilution bias. A bias correction may be made with data from a validity study or a reliability study. In this article we give a non-technical description of designs of reliability studies with emphasis on selection of individuals for a repeated measurement, assumptions of measurement error models, and correction methods for the slope in a simple linear regression model where the dependent variable is a continuous variable. Also, we describe situations where correction for regression dilution bias is not appropriate. The methods are illustrated with the association between insulin sensitivity measured with the euglycaemic insulin clamp technique and fasting insulin, where measurement of the latter variable carries noticeable random error. We provide software tools for estimation of a corrected slope in a simple linear regression model assuming data for a continuous dependent variable and a continuous risk factor from a main study and an additional measurement of the risk factor in a reliability study. Also, we supply programs for estimation of the number of individuals needed in the reliability study and for choice of its design. Our conclusion is that correction for regression dilution bias is seldom applied in epidemiological studies. This may cause important effects of risk factors with large measurement errors to be neglected.

  8. Possibilities and limitations of applying software reliability growth models to safety-critical software

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Jang, Seung Cheol; Ha, Jae Joo

    2007-01-01

    It is generally known that software reliability growth models such as the Jelinski-Moranda model and the Goel-Okumoto's Non-Homogeneous Poisson Process (NHPP) model cannot be applied to safety-critical software due to a lack of software failure data. In this paper, by applying two of the most widely known software reliability growth models to sample software failure data, we demonstrate the possibility of using the software reliability growth models to prove the high reliability of safety-critical software. The high sensitivity of a piece of software's reliability to software failure data, as well as a lack of sufficient software failure data, is also identified as a possible limitation when applying the software reliability growth models to safety-critical software

  9. Interface-based software testing

    Directory of Open Access Journals (Sweden)

    Aziz Ahmad Rais

    2016-10-01

    Full Text Available Software quality is determined by assessing the characteristics that specify how it should work, which are verified through testing. If it were possible to touch, see, or measure software, it would be easier to analyze and prove its quality. Unfortunately, software is an intangible asset, which makes testing complex. This is especially true when software quality is not a question of particular functions that can be tested through a graphical user interface. The primary objective of software architecture is to design quality of software through modeling and visualization. There are many methods and standards that define how to control and manage quality. However, many IT software development projects still fail due to the difficulties involved in measuring, controlling, and managing software quality. Software quality failure factors are numerous. Examples include beginning to test software too late in the development process, or failing properly to understand, or design, the software architecture and the software component structure. The goal of this article is to provide an interface-based software testing technique that better measures software quality, automates software quality testing, encourages early testing, and increases the software’s overall testability

  10. A brain MRI bias field correction method created in the Gaussian multi-scale space

    Science.gov (United States)

    Chen, Mingsheng; Qin, Mingxin

    2017-07-01

    A pre-processing step is needed to correct for the bias field signal before submitting corrupted MR images to such image-processing algorithms. This study presents a new bias field correction method. The method creates a Gaussian multi-scale space by the convolution of the inhomogeneous MR image with a two-dimensional Gaussian function. In the multi-Gaussian space, the method retrieves the image details from the differentiation of the original image and convolution image. Then, it obtains an image whose inhomogeneity is eliminated by the weighted sum of image details in each layer in the space. Next, the bias field-corrected MR image is retrieved after the Υ correction, which enhances the contrast and brightness of the inhomogeneity-eliminated MR image. We have tested the approach on T1 MRI and T2 MRI with varying bias field levels and have achieved satisfactory results. Comparison experiments with popular software have demonstrated superior performance of the proposed method in terms of quantitative indices, especially an improvement in subsequent image segmentation.

  11. Agile software assessment

    OpenAIRE

    Nierstrasz Oscar; Lungu Mircea

    2012-01-01

    Informed decision making is a critical activity in software development but it is poorly supported by common development environments which focus mainly on low level programming tasks. We posit the need for agile software assessment which aims to support decision making by enabling rapid and effective construction of software models and custom analyses. Agile software assessment entails gathering and exploiting the broader context of software information related to the system at hand as well ...

  12. HAZARD ANALYSIS SOFTWARE

    International Nuclear Information System (INIS)

    Sommer, S; Tinh Tran, T.

    2008-01-01

    Washington Safety Management Solutions, LLC developed web-based software to improve the efficiency and consistency of hazard identification and analysis, control selection and classification, and to standardize analysis reporting at Savannah River Site. In the new nuclear age, information technology provides methods to improve the efficiency of the documented safety analysis development process which includes hazard analysis activities. This software provides a web interface that interacts with a relational database to support analysis, record data, and to ensure reporting consistency. A team of subject matter experts participated in a series of meetings to review the associated processes and procedures for requirements and standard practices. Through these meetings, a set of software requirements were developed and compiled into a requirements traceability matrix from which software could be developed. The software was tested to ensure compliance with the requirements. Training was provided to the hazard analysis leads. Hazard analysis teams using the software have verified its operability. The software has been classified as NQA-1, Level D, as it supports the analysis team but does not perform the analysis. The software can be transported to other sites with alternate risk schemes. The software is being used to support the development of 14 hazard analyses. User responses have been positive with a number of suggestions for improvement which are being incorporated as time permits. The software has enforced a uniform implementation of the site procedures. The software has significantly improved the efficiency and standardization of the hazard analysis process

  13. CHEMOSTAT, UM SOFTWARE GRATUITO PARA ANÁLISE EXPLORATÓRIA DE DADOS MULTIVARIADOS

    Directory of Open Access Journals (Sweden)

    Gilson A. Helfer

    2015-05-01

    Full Text Available The objective of this work was to develop a free access exploratory data analysis software application for academic use that is easy to install and can be handled without user-level programming due to extensive use of chemometrics and its association with applications that require purchased licenses or routines. The developed software, called Chemostat, employs Hierarchical Cluster Analysis (HCA, Principal Component Analysis (PCA, intervals Principal Component Analysis (iPCA, as well as correction methods, data transformation and outlier detection. The data can be imported from the clipboard, text files, ASCII or FT-IR Perkin-Elmer “.sp” files. It generates a variety of charts and tables that allow the analysis of results that can be exported in several formats. The main features of the software were tested using midinfrared and near-infrared spectra in vegetable oils and digital images obtained from different types of commercial diesel. In order to validate the software results, the same sets of data were analyzed using Matlab© and the results in both applications matched in various combinations. In addition to the desktop version, the reuse of algorithms allowed an online version to be provided that offers a unique experience on the web. Both applications are available in English.

  14. Corrective Jaw Surgery

    Medline Plus

    Full Text Available ... out more. Corrective Jaw Surgery Corrective Jaw Surgery Orthognathic surgery is performed to correct the misalignment of jaws ... out more. Corrective Jaw Surgery Corrective Jaw Surgery Orthognathic surgery is performed to correct the misalignment of jaws ...

  15. Fast shading correction for cone beam CT in radiation therapy via sparse sampling on planning CT.

    Science.gov (United States)

    Shi, Linxi; Tsui, Tiffany; Wei, Jikun; Zhu, Lei

    2017-05-01

    The image quality of cone beam computed tomography (CBCT) is limited by severe shading artifacts, hindering its quantitative applications in radiation therapy. In this work, we propose an image-domain shading correction method using planning CT (pCT) as prior information which is highly adaptive to clinical environment. We propose to perform shading correction via sparse sampling on pCT. The method starts with a coarse mapping between the first-pass CBCT images obtained from the Varian TrueBeam system and the pCT. The scatter correction method embedded in the Varian commercial software removes some image errors but the CBCT images still contain severe shading artifacts. The difference images between the mapped pCT and the CBCT are considered as shading errors, but only sparse shading samples are selected for correction using empirical constraints to avoid carrying over false information from pCT. A Fourier-Transform-based technique, referred to as local filtration, is proposed to efficiently process the sparse data for effective shading correction. The performance of the proposed method is evaluated on one anthropomorphic pelvis phantom and 17 patients, who were scheduled for radiation therapy. (The codes of the proposed method and sample data can be downloaded from https://sites.google.com/view/linxicbct) RESULTS: The proposed shading correction substantially improves the CBCT image quality on both the phantom and the patients to a level close to that of the pCT images. On the phantom, the spatial nonuniformity (SNU) difference between CBCT and pCT is reduced from 74 to 1 HU. The root of mean square difference of SNU between CBCT and pCT is reduced from 83 to 10 HU on the pelvis patients, and from 101 to 12 HU on the thorax patients. The robustness of the proposed shading correction is fully investigated with simulated registration errors between CBCT and pCT on the phantom and mis-registration on patients. The sparse sampling scheme of our method successfully

  16. Framework for Small-Scale Experiments in Software Engineering: Guidance and Control Software Project: Software Engineering Case Study

    Science.gov (United States)

    Hayhurst, Kelly J.

    1998-01-01

    Software is becoming increasingly significant in today's critical avionics systems. To achieve safe, reliable software, government regulatory agencies such as the Federal Aviation Administration (FAA) and the Department of Defense mandate the use of certain software development methods. However, little scientific evidence exists to show a correlation between software development methods and product quality. Given this lack of evidence, a series of experiments has been conducted to understand why and how software fails. The Guidance and Control Software (GCS) project is the latest in this series. The GCS project is a case study of the Requirements and Technical Concepts for Aviation RTCA/DO-178B guidelines, Software Considerations in Airborne Systems and Equipment Certification. All civil transport airframe and equipment vendors are expected to comply with these guidelines in building systems to be certified by the FAA for use in commercial aircraft. For the case study, two implementations of a guidance and control application were developed to comply with the DO-178B guidelines for Level A (critical) software. The development included the requirements, design, coding, verification, configuration management, and quality assurance processes. This paper discusses the details of the GCS project and presents the results of the case study.

  17. Statistical Software Engineering

    Science.gov (United States)

    1998-04-13

    multiversion software subject to coincident errors. IEEE Trans. Software Eng. SE-11:1511-1517. Eckhardt, D.E., A.K Caglayan, J.C. Knight, L.D. Lee, D.F...J.C. and N.G. Leveson. 1986. Experimental evaluation of the assumption of independence in multiversion software. IEEE Trans. Software

  18. Analysis of the Failures and Corrective Actions for the LHC Cryogenics Radiation Tolerant Electronics and its Field Instruments

    CERN Document Server

    Balle, Ch; Vauthier, N

    2014-01-01

    The LHC cryogenic system radiation tolerant electronics and their associated field instruments have been in nominal conditions since before the commissioning of the first LHC beams in September 2008. This system is made of about 15’000 field instruments (thermometers, pressure sensors, liquid helium level gauges, electrical heaters and position switches), 7’500 electronic cards and 853 electronic crates. Since mid-2008 a software tool has been deployed, this allows an operator to report a problem and then lists the corrective actions. The tool is a great help in detecting recurrent problems that may be tackled by a hardware or software consolidation. The corrective actions range from simple resets, exchange of defective equipment, repair of electrical connectors, etc. However a recurrent problem that heals by itself is present on some channels. This type of fault is extremely difficult to diagnose and it appears as a temporary opening of an electrical circuit; its duration can range from a few minutes to ...

  19. Addressing Software Engineering Issues in Real-Time Software ...

    African Journals Online (AJOL)

    Addressing Software Engineering Issues in Real-Time Software ... systems, manufacturing process, process control, military, space exploration, and ... but also physical properties such as timeliness, Quality of Service and reliability.

  20. Effectiveness of an Automatic Tracking Software in Underwater Motion Analysis

    Directory of Open Access Journals (Sweden)

    Fabrício A. Magalhaes

    2013-12-01

    Full Text Available Tracking of markers placed on anatomical landmarks is a common practice in sports science to perform the kinematic analysis that interests both athletes and coaches. Although different software programs have been developed to automatically track markers and/or features, none of them was specifically designed to analyze underwater motion. Hence, this study aimed to evaluate the effectiveness of a software developed for automatic tracking of underwater movements (DVP, based on the Kanade-Lucas-Tomasi feature tracker. Twenty-one video recordings of different aquatic exercises (n = 2940 markers’ positions were manually tracked to determine the markers’ center coordinates. Then, the videos were automatically tracked using DVP and a commercially available software (COM. Since tracking techniques may produce false targets, an operator was instructed to stop the automatic procedure and to correct the position of the cursor when the distance between the calculated marker’s coordinate and the reference one was higher than 4 pixels. The proportion of manual interventions required by the software was used as a measure of the degree of automation. Overall, manual interventions were 10.4% lower for DVP (7.4% than for COM (17.8%. Moreover, when examining the different exercise modes separately, the percentage of manual interventions was 5.6% to 29.3% lower for DVP than for COM. Similar results were observed when analyzing the type of marker rather than the type of exercise, with 9.9% less manual interventions for DVP than for COM. In conclusion, based on these results, the developed automatic tracking software presented can be used as a valid and useful tool for underwater motion analysis.

  1. Analysis of the failures and corrective actions for the LHC cryogenics radiation tolerant electronics and its field instruments

    Energy Technology Data Exchange (ETDEWEB)

    Balle, Christoph; Casas, Juan; Vauthier, Nicolas [CERN, TE Department, 1211 Geneva (Switzerland)

    2014-01-29

    The LHC cryogenic system radiation tolerant electronics and their associated field instruments have been in nominal conditions since before the commissioning of the first LHC beams in September 2008. This system is made of about 15’000 field instruments (thermometers, pressure sensors, liquid helium level gauges, electrical heaters and position switches), 7’500 electronic cards and 853 electronic crates. Since mid-2008 a software tool has been deployed, this allows an operator to report a problem and then lists the corrective actions. The tool is a great help in detecting recurrent problems that may be tackled by a hardware or software consolidation. The corrective actions range from simple resets, exchange of defective equipment, repair of electrical connectors, etc. However a recurrent problem that heals by itself is present on some channels. This type of fault is extremely difficult to diagnose and it appears as a temporary opening of an electrical circuit; its duration can range from a few minutes to several months. This paper presents the main type of problems encountered during the last four years, their evolution over time, the various hardware or software consolidations that have resulted and whether they have had an impact in the availability of the LHC beam.

  2. Commercial Literacy Software.

    Science.gov (United States)

    Balajthy, Ernest

    1997-01-01

    Presents the first year's results of a continuing project to monitor the availability of software of relevance for literacy education purposes. Concludes there is an enormous amount of software available for use by teachers of reading and literacy--whereas drill-and-practice software is the largest category of software available, large numbers of…

  3. Software Quality Assurance Metrics

    Science.gov (United States)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  4. Hybrid wavefront sensing and image correction algorithm for imaging through turbulent media

    Science.gov (United States)

    Wu, Chensheng; Robertson Rzasa, John; Ko, Jonathan; Davis, Christopher C.

    2017-09-01

    It is well known that passive image correction of turbulence distortions often involves using geometry-dependent deconvolution algorithms. On the other hand, active imaging techniques using adaptive optic correction should use the distorted wavefront information for guidance. Our work shows that a hybrid hardware-software approach is possible to obtain accurate and highly detailed images through turbulent media. The processing algorithm also takes much fewer iteration steps in comparison with conventional image processing algorithms. In our proposed approach, a plenoptic sensor is used as a wavefront sensor to guide post-stage image correction on a high-definition zoomable camera. Conversely, we show that given the ground truth of the highly detailed image and the plenoptic imaging result, we can generate an accurate prediction of the blurred image on a traditional zoomable camera. Similarly, the ground truth combined with the blurred image from the zoomable camera would provide the wavefront conditions. In application, our hybrid approach can be used as an effective way to conduct object recognition in a turbulent environment where the target has been significantly distorted or is even unrecognizable.

  5. Software Quality Perceptions of Stakeholders Involved in the Software Development Process

    Science.gov (United States)

    Padmanabhan, Priya

    2013-01-01

    Software quality is one of the primary determinants of project management success. Stakeholders involved in software development widely agree that quality is important (Barney and Wohlin 2009). However, they may differ on what constitutes software quality, and which of its attributes are more important than others. Although, software quality…

  6. Quantitative 177Lu-SPECT/CT imaging and validation of a commercial dosimetry software

    International Nuclear Information System (INIS)

    D'Ambrosio, L.; Aloj, L.; Morisco, A.; Aurilio, M.; Prisco, A.; Di Gennaro, F.; Lastoria, S.; Madesani, D.

    2015-01-01

    Full text of publication follows. Aim: 3D dosimetry is an appealing yet complex application of SPECT/CT in patients undergoing radionuclide therapy. In this study we have developed a quantitative imaging protocol and we have validated commercially available dosimetry software (Dosimetry Tool-kit Package, GE Heathcare) in patients undergoing 177 Lu-DOTATATE therapy. Materials and methods: dosimetry tool-kit uses multi SPECT/CT and/or WB planar datasets for quantifying changes in radiopharmaceutical uptake over time to determine residence times. This software includes tools for performing reconstruction of SPECT/CT data, registration of all scans to a common reference, segmentation of the different organs, creating time activity curves, curve fitting and calculation of residence times. All acquisitions were performed using a hybrid dual-head SPECT-CT camera (Discovery 670, GE Heathcare) equipped with medium energy collimator using a triple-energy window. SPECT images were reconstructed using an iterative reconstruction algorithm with attenuation, scatter and collimator depth-dependent three-dimensional resolution recovery correction. Camera sensitivity and dead time were evaluated. Accuracy of activity quantification was performed on a large homogeneous source with addition of attenuating/scattering medium. A NEMA/IEC body phantom was utilized to measure the recovery coefficient that the software does not take into account. The residence times for organs at risk were calculated in five patients. OLINDA-EXM software was used to calculate absorbed doses. Results: 177 Lu-sensitivity factor was 13 counts/MBq/s. Dead time was <3% with 1.11 GBq in the field of view. The measured activity was consistent with the decay-corrected calibrated activity for large volumes (>100 cc). The recovery coefficient varied from 0.71 (26.5 ml) to 0.16 (2.5 ml) in the absence of background activity and from 0.58 to 0.13 with a source to background activity concentration ratio 20:1. The

  7. Accuracy of Automatic Cephalometric Software on Landmark Identification

    Science.gov (United States)

    Anuwongnukroh, N.; Dechkunakorn, S.; Damrongsri, S.; Nilwarat, C.; Pudpong, N.; Radomsutthisarn, W.; Kangern, S.

    2017-11-01

    This study was to assess the accuracy of an automatic cephalometric analysis software in the identification of cephalometric landmarks. Thirty randomly selected digital lateral cephalograms of patients undergoing orthodontic treatment were used in this study. Thirteen landmarks (S, N, Or, A-point, U1T, U1A, B-point, Gn, Pog, Me, Go, L1T, and L1A) were identified on the digital image by an automatic cephalometric software and on cephalometric tracing by manual method. Superimposition of printed image and manual tracing was done by registration at the soft tissue profiles. The accuracy of landmarks located by the automatic method was compared with that of the manually identified landmarks by measuring the mean differences of distances of each landmark on the Cartesian plane where X and Y coordination axes passed through the center of ear rod. One-Sample T test was used to evaluate the mean differences. Statistically significant mean differences (pmean differences in both horizontal and vertical directions. Small mean differences (mean differences were found for A-point (3.0 4mm) in vertical direction. Only 5 of 13 landmarks (38.46%; S, N, Gn, Pog, and Go) showed no significant mean difference between the automatic and manual landmarking methods. It is concluded that if this automatic cephalometric analysis software is used for orthodontic diagnosis, the orthodontist must correct or modify the position of landmarks in order to increase the accuracy of cephalometric analysis.

  8. Software engineering

    CERN Document Server

    Sommerville, Ian

    2016-01-01

    For courses in computer science and software engineering The Fundamental Practice of Software Engineering Software Engineering introduces readers to the overwhelmingly important subject of software programming and development. In the past few years, computer systems have come to dominate not just our technological growth, but the foundations of our world's major industries. This text seeks to lay out the fundamental concepts of this huge and continually growing subject area in a clear and comprehensive manner. The Tenth Edition contains new information that highlights various technological updates of recent years, providing readers with highly relevant and current information. Sommerville's experience in system dependability and systems engineering guides the text through a traditional plan-based approach that incorporates some novel agile methods. The text strives to teach the innovators of tomorrow how to create software that will make our world a better, safer, and more advanced place to live.

  9. Agile Software Development

    Science.gov (United States)

    Biju, Soly Mathew

    2008-01-01

    Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…

  10. Application of GESPECOR software for the calculation of coincidence summing effects in special cases

    International Nuclear Information System (INIS)

    Arnold, Dirk; Sima, Octavian

    2004-01-01

    In this work, coincidence summing correction factors have been measured for 133 Ba, 152 Eu and 88 Y point sources with a 50% relative efficiency p-type detector and a 25% relative efficiency n-type detector in two close-to-detector measurement geometries. The experimental data for 133 Ba and 152 Eu and the results obtained with the GESPECOR software reveal a complex structure of the conventional dead layer of the p-type detector. The high value of the coincidence summing correction factor for the 511 keV peak of 88 Y, in agreement with the values computed by GESPECOR, in this case cautions against the application of the semiempirical method for evaluating coincidence summing effects

  11. Establishing software quality assurance

    International Nuclear Information System (INIS)

    Malsbury, J.

    1983-01-01

    This paper is concerned with four questions about establishing software QA: What is software QA. Why have software QA. What is the role of software QA. What is necessary to ensure the success of software QA

  12. Applied software risk management a guide for software project managers

    CERN Document Server

    Pandian, C Ravindranath

    2006-01-01

    Few software projects are completed on time, on budget, and to their original specifications. Focusing on what practitioners need to know about risk in the pursuit of delivering software projects, Applied Software Risk Management: A Guide for Software Project Managers covers key components of the risk management process and the software development process, as well as best practices for software risk identification, risk planning, and risk analysis. Written in a clear and concise manner, this resource presents concepts and practical insight into managing risk. It first covers risk-driven project management, risk management processes, risk attributes, risk identification, and risk analysis. The book continues by examining responses to risk, the tracking and modeling of risks, intelligence gathering, and integrated risk management. It concludes with details on drafting and implementing procedures. A diary of a risk manager provides insight in implementing risk management processes.Bringing together concepts ...

  13. Software maintenance and evolution and automated software engineering

    NARCIS (Netherlands)

    Carver, Jeffrey C.; Serebrenik, Alexander

    2018-01-01

    This issue's column reports on the 33rd International Conference on Software Maintenance and Evolution and 32nd International Conference on Automated Software Engineering. Topics include flaky tests, technical debt, QA bots, and regular expressions.

  14. Software development for the RF measurement and analysis of RFQ accelerator

    International Nuclear Information System (INIS)

    Fu Shinian

    2002-01-01

    In a high current RFQ accelerator, it is required to tightly control the beam losses and beam emittance growth. For this reason, it is demanded to accurately measure and to correctly analyze field distribution and mode components, and eventually, to tune the RF field to reach its design values. LebView is a widely used software platform for the automatic measurement and data processing. The author will present the code development on this platform for the RFQ measurement and analysis, including some applications of the codes

  15. Software development for the RF measurement and analysis of RFQ accelerator

    CERN Document Server

    Fu Shin Ian

    2002-01-01

    In a high current RFQ accelerator, it is required to tightly control the beam losses and beam emittance growth. For this reason, it is demanded to accurately measure and to correctly analyze field distribution and mode components, and eventually, to tune the RF field to reach its design values. LebView is a widely used software platform for the automatic measurement and data processing. The will present the code development on this platform for the RFQ measurement and analysis, including some applications of the codes

  16. Software development for the RF measurement and analysis of RFQ accelerator

    International Nuclear Information System (INIS)

    Fu Shinian

    2002-01-01

    In a high current RFQ accelerator, it is required to tightly control the beam losses and beam emittance growth. For this reason, it is demanded to accurately measure and to correctly analyze field distribution and mode components, and eventually, to tune the RF field to reach its design values. LebView is a widely used software platform for the automatic measurement and data processing, the authors present authors' code development on this platform for the RFQ measurement and analysis, including some applications of the codes

  17. An off-the-shelf guider for the Palomar 200-inch telescope: interfacing amateur astronomy software with professional telescopes for an easy life

    Science.gov (United States)

    Clarke, Fraser; Lynn, James; Thatte, Niranjan; Tecza, Matthias

    2014-08-01

    We have developed a simple but effective guider for use with the Oxford-SWIFT integral field spectrograph on the Palomar 200-inch telescope. The guider uses mainly off-the-shelf components, including commercial amateur astronomy software to interface with the CCD camera, calculating guiding corrections, and send guide commands to the telescope. The only custom piece of software is an driver to provide an interface between the Palomar telescope control system and the industry standard 'ASCOM' system. Using existing commercial software provided a very cheap guider (guiding, and could easily be adapted to any other professional telescope

  18. Harmonized Constraints in Software Engineering and Acquisition Process Management Requirements are the Clue to Meet Future Performance Goals Successfully in an Environment of Scarce Resources

    National Research Council Canada - National Science Library

    Reich, Holger

    2008-01-01

    This MBA project investigates the importance of correctly deriving requirements from the capability gap and operational environment, and translating them into the processes of contracting, software...

  19. Reliability of software

    International Nuclear Information System (INIS)

    Kopetz, H.

    1980-01-01

    Common factors and differences in the reliability of hardware and software; reliability increase by means of methods of software redundancy. Maintenance of software for long term operating behavior. (HP) [de

  20. Operator quantum error-correcting subsystems for self-correcting quantum memories

    International Nuclear Information System (INIS)

    Bacon, Dave

    2006-01-01

    The most general method for encoding quantum information is not to encode the information into a subspace of a Hilbert space, but to encode information into a subsystem of a Hilbert space. Recently this notion has led to a more general notion of quantum error correction known as operator quantum error correction. In standard quantum error-correcting codes, one requires the ability to apply a procedure which exactly reverses on the error-correcting subspace any correctable error. In contrast, for operator error-correcting subsystems, the correction procedure need not undo the error which has occurred, but instead one must perform corrections only modulo the subsystem structure. This does not lead to codes which differ from subspace codes, but does lead to recovery routines which explicitly make use of the subsystem structure. Here we present two examples of such operator error-correcting subsystems. These examples are motivated by simple spatially local Hamiltonians on square and cubic lattices. In three dimensions we provide evidence, in the form a simple mean field theory, that our Hamiltonian gives rise to a system which is self-correcting. Such a system will be a natural high-temperature quantum memory, robust to noise without external intervening quantum error-correction procedures

  1. Correction method and software for image distortion and nonuniform response in charge-coupled device-based x-ray detectors utilizing x-ray image intensifier

    International Nuclear Information System (INIS)

    Ito, Kazuki; Kamikubo, Hironari; Yagi, Naoto; Amemiya, Yoshiyuki

    2005-01-01

    An on-site method of correcting the image distortion and nonuniform response of a charge-coupled device (CCD)-based X-ray detector was developed using the response of the imaging plate as a reference. The CCD-based X-ray detector consists of a beryllium-windowed X-ray image intensifier (Be-XRII) and a CCD as the image sensor. An image distortion of 29% was improved to less than 1% after the correction. In the correction of nonuniform response due to image distortion, subpixel approximation was performed for the redistribution of pixel values. The optimal number of subpixels was also discussed. In an experiment with polystyrene (PS) latex, it was verified that the correction of both image distortion and nonuniform response worked properly. The correction for the 'contrast reduction' problem was also demonstrated for an isotropic X-ray scattering pattern from the PS latex. (author)

  2. Impact of Internet of Things on Software Business Model and Software Industry

    OpenAIRE

    Murari, Bhanu Teja

    2016-01-01

    Context: Internet of things (IoT) technology is rapidly increasing and changes the business environment for a software organization. There is a need to understand what are important factors of business model should a software company focus on obtaining benefits from the potential that IoT offers. This thesis also focuses on finding the impact of IoT on software business model and software industry especially on software development. Objectives: In this thesis, we do research on IoT software b...

  3. AROSICS: An Automated and Robust Open-Source Image Co-Registration Software for Multi-Sensor Satellite Data

    Directory of Open Access Journals (Sweden)

    Daniel Scheffler

    2017-07-01

    Full Text Available Geospatial co-registration is a mandatory prerequisite when dealing with remote sensing data. Inter- or intra-sensoral misregistration will negatively affect any subsequent image analysis, specifically when processing multi-sensoral or multi-temporal data. In recent decades, many algorithms have been developed to enable manual, semi- or fully automatic displacement correction. Especially in the context of big data processing and the development of automated processing chains that aim to be applicable to different remote sensing systems, there is a strong need for efficient, accurate and generally usable co-registration. Here, we present AROSICS (Automated and Robust Open-Source Image Co-Registration Software, a Python-based open-source software including an easy-to-use user interface for automatic detection and correction of sub-pixel misalignments between various remote sensing datasets. It is independent of spatial or spectral characteristics and robust against high degrees of cloud coverage and spectral and temporal land cover dynamics. The co-registration is based on phase correlation for sub-pixel shift estimation in the frequency domain utilizing the Fourier shift theorem in a moving-window manner. A dense grid of spatial shift vectors can be created and automatically filtered by combining various validation and quality estimation metrics. Additionally, the software supports the masking of, e.g., clouds and cloud shadows to exclude such areas from spatial shift detection. The software has been tested on more than 9000 satellite images acquired by different sensors. The results are evaluated exemplarily for two inter-sensoral and two intra-sensoral use cases and show registration results in the sub-pixel range with root mean square error fits around 0.3 pixels and better.

  4. Pixel-based CTE Correction of ACS/WFC: Modifications To The ACS Calibration Pipeline (CALACS)

    Science.gov (United States)

    Smith, Linda J.; Anderson, J.; Armstrong, A.; Avila, R.; Bedin, L.; Chiaberge, M.; Davis, M.; Ferguson, B.; Fruchter, A.; Golimowski, D.; Grogin, N.; Hack, W.; Lim, P. L.; Lucas, R.; Maybhate, A.; McMaster, M.; Ogaz, S.; Suchkov, A.; Ubeda, L.

    2012-01-01

    The Advanced Camera for Surveys (ACS) was installed on the Hubble Space Telescope (HST) nearly ten years ago. Over the last decade, continuous exposure to the harsh radiation environment has degraded the charge transfer efficiency (CTE) of the CCDs. The worsening CTE impacts the science that can be obtained by altering the photometric, astrometric and morphological characteristics of sources, particularly those farthest from the readout amplifiers. To ameliorate these effects, Anderson & Bedin (2010, PASP, 122, 1035) developed a pixel-based empirical approach to correcting ACS data by characterizing the CTE profiles of trails behind warm pixels in dark exposures. The success of this technique means that it is now possible to correct full-frame ACS/WFC images for CTE degradation in the standard data calibration and reduction pipeline CALACS. Over the past year, the ACS team at STScI has developed, refined and tested the new software. The details of this work are described in separate posters. The new code is more effective at low flux levels (repair ACS electronics) and pixel-based CTE correction. In addition to the standard cosmic ray corrected, flat-fielded and drizzled data products (crj, flt and drz files) there are three new equivalent files (crc, flc and drc) which contain the CTE-corrected data products. The user community will be able to choose whether to use the standard or CTE-corrected products.

  5. A DSP-based neural network non-uniformity correction algorithm for IRFPA

    Science.gov (United States)

    Liu, Chong-liang; Jin, Wei-qi; Cao, Yang; Liu, Xiu

    2009-07-01

    An effective neural network non-uniformity correction (NUC) algorithm based on DSP is proposed in this paper. The non-uniform response in infrared focal plane array (IRFPA) detectors produces corrupted images with a fixed-pattern noise(FPN).We introduced and analyzed the artificial neural network scene-based non-uniformity correction (SBNUC) algorithm. A design of DSP-based NUC development platform for IRFPA is described. The DSP hardware platform designed is of low power consumption, with 32-bit fixed point DSP TMS320DM643 as the kernel processor. The dependability and expansibility of the software have been improved by DSP/BIOS real-time operating system and Reference Framework 5. In order to realize real-time performance, the calibration parameters update is set at a lower task priority then video input and output in DSP/BIOS. In this way, calibration parameters updating will not affect video streams. The work flow of the system and the strategy of real-time realization are introduced. Experiments on real infrared imaging sequences demonstrate that this algorithm requires only a few frames to obtain high quality corrections. It is computationally efficient and suitable for all kinds of non-uniformity.

  6. The Software Invention Cube: A classification scheme for software inventions

    NARCIS (Netherlands)

    Bergstra, J.A.; Klint, P.

    2008-01-01

    The patent system protects inventions. The requirement that a software invention should make ‘a technical contribution’ turns out to be untenable in practice and this raises the question, what constitutes an invention in the realm of software. The authors developed the Software Invention Cube

  7. Microarray background correction: maximum likelihood estimation for the normal-exponential convolution

    DEFF Research Database (Denmark)

    Silver, Jeremy D; Ritchie, Matthew E; Smyth, Gordon K

    2009-01-01

    exponentially distributed, representing background noise and signal, respectively. Using a saddle-point approximation, Ritchie and others (2007) found normexp to be the best background correction method for 2-color microarray data. This article develops the normexp method further by improving the estimation...... is developed for exact maximum likelihood estimation (MLE) using high-quality optimization software and using the saddle-point estimates as starting values. "MLE" is shown to outperform heuristic estimators proposed by other authors, both in terms of estimation accuracy and in terms of performance on real data...

  8. Development of decision tree software and protein profiling using surface enhanced laser desorption/ionization-time of flight-mass spectrometry (SELDI-TOF-MS) in papillary thyroid cancer

    International Nuclear Information System (INIS)

    Yoon, Joon Kee; An, Young Sil; Park, Bok Nam; Yoon, Seok Nam; Lee, Jun

    2007-01-01

    The aim of this study was to develop a bioinformatics software and to test it in serum samples of papillary thyroid cancer using mass spectrometry (SELDI-TOF-MS). Development of 'Protein analysis' software performing decision tree analysis was done by customizing C4.5. Sixty-one serum samples from 27 papillary thyroid cancer, 17 autoimmune thyroiditis, 17 controls were applied to 2 types of protein chips, CM10 (weak cation exchange) and IMAC3 (metal binding - Cu). Mass spectrometry was performed to reveal the protein expression profiles. Decision trees were generated using 'Protein analysis' software, and automatically detected biomarker candidates. Validation analysis was performed for CM10 chip by random sampling. Decision tree software, which can perform training and validation from profiling data, was developed. For CM10 and IMAC3 chips, 23 of 113 and 8 of 41 protein peaks were significantly different among 3 groups (ρ < 0.05), respectively. Decision tree correctly classified 3 groups with an error rate of 3.3% for CM10 and 2.0% for IMAC3, and 4 and 7 biomarker candidates were detected respectively. In 2 group comparisons, all cancer samples were correctly discriminated from non-cancer samples (error rate = 0%) for CM10 by single node and for IMAC3 by multiple nodes. Validation results from 5 test sets revealed SELDI-TOF-MS and decision tree correctly differentiated cancers from non-cancers (54/55, 98%), while predictability was moderate in 3 group classification (36/55, 65%). Our in-house software was able to successfully build decision trees and detect biomarker candidates, therefore it could be useful for biomarker discovery and clinical follow up of papillary thyroid cancer

  9. Model-integrating software components engineering flexible software systems

    CERN Document Server

    Derakhshanmanesh, Mahdi

    2015-01-01

    In his study, Mahdi Derakhshanmanesh builds on the state of the art in modeling by proposing to integrate models into running software on the component-level without translating them to code. Such so-called model-integrating software exploits all advantages of models: models implicitly support a good separation of concerns, they are self-documenting and thus improve understandability and maintainability and in contrast to model-driven approaches there is no synchronization problem anymore between the models and the code generated from them. Using model-integrating components, software will be

  10. Molecular radiotherapy: The NUKFIT software for calculating the time-integrated activity coefficient

    Energy Technology Data Exchange (ETDEWEB)

    Kletting, P.; Schimmel, S.; Luster, M. [Klinik für Nuklearmedizin, Universität Ulm, Ulm 89081 (Germany); Kestler, H. A. [Research Group Bioinformatics and Systems Biology, Institut für Neuroinformatik, Universität Ulm, Ulm 89081 (Germany); Hänscheid, H.; Fernández, M.; Lassmann, M. [Klinik für Nuklearmedizin, Universität Würzburg, Würzburg 97080 (Germany); Bröer, J. H.; Nosske, D. [Bundesamt für Strahlenschutz, Fachbereich Strahlenschutz und Gesundheit, Oberschleißheim 85764 (Germany); Glatting, G. [Medical Radiation Physics/Radiation Protection, Medical Faculty Mannheim, Heidelberg University, Mannheim 68167 (Germany)

    2013-10-15

    Purpose: Calculation of the time-integrated activity coefficient (residence time) is a crucial step in dosimetry for molecular radiotherapy. However, available software is deficient in that it is either not tailored for the use in molecular radiotherapy and/or does not include all required estimation methods. The aim of this work was therefore the development and programming of an algorithm which allows for an objective and reproducible determination of the time-integrated activity coefficient and its standard error.Methods: The algorithm includes the selection of a set of fitting functions from predefined sums of exponentials and the choice of an error model for the used data. To estimate the values of the adjustable parameters an objective function, depending on the data, the parameters of the error model, the fitting function and (if required and available) Bayesian information, is minimized. To increase reproducibility and user-friendliness the starting values are automatically determined using a combination of curve stripping and random search. Visual inspection, the coefficient of determination, the standard error of the fitted parameters, and the correlation matrix are provided to evaluate the quality of the fit. The functions which are most supported by the data are determined using the corrected Akaike information criterion. The time-integrated activity coefficient is estimated by analytically integrating the fitted functions. Its standard error is determined assuming Gaussian error propagation. The software was implemented using MATLAB.Results: To validate the proper implementation of the objective function and the fit functions, the results of NUKFIT and SAAM numerical, a commercially available software tool, were compared. The automatic search for starting values was successfully tested for reproducibility. The quality criteria applied in conjunction with the Akaike information criterion allowed the selection of suitable functions. Function fit

  11. Software engineering the current practice

    CERN Document Server

    Rajlich, Vaclav

    2011-01-01

    INTRODUCTION History of Software EngineeringSoftware PropertiesOrigins of SoftwareBirth of Software EngineeringThird Paradigm: Iterative ApproachSoftware Life Span ModelsStaged ModelVariants of Staged ModelSoftware Technologies Programming Languages and CompilersObject-Oriented TechnologyVersion Control SystemSoftware ModelsClass DiagramsUML Activity DiagramsClass Dependency Graphs and ContractsSOFTWARE CHANGEIntroduction to Software ChangeCharacteristics of Software ChangePhases of Software ChangeRequirements and Their ElicitationRequirements Analysis and Change InitiationConcepts and Concept

  12. A Quantitative Study of Global Software Development Teams, Requirements, and Software Projects

    Science.gov (United States)

    Parker, Linda L.

    2016-01-01

    The study explored the relationship between global software development teams, effective software requirements, and stakeholders' perception of successful software development projects within the field of information technology management. It examined the critical relationship between Global Software Development (GSD) teams creating effective…

  13. Possibilities and Limitations of Applying Software Reliability Growth Models to Safety- Critical Software

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Jang, Seung Cheol; Ha, Jae Joo

    2006-01-01

    As digital systems are gradually introduced to nuclear power plants (NPPs), the need of quantitatively analyzing the reliability of the digital systems is also increasing. Kang and Sung identified (1) software reliability, (2) common-cause failures (CCFs), and (3) fault coverage as the three most critical factors in the reliability analysis of digital systems. For the estimation of the safety-critical software (the software that is used in safety-critical digital systems), the use of Bayesian Belief Networks (BBNs) seems to be most widely used. The use of BBNs in reliability estimation of safety-critical software is basically a process of indirectly assigning a reliability based on various observed information and experts' opinions. When software testing results or software failure histories are available, we can use a process of directly estimating the reliability of the software using various software reliability growth models such as Jelinski- Moranda model and Goel-Okumoto's nonhomogeneous Poisson process (NHPP) model. Even though it is generally known that software reliability growth models cannot be applied to safety-critical software due to small number of expected failure data from the testing of safety-critical software, we try to find possibilities and corresponding limitations of applying software reliability growth models to safety critical software

  14. Software Assurance Competency Model

    Science.gov (United States)

    2013-03-01

    COTS) software , and software as a service ( SaaS ). L2: Define and analyze risks in the acquisition of contracted software , COTS software , and SaaS ...2010a]: Application of technologies and processes to achieve a required level of confidence that software systems and services function in the...

  15. Final Report: Correctness Tools for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Mellor-Crummey, John [Rice Univ., Houston, TX (United States)

    2014-10-27

    In the course of developing parallel programs for leadership computing systems, subtle programming errors often arise that are extremely difficult to diagnose without tools. To meet this challenge, University of Maryland, the University of Wisconsin—Madison, and Rice University worked to develop lightweight tools to help code developers pinpoint a variety of program correctness errors that plague parallel scientific codes. The aim of this project was to develop software tools that help diagnose program errors including memory leaks, memory access errors, round-off errors, and data races. Research at Rice University focused on developing algorithms and data structures to support efficient monitoring of multithreaded programs for memory access errors and data races. This is a final report about research and development work at Rice University as part of this project.

  16. ESTSC - Software Best Practices

    Science.gov (United States)

    DOE Scientific and Technical Software Best Practices December 2010 Table of Contents 1.0 Introduction 2.0 Responsibilities 2.1 OSTI/ESTSC 2.2 SIACs 2.3 Software Submitting Sites/Creators 2.4 Software Sensitivity Review 3.0 Software Announcement and Submission 3.1 STI Software Appropriate for Announcement 3.2

  17. Agile deployment and code coverage testing metrics of the boot software on-board Solar Orbiter's Energetic Particle Detector

    Science.gov (United States)

    Parra, Pablo; da Silva, Antonio; Polo, Óscar R.; Sánchez, Sebastián

    2018-02-01

    In this day and age, successful embedded critical software needs agile and continuous development and testing procedures. This paper presents the overall testing and code coverage metrics obtained during the unit testing procedure carried out to verify the correctness of the boot software that will run in the Instrument Control Unit (ICU) of the Energetic Particle Detector (EPD) on-board Solar Orbiter. The ICU boot software is a critical part of the project so its verification should be addressed at an early development stage, so any test case missed in this process may affect the quality of the overall on-board software. According to the European Cooperation for Space Standardization ESA standards, testing this kind of critical software must cover 100% of the source code statement and decision paths. This leads to the complete testing of fault tolerance and recovery mechanisms that have to resolve every possible memory corruption or communication error brought about by the space environment. The introduced procedure enables fault injection from the beginning of the development process and enables to fulfill the exigent code coverage demands on the boot software.

  18. Factors that motivate software developers in Nigerian's software ...

    African Journals Online (AJOL)

    It was also observed those courtesy, good reward systems, regular training, recognition, tolerance of mistakes and good leadership were high motivators of software developers. Keywords: Software developers, information technology, project managers, Nigeria International Journal of Natural and Applied Sciences, 6(4): ...

  19. A Translator Verification Technique for FPGA Software Development in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Yeob; Kim, Eui Sub; Yoo, Jun Beom [Konkuk University, Seoul (Korea, Republic of)

    2014-10-15

    Although the FPGAs give a high performance than PLC (Programmable Logic Controller), the platform change from PLC to FPGA impose all PLC software engineers give up their experience, knowledge and practices accumulated over decades, and start a new FPGA-based hardware development from scratch. We have researched to fine the solution to this problem reducing the risk and preserving the experience and knowledge. One solution is to use the FBDtoVerilog translator, which translates the FBD programs into behavior-preserving Verilog programs. In general, the PLCs are usually designed with an FBD, while the FPGAs are described with a HDL (Hardware Description Language) such as Verilog or VHDL. Once PLC designer designed the FBD programs, the FBDtoVerilog translates the FBD into Verilog, mechanically. The designers, therefore, need not consider the rest of FPGA development process (e.g., Synthesis and Place and Routing) and can preserve the accumulated experience and knowledge. Even if we assure that the translation from FBD to Verilog is correct, it must be verified rigorously and thoroughly since it is used in nuclear power plants, which is one of the most safety critical systems. While the designer develops the FPGA software with the FBD program translated by the translator, there are other translation tools such as synthesis tool and place and routing tool. This paper also focuses to verify them rigorously and thoroughly. There are several verification techniques for correctness of translator, but they are hard to apply because of the outrageous cost and performance time. Instead, this paper tries to use an indirect verification technique for demonstrating the correctness of translator using the co-simulation technique. We intend to prove only against specific inputs which are under development for a target I and C system, not against all possible input cases.

  20. A Translator Verification Technique for FPGA Software Development in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kim, Jae Yeob; Kim, Eui Sub; Yoo, Jun Beom

    2014-01-01

    Although the FPGAs give a high performance than PLC (Programmable Logic Controller), the platform change from PLC to FPGA impose all PLC software engineers give up their experience, knowledge and practices accumulated over decades, and start a new FPGA-based hardware development from scratch. We have researched to fine the solution to this problem reducing the risk and preserving the experience and knowledge. One solution is to use the FBDtoVerilog translator, which translates the FBD programs into behavior-preserving Verilog programs. In general, the PLCs are usually designed with an FBD, while the FPGAs are described with a HDL (Hardware Description Language) such as Verilog or VHDL. Once PLC designer designed the FBD programs, the FBDtoVerilog translates the FBD into Verilog, mechanically. The designers, therefore, need not consider the rest of FPGA development process (e.g., Synthesis and Place and Routing) and can preserve the accumulated experience and knowledge. Even if we assure that the translation from FBD to Verilog is correct, it must be verified rigorously and thoroughly since it is used in nuclear power plants, which is one of the most safety critical systems. While the designer develops the FPGA software with the FBD program translated by the translator, there are other translation tools such as synthesis tool and place and routing tool. This paper also focuses to verify them rigorously and thoroughly. There are several verification techniques for correctness of translator, but they are hard to apply because of the outrageous cost and performance time. Instead, this paper tries to use an indirect verification technique for demonstrating the correctness of translator using the co-simulation technique. We intend to prove only against specific inputs which are under development for a target I and C system, not against all possible input cases

  1. Software for Optimizing Quality Assurance of Other Software

    Science.gov (United States)

    Feather, Martin; Cornford, Steven; Menzies, Tim

    2004-01-01

    Software assurance is the planned and systematic set of activities that ensures that software processes and products conform to requirements, standards, and procedures. Examples of such activities are the following: code inspections, unit tests, design reviews, performance analyses, construction of traceability matrices, etc. In practice, software development projects have only limited resources (e.g., schedule, budget, and availability of personnel) to cover the entire development effort, of which assurance is but a part. Projects must therefore select judiciously from among the possible assurance activities. At its heart, this can be viewed as an optimization problem; namely, to determine the allocation of limited resources (time, money, and personnel) to minimize risk or, alternatively, to minimize the resources needed to reduce risk to an acceptable level. The end result of the work reported here is a means to optimize quality-assurance processes used in developing software.

  2. Gammasphere software development

    International Nuclear Information System (INIS)

    Piercey, R.B.

    1994-01-01

    This report describes the activities of the nuclear physics group at Mississippi State University which were performed during 1993. Significant progress has been made in the focus areas: chairing the Gammasphere Software Working Group (SWG); assisting with the porting and enhancement of the ORNL UPAK histogramming software package; and developing standard formats for Gammasphere data products. In addition, they have established a new public ftp archive to distribute software and software development tools and information

  3. Neutron Scattering Software

    Science.gov (United States)

    Home Page | Facilities | Reference | Software | Conferences | Announcements | Mailing Lists Neutron Scattering Banner Neutron Scattering Software A new portal for neutron scattering has just been established sets KUPLOT: data plotting and fitting software ILL/TAS: Matlab probrams for analyzing triple axis data

  4. Techniques for developing reliable and functional materials control and accounting software

    International Nuclear Information System (INIS)

    Barlich, G.

    1988-01-01

    The media has increasingly focused on failures of computer systems resulting in financial, material, and other losses and on systems failing to function as advertised. Unfortunately, such failures with equally disturbing losses are possible in computer systems providing materials control and accounting (MCandA) functions. Major improvements in the reliability and correctness of systems are possible with disciplined design and development techniques applied during software development. This paper describes some of the techniques used in the Safeguard Systems Group at Los Alamos National Laboratory for various MCandA systems

  5. Comparison of monitor units calculated by radiotherapy treatment planning system and an independent monitor unit verification software.

    Science.gov (United States)

    Sellakumar, P; Arun, C; Sanjay, S S; Ramesh, S B

    2011-01-01

    In radiation therapy, the monitor units (MU) needed to deliver a treatment plan are calculated by treatment planning systems (TPS). The essential part of quality assurance is to verify the MU with independent monitor unit calculation to correct any potential errors prior to the start of treatment. In this study, we have compared the MU calculated by TPS and by independent MU verification software. The MU verification software was commissioned and tested for the data integrity to ensure that the correct beam data was considered for MU calculations. The accuracy of the calculations was tested by creating a series of test plans and comparing them with ion chamber measurements. The results show that there is good agreement between the two. The MU difference (MUdiff) between the monitor unit calculations of TPS and independent MU verification system was calculated for 623 fields from 245 patients and was analyzed by treatment site for head & neck, thorax, breast, abdomen and pelvis. The mean MUdiff of -0.838% with a standard deviation of 3.04% was observed for all 623 fields. The site specific standard deviation of MUdiff was as follows: abdomen and pelvis (<1.75%), head & neck (2.5%), thorax (2.32%) and breast (6.01%). The disparities were analyzed and different correction methods were used to reduce the disparity. © 2010 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  6. Novel approaches to assess the quality of fertility data stored in dairy herd management software.

    Science.gov (United States)

    Hermans, K; Waegeman, W; Opsomer, G; Van Ranst, B; De Koster, J; Van Eetvelde, M; Hostens, M

    2017-05-01

    Scientific journals and popular press magazines are littered with articles in which the authors use data from dairy herd management software. Almost none of such papers include data cleaning and data quality assessment in their study design despite this being a very critical step during data mining. This paper presents 2 novel data cleaning methods that permit identification of animals with good and bad data quality. The first method is a deterministic or rule-based data cleaning method. Reproduction and mutation or life-changing events such as birth and death were converted to a symbolic (alphabetical letter) representation and split into triplets (3-letter code). The triplets were manually labeled as physiologically correct, suspicious, or impossible. The deterministic data cleaning method was applied to assess the quality of data stored in dairy herd management from 26 farms enrolled in the herd health management program from the Faculty of Veterinary Medicine Ghent University, Belgium. In total, 150,443 triplets were created, 65.4% were labeled as correct, 17.4% as suspicious, and 17.2% as impossible. The second method, a probabilistic method, uses a machine learning algorithm (random forests) to predict the correctness of fertility and mutation events in an early stage of data cleaning. The prediction accuracy of the random forests algorithm was compared with a classical linear statistical method (penalized logistic regression), outperforming the latter substantially, with a superior receiver operating characteristic curve and a higher accuracy (89 vs. 72%). From those results, we conclude that the triplet method can be used to assess the quality of reproduction data stored in dairy herd management software and that a machine learning technique such as random forests is capable of predicting the correctness of fertility data. Copyright © 2017 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  7. The use of controlling-training software in civil engineering bachelors’ educational process

    Directory of Open Access Journals (Sweden)

    Rekunov Sergey

    2017-01-01

    Full Text Available The paper considers the current state of the higher education system of the Russian Federation in the context of teaching disciplines of strength cycle with the use of information educational technologies. The educational process in the discipline “Structural Mechanics” is shown by using the Controlling-Training Software on the sample program “Statically Determinate Plane Truss”. Such software was developed by employees of the Department of Structural Mechanics of the Institute of Architecture and Civil Engineering of Volgograd State Technical University to heighten student’s interest in calculations of building structures while using computer-based teaching methods, as well as to simplify the existing monitoring procedure. The Controlling-Training Software makes it possible not only to consolidate but also to independently assess the acquired theoretical knowledge. Working with this program displays all necessary information: the calculated scheme, the area of issues, an area with a commentary on the input response. Help files contain a sufficient amount of theoretical material with examples of solutions. Students have several attempts to enter their answer. If they are unsuccessful, then the screen displays the correct answer with visual graphic illustration and explanation of this issue. Carrying out the educational process by use of the Controlling-Training Software resulted in saving a significant amount of academic time.

  8. Software To Go: A Catalog of Software Available for Loan.

    Science.gov (United States)

    Kurlychek, Ken, Comp.

    This catalog lists the holdings of the Software To Go software lending library and clearinghouse for programs and agencies serving students or clients who are deaf or hard of hearing. An introduction describes the clearinghouse and its collection of software, much of it commercial and copyrighted material, for Apple, Macintosh, and IBM (MS-DOS)…

  9. Evaluation of attenuation correction, scatter correction and resolution recovery in myocardial Tc-99m MIBI SPECT

    Energy Technology Data Exchange (ETDEWEB)

    Larcos, G.; Hutton, B.F.; Farlow, D.C.; Campbell- Rodgers, N.; Gruenewald, S.M.; Lau, Y.H. [Westmead Hospital, Westmead, Sydney, NSW (Australia). Departments of Nuclear Medicine and Ultrasound and Medical Physics

    1998-06-01

    Full text: The introduction of transmission based attenuation correction (AC) has increased the diagnostic accuracy of Tc-99m MIBI myocardial perfusion SPECT. The aim of this study is to evaluate recent developments, including scatter correction (SC) and resolution recovery (RR). We reviewed 13 patients who underwent Tc-99m MIBI SPECT (two day protocol) and coronary angiography and 4 manufacturer supplied studies assigned a low pretest likelihood of coronary artery disease (CAD). Patients had a mean age of 59 years (range: 41-78). Data were reconstructed using filtered backprojection (FBP; method 1), maximum likelihood (ML) incorporating AC (method 2), ADAC software using sinogram based SC+RR followed by ML with AC (method 3) and ordered subset ML incorporating AC,SC and RR (method 4). Images were reported by two of three blinded experienced physicians using a standard semiquantitative scoring scheme. Fixed or reversible perfusion defects were considered abnormal; CAD was considered present with stenoses > 50%. Patients had normal coronary anatomy (n=9), single (n=4) or two vessel CAD (n=4) (four in each of LAD, RCA and LCX). There were no statistically significant differences for any combination. Normalcy rate = 100% for all methods. Physicians graded 3/17 (methods 2,4) and 1/17 (method 3) images as fair or poor in quality. Thus, AC or AC+SC+RR produce good quality images in most patients; there is potential for improvement in sensitivity over standard FBP with no significant change in normalcy or specificity

  10. Computer Software Configuration Item-Specific Flight Software Image Transfer Script Generator

    Science.gov (United States)

    Bolen, Kenny; Greenlaw, Ronald

    2010-01-01

    A K-shell UNIX script enables the International Space Station (ISS) Flight Control Team (FCT) operators in NASA s Mission Control Center (MCC) in Houston to transfer an entire or partial computer software configuration item (CSCI) from a flight software compact disk (CD) to the onboard Portable Computer System (PCS). The tool is designed to read the content stored on a flight software CD and generate individual CSCI transfer scripts that are capable of transferring the flight software content in a given subdirectory on the CD to the scratch directory on the PCS. The flight control team can then transfer the flight software from the PCS scratch directory to the Electronically Erasable Programmable Read Only Memory (EEPROM) of an ISS Multiplexer/ Demultiplexer (MDM) via the Indirect File Transfer capability. The individual CSCI scripts and the CSCI Specific Flight Software Image Transfer Script Generator (CFITSG), when executed a second time, will remove all components from their original execution. The tool will identify errors in the transfer process and create logs of the transferred software for the purposes of configuration management.

  11. Software Engineering Improvement Plan

    Science.gov (United States)

    2006-01-01

    In performance of this task order, bd Systems personnel provided support to the Flight Software Branch and the Software Working Group through multiple tasks related to software engineering improvement and to activities of the independent Technical Authority (iTA) Discipline Technical Warrant Holder (DTWH) for software engineering. To ensure that the products, comments, and recommendations complied with customer requirements and the statement of work, bd Systems personnel maintained close coordination with the customer. These personnel performed work in areas such as update of agency requirements and directives database, software effort estimation, software problem reports, a web-based process asset library, miscellaneous documentation review, software system requirements, issue tracking software survey, systems engineering NPR, and project-related reviews. This report contains a summary of the work performed and the accomplishments in each of these areas.

  12. Software failure events derivation and analysis by frame-based technique

    International Nuclear Information System (INIS)

    Huang, H.-W.; Shih, C.; Yih, Swu; Chen, M.-H.

    2007-01-01

    A frame-based technique, including physical frame, logical frame, and cognitive frame, was adopted to perform digital I and C failure events derivation and analysis for generic ABWR. The physical frame was structured with a modified PCTran-ABWR plant simulation code, which was extended and enhanced on the feedwater system, recirculation system, and steam line system. The logical model is structured with MATLAB, which was incorporated into PCTran-ABWR to improve the pressure control system, feedwater control system, recirculation control system, and automated power regulation control system. As a result, the software failure of these digital control systems can be properly simulated and analyzed. The cognitive frame was simulated by the operator awareness status in the scenarios. Moreover, via an internal characteristics tuning technique, the modified PCTran-ABWR can precisely reflect the characteristics of the power-core flow. Hence, in addition to the transient plots, the analysis results can then be demonstrated on the power-core flow map. A number of postulated I and C system software failure events were derived to achieve the dynamic analyses. The basis for event derivation includes the published classification for software anomalies, the digital I and C design data for ABWR, chapter 15 accident analysis of generic SAR, and the reported NPP I and C software failure events. The case study of this research includes: (1) the software CMF analysis for the major digital control systems; and (2) postulated ABWR digital I and C software failure events derivation from the actual happening of non-ABWR digital I and C software failure events, which were reported to LER of USNRC or IRS of IAEA. These events were analyzed by PCTran-ABWR. Conflicts among plant status, computer status, and human cognitive status are successfully identified. The operator might not easily recognize the abnormal condition, because the computer status seems to progress normally. However, a well

  13. Evaluation of Software Quality to Improve Application Performance Using Mc Call Model

    Directory of Open Access Journals (Sweden)

    Inda D Lestantri

    2018-04-01

    Full Text Available The existence of software should have more value to improve the performance of the organization in addition to having the primary function to automate. Before being implemented in an operational environment, software must pass the test gradually to ensure that the software is functioning properly, meeting user needs and providing convenience for users to use it. This test is performed on a web-based application, by taking a test case in an e-SAP application. E-SAP is an application used to monitor teaching and learning activities used by a university in Jakarta. To measure software quality, testing can be done on users randomly. The user samples selected in this test are users with an age range of 18 years old up to 25 years, background information technology. This test was conducted on 30 respondents. This test is done by using Mc Call model. Model of testing Mc Call consists of 11 dimensions are grouped into 3 categories. This paper describes the testing with reference to the category of product operation, which includes 5 dimensions. The dimensions of testing performed include the dimensions of correctness, usability, efficiency, reliability, and integrity. This paper discusses testing on each dimension to measure software quality as an effort to improve performance. The result of research is e-SAP application has good quality with product operation value equal to 85.09%. This indicates that the e-SAP application has a great quality, so this application deserves to be examined in the next stage on the operational environment.

  14. Software quality assurance and software safety in the Biomed Control System

    International Nuclear Information System (INIS)

    Singh, R.P.; Chu, W.T.; Ludewigt, B.A.; Marks, K.M.; Nyman, M.A.; Renner, T.R.; Stradtner, R.

    1989-01-01

    The Biomed Control System is a hardware/software system used for the delivery, measurement and monitoring of heavy-ion beams in the patient treatment and biology experiment rooms in the Bevalac at the Lawrence Berkeley Laboratory (LBL). This paper describes some aspects of this system including historical background philosophy, configuration management, hardware features that facilitate software testing, software testing procedures, the release of new software quality assurance, safety and operator monitoring. 3 refs

  15. Hadronic energy resolution of a highly granular scintillator-steel hadron calorimeter using software compensation techniques

    CERN Document Server

    Adloff, C.; Blaising, J.J.; Drancourt, C.; Espargiliere, A.; Gaglione, R.; Geffroy, N.; Karyotakis, Y.; Prast, J.; Vouters, G.; Francis, K.; Repond, J.; Smith, J.; Xia, L.; Baldolemar, E.; Li, J.; Park, S.T.; Sosebee, M.; White, A.P.; Yu, J.; Buanes, T.; Eigen, G.; Mikami, Y.; Watson, N.K.; Goto, T.; Mavromanolakis, G.; Thomson, M.A.; Ward, D.R.; Yan, W.; Benchekroun, D.; Hoummada, A.; Khoulaki, Y.; Benyamna, M.; Carloganu, C.; Fehr, F.; Gay, P.; Manen, S.; Royer, L.; Blazey, G.C.; Dyshkant, A.; Lima, J.G.R.; Zutshi, V.; Hostachy, J.Y.; Morin, L.; Cornett, U.; David, D.; Falley, G.; Gadow, K.; Gottlicher, P.; Gunter, C.; Hermberg, B.; Karstensen, S.; Krivan, F.; Lucaci-Timoce, A.I.; Lu, S.; Lutz, B.; Morozov, S.; Morgunov, V.; Reinecke, M.; Sefkow, F.; Smirnov, P.; Terwort, M.; Vargas-Trevino, A.; Feege, N.; Garutti, E.; Marchesini, I.; Ramilli, M.; Eckert, P.; Harion, T.; Kaplan, A.; Schultz-Coulon, H.Ch; Shen, W.; Stamen, R.; Tadday, A.; Bilki, B.; Norbeck, E.; Onel, Y.; Wilson, G.W.; Kawagoe, K.; Dauncey, P.D.; Magnan, A.M.; Wing, M.; Salvatore, F.; Calvo Alamillo, E.; Fouz, M.C.; Puerta-Pelayo, J.; Balagura, V.; Bobchenko, B.; Chadeeva, M.; Danilov, M.; Epifantsev, A.; Markin, O.; Mizuk, R.; Novikov, E.; Rusinov, V.; Tarkovsky, E.; Kirikova, N.; Kozlov, V.; Smirnov, P.; Soloviev, Y.; Buzhan, P.; Dolgoshein, B.; Ilyin, A.; Kantserov, V.; Kaplin, V.; Karakash, A.; Popova, E.; Smirnov, S.; Kiesling, C.; Pfau, S.; Seidel, K.; Simon, F.; Soldner, C.; Szalay, M.; Tesar, M.; Weuste, L.; Bonis, J.; Bouquet, B.; Callier, S.; Cornebise, P.; Doublet, Ph; Dulucq, F.; Faucci Giannelli, M.; Fleury, J.; Li, H.; Martin-Chassard, G.; Richard, F.; de la Taille, Ch.; Poschl, R.; Raux, L.; Seguin-Moreau, N.; Wicek, F.; Anduze, M.; Boudry, V.; Brient, J.C.; Jeans, D.; Mora de Freitas, P.; Musat, G.; Reinhard, M.; Ruan, M.; Videau, H.; Bulanek, B.; Zacek, J.; Cvach, J.; Gallus, P.; Havranek, M.; Janata, M.; Kvasnicka, J.; Lednicky, D.; Marcisovsky, M.; Polak, I.; Popule, J.; Tomasek, L.; Tomasek, M.; Ruzicka, P.; Sicho, P.; Smolik, J.; Vrba, V.; Zalesak, J.; Belhorma, B.; Ghazlane, H.; Takeshita, T.; Uozumi, S.; Sauer, J.; Weber, S.; Zeitnitz, C.

    2012-01-01

    SPS. The energy resolution for single hadrons is determined to be approximately 58%/ √E/GeV. This resolution is improved to approximately 45%/ √E/GeV with software compensation techniques. These techniques take advantage of the event-by-event information about the substructure of hadronic showers which is provided by the imaging capabilities of the calorimeter. The energy reconstruction is improved either with corrections based on the local energy density or by applying a single correction factor to the event energy sum derived from a global measure of the shower energy density. The application of the compensation algorithms to GEANT4 simulations yield resolution improvements comparable to those observed for real data.

  16. The dynamic of modern software development project management and the software crisis of quality. An integrated system dynamics approach towards software quality improvement

    OpenAIRE

    Nasirikaljahi, Armindokht

    2012-01-01

    The software industry is plagued by cost-overruns, delays, poor customer satisfaction and quality issues that are costing clients and customers world-wide billions of dollars each year. The phenomenon is coined The Software Crisis", and poses a huge challenge for software project management. This thesis addresses one of the core issues of the software crisis, namely software quality. The challenges of software quality are central for understanding the other symptoms of the software crisis. Th...

  17. Software architecture 2

    CERN Document Server

    Oussalah, Mourad Chabanne

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural templa

  18. Software architecture 1

    CERN Document Server

    Oussalah , Mourad Chabane

    2014-01-01

    Over the past 20 years, software architectures have significantly contributed to the development of complex and distributed systems. Nowadays, it is recognized that one of the critical problems in the design and development of any complex software system is its architecture, i.e. the organization of its architectural elements. Software Architecture presents the software architecture paradigms based on objects, components, services and models, as well as the various architectural techniques and methods, the analysis of architectural qualities, models of representation of architectural template

  19. NASA's Software Safety Standard

    Science.gov (United States)

    Ramsay, Christopher M.

    2007-01-01

    NASA relies more and more on software to control, monitor, and verify its safety critical systems, facilities and operations. Since the 1960's there has hardly been a spacecraft launched that does not have a computer on board that will provide command and control services. There have been recent incidents where software has played a role in high-profile mission failures and hazardous incidents. For example, the Mars Orbiter, Mars Polar Lander, the DART (Demonstration of Autonomous Rendezvous Technology), and MER (Mars Exploration Rover) Spirit anomalies were all caused or contributed to by software. The Mission Control Centers for the Shuttle, ISS, and unmanned programs are highly dependant on software for data displays, analysis, and mission planning. Despite this growing dependence on software control and monitoring, there has been little to no consistent application of software safety practices and methodology to NASA's projects with safety critical software. Meanwhile, academia and private industry have been stepping forward with procedures and standards for safety critical systems and software, for example Dr. Nancy Leveson's book Safeware: System Safety and Computers. The NASA Software Safety Standard, originally published in 1997, was widely ignored due to its complexity and poor organization. It also focused on concepts rather than definite procedural requirements organized around a software project lifecycle. Led by NASA Headquarters Office of Safety and Mission Assurance, the NASA Software Safety Standard has recently undergone a significant update. This new standard provides the procedures and guidelines for evaluating a project for safety criticality and then lays out the minimum project lifecycle requirements to assure the software is created, operated, and maintained in the safest possible manner. This update of the standard clearly delineates the minimum set of software safety requirements for a project without detailing the implementation for those

  20. Software requirements

    CERN Document Server

    Wiegers, Karl E

    2003-01-01

    Without formal, verifiable software requirements-and an effective system for managing them-the programs that developers think they've agreed to build often will not be the same products their customers are expecting. In SOFTWARE REQUIREMENTS, Second Edition, requirements engineering authority Karl Wiegers amplifies the best practices presented in his original award-winning text?now a mainstay for anyone participating in the software development process. In this book, you'll discover effective techniques for managing the requirements engineering process all the way through the development cy

  1. Software measurement standards for areal surface texture parameters: part 2—comparison of software

    International Nuclear Information System (INIS)

    Harris, P M; Smith, I M; Giusca, C; Leach, R K; Wang, C

    2012-01-01

    A companion paper in this issue describes reference software for the evaluation of areal surface texture parameters, focusing on the definitions of the parameters and giving details of the numerical algorithms employed in the software to implement those definitions. The reference software is used as a benchmark against which software in a measuring instrument can be compared. A data set is used as input to both the software under test and the reference software, and the results delivered by the software under test are compared with those provided by the reference software. This paper presents a comparison of the results returned by the reference software with those reported by proprietary software for surface texture measurement. Differences between the results can be used to identify where algorithms and software for evaluating the parameters differ. They might also be helpful in identifying where parameters are not sufficiently well-defined in standards. (paper)

  2. Software: our quest for excellence. Honoring 50 years of software history, progress, and process

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-06-01

    The Software Quality Forum was established by the Software Quality Assurance (SQA) Subcommittee, which serves as a technical advisory group on software engineering and quality initiatives and issues for DOE`s quality managers. The forum serves as an opportunity for all those involved in implementing SQA programs to meet and share ideas and concerns. Participation from managers, quality engineers, and software professionals provides an ideal environment for identifying and discussing issues and concerns. The interaction provided by the forum contributes to the realization of a shared goal--high quality software product. Topics include: testing, software measurement, software surety, software reliability, SQA practices, assessments, software process improvement, certification and licensing of software professionals, CASE tools, software project management, inspections, and management`s role in ensuring SQA. The bulk of this document consists of vugraphs. Selected papers have been indexed separately for inclusion in the Energy Science and Technology Database.

  3. Software Architecture Evolution

    Science.gov (United States)

    Barnes, Jeffrey M.

    2013-01-01

    Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…

  4. Design, implementation and verification of software code for radiation dose assessment based on simple generic environmental model

    International Nuclear Information System (INIS)

    I Putu Susila; Arif Yuniarto

    2017-01-01

    Radiation dose assessment to determine the potential of radiological impacts of various installations within nuclear facility complex is necessary to ensure environmental and public safety. A simple generic model-based method for calculating radiation doses caused by the release of radioactive substances into the environment has been published by the International Atomic Energy Agency (IAEA) as the Safety Report Series No. 19 (SRS-19). In order to assist the application of the assessment method and a basis for the development of more complex assessment methods, an open-source based software code has been designed and implemented. The software comes with maps and is very easy to be used because assessment scenarios can be done through diagrams. Software verification was performed by comparing its result to SRS-19 and CROM software calculation results. Dose estimated by SRS-19 are higher compared to the result of developed software. However, these are still acceptable since dose estimation in SRS-19 is based on conservative approach. On the other hand, compared to CROM software, the same results for three scenarios and a non-significant difference of 2.25 % in another scenario were obtained. These results indicate the correctness of our implementation and implies that the developed software is ready for use in real scenario. In the future, the addition of various features and development of new model need to be done to improve the capability of software that has been developed. (author)

  5. Validation procedures of software applied in nuclear instruments. Proceedings of a technical meeting

    International Nuclear Information System (INIS)

    2007-09-01

    The IAEA has supported the availability of well functioning nuclear instruments in Member States over more than three decades. Some older or aged instruments are still being used and are still in good working condition. However, those instruments may not meet modern software requirements for the end-user in all cases. Therefore, Member States, mostly those with emerging economies, modernize/refurbish such instruments to meet the end-user demands. New advanced software is not only applied in case of new instrumentation, but often also for new and improved applications of modernized and/or refurbished instruments in many Member States for which in few cases the IAEA also provided support. Modern software applied in nuclear instrumentation plays a key role for their safe operation and execution of commands in a user friendly manner. Correct data handling and transfer has to be ensured. Additional features such as data visualization, interfacing to PC for control and data storage are often included. To finalize the task, where new instrumentation which is not commercially available is used, or aged instruments are modernized/refurbished, the applied software has to be verified and validated. A Technical Meeting on 'Validation Procedures of Software Applied in Nuclear Instruments' was organized in Vienna, 20-23 November 2006, to discuss the verification and validation process of software applied to operation and use of nuclear instruments. The presentations at the technical meeting included valuable information, which has been compiled and summarized in this publication, which should be useful for technical staff in Member States when modernizing/refurbishing nuclear instruments. 22 experts in the field of modernization/refurbishment of nuclear instruments as well as users of applied software presented their latest results. Discussion sessions followed the presentations. This publication is the outcome of deliberations during the meeting

  6. Software Engineering Laboratory Series: Proceedings of the Twentieth Annual Software Engineering Workshop

    Science.gov (United States)

    1995-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  7. Model-driven software engineering

    NARCIS (Netherlands)

    Amstel, van M.F.; Brand, van den M.G.J.; Protic, Z.; Verhoeff, T.; Hamberg, R.; Verriet, J.

    2014-01-01

    Software plays an important role in designing and operating warehouses. However, traditional software engineering methods for designing warehouse software are not able to cope with the complexity, size, and increase of automation in modern warehouses. This chapter describes Model-Driven Software

  8. Improving Software Developer's Competence

    DEFF Research Database (Denmark)

    Abrahamsson, Pekka; Kautz, Karlheinz; Sieppi, Heikki

    2002-01-01

    Emerging agile software development methods are people oriented development approaches to be used by the software industry. The personal software process (PSP) is an accepted method for improving the capabilities of a single software engineer. Five original hypotheses regarding the impact...

  9. Computing and software

    Directory of Open Access Journals (Sweden)

    White, G. C.

    2004-06-01

    Full Text Available The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methods available. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them. In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented. Rotella et al. (2004 compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004. Efford et al. (2004 present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years. Barker & White (2004 discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine

  10. Software Defined Networking Demands on Software Technologies

    DEFF Research Database (Denmark)

    Galinac Grbac, T.; Caba, Cosmin Marius; Soler, José

    2015-01-01

    Software Defined Networking (SDN) is a networking approach based on a centralized control plane architecture with standardised interfaces between control and data planes. SDN enables fast configuration and reconfiguration of the network to enhance resource utilization and service performances....... This new approach enables a more dynamic and flexible network, which may adapt to user needs and application requirements. To this end, systemized solutions must be implemented in network software, aiming to provide secure network services that meet the required service performance levels. In this paper......, we review this new approach to networking from an architectural point of view, and identify and discuss some critical quality issues that require new developments in software technologies. These issues we discuss along with use case scenarios. Here in this paper we aim to identify challenges...

  11. Software quality assurance

    CERN Document Server

    Laporte, Claude Y

    2018-01-01

    This book introduces Software Quality Assurance (SQA) and provides an overview of standards used to implement SQA. It defines ways to assess the effectiveness of how one approaches software quality across key industry sectors such as telecommunications, transport, defense, and aerospace. * Includes supplementary website with an instructor's guide and solutions * Applies IEEE software standards as well as the Capability Maturity Model Integration for Development (CMMI) * Illustrates the application of software quality assurance practices through the use of practical examples, quotes from experts, and tips from the authors

  12. Global Software Engineering

    DEFF Research Database (Denmark)

    Ebert, Christof; Kuhrmann, Marco; Prikladnicki, Rafael

    2016-01-01

    Professional software products and IT systems and services today are developed mostly by globally distributed teams, projects, and companies. Successfully orchestrating Global Software Engineering (GSE) has become the major success factor both for organizations and practitioners. Yet, more than...... and experience reported at the IEEE International Conference on Software Engineering (ICGSE) series. The outcomes of our analysis show GSE as a field highly attached to industry and, thus, a considerable share of ICGSE papers address the transfer of Software Engineering concepts and solutions to the global stage...

  13. FMT (Flight Software Memory Tracker) For Cassini Spacecraft-Software Engineering Using JAVA

    Science.gov (United States)

    Kan, Edwin P.; Uffelman, Hal; Wax, Allan H.

    1997-01-01

    The software engineering design of the Flight Software Memory Tracker (FMT) Tool is discussed in this paper. FMT is a ground analysis software set, consisting of utilities and procedures, designed to track the flight software, i.e., images of memory load and updatable parameters of the computers on-board Cassini spacecraft. FMT is implemented in Java.

  14. Software architecture evolution

    DEFF Research Database (Denmark)

    Barais, Olivier; Le Meur, Anne-Francoise; Duchien, Laurence

    2008-01-01

    Software architectures must frequently evolve to cope with changing requirements, and this evolution often implies integrating new concerns. Unfortunately, when the new concerns are crosscutting, existing architecture description languages provide little or no support for this kind of evolution....... The software architect must modify multiple elements of the architecture manually, which risks introducing inconsistencies. This chapter provides an overview, comparison and detailed treatment of the various state-of-the-art approaches to describing and evolving software architectures. Furthermore, we discuss...... one particular framework named Tran SAT, which addresses the above problems of software architecture evolution. Tran SAT provides a new element in the software architecture descriptions language, called an architectural aspect, for describing new concerns and their integration into an existing...

  15. A Formal Approach to the Provably Correct Synthesis of Mission Critical Embedded Software for Multi Core Embedded Platforms

    Science.gov (United States)

    2014-04-01

    synchronization primitives based on preset templates can result in over synchronization if unchecked, possibly creating deadlock situations. Further...inputs rather than enforcing synchronization with a global clock. MRICDF models software as a network of communicating actors. Four primitive actors...control wants to send interrupt or not. Since this is shared buffer, a semaphore mechanism is assumed to synchronize the read/write of this buffer. The

  16. Secure Software Configuration Management Processes for nuclear safety software development environment

    International Nuclear Information System (INIS)

    Chou, I.-Hsin

    2011-01-01

    Highlights: → The proposed method emphasizes platform-independent security processes. → A hybrid process based on the nuclear SCM and security regulations is proposed. → Detailed descriptions and Process Flow Diagram are useful for software developers. - Abstract: The main difference between nuclear and generic software is that the risk factor is infinitely greater in nuclear software - if there is a malfunction in the safety system, it can result in significant economic loss, physical damage or threat to human life. However, secure software development environment have often been ignored in the nuclear industry. In response to the terrorist attacks on September 11, 2001, the US Nuclear Regulatory Commission (USNRC) revised the Regulatory Guide (RG 1.152-2006) 'Criteria for use of computers in safety systems of nuclear power plants' to provide specific security guidance throughout the software development life cycle. Software Configuration Management (SCM) is an essential discipline in the software development environment. SCM involves identifying configuration items, controlling changes to those items, and maintaining integrity and traceability of them. For securing the nuclear safety software, this paper proposes a Secure SCM Processes (S 2 CMP) which infuses regulatory security requirements into proposed SCM processes. Furthermore, a Process Flow Diagram (PFD) is adopted to describe S 2 CMP, which is intended to enhance the communication between regulators and developers.

  17. Exploring the organizational impact of software-as-a-Service on software vendors the role of organizational integration in software-as-a-Service development and operation

    CERN Document Server

    Stuckenberg, Sebastian

    2014-01-01

    Software-as-a-Service has gained momentum as a software delivery and pricing model within the software industry. Existing practices of software vendors are challenged by a potential paradigm shift. This book analyzes the implications of Software-as-a-Service on software vendors using a business model and value chain perspective. The analysis of qualitative data from software vendors highlights the role of organizational integration within software vendors. By providing insights regarding the impact of Software-as-a-Service on organizational structures and processes of software vendors, this st

  18. Package-based software development

    NARCIS (Netherlands)

    Jonge, de M.; Chroust, G.; Hofer, C.

    2003-01-01

    The main goal of component-based software engineering is to decrease development time and development costs of software systems, by reusing prefabricated building blocks. Here we focus on software reuse within the implementation of such component-based applications, and on the corresponding software

  19. ACTS: from ATLAS software towards a common track reconstruction software

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00349786; The ATLAS collaboration; Salzburger, Andreas; Kiehn, Moritz; Hrdinka, Julia; Calace, Noemi

    2017-01-01

    Reconstruction of charged particles' trajectories is a crucial task for most particle physics experiments. The high instantaneous luminosity achieved at the LHC leads to a high number of proton-proton collisions per bunch crossing, which has put the track reconstruction software of the LHC experiments through a thorough test. Preserving track reconstruction performance under increasingly difficult experimental conditions, while keeping the usage of computational resources at a reasonable level, is an inherent problem for many HEP experiments. Exploiting concurrent algorithms and using multivariate techniques for track identification are the primary strategies to achieve that goal. Starting from current ATLAS software, the ACTS project aims to encapsulate track reconstruction software into a generic, framework- and experiment-independent software package. It provides a set of high-level algorithms and data structures for performing track reconstruction tasks as well as fast track simulation. The software is de...

  20. Computer software configuration description, 241-AY and 241-AZ tank farm MICON automation system

    International Nuclear Information System (INIS)

    Winkelman, W.D.

    1998-01-01

    This document describes the configuration process, choices and conventions used during the Micon DCS configuration activities, and issues involved in making changes to the configuration. Includes the master listings of the Tag definitions, which should be revised to authorize any changes. Revision 3 provides additional information on the software used to provide communications with the W-320 project and incorporates minor changes to ensure the document alarm setpoint priorities correctly match operational expectations

  1. Filter Paper: Solution to High Self-Attenuation Corrections in HEPA Filter Measurements

    International Nuclear Information System (INIS)

    Oberer, R.B.; Harold, N.B.; Gunn, C.A.; Brummett, M.; Chaing, L.G.

    2005-01-01

    An 8 by 8 by 6 inch High Efficiency Particulate Air (HEPA) filter was measured as part of a uranium holdup survey in June of 2005 as it has been routinely measured every two months since 1998. Although the survey relies on gross gamma count measurements, this was one of a few measurements that had been converted to a quantitative measurement in 1998. The measurement was analyzed using the traditional Generalized Geometry Holdup (GGH) approach, using HMS3 software, with an area calibration and self-attenuation corrected with an empirical correction factor of 1.06. A result of 172 grams of 235 U was reported. The actual quantity of 235 U in the filter was approximately 1700g. Because of this unusually large discrepancy, the measurement of HEPA filters will be discussed. Various techniques for measuring HEPA filters will be described using the measurement of a 24 by 24 by 12 inch HEPA filter as an example. A new method to correct for self attenuation will be proposed for this measurement Following the discussion of the 24 by 24 by 12 inch HEPA filter, the measurement of the 8 by 8 by 6 inch will be discussed in detail

  2. Development of decision tree software and protein profiling using surface enhanced laser desorption/ionization-time of flight-mass spectrometry (SELDI-TOF-MS) in papillary thyroid cancer

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Joon Kee; An, Young Sil; Park, Bok Nam; Yoon, Seok Nam [Ajou University School of Medicine, Suwon (Korea, Republic of); Lee, Jun [Konkuk University, Seoul (Korea, Republic of)

    2007-08-15

    The aim of this study was to develop a bioinformatics software and to test it in serum samples of papillary thyroid cancer using mass spectrometry (SELDI-TOF-MS). Development of 'Protein analysis' software performing decision tree analysis was done by customizing C4.5. Sixty-one serum samples from 27 papillary thyroid cancer, 17 autoimmune thyroiditis, 17 controls were applied to 2 types of protein chips, CM10 (weak cation exchange) and IMAC3 (metal binding - Cu). Mass spectrometry was performed to reveal the protein expression profiles. Decision trees were generated using 'Protein analysis' software, and automatically detected biomarker candidates. Validation analysis was performed for CM10 chip by random sampling. Decision tree software, which can perform training and validation from profiling data, was developed. For CM10 and IMAC3 chips, 23 of 113 and 8 of 41 protein peaks were significantly different among 3 groups ({rho} < 0.05), respectively. Decision tree correctly classified 3 groups with an error rate of 3.3% for CM10 and 2.0% for IMAC3, and 4 and 7 biomarker candidates were detected respectively. In 2 group comparisons, all cancer samples were correctly discriminated from non-cancer samples (error rate = 0%) for CM10 by single node and for IMAC3 by multiple nodes. Validation results from 5 test sets revealed SELDI-TOF-MS and decision tree correctly differentiated cancers from non-cancers (54/55, 98%), while predictability was moderate in 3 group classification (36/55, 65%). Our in-house software was able to successfully build decision trees and detect biomarker candidates, therefore it could be useful for biomarker discovery and clinical follow up of papillary thyroid cancer.

  3. Happy software developers solve problems better: psychological measurements in empirical software engineering.

    Science.gov (United States)

    Graziotin, Daniel; Wang, Xiaofeng; Abrahamsson, Pekka

    2014-01-01

    For more than thirty years, it has been claimed that a way to improve software developers' productivity and software quality is to focus on people and to provide incentives to make developers satisfied and happy. This claim has rarely been verified in software engineering research, which faces an additional challenge in comparison to more traditional engineering fields: software development is an intellectual activity and is dominated by often-neglected human factors (called human aspects in software engineering research). Among the many skills required for software development, developers must possess high analytical problem-solving skills and creativity for the software construction process. According to psychology research, affective states-emotions and moods-deeply influence the cognitive processing abilities and performance of workers, including creativity and analytical problem solving. Nonetheless, little research has investigated the correlation between the affective states, creativity, and analytical problem-solving performance of programmers. This article echoes the call to employ psychological measurements in software engineering research. We report a study with 42 participants to investigate the relationship between the affective states, creativity, and analytical problem-solving skills of software developers. The results offer support for the claim that happy developers are indeed better problem solvers in terms of their analytical abilities. The following contributions are made by this study: (1) providing a better understanding of the impact of affective states on the creativity and analytical problem-solving capacities of developers, (2) introducing and validating psychological measurements, theories, and concepts of affective states, creativity, and analytical-problem-solving skills in empirical software engineering, and (3) raising the need for studying the human factors of software engineering by employing a multidisciplinary viewpoint.

  4. Predicting the sparticle spectrum from GUTs via SUSY threshold corrections with SusyTC

    Energy Technology Data Exchange (ETDEWEB)

    Antusch, Stefan [Department of Physics, University of Basel,Klingelbergstr. 82, CH-4056 Basel (Switzerland); Max-Planck-Institut für Physik (Werner-Heisenberg-Institut),Föhringer Ring 6, D-80805 München (Germany); Sluka, Constantin [Department of Physics, University of Basel,Klingelbergstr. 82, CH-4056 Basel (Switzerland)

    2016-07-21

    Grand Unified Theories (GUTs) can feature predictions for the ratios of quark and lepton Yukawa couplings at high energy, which can be tested with the increasingly precise results for the fermion masses, given at low energies. To perform such tests, the renormalization group (RG) running has to be performed with sufficient accuracy. In supersymmetric (SUSY) theories, the one-loop threshold corrections (TC) are of particular importance and, since they affect the quark-lepton mass relations, link a given GUT flavour model to the sparticle spectrum. To accurately study such predictions, we extend and generalize various formulas in the literature which are needed for a precision analysis of SUSY flavour GUT models. We introduce the new software tool SusyTC, a major extension to the Mathematica package REAP http://dx.doi.org/10.1088/1126-6708/2005/03/024, where these formulas are implemented. SusyTC extends the functionality of REAP by a full inclusion of the (complex) MSSM SUSY sector and a careful calculation of the one-loop SUSY threshold corrections for the full down-type quark, up-type quark and charged lepton Yukawa coupling matrices in the electroweak-unbroken phase. Among other useful features, SusyTC calculates the one-loop corrected pole mass of the charged (or the CP-odd) Higgs boson as well as provides output in SLHA conventions, i.e. the necessary input for external software, e.g. for performing a two-loop Higgs mass calculation. We apply SusyTC to study the predictions for the parameters of the CMSSM (mSUGRA) SUSY scenario from the set of GUT scale Yukawa relations ((y{sub e})/(y{sub d}))=−(1/2), ((y{sub μ})/(y{sub s}))=6, and ((y{sub τ})/(y{sub b}))=−(3/2), which has been proposed recently in the context of SUSY GUT flavour models.

  5. Software development processes and analysis software: a mismatch and a novel framework

    International Nuclear Information System (INIS)

    Kelly, D.; Harauz, J.

    2011-01-01

    This paper discusses the salient characteristics of analysis software and the impact of those characteristics on its development. From this discussion, it can be seen that mainstream software development processes, usually characterized as Plan Driven or Agile, are built upon assumptions that are mismatched to the development and maintenance of analysis software. We propose a novel software development framework that would match the process normally observed in the development of analysis software. In the discussion of this framework, we suggest areas of research and directions for future work. (author)

  6. Software Innovation

    DEFF Research Database (Denmark)

    Rose, Jeremy

      Innovation is the forgotten key to modern systems development - the element that defines the enterprising engineer, the thriving software firm and the cutting edge software application.  Traditional forms of technical education pay little attention to creativity - often encouraging overly...

  7. The laws of software process a new model for the production and management of software

    CERN Document Server

    Armour, Phillip G

    2003-01-01

    The Nature of Software and The Laws of Software ProcessA Brief History of KnowledgeThe Characteristics of Knowledge Storage MediaThe Nature of Software DevelopmentThe Laws of Software Process and the Five Orders of IgnoranceThe Laws of Software ProcessThe First Law of Software ProcessThe Corollary to the First Law of Software ProcessThe Reflexive Creation of Systems and ProcessesThe Lemma of Eternal LatenessThe Second Law of Software ProcessThe Rule of Process BifurcationThe Dual Hypotheses of Knowledge DiscoveryArmour's Observation on Software ProcessThe Third Law of Software Process (also kn

  8. Analysis of mice tumor models using dynamic MRI data and a dedicated software platform

    Energy Technology Data Exchange (ETDEWEB)

    Alfke, H.; Maurer, E.; Klose, K.J. [Philipps Univ. Marburg (Germany). Dept. of Radiology; Kohle, S.; Rascher-Friesenhausen, R.; Behrens, S.; Peitgen, H.O. [MeVis - Center for Medical Diagnostic Systems and Visualization, Bremen (Germany); Celik, I. [Philipps Univ. Marburg (Germany). Inst. for Theoretical Surgery; Heverhagen, J.T. [Philipps Univ. Marburg (Germany). Dept. of Radiology; Ohio State Univ., Columbus (United States). Dept. of Radiology

    2004-09-01

    Purpose: To implement a software platform (DynaVision) dedicated to analyze data from functional imaging of tumors with different mathematical approaches, and to test the software platform in pancreatic carcinoma xenografts in mice with severe combined immunodeficiency disease (SCID). Materials and Methods: A software program was developed for extraction and visualization of tissue perfusion parameters from dynamic contrast-enhanced images. This includes regional parameter calculation from enhancement curves, parametric images (e.g., blood flow), animation, 3D visualization, two-compartment modeling a mode for comparing different datasets (e.g., therapy monitoring), and motion correction. We analyzed xenograft tumors from two pancreatic carcinoma cell lines (B x PC3 and ASPC1) implanted in 14 SCID mice after injection of Gd-DTPA into the tail vein. These data were correlated with histopathological findings. Results: Image analysis was completed in approximately 15 minutes per data set. The possibility of drawing and editing ROIs within the whole data set makes it easy to obtain quantitative data from the intensity-time curves. In one animal, motion artifacts reduced the image quality to a greater extent but data analysis was still possible after motion correction. Dynamic MRI of mice tumor models revealed a highly heterogeneous distribution of the contrast-enhancement curves and derived parameters, which correlated with differences in histopathology. ASPc1 tumors showed a more hypervascular type of curves with faster and higher signal enhancement rate (wash-in) and a faster signal decrease (wash-out). BXPC3 tumors showed a more hypovascular type with slower wash-in and wash-out. This correlated with the biological properties of the tumors. (orig.)

  9. Analysis of mice tumor models using dynamic MRI data and a dedicated software platform

    International Nuclear Information System (INIS)

    Alfke, H.; Maurer, E.; Klose, K.J.; Celik, I.; Heverhagen, J.T.; Ohio State Univ., Columbus

    2004-01-01

    Purpose: To implement a software platform (DynaVision) dedicated to analyze data from functional imaging of tumors with different mathematical approaches, and to test the software platform in pancreatic carcinoma xenografts in mice with severe combined immunodeficiency disease (SCID). Materials and Methods: A software program was developed for extraction and visualization of tissue perfusion parameters from dynamic contrast-enhanced images. This includes regional parameter calculation from enhancement curves, parametric images (e.g., blood flow), animation, 3D visualization, two-compartment modeling a mode for comparing different datasets (e.g., therapy monitoring), and motion correction. We analyzed xenograft tumors from two pancreatic carcinoma cell lines (B x PC3 and ASPC1) implanted in 14 SCID mice after injection of Gd-DTPA into the tail vein. These data were correlated with histopathological findings. Results: Image analysis was completed in approximately 15 minutes per data set. The possibility of drawing and editing ROIs within the whole data set makes it easy to obtain quantitative data from the intensity-time curves. In one animal, motion artifacts reduced the image quality to a greater extent but data analysis was still possible after motion correction. Dynamic MRI of mice tumor models revealed a highly heterogeneous distribution of the contrast-enhancement curves and derived parameters, which correlated with differences in histopathology. ASPc1 tumors showed a more hypervascular type of curves with faster and higher signal enhancement rate (wash-in) and a faster signal decrease (wash-out). BXPC3 tumors showed a more hypovascular type with slower wash-in and wash-out. This correlated with the biological properties of the tumors. (orig.)

  10. Software Engineering Education Directory

    Science.gov (United States)

    1990-04-01

    and Engineering (CMSC 735) Codes: GPEV2 * Textiooks: IEEE Tutoria on Models and Metrics for Software Management and Engameeing by Basi, Victor R...Software Engineering (Comp 227) Codes: GPRY5 Textbooks: IEEE Tutoria on Software Design Techniques by Freeman, Peter and Wasserman, Anthony 1. Software

  11. Agile software development

    CERN Document Server

    Dingsoyr, Torgeir; Moe, Nils Brede

    2010-01-01

    Agile software development has become an umbrella term for a number of changes in how software developers plan and coordinate their work, how they communicate with customers and external stakeholders, and how software development is organized in small, medium, and large companies, from the telecom and healthcare sectors to games and interactive media. Still, after a decade of research, agile software development is the source of continued debate due to its multifaceted nature and insufficient synthesis of research results. Dingsoyr, Dyba, and Moe now present a comprehensive snapshot of the kno

  12. Integrated software health management for aerospace guidance, navigation, and control systems: A probabilistic reasoning approach

    Science.gov (United States)

    Mbaya, Timmy

    Embedded Aerospace Systems have to perform safety and mission critical operations in a real-time environment where timing and functional correctness are extremely important. Guidance, Navigation, and Control (GN&C) systems substantially rely on complex software interfacing with hardware in real-time; any faults in software or hardware, or their interaction could result in fatal consequences. Integrated Software Health Management (ISWHM) provides an approach for detection and diagnosis of software failures while the software is in operation. The ISWHM approach is based on probabilistic modeling of software and hardware sensors using a Bayesian network. To meet memory and timing constraints of real-time embedded execution, the Bayesian network is compiled into an Arithmetic Circuit, which is used for on-line monitoring. This type of system monitoring, using an ISWHM, provides automated reasoning capabilities that compute diagnoses in a timely manner when failures occur. This reasoning capability enables time-critical mitigating decisions and relieves the human agent from the time-consuming and arduous task of foraging through a multitude of isolated---and often contradictory---diagnosis data. For the purpose of demonstrating the relevance of ISWHM, modeling and reasoning is performed on a simple simulated aerospace system running on a real-time operating system emulator, the OSEK/Trampoline platform. Models for a small satellite and an F-16 fighter jet GN&C (Guidance, Navigation, and Control) system have been implemented. Analysis of the ISWHM is then performed by injecting faults and analyzing the ISWHM's diagnoses.

  13. Evolving software products, the design of a water-related modeling software ecosystem

    DEFF Research Database (Denmark)

    Manikas, Konstantinos

    2017-01-01

    more than 50 years ago. However, a radical change of software products to evolve both in the software engineering as much as the organizational and business aspects in a disruptive manner are rather rare. In this paper, we report on the transformation of one of the market leader product series in water......-related calculation and modeling from a traditional business-as-usual series of products to an evolutionary software ecosystem. We do so by relying on existing concepts on software ecosystem analysis to analyze the future ecosystem. We report and elaborate on the main focus points necessary for this transition. We...... argue for the generalization of our focus points to the transition from traditional business-as-usual software products to software ecosystems....

  14. Views on Software Testability

    OpenAIRE

    Shimeall, Timothy; Friedman, Michael; Chilenski, John; Voas, Jeffrey

    1994-01-01

    The field of testability is an active, well-established part of engineering of modern computer systems. However, only recently have technologies for software testability began to be developed. These technologies focus on accessing the aspects of software that improve or depreciate the ease of testing. As both the size of implemented software and the amount of effort required to test that software increase, so will the important of software testability technologies in influencing the softwa...

  15. Self-assembling software generator

    Science.gov (United States)

    Bouchard, Ann M [Albuquerque, NM; Osbourn, Gordon C [Albuquerque, NM

    2011-11-25

    A technique to generate an executable task includes inspecting a task specification data structure to determine what software entities are to be generated to create the executable task, inspecting the task specification data structure to determine how the software entities will be linked after generating the software entities, inspecting the task specification data structure to determine logic to be executed by the software entities, and generating the software entities to create the executable task.

  16. Developing Software Simulations

    Directory of Open Access Journals (Sweden)

    Tom Hall

    2007-06-01

    Full Text Available Programs in education and business often require learners to develop and demonstrate competence in specified areas and then be able to effectively apply this knowledge. One method to aid in developing a skill set in these areas is through the use of software simulations. These simulations can be used for learner demonstrations of competencies in a specified course as well as a review of the basic skills at the beginning of subsequent courses. The first section of this paper discusses ToolBook, the software used to develop our software simulations. The second section discusses the process of developing software simulations. The third part discusses how we have used software simulations to assess student knowledge of research design by providing simulations that allow the student to practice using SPSS and Excel.

  17. How does Software Process Improvement Address Global Software Engineering?

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Diebold, Philipp; Münch, Jürgen

    2016-01-01

    a systematic mapping study on the state-of-the-art in SPI from a general perspective, we observed Global Software Engineering (GSE) becoming a topic of interest in recent years. Therefore, in this paper, we provide a detailed investigation of those papers from the overall systematic mapping study that were......For decades, Software Process Improvement (SPI) programs have been implemented, inter alia, to improve quality and speed of software development. To set up, guide, and carry out SPI projects, and to measure SPI state, impact, and success, a multitude of different SPI approaches and considerable...... experience are available. SPI addresses many aspects ranging from individual developer skills to entire organizations. It comprises for instance the optimization of specific activities in the software lifecycle as well as the creation of organization awareness and project culture. In the course of conducting...

  18. FY1995 study of very flexible software structures based on soft-software components; 1995 nendo yawarankana software buhin ni motozuku software no choju kozo ni kansuru kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    The purpose of this study is to develop the method and tools for changing the software structure flexibly along with the continuous continuous change of its environment and conditions of use. The goal is the software of very high adaptability by using soft-software components and flexible assembly. The CASE tool platform Sapid based on a fine-grained repository was developed and enforced for raising the abstraction level of program code and for mining potential flexible components. To reconstruct the software adaptable to a required environment, the SQM (Software Quark Model) was used in managing interconnectivity and other semantic relationships of among components. On these two basic systems, we developed various methods and tools such as those for static and dynamic analysis of very flexible software structures, program transformation description, program pattern extraction and composition component optimization by partial evaluation, component extraction by function slicing, code encapsulation, and component navigation and application. (NEDO)

  19. 48 CFR 227.7203-2 - Acquisition of noncommercial computer software and computer software documentation.

    Science.gov (United States)

    2010-10-01

    ... at one site or multiple site licenses, and the format and media in which the software or... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software...

  20. Evidence synthesis software.

    Science.gov (United States)

    Park, Sophie Elizabeth; Thomas, James

    2018-06-07

    It can be challenging to decide which evidence synthesis software to choose when doing a systematic review. This article discusses some of the important questions to consider in relation to the chosen method and synthesis approach. Software can support researchers in a range of ways. Here, a range of review conditions and software solutions. For example, facilitating contemporaneous collaboration across time and geographical space; in-built bias assessment tools; and line-by-line coding for qualitative textual analysis. EPPI-Reviewer is a review software for research synthesis managed by the EPPI-centre, UCL Institute of Education. EPPI-Reviewer has text mining automation technologies. Version 5 supports data sharing and re-use across the systematic review community. Open source software will soon be released. EPPI-Centre will continue to offer the software as a cloud-based service. The software is offered via a subscription with a one-month (extendible) trial available and volume discounts for 'site licences'. It is free to use for Cochrane and Campbell reviews. The next EPPI-Reviewer version is being built in collaboration with National Institute for Health and Care Excellence using 'surveillance' of newly published research to support 'living' iterative reviews. This is achieved using a combination of machine learning and traditional information retrieval technologies to identify the type of research each new publication describes and determine its relevance for a particular review, domain or guideline. While the amount of available knowledge and research is constantly increasing, the ways in which software can support the focus and relevance of data identification are also developing fast. Software advances are maximising the opportunities for the production of relevant and timely reviews. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise

  1. Software Validation in ATLAS

    International Nuclear Information System (INIS)

    Hodgkinson, Mark; Seuster, Rolf; Simmons, Brinick; Sherwood, Peter; Rousseau, David

    2012-01-01

    The ATLAS collaboration operates an extensive set of protocols to validate the quality of the offline software in a timely manner. This is essential in order to process the large amounts of data being collected by the ATLAS detector in 2011 without complications on the offline software side. We will discuss a number of different strategies used to validate the ATLAS offline software; running the ATLAS framework software, Athena, in a variety of configurations daily on each nightly build via the ATLAS Nightly System (ATN) and Run Time Tester (RTT) systems; the monitoring of these tests and checking the compilation of the software via distributed teams of rotating shifters; monitoring of and follow up on bug reports by the shifter teams and periodic software cleaning weeks to improve the quality of the offline software further.

  2. Software - Naval Oceanography Portal

    Science.gov (United States)

    are here: Home › USNO › Earth Orientation › Software USNO Logo USNO Navigation Earth Orientation Products GPS-based Products VLBI-based Products EO Information Center Publications about Products Software Search databases Auxiliary Software Supporting Software Form Folder Earth Orientation Matrix Calculator

  3. Software Design Improvements. Part 2; Software Quality and the Design and Inspection Process

    Science.gov (United States)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    The application of assurance engineering techniques improves the duration of failure-free performance of software. The totality of features and characteristics of a software product are what determine its ability to satisfy customer needs. Software in safety-critical systems is very important to NASA. We follow the System Safety Working Groups definition for system safety software as: 'The optimization of system safety in the design, development, use and maintenance of software and its integration with safety-critical systems in an operational environment. 'If it is not safe, say so' has become our motto. This paper goes over methods that have been used by NASA to make software design improvements by focusing on software quality and the design and inspection process.

  4. Correction of 157-nm lens based on phase ring aberration extraction method

    Science.gov (United States)

    Meute, Jeff; Rich, Georgia K.; Conley, Will; Smith, Bruce W.; Zavyalova, Lena V.; Cashmore, Julian S.; Ashworth, Dominic; Webb, James E.; Rich, Lisa

    2004-05-01

    Early manufacture and use of 157nm high NA lenses has presented significant challenges including: intrinsic birefringence correction, control of optical surface contamination, and the use of relatively unproven materials, coatings, and metrology. Many of these issues were addressed during the manufacture and use of International SEMATECH"s 0.85NA lens. Most significantly, we were the first to employ 157nm phase measurement interferometry (PMI) and birefringence modeling software for lens optimization. These efforts yielded significant wavefront improvement and produced one of the best wavefront-corrected 157nm lenses to date. After applying the best practices to the manufacture of the lens, we still had to overcome the difficulties of integrating the lens into the tool platform at International SEMATECH instead of at the supplier facility. After lens integration, alignment, and field optimization were complete, conventional lithography and phase ring aberration extraction techniques were used to characterize system performance. These techniques suggested a wavefront error of approximately 0.05 waves RMS--much larger than the 0.03 waves RMS predicted by 157nm PMI. In-situ wavefront correction was planned for in the early stages of this project to mitigate risks introduced by the use of development materials and techniques and field integration of the lens. In this publication, we document the development and use of a phase ring aberration extraction method for characterizing imaging performance and a technique for correcting aberrations with the addition of an optical compensation plate. Imaging results before and after the lens correction are presented and differences between actual and predicted results are discussed.

  5. Interface-based software testing

    OpenAIRE

    Aziz Ahmad Rais

    2016-01-01

    Software quality is determined by assessing the characteristics that specify how it should work, which are verified through testing. If it were possible to touch, see, or measure software, it would be easier to analyze and prove its quality. Unfortunately, software is an intangible asset, which makes testing complex. This is especially true when software quality is not a question of particular functions that can be tested through a graphical user interface. The primary objective of softwar...

  6. Software component quality evaluation

    Science.gov (United States)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  7. In vitro porcine blood-brain barrier model for permeability studies: pCEL-X software pKa(FLUX) method for aqueous boundary layer correction and detailed data analysis.

    Science.gov (United States)

    Yusof, Siti R; Avdeef, Alex; Abbott, N Joan

    2014-12-18

    In vitro blood-brain barrier (BBB) models from primary brain endothelial cells can closely resemble the in vivo BBB, offering valuable models to assay BBB functions and to screen potential central nervous system drugs. We have recently developed an in vitro BBB model using primary porcine brain endothelial cells. The model shows expression of tight junction proteins and high transendothelial electrical resistance, evidence for a restrictive paracellular pathway. Validation studies using small drug-like compounds demonstrated functional uptake and efflux transporters, showing the suitability of the model to assay drug permeability. However, one limitation of in vitro model permeability measurement is the presence of the aqueous boundary layer (ABL) resulting from inefficient stirring during the permeability assay. The ABL can be a rate-limiting step in permeation, particularly for lipophilic compounds, causing underestimation of the permeability. If the ABL effect is ignored, the permeability measured in vitro will not reflect the permeability in vivo. To address the issue, we explored the combination of in vitro permeability measurement using our porcine model with the pKa(FLUX) method in pCEL-X software to correct for the ABL effect and allow a detailed analysis of in vitro (transendothelial) permeability data, Papp. Published Papp using porcine models generated by our group and other groups are also analyzed. From the Papp, intrinsic transcellular permeability (P0) is derived by simultaneous refinement using a weighted nonlinear regression, taking into account permeability through the ABL, paracellular permeability and filter restrictions on permeation. The in vitro P0 derived for 22 compounds (35 measurements) showed good correlation with P0 derived from in situ brain perfusion data (r(2)=0.61). The analysis also gave evidence for carrier-mediated uptake of naloxone, propranolol and vinblastine. The combination of the in vitro porcine model and the software

  8. Software ecosystems analyzing and managing business networks in the software industry

    CERN Document Server

    Jansen, S; Cusumano, MA

    2013-01-01

    This book describes the state-of-the-art of software ecosystems. It constitutes a fundamental step towards an empirically based, nuanced understanding of the implications for management, governance, and control of software ecosystems. This is the first book of its kind dedicated to this emerging field and offers guidelines on how to analyze software ecosystems; methods for managing and growing; methods on transitioning from a closed software organization to an open one; and instruments for dealing with open source, licensing issues, product management and app stores. It is unique in bringing t

  9. Software reengineering

    Science.gov (United States)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  10. Specialized software utilities for gamma ray spectrometry. Final report of a co-ordinated research project 1996-2000

    International Nuclear Information System (INIS)

    2002-03-01

    A Co-ordinated Research Project (CRP) on Software Utilities for Gamma Ray Spectrometry was initiated by the International Atomic Energy Agency in 1996 for a three year period. In the CRP several basic applications of nuclear data handling were assayed which also dealt with the development of PC computer codes for various spectrometric purposes. The CRP produced several software packages: for the analysis of low level NaI spectra; user controlled analysis of gamma ray spectra from HPGe detectors; a set of routines for the definition of the detector resolution function and for the unfolding of experimental annihilation spectra; a program for the generation of gamma ray libraries for specific applications; a program to calculate true coincidence corrections; a program to calculate full-energy peak efficiency calibration curve for homogenous cylindrical sample geometries including self-attenuation correction; and a program for the library driven analysis of gamma ray spectra and for the quantification of radionuclide content in samples. In addition, the CRP addressed problems of the analysis of naturally occurring radioactive soil material gamma ray spectra, questions of quality assurance and quality control in gamma ray spectrometry, and verification of the expert system SHAMAN for the analysis of air filter spectra obtained within the framework of the Comprehensive Nuclear Test Ban Treaty. This TECDOC contains 10 presentations delivered at the meeting with the description of the software developed. Each of the papers has been indexed separately

  11. Third-Party Software's Trust Quagmire.

    Science.gov (United States)

    Voas, J; Hurlburt, G

    2015-12-01

    Current software development has trended toward the idea of integrating independent software sub-functions to create more complete software systems. Software sub-functions are often not homegrown - instead they are developed by unknown 3 rd party organizations and reside in software marketplaces owned or controlled by others. Such software sub-functions carry plausible concern in terms of quality, origins, functionality, security, interoperability, to name a few. This article surveys key technical difficulties in confidently building systems from acquired software sub-functions by calling out the principle software supply chain actors.

  12. Dtest Testing Software

    Science.gov (United States)

    Jain, Abhinandan; Cameron, Jonathan M.; Myint, Steven

    2013-01-01

    This software runs a suite of arbitrary software tests spanning various software languages and types of tests (unit level, system level, or file comparison tests). The dtest utility can be set to automate periodic testing of large suites of software, as well as running individual tests. It supports distributing multiple tests over multiple CPU cores, if available. The dtest tool is a utility program (written in Python) that scans through a directory (and its subdirectories) and finds all directories that match a certain pattern and then executes any tests in that directory as described in simple configuration files.

  13. Preliminary experience with SpineEOS, a new software for 3D planning in AIS surgery.

    Science.gov (United States)

    Ferrero, Emmanuelle; Mazda, Keyvan; Simon, Anne-Laure; Ilharreborde, Brice

    2018-04-24

    Preoperative planning of scoliosis surgery is essential in the effective treatment of spine pathology. Thus, precontoured rods have been recently developed to avoid iatrogenic sagittal misalignment and rod breakage. Some specific issues exist in adolescent idiopathic scoliosis (AIS), such as a less distal lower instrumented level, a great variability in the location of inflection point (transition from lumbar lordosis to thoracic kyphosis), and sagittal correction is limited by both bone-implant interface. Since 2007, stereoradiographic imaging system is used and allows for 3D reconstructions. Therefore, a software was developed to perform preoperative 3D surgical planning and to provide rod's shape and length. The goal of this preliminary study was to assess the feasibility, reliability, and the clinical relevance of this new software. Retrospective study on 47 AIS patients operated with the same surgical technique: posteromedial translation through posterior approach with lumbar screws and thoracic sublaminar bands. Pre- and postoperatively, 3D reconstructions were performed on stereoradiographic images (EOS system, Paris, France) and compared. Then, the software was used to plan the surgical correction and determine rod's shape and length. Simulated spine and rods were compared to postoperative real 3D reconstructions. 3D reconstructions and planning were performed by an independent observer. 3D simulations were performed on the 47 patients. No difference was found between the simulated model and the postoperative 3D reconstructions in terms of sagittal parameters. Postoperatively, 21% of LL were not within reference values. Postoperative SVA was 20 mm anterior in 2/3 of the cases. Postoperative rods were significantly longer than precontoured rods planned with the software (mean 10 mm). Inflection points were different on the rods used and the planned rods (2.3 levels on average). In this preliminary study, the software based on 3D stereoradiography low

  14. Systematic Software Development

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Méndez Fernández, Daniel

    2015-01-01

    The speed of innovation and the global allocation of resources to accelerate development or to reduce cost put pressure on the software industry. In the global competition, especially so-called high-price countries have to present arguments why the higher development cost is justified and what...... makes these countries an attractive host for software companies. Often, high-quality engineering and excellent quality of products, e.g., machinery and equipment, are mentioned. Yet, the question is: Can such arguments be also found for the software industry? We aim at investigating the degree...... of professionalism and systematization of software development to draw a map of strengths and weaknesses. To this end, we conducted as a first step an exploratory survey in Germany, presented in this paper. In this survey, we focused on the perceived importance of the two general software engineering process areas...

  15. Test documentation for the GENII Software Version 1.485

    International Nuclear Information System (INIS)

    Rittmann, P.D.

    1994-01-01

    Version 1.485 of the GENII software was released by the PNL GENII custodian in December of 1990. At that time the WHC GENII custodian performed several tests to verify that the advertised revisions were indeed present and that these changes had not introduced errors in the calculations normally done by WHC. These tests were not documented at that time. The purpose of this document is to summarize suitable acceptance tests of GENII and compare them with a few hand calculations. The testing is not as thorough as that used by the PNL GENII Custodian, but is sufficient to establish that the GENII program appears to work correctly on WHC managed personal computers

  16. Software for safety critical applications

    International Nuclear Information System (INIS)

    Kropik, M.; Matejka, K.; Jurickova, M.; Chudy, R.

    2001-01-01

    The contribution gives an overview of the project of the software development for safety critical applications. This project has been carried out since 1997. The principal goal of the project was to establish a research laboratory for the development of the software with the highest requirements for quality and reliability. This laboratory was established at the department, equipped with proper hardware and software to support software development. A research team of predominantly young researchers for software development was created. The activities of the research team started with studying and proposing the software development methodology. In addition, this methodology was applied to the real software development. The verification and validation process followed the software development. The validation system for the integrated hardware and software tests was brought into being and its control software was developed. The quality of the software tools was also observed, and the SOSAT tool was used during these activities. National and international contacts were established and maintained during the project solution.(author)

  17. Towards an Ontology of Software

    OpenAIRE

    Wang, Xiaowei

    2016-01-01

    Software is permeating every aspect of our personal and social life. And yet, the cluster of concepts around the notion of software, such as the notions of a software product, software requirements, software specifications, are still poorly understood with no consensus on the horizon. For many, software is just code, something intangible best defined in contrast with hardware, but it is not particularly illuminating. This erroneous notion, software is just code, presents both in the ontology ...

  18. IMS software developments for the detection of chemical warfare agent

    Science.gov (United States)

    Klepel, ST.; Graefenhain, U.; Lippe, R.; Stach, J.; Starrock, V.

    1995-01-01

    Interference compounds like gasoline, diesel, burning wood or fuel, etc. are presented in common battlefield situations. These compounds can cause detectors to respond as a false positive or interfere with the detector's ability to respond to target compounds such as chemical warfare agents. To ensure proper response of the ion mobility spectrometer to chemical warfare agents, two special software packages were developed and incorporated into the Bruker RAID-1. The programs suppress interferring signals caused by car exhaust or smoke gases resulting from burning materials and correct the influence of variable sample gas humidity which is important for detection and quantification of blister agents like mustard gas or lewisite.

  19. How Does Software Process Improvement Address Global Software Engineering?

    DEFF Research Database (Denmark)

    Kuhrmann, Marco; Diebold, Philipp; Münch, Jürgen

    2016-01-01

    For decades, Software Process Improvement (SPI) programs have been implemented, inter alia, to improve quality and speed of software development. To set up, guide, and carry out SPI projects, and to measure SPI state, impact, and success, a multitude of different SPI approaches and considerable...

  20. Software vulnerability: Definition, modelling, and practical evaluation for E-mail: transfer software

    International Nuclear Information System (INIS)

    Kimura, Mitsuhiro

    2006-01-01

    This paper proposes a method of assessing software vulnerability quantitatively. By expanding the concept of the IPO (input-program-output) model, we first define the software vulnerability and construct a stochastic model. Then we evaluate the software vulnerability of the sendmail system by analyzing the actual security-hole data, which were collected from its release note. Also we show the relationship between the estimated software reliability and vulnerability of the analyzed system

  1. A graphics processing unit accelerated motion correction algorithm and modular system for real-time fMRI.

    Science.gov (United States)

    Scheinost, Dustin; Hampson, Michelle; Qiu, Maolin; Bhawnani, Jitendra; Constable, R Todd; Papademetris, Xenophon

    2013-07-01

    Real-time functional magnetic resonance imaging (rt-fMRI) has recently gained interest as a possible means to facilitate the learning of certain behaviors. However, rt-fMRI is limited by processing speed and available software, and continued development is needed for rt-fMRI to progress further and become feasible for clinical use. In this work, we present an open-source rt-fMRI system for biofeedback powered by a novel Graphics Processing Unit (GPU) accelerated motion correction strategy as part of the BioImage Suite project ( www.bioimagesuite.org ). Our system contributes to the development of rt-fMRI by presenting a motion correction algorithm that provides an estimate of motion with essentially no processing delay as well as a modular rt-fMRI system design. Using empirical data from rt-fMRI scans, we assessed the quality of motion correction in this new system. The present algorithm performed comparably to standard (non real-time) offline methods and outperformed other real-time methods based on zero order interpolation of motion parameters. The modular approach to the rt-fMRI system allows the system to be flexible to the experiment and feedback design, a valuable feature for many applications. We illustrate the flexibility of the system by describing several of our ongoing studies. Our hope is that continuing development of open-source rt-fMRI algorithms and software will make this new technology more accessible and adaptable, and will thereby accelerate its application in the clinical and cognitive neurosciences.

  2. The software and hardware design of a 16 channel online dose rate monitoring system

    International Nuclear Information System (INIS)

    Tang Wenjuan; Yan Yonghong; Yang Shiming; Li Xiaonan; Min Jian

    2011-01-01

    The software and hardware design of a 16 channel online dose rate monitoring system is presented. After being amplified and A/D converted, the output signal of the sensors was sent to a microprocessor through an FPGA, where the low-frequency filter, calculation, temperature compensation and pedestal deduction were accomplished. Such steps corrected the variation of dark current dependent on temperature fluctuations in a effective way, and finally the instantaneous dose rate results with enough precise were obtained. (authors)

  3. Facilitating the analysis of the multifocal electroretinogram using the free software environment R.

    Science.gov (United States)

    Bergholz, Richard; Rossel, Mirjam; Dutescu, Ralf M; Vöge, Klaas P; Salchow, Daniel J

    2018-01-01

    The large amount of data rendered by the multifocal electroretinogram (mfERG) can be analyzed and visualized in various ways. The evaluation and comparison of more than one examination is time-consuming and prone to create errors. Using the free software environment R we developed a solution to average the data of multiple examinations and to allow a comparison of different patient groups. Data of single mfERG recordings as exported in .csv format from a RETIport 21 system (version 7/03, Roland Consult) or manually compiled .csv files are the basis for the calculations. The R software extracts response densities and implicit times of N1 and P1 for the sum response, each ring eccentricity, and each single hexagon. Averages can be calculated for as many subjects as needed. The mentioned parameters can then be compared to another group of patients or healthy subjects. Application of the software is illustrated by comparing 11 patients with chloroquine maculopathy to a control group of 7 healthy subjects. The software scripts display response density and implicit time 3D plots of each examination as well as of the group averages. Differences of the group averages are presented as 3D and grayscale 2D plots. Both groups are compared using the t-test with Bonferroni correction. The group comparison is furthermore illustrated by the average waveforms and by boxplots of each eccentricity. This software solution on the basis of the programming language R facilitates the clinical and scientific use of the mfERG and aids in interpretation and analysis.

  4. Licensing safety critical software

    International Nuclear Information System (INIS)

    Archinoff, G.H.; Brown, R.A.

    1990-01-01

    Licensing difficulties with the shutdown system software at the Darlington Nuclear Generating Station contributed to delays in starting up the station. Even though the station has now been given approval by the Atomic Energy Control Board (AECB) to operate, the software issue has not disappeared - Ontario Hydro has been instructed by the AECB to redesign the software. This article attempts to explain why software based shutdown systems were chosen for Darlington, why there was so much difficulty licensing them, and what the implications are for other safety related software based applications

  5. Software Engineering Laboratory Series: Proceedings of the Twenty-First Annual Software Engineering Workshop

    Science.gov (United States)

    1996-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  6. Software Engineering Laboratory Series: Proceedings of the Twenty-Second Annual Software Engineering Workshop

    Science.gov (United States)

    1997-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  7. Improved volumetric measurement of brain structure with a distortion correction procedure using an ADNI phantom.

    Science.gov (United States)

    Maikusa, Norihide; Yamashita, Fumio; Tanaka, Kenichiro; Abe, Osamu; Kawaguchi, Atsushi; Kabasawa, Hiroyuki; Chiba, Shoma; Kasahara, Akihiro; Kobayashi, Nobuhisa; Yuasa, Tetsuya; Sato, Noriko; Matsuda, Hiroshi; Iwatsubo, Takeshi

    2013-06-01

    Serial magnetic resonance imaging (MRI) images acquired from multisite and multivendor MRI scanners are widely used in measuring longitudinal structural changes in the brain. Precise and accurate measurements are important in understanding the natural progression of neurodegenerative disorders such as Alzheimer's disease. However, geometric distortions in MRI images decrease the accuracy and precision of volumetric or morphometric measurements. To solve this problem, the authors suggest a commercially available phantom-based distortion correction method that accommodates the variation in geometric distortion within MRI images obtained with multivendor MRI scanners. The authors' method is based on image warping using a polynomial function. The method detects fiducial points within a phantom image using phantom analysis software developed by the Mayo Clinic and calculates warping functions for distortion correction. To quantify the effectiveness of the authors' method, the authors corrected phantom images obtained from multivendor MRI scanners and calculated the root-mean-square (RMS) of fiducial errors and the circularity ratio as evaluation values. The authors also compared the performance of the authors' method with that of a distortion correction method based on a spherical harmonics description of the generic gradient design parameters. Moreover, the authors evaluated whether this correction improves the test-retest reproducibility of voxel-based morphometry in human studies. A Wilcoxon signed-rank test with uncorrected and corrected images was performed. The root-mean-square errors and circularity ratios for all slices significantly improved (p Wilcoxon signed-rank test, p test-retest reproducibility. The results showed that distortion was corrected significantly using the authors' method. In human studies, the reproducibility of voxel-based morphometry analysis for the whole gray matter significantly improved after distortion correction using the authors

  8. Ensuring Software IP Cleanliness

    OpenAIRE

    Mahshad Koohgoli; Richard Mayer

    2007-01-01

    At many points in the life of a software enterprise, determination of intellectual property (IP) cleanliness becomes critical. The value of an enterprise that develops and sells software may depend on how clean the software is from the IP perspective. This article examines various methods of ensuring software IP cleanliness and discusses some of the benefits and shortcomings of current solutions.

  9. SOFTWARE PROCESS IMPROVEMENT: AWARENESS, USE, AND BENEFITS IN CANADIAN SOFTWARE DEVELOPMENT FIRMS

    OpenAIRE

    CHEVERS, DELROY

    2017-01-01

    ABSTRACT Since 1982, the software development community has been concerned with the delivery of quality systems. Software process improvement (SPI) is an initiative to avoid the delivery of low quality systems. However, the awareness and adoption of SPI is low. Thus, this study examines the rate of awareness, use, and benefits of SPI initiatives in Canadian software development firms. Using SPSS as the analytical tool, this study found that 59% of Canadian software development firms are aware...

  10. SLDAssay: A software package and web tool for analyzing limiting dilution assays.

    Science.gov (United States)

    Trumble, Ilana M; Allmon, Andrew G; Archin, Nancie M; Rigdon, Joseph; Francis, Owen; Baldoni, Pedro L; Hudgens, Michael G

    2017-11-01

    Serial limiting dilution (SLD) assays are used in many areas of infectious disease related research. This paper presents SLDAssay, a free and publicly available R software package and web tool for analyzing data from SLD assays. SLDAssay computes the maximum likelihood estimate (MLE) for the concentration of target cells, with corresponding exact and asymptotic confidence intervals. Exact and asymptotic goodness of fit p-values, and a bias-corrected (BC) MLE are also provided. No other publicly available software currently implements the BC MLE or the exact methods. For validation of SLDAssay, results from Myers et al. (1994) are replicated. Simulations demonstrate the BC MLE is less biased than the MLE. Additionally, simulations demonstrate that exact methods tend to give better confidence interval coverage and goodness-of-fit tests with lower type I error than the asymptotic methods. Additional advantages of using exact methods are also discussed. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. What Counts in Software Process?

    DEFF Research Database (Denmark)

    Cohn, Marisa

    2009-01-01

    and conversations in negotiating between prescriptions from a model and the contingencies that arise in an enactment. A qualitative field study at two Agile software development companies was conducted to investigate the role of artifacts in the software development work and the relationship between these artifacts...... and the Software Process. Documentation of software requirements is a major concern among software developers and software researchers. Agile software development denotes a different relationship to documentation, one that warrants investigation. Empirical findings are presented which suggest a new understanding...

  12. Model for Simulating a Spiral Software-Development Process

    Science.gov (United States)

    Mizell, Carolyn; Curley, Charles; Nayak, Umanath

    2010-01-01

    ), productivity (number of lines of code per hour), and number of defects per source line of code. The user provides the number of resources, the overall percent of effort that should be allocated to each process step, and the number of desired staff members for each step. The output of PATT includes the size of the product, a measure of effort, a measure of rework effort, the duration of the entire process, and the numbers of injected, detected, and corrected defects as well as a number of other interesting features. In the development of the present model, steps were added to the IEEE 12207 waterfall process, and this model and its implementing software were made to run repeatedly through the sequence of steps, each repetition representing an iteration in a spiral process. Because the IEEE 12207 model is founded on a waterfall paradigm, it enables direct comparison of spiral and waterfall processes. The model can be used throughout a software-development project to analyze the project as more information becomes available. For instance, data from early iterations can be used as inputs to the model, and the model can be used to estimate the time and cost of carrying the project to completion.

  13. LDUA software custodian's notebook

    International Nuclear Information System (INIS)

    Aftanas, B.L.

    1998-01-01

    This plan describes the activities to be performed and controls to be applied to the process of specifying, obtaining, and qualifying the control and data acquisition software for the Light Duty Utility Arm (LDUA) System. It serves the purpose of a software quality assurance plan, a verification and validation plan, and a configuration management plan. This plan applies to all software that is an integral part of the LDUA control and data acquisition system, that is, software that is installed in the computers that are part of the LDUA system as it is deployed in the field. This plan applies to the entire development process, including: requirements; design; implementation; and operations and maintenance. This plan does not apply to any software that is not integral with the LDUA system. This plan has-been prepared in accordance with WHC-CM-6-1 Engineering Practices, EP-2.1; WHC-CM-3-10 Software Practices; and WHC-CM-4-2, QR 19.0, Software Quality Assurance Requirements

  14. Preliminary Studies for a CBCT Imaging Protocol for Offline Organ Motion Analysis: Registration Software Validation and CTDI Measurements

    International Nuclear Information System (INIS)

    Falco, Maria Daniela; Fontanarosa, Davide; Miceli, Roberto; Carosi, Alessandra; Santoni, Riccardo; D'Andrea, Marco

    2011-01-01

    Cone-beam X-ray volumetric imaging in the treatment room, allows online correction of set-up errors and offline assessment of residual set-up errors and organ motion. In this study the registration algorithm of the X-ray volume imaging software (XVI, Elekta, Crawley, United Kingdom), which manages a commercial cone-beam computed tomography (CBCT)-based positioning system, has been tested using a homemade and an anthropomorphic phantom to: (1) assess its performance in detecting known translational and rotational set-up errors and (2) transfer the transformation matrix of its registrations into a commercial treatment planning system (TPS) for offline organ motion analysis. Furthermore, CBCT dose index has been measured for a particular site (prostate: 120 kV, 1028.8 mAs, approximately 640 frames) using a standard Perspex cylindrical body phantom (diameter 32 cm, length 15 cm) and a 10-cm-long pencil ionization chamber. We have found that known displacements were correctly calculated by the registration software to within 1.3 mm and 0.4 o . For the anthropomorphic phantom, only translational displacements have been considered. Both studies have shown errors within the intrinsic uncertainty of our system for translational displacements (estimated as 0.87 mm) and rotational displacements (estimated as 0.22 o ). The resulting table translations proposed by the system to correct the displacements were also checked with portal images and found to place the isocenter of the plan on the linac isocenter within an error of 1 mm, which is the dimension of the spherical lead marker inserted at the center of the homemade phantom. The registration matrix translated into the TPS image fusion module correctly reproduced the alignment between planning CT scans and CBCT scans. Finally, measurements on the CBCT dose index indicate that CBCT acquisition delivers less dose than conventional CT scans and electronic portal imaging device portals. The registration software was found to be

  15. Calibration and correction of LA-ICP-MS and LA-MC-ICP-MS analyses for element contents and isotopic ratios

    Directory of Open Access Journals (Sweden)

    Jie Lin

    2016-06-01

    Full Text Available LA-ICP-MS and LA-MC-ICP-MS have been the techniques of choice for achieving accurate and precise element content and isotopic ratio, the state-of-the-art technique combines the advantages of low detection limits with high spatial resolution, however, the analysis accuracy and precision are restricted by many factors, such as sensitivity drift, elemental/isotopic fractionation, matrix effects, interferences and the lack of sufficiently matrix-matched reference materials. Thus, rigorous and suitable calibration and correction methods are needed to obtain quantitative data. This review systematically summarized and evaluated the interference correction, quantitative calculation and sensitivity correction strategies in order to provide the analysts with suitable calibration and correction strategies according to the sample types and the analyzed elements. The functions and features of data reduction software ICPMSDataCal were also outlined, which can provide real-time and on-line data reduction of element content and isotopic ratios analyzed by LA-ICP-MS and LA-MC-ICP-MS.

  16. Patterns for Parallel Software Design

    CERN Document Server

    Ortega-Arjona, Jorge Luis

    2010-01-01

    Essential reading to understand patterns for parallel programming Software patterns have revolutionized the way we think about how software is designed, built, and documented, and the design of parallel software requires you to consider other particular design aspects and special skills. From clusters to supercomputers, success heavily depends on the design skills of software developers. Patterns for Parallel Software Design presents a pattern-oriented software architecture approach to parallel software design. This approach is not a design method in the classic sense, but a new way of managin

  17. Software testing concepts and operations

    CERN Document Server

    Mili, Ali

    2015-01-01

    Explores and identifies the main issues, concepts, principles and evolution of software testing, including software quality engineering and testing concepts, test data generation, test deployment analysis, and software test managementThis book examines the principles, concepts, and processes that are fundamental to the software testing function. This book is divided into five broad parts. Part I introduces software testing in the broader context of software engineering and explores the qualities that testing aims to achieve or ascertain, as well as the lifecycle of software testing. Part II c

  18. The Art of Software Testing

    CERN Document Server

    Myers, Glenford J; Badgett, Tom

    2011-01-01

    The classic, landmark work on software testing The hardware and software of computing have changed markedly in the three decades since the first edition of The Art of Software Testing, but this book's powerful underlying analysis has stood the test of time. Whereas most books on software testing target particular development techniques, languages, or testing methods, The Art of Software Testing, Third Edition provides a brief but powerful and comprehensive presentation of time-proven software testing approaches. If your software development project is mission critical, this book is an investme

  19. A Software for soil quality conservation at organic waste disposal areas: The case of olive mill and pistachio wastes.

    Science.gov (United States)

    Doula, Maria; Sarris, Apostolos; Papadopoulos, Nikos; Hliaoutakis, Aggelos; Kydonakis, Aris; Argyriou, Lemonia; Theocharopoulos, Sid; Kolovos, Chronis

    2016-04-01

    For the sustainable reuse of organic wastes at agricultural areas, apart from extensive evaluation of waste properties and characteristics, it is of significant importance, in order to protect soil quality, to evaluate land suitability and estimate the correct application doses prior waste landspreading. In the light of this precondition, a software was developed that integrates GIS maps of land suitability for waste reuse (wastewater and solid waste) and an algorithm for waste doses estimation in relation to soil analysis, and in case of reuse for fertilization with soil analysis, irrigation water quality and plant needs. EU and legislation frameworks of European Member States are also considered for the assessment of waste suitability for landspreading and for the estimation of the correct doses that will not cause adverse effects on soil and also to underground water (e.g. Nitrate Directive). Two examples of software functionality are presented in this study using data collected during two LIFE projects, i.e. Prosodol for landspreading of olive mill wastes and AgroStrat for pistachio wastes.

  20. Ensuring Software IP Cleanliness

    Directory of Open Access Journals (Sweden)

    Mahshad Koohgoli

    2007-12-01

    Full Text Available At many points in the life of a software enterprise, determination of intellectual property (IP cleanliness becomes critical. The value of an enterprise that develops and sells software may depend on how clean the software is from the IP perspective. This article examines various methods of ensuring software IP cleanliness and discusses some of the benefits and shortcomings of current solutions.