WorldWideScience

Sample records for analysis software applications

  1. Software FMEA analysis for safety-related application software

    International Nuclear Information System (INIS)

    Park, Gee-Yong; Kim, Dong Hoon; Lee, Dong Young

    2014-01-01

    Highlights: • We develop a modified FMEA analysis suited for applying to software architecture. • A template for failure modes on a specific software language is established. • A detailed-level software FMEA analysis on nuclear safety software is presented. - Abstract: A method of a software safety analysis is described in this paper for safety-related application software. The target software system is a software code installed at an Automatic Test and Interface Processor (ATIP) in a digital reactor protection system (DRPS). For the ATIP software safety analysis, at first, an overall safety or hazard analysis is performed over the software architecture and modules, and then a detailed safety analysis based on the software FMEA (Failure Modes and Effect Analysis) method is applied to the ATIP program. For an efficient analysis, the software FMEA analysis is carried out based on the so-called failure-mode template extracted from the function blocks used in the function block diagram (FBD) for the ATIP software. The software safety analysis by the software FMEA analysis, being applied to the ATIP software code, which has been integrated and passed through a very rigorous system test procedure, is proven to be able to provide very valuable results (i.e., software defects) that could not be identified during various system tests

  2. Application of Software Safety Analysis Methods

    International Nuclear Information System (INIS)

    Park, G. Y.; Hur, S.; Cheon, S. W.; Kim, D. H.; Lee, D. Y.; Kwon, K. C.; Lee, S. J.; Koo, Y. H.

    2009-01-01

    A fully digitalized reactor protection system, which is called the IDiPS-RPS, was developed through the KNICS project. The IDiPS-RPS has four redundant and separated channels. Each channel is mainly composed of a group of bistable processors which redundantly compare process variables with their corresponding setpoints and a group of coincidence processors that generate a final trip signal when a trip condition is satisfied. Each channel also contains a test processor called the ATIP and a display and command processor called the COM. All the functions were implemented in software. During the development of the safety software, various software safety analysis methods were applied, in parallel to the verification and validation (V and V) activities, along the software development life cycle. The software safety analysis methods employed were the software hazard and operability (Software HAZOP) study, the software fault tree analysis (Software FTA), and the software failure modes and effects analysis (Software FMEA)

  3. Software applications for flux balance analysis.

    Science.gov (United States)

    Lakshmanan, Meiyappan; Koh, Geoffrey; Chung, Bevan K S; Lee, Dong-Yup

    2014-01-01

    Flux balance analysis (FBA) is a widely used computational method for characterizing and engineering intrinsic cellular metabolism. The increasing number of its successful applications and growing popularity are possibly attributable to the availability of specific software tools for FBA. Each tool has its unique features and limitations with respect to operational environment, user-interface and supported analysis algorithms. Presented herein is an in-depth evaluation of currently available FBA applications, focusing mainly on usability, functionality, graphical representation and inter-operability. Overall, most of the applications are able to perform basic features of model creation and FBA simulation. COBRA toolbox, OptFlux and FASIMU are versatile to support advanced in silico algorithms to identify environmental and genetic targets for strain design. SurreyFBA, WEbcoli, Acorn, FAME, GEMSiRV and MetaFluxNet are the distinct tools which provide the user friendly interfaces in model handling. In terms of software architecture, FBA-SimVis and OptFlux have the flexible environments as they enable the plug-in/add-on feature to aid prospective functional extensions. Notably, an increasing trend towards the implementation of more tailored e-services such as central model repository and assistance to collaborative efforts was observed among the web-based applications with the help of advanced web-technologies. Furthermore, most recent applications such as the Model SEED, FAME, MetaFlux and MicrobesFlux have even included several routines to facilitate the reconstruction of genome-scale metabolic models. Finally, a brief discussion on the future directions of FBA applications was made for the benefit of potential tool developers.

  4. Software safety analysis application in installation phase

    International Nuclear Information System (INIS)

    Huang, H. W.; Yih, S.; Wang, L. H.; Liao, B. C.; Lin, J. M.; Kao, T. M.

    2010-01-01

    This work performed a software safety analysis (SSA) in the installation phase of the Lungmen nuclear power plant (LMNPP) in Taiwan, under the cooperation of INER and TPC. The US Nuclear Regulatory Commission (USNRC) requests licensee to perform software safety analysis (SSA) and software verification and validation (SV and V) in each phase of software development life cycle with Branch Technical Position (BTP) 7-14. In this work, 37 safety grade digital instrumentation and control (I and C) systems were analyzed by Failure Mode and Effects Analysis (FMEA), which is suggested by IEEE Standard 7-4.3.2-2003. During the installation phase, skew tests for safety grade network and point to point tests were performed. The FMEA showed all the single failure modes can be resolved by the redundant means. Most of the common mode failures can be resolved by operator manual actions. (authors)

  5. Application of software technology to automatic test data analysis

    Science.gov (United States)

    Stagner, J. R.

    1991-01-01

    The verification process for a major software subsystem was partially automated as part of a feasibility demonstration. The methods employed are generally useful and applicable to other types of subsystems. The effort resulted in substantial savings in test engineer analysis time and offers a method for inclusion of automatic verification as a part of regression testing.

  6. IFDOTMETER: A New Software Application for Automated Immunofluorescence Analysis.

    Science.gov (United States)

    Rodríguez-Arribas, Mario; Pizarro-Estrella, Elisa; Gómez-Sánchez, Rubén; Yakhine-Diop, S M S; Gragera-Hidalgo, Antonio; Cristo, Alejandro; Bravo-San Pedro, Jose M; González-Polo, Rosa A; Fuentes, José M

    2016-04-01

    Most laboratories interested in autophagy use different imaging software for managing and analyzing heterogeneous parameters in immunofluorescence experiments (e.g., LC3-puncta quantification and determination of the number and size of lysosomes). One solution would be software that works on a user's laptop or workstation that can access all image settings and provide quick and easy-to-use analysis of data. Thus, we have designed and implemented an application called IFDOTMETER, which can run on all major operating systems because it has been programmed using JAVA (Sun Microsystems). Briefly, IFDOTMETER software has been created to quantify a variety of biological hallmarks, including mitochondrial morphology and nuclear condensation. The program interface is intuitive and user-friendly, making it useful for users not familiar with computer handling. By setting previously defined parameters, the software can automatically analyze a large number of images without the supervision of the researcher. Once analysis is complete, the results are stored in a spreadsheet. Using software for high-throughput cell image analysis offers researchers the possibility of performing comprehensive and precise analysis of a high number of images in an automated manner, making this routine task easier. © 2015 Society for Laboratory Automation and Screening.

  7. Application of Metric-based Software Reliability Analysis to Example Software

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Smidts, Carol

    2008-07-01

    The software reliability of TELLERFAST ATM software is analyzed by using two metric-based software reliability analysis methods, a state transition diagram-based method and a test coverage-based method. The procedures for the software reliability analysis by using the two methods and the analysis results are provided in this report. It is found that the two methods have a relation of complementary cooperation, and therefore further researches on combining the two methods to reflect the benefit of the complementary cooperative effect to the software reliability analysis are recommended

  8. Development and applications of Kramers-Kronig PEELS analysis software

    International Nuclear Information System (INIS)

    Fan, X. D.; Peng, J.L.; Bursill, L.A.

    1997-01-01

    A Kramers-Kronig analysis program is developed as a custom function for the GATAN parallel electron energy loss spectroscopy (PEELS) software package EL/P. When used with a JEOL 4000EX high-resolution transmission electron microscope this program allows to measure the dielectric functions of materials with an energy resolution of approx 1.4eV. The imaginary part of the dielectric function is particularly useful, since it allows the magnitude of the band gap to be determined for relatively wide-gap materials. More importantly, changes in the gap may be monitored at high spatial resolution, when used in conjunction with the HRTEM images. The principles of the method are described and applications are presented for Type-1a gem quality diamond, before and after neutron irradiation. The former shows a band gap of about 5.8 eV, as expected, whereas for the latter the gap appears to be effectively collapsed. The core-loss spectra confirm that Type-1a diamond has pure sp 3 tetrahedral bonding, whereas the neutron irradiated diamond has mixed sp 2 /sp 3 bonding. Analysis of the low-loss spectra for the neutron-irradiated specimen yielded density 1.6 g/cm 3 , approximately half that of diamond. 10 refs., 2 figs

  9. Applications of the BEam Cross section Analysis Software (BECAS)

    DEFF Research Database (Denmark)

    Blasques, José Pedro Albergaria Amaral; Bitsche, Robert; Fedorov, Vladimir

    2013-01-01

    A newly developed framework is presented for structural design and analysis of long slender beam-like structures, e.g., wind turbine blades. The framework is based on the BEam Cross section Analysis Software – BECAS – a finite element based cross section analysis tool. BECAS is used for the gener......A newly developed framework is presented for structural design and analysis of long slender beam-like structures, e.g., wind turbine blades. The framework is based on the BEam Cross section Analysis Software – BECAS – a finite element based cross section analysis tool. BECAS is used...... for the generation of beam finite element models which correctly account for effects stemming from material anisotropy and inhomogeneity in cross sections of arbitrary geometry. These type of modelling approach allows for an accurate yet computationally inexpensive representation of a general class of three...

  10. Application of econometric and ecology analysis methods in physics software

    Science.gov (United States)

    Han, Min Cheol; Hoff, Gabriela; Kim, Chan Hyeong; Kim, Sung Hun; Grazia Pia, Maria; Ronchieri, Elisabetta; Saracco, Paolo

    2017-10-01

    Some data analysis methods typically used in econometric studies and in ecology have been evaluated and applied in physics software environments. They concern the evolution of observables through objective identification of change points and trends, and measurements of inequality, diversity and evenness across a data set. Within each analysis area, various statistical tests and measures have been examined. This conference paper summarizes a brief overview of some of these methods.

  11. Evaluation of Distribution Analysis Software for DER Applications

    Energy Technology Data Exchange (ETDEWEB)

    Staunton, RH

    2003-01-23

    unstoppable. In response, energy providers will be forced to both fully acknowledge the trend and plan for accommodating DER [3]. With bureaucratic barriers [4], lack of time/resources, tariffs, etc. still seen in certain regions of the country, changes still need to be made. Given continued technical advances in DER, the time is fast approaching when the industry, nation-wide, must not only accept DER freely but also provide or review in-depth technical assessments of how DER should be integrated into and managed throughout the distribution system. Characterization studies are needed to fully understand how both the utility system and DER devices themselves will respond to all reasonable events (e.g., grid disturbances, faults, rapid growth, diverse and multiple DER systems, large reactive loads). Some of this work has already begun as it relates to operation and control of DER [5] and microturbine performance characterization [6,7]. One of the most urgently needed tools that can provide these types of analyses is a distribution network analysis program in combination with models for various DER. Together, they can be used for (1) analyzing DER placement in distribution networks and (2) helping to ensure that adequate transmission reliability is maintained. Surveys of the market show products that represent a partial match to these needs; specifically, software that has been developed to plan electrical distribution systems and analyze reliability (in a near total absence of DER). The first part of this study (Sections 2 and 3 of the report) looks at a number of these software programs and provides both summary descriptions and comparisons. The second part of this study (Section 4 of the report) considers the suitability of these analysis tools for DER studies. It considers steady state modeling and assessment work performed by ORNL using one commercially available tool on feeder data provided by a southern utility. Appendix A provides a technical report on the results of

  12. Software for tomographic analysis: application in ceramic filters

    International Nuclear Information System (INIS)

    Figuerola, W.B.; Assis, J.T.; Oliveira, L.F.; Lopes, R.T.

    2001-01-01

    New methods for acquiring data have been developed with the technological advances. With this, it has been possible to obtain more precise data and, consequently produce results with greater reliability. Among the variety of acquisition methods available, those that have volume description, as CT (Computerized Tomography) and NMR (Nuclear Magnetic Resonance) stand out. The models of volumetric data (group of data that describe a solid object from a three dimensional space) are being greatly used in diversity of areas as a way of inspection, modeling and simulation of objects in a three - dimensional space. Applications of this model are already found in Mechanic Engineering, Geosciences, Medicine and other areas. In the area of engineering it is sometimes necessary to use industrial CT as the only non-invasive form of inspection the interior of pieces without destroying them. The 3D micro focus X-ray tomography is one technique of non destructive testing used in the most different areas of science and technology, given its capacity to generate clean images (practically free of the unhappiness effect) and high resolution reconstructions. The unsharpness effect minimization and space resolution improvement are consequences of the focal spot size reduction in the X-ray micro focus tube to dimensions smaller than 50 mm. The ceramic filters are used in a wide area in the metallurgic industry, particularly in the cast aluminum where they are used to clean the waste coming through the liquid aluminum. The ceramic filters used in this work are manufactured by FUSICO (German company) and they are constructed from foams. They are manufactured at three models: 10, 20 and 30 ppi (porous per inch). In this paper we present the development of software to analyze and characterize ceramic filters, which can be divided in four stages. This software was developed in C++ language, using objects oriented programming. It is also capable of being executed in multiple platforms (Windows

  13. Analysis of Software Development Methodologies to Build Safety Software Applications for the SATEX-II: A Mexican Experimental Satellite

    Science.gov (United States)

    Aguilar Cisneros, Jorge; Vargas Martinez, Hector; Pedroza Melendez, Alejandro; Alonso Arevalo, Miguel

    2013-09-01

    Mexico is a country where the experience to build software for satellite applications is beginning. This is a delicate situation because in the near future we will need to develop software for the SATEX-II (Mexican Experimental Satellite). SATEX- II is a SOMECyTA's project (the Mexican Society of Aerospace Science and Technology). We have experienced applying software development methodologies, like TSP (Team Software Process) and SCRUM in other areas. Then, we analyzed these methodologies and we concluded: these can be applied to develop software for the SATEX-II, also, we supported these methodologies with SSP-05-0 Standard in particular with ESA PSS-05-11. Our analysis was focusing on main characteristics of each methodology and how these methodologies could be used with the ESA PSS 05-0 Standards. Our outcomes, in general, may be used by teams who need to build small satellites, but, in particular, these are going to be used when we will build the on board software applications for the SATEX-II.

  14. Progress in Addressing DNFSB Recommendation 2002-1 Issues: Improving Accident Analysis Software Applications

    International Nuclear Information System (INIS)

    VINCENT, ANDREW

    2005-01-01

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (''Quality Assurance for Safety-Related Software'') identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls to prevent or mitigate potential accidents. Over the last year, DOE has begun several processes and programs as part of the Implementation Plan commitments, and in particular, has made significant progress in addressing several sets of issues particularly important in the application of software for performing hazard and accident analysis. The work discussed here demonstrates that through these actions, Software Quality Assurance (SQA) guidance and software tools are available that can be used to improve resulting safety analysis. Specifically, five of the primary actions corresponding to the commitments made in the Implementation Plan to Recommendation 2002-1 are identified and discussed in this paper. Included are the web-based DOE SQA Knowledge Portal and the Central Registry, guidance and gap analysis reports, electronic bulletin board and discussion forum, and a DOE safety software guide. These SQA products can benefit DOE safety contractors in the development of hazard and accident analysis by precluding inappropriate software applications and utilizing best practices when incorporating software results to safety basis documentation. The improvement actions discussed here mark a beginning to establishing stronger, standard-compliant programs, practices, and processes in SQA among safety software users, managers, and reviewers throughout the DOE Complex. Additional effort is needed, however, particularly in: (1) processes to add new software applications to the DOE Safety Software Toolbox; (2) improving the effectiveness of software issue communication; and (3) promoting a safety software quality assurance culture

  15. Software and package applicating for network meta-analysis: A usage-based comparative study.

    Science.gov (United States)

    Xu, Chang; Niu, Yuming; Wu, Junyi; Gu, Huiyun; Zhang, Chao

    2017-12-21

    To compare and analyze the characteristics and functions of software applications for network meta-analysis (NMA). PubMed, EMbase, The Cochrane Library, the official websites of Bayesian inference Using Gibbs Sampling (BUGS), Stata and R, and Google were searched to collect the software and packages for performing NMA; software and packages published up to March 2016 were included. After collecting the software, packages, and their user guides, we used the software and packages to calculate a typical example. All characteristics, functions, and computed results were compared and analyzed. Ten types of software were included, including programming and non-programming software. They were developed mainly based on Bayesian or frequentist theory. Most types of software have the characteristics of easy operation, easy mastery, exact calculation, or excellent graphing. However, there was no single software that performed accurate calculations with superior graphing; this could only be achieved through the combination of two or more types of software. This study suggests that the user should choose the appropriate software according to personal programming basis, operational habits, and financial ability. Then, the choice of the combination of BUGS and R (or Stata) software to perform the NMA is considered. © 2017 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.

  16. Application software profiles 2010

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    2010-04-15

    This article presented information on new software applications designed to facilitate petroleum exploration, drilling and production activities. Computer modelling and analysis enables oil and gas producers to characterize reservoirs, estimate reserves forecast production, plan operations and manage assets. Seven Calgary-based organizations were highlighted along with their sophisticated software tools, the applications and the new features available in each product. The geoSCOUT version 7.7 by GeoLOGIC Systems Ltd. integrates public and proprietary data on wells, well logs, reserves, pipelines, production, ownership and seismic location data. The Value Navigator and AFE Navigator by Energy Navigator provides control over reserves, production and cash flow forecasting. FAST Harmony, FAST Evolution, FAST CBM, FAST FieldNotes, Fast Piper, FAST RTA, FAST VirtuWell and FAST WellTest by Fekete Associates Inc. provide reserve evaluations for reservoir engineering projects and production data analysis. The esi.manage software program by 3esi improves business results for upstream oil and gas companies through enhanced decision making and workforce effectiveness. WELLFLO, PIPEFLO, FORGAS, OLGA, Drillbench, and MEPO wellbore solutions by Neotec provide unique platforms for flow simulation to optimize oil and gas production systems. Petrel, ECLIPSE, Avocet, PipeSim and Merak software tools by Schlumberger Information Solutions are petroleum systems modelling tools for geologic mapping, visualization modelling and reservoir engineering. StudioSL by Streamsim Technologies Inc. is a modelling tool for optimizing flood management. figs.

  17. Application of Gaia Analysis Software AGIS to Nano-JASMINE

    Science.gov (United States)

    Yamada, Y.; Lammers, U.; Gouda, N.

    2011-07-01

    The core data reduction for the Nano-JASMINE mission is planned to be done with Gaia's Astrometric Global Iterative Solution (AGIS). Nano-JASMINE is an ultra small (35 kg) satellite for astrometry observations in Japan and Gaia is ESA's large (over 1000 kg) next-generation astrometry mission. The accuracy of Nano-JASMINE is about 3 mas, comparable to the Hipparcos mission, Gaia's predecessor some 20 years ago. It is challenging that such a small satellite can perform real scientific observations. The collaboration for sharing software started in 2007. In addition to similar design and operating principles of the two missions, this is possible thanks to the encapsulation of all Gaia-specific aspects of AGIS in a Parameter Database. Nano-JASMINE will be the test bench for the Gaia AGIS software. We present this idea in detail and the necessary practical steps to make AGIS work with Nano-JASMINE data. We also show the key mission parameters, goals, and status of the data reduction for the Nano-JASMINE.

  18. To the problem of regulating of software applicability for the analysis of domestic reactor accidents

    International Nuclear Information System (INIS)

    Kim, V.V.; Skalozubov, V.I.

    1999-01-01

    Based on consideration and generalization of results of verification/validation researches the necessity of development of an objective evaluation criterions of software applicability (calculated codes) for separate types of domestic reactor accidents is justified. These criterions should be used in a normative position of certification or the application order of calculated codes for the analysis of reactor safety

  19. Key attributes of the SAPHIRE risk and reliability analysis software for risk-informed probabilistic applications

    International Nuclear Information System (INIS)

    Smith, Curtis; Knudsen, James; Kvarfordt, Kellie; Wood, Ted

    2008-01-01

    The Idaho National Laboratory is a primary developer of probabilistic risk and reliability analysis (PRRA) tools, dating back over 35 years. Evolving from mainframe-based software, the current state-of-the-practice has led to the creation of the SAPHIRE software. Currently, agencies such as the Nuclear Regulatory Commission, the National Aeronautics and Aerospace Agency, the Department of Energy, and the Department of Defense use version 7 of the SAPHIRE software for many of their risk-informed activities. In order to better understand and appreciate the power of software as part of risk-informed applications, we need to recall that our current analysis methods and solution methods have built upon pioneering work done 30-40 years ago. We contrast this work with the current capabilities in the SAPHIRE analysis package. As part of this discussion, we provide information for both the typical features and special analysis capabilities, which are available. We also present the application and results typically found with state-of-the-practice PRRA models. By providing both a high-level and detailed look at the SAPHIRE software, we give a snapshot in time for the current use of software tools in a risk-informed decision arena

  20. PROMETHEE Method and Sensitivity Analysis in the Software Application for the Support of Decision-Making

    Directory of Open Access Journals (Sweden)

    Petr Moldrik

    2008-01-01

    Full Text Available PROMETHEE is one of methods, which fall into multi-criteria analysis (MCA. The MCA, as the name itself indicates, deals with the evaluation of particular variants according to several criteria. Developed software application (MCA8 for the support of multi-criteria decision-making was upgraded about PROMETHEE method and a graphic tool, which enables the execution of the sensitivity analysis. This analysis is used to ascertain how a given model output depends upon the input parameters. The MCA8 software application with mentioned graphic upgrade was developed for purposes of solving multi-criteria decision tasks. In the MCA8 is possible to perform sensitivity analysis by a simple form – through column graphs. We can change criteria significances (weights in these column graphs directly and watch the changes of the order of variants immediately.

  1. Study of gamma ray analysis software's. Application to activation analysis of geological samples

    International Nuclear Information System (INIS)

    Silva, Luiz Roberto Nogueira da

    1998-01-01

    A comparative evaluation of the gamma-ray analysis software VISPECT, in relation to two commercial gamma-ray analysis software packages, OMNIGAM (EG and G Ortec) and SAMPO 90 (Canberra) was performed. For this evaluation, artificial gamma ray spectra were created, presenting peaks of different intensities and located at four different regions of the spectrum. Multiplet peaks with equal and different intensities, but with different channel separations, were also created. The results obtained showed a good performance of VISPECT in detecting and analysing single and multiplet peaks of different intensities in the gamma-ray spectrum. Neutron activation analysis of the geological reference material GS-N (IWG-GIT) and of the granite G-94, used in a Proficiency Testing Trial of Analytical Geochemistry Laboratories, was also performed , in order to evaluate the VISEPCT software in the analysis of real samples. The results obtained by using VISPECT were as good or better than the ones obtained using the other programs. (author)

  2. Demonstration of a software design and statistical analysis methodology with application to patient outcomes data sets.

    Science.gov (United States)

    Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard

    2013-11-01

    With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues. A software application was constructed to automate analysis of patient outcomes data using a wide range of statistical metrics, by combining use of C#.Net and R code. The accuracy and speed of the code was evaluated using benchmark data sets. The approach provides data needed to evaluate combinations of statistical measurements for ability to identify patterns of interest in the data. Through application of the tools to a benchmark data set for dose-response threshold and to SBRT lung data sets, an algorithm was developed that uses receiver operator characteristic curves to identify a threshold value and combines use of contingency tables, Fisher exact tests, Welch t-tests, and Kolmogorov-Smirnov tests to filter the large data set to identify values demonstrating dose-response. Kullback-Leibler divergences were used to provide additional confirmation. The work demonstrates the viability of the design approach and the software tool for analysis of large data sets.

  3. HAZARD ANALYSIS SOFTWARE

    International Nuclear Information System (INIS)

    Sommer, S; Tinh Tran, T.

    2008-01-01

    Washington Safety Management Solutions, LLC developed web-based software to improve the efficiency and consistency of hazard identification and analysis, control selection and classification, and to standardize analysis reporting at Savannah River Site. In the new nuclear age, information technology provides methods to improve the efficiency of the documented safety analysis development process which includes hazard analysis activities. This software provides a web interface that interacts with a relational database to support analysis, record data, and to ensure reporting consistency. A team of subject matter experts participated in a series of meetings to review the associated processes and procedures for requirements and standard practices. Through these meetings, a set of software requirements were developed and compiled into a requirements traceability matrix from which software could be developed. The software was tested to ensure compliance with the requirements. Training was provided to the hazard analysis leads. Hazard analysis teams using the software have verified its operability. The software has been classified as NQA-1, Level D, as it supports the analysis team but does not perform the analysis. The software can be transported to other sites with alternate risk schemes. The software is being used to support the development of 14 hazard analyses. User responses have been positive with a number of suggestions for improvement which are being incorporated as time permits. The software has enforced a uniform implementation of the site procedures. The software has significantly improved the efficiency and standardization of the hazard analysis process

  4. VANESA - A Software Application for the Visualization and Analysis of Networks in Systems Biology Applications

    Directory of Open Access Journals (Sweden)

    Brinkrolf Christoph

    2014-06-01

    Full Text Available VANESA is a modeling software for the automatic reconstruction and analysis of biological networks based on life-science database information. Using VANESA, scientists are able to model any kind of biological processes and systems as biological networks. It is now possible for scientists to automatically reconstruct important molecular systems with information from the databases KEGG, MINT, IntAct, HPRD, and BRENDA. Additionally, experimental results can be expanded with database information to better analyze the investigated elements and processes in an overall context. Users also have the possibility to use graph theoretical approaches in VANESA to identify regulatory structures and significant actors within the modeled systems. These structures can then be further investigated in the Petri net environment of VANESA. It is platform-independent, free-of-charge, and available at http://vanesa.sf.net.

  5. On-Orbit Software Analysis

    Science.gov (United States)

    Moran, Susanne I.

    2004-01-01

    The On-Orbit Software Analysis Research Infusion Project was done by Intrinsyx Technologies Corporation (Intrinsyx) at the National Aeronautics and Space Administration (NASA) Ames Research Center (ARC). The Project was a joint collaborative effort between NASA Codes IC and SL, Kestrel Technology (Kestrel), and Intrinsyx. The primary objectives of the Project were: Discovery and verification of software program properties and dependencies, Detection and isolation of software defects across different versions of software, and Compilation of historical data and technical expertise for future applications

  6. Application of software quality assurance methods in validation and maintenance of reactor analysis computer codes

    International Nuclear Information System (INIS)

    Reznik, L.

    1994-01-01

    Various computer codes employed at Israel Electricity Company for preliminary reactor design analysis and fuel cycle scoping calculations have been often subject to program source modifications. Although most changes were due to computer or operating system compatibility problems, a number of significant modifications were due to model improvement and enhancements of algorithm efficiency and accuracy. With growing acceptance of software quality assurance requirements and methods, a program of implementing extensive testing of modified software has been adopted within the regular maintenance activities. In this work survey has been performed of various software quality assurance methods of software testing which belong mainly to the two major categories of implementation ('white box') and specification-based ('black box') testing. The results of this survey exhibits a clear preference of specification-based testing. In particular the equivalence class partitioning method and the boundary value method have been selected as especially suitable functional methods for testing reactor analysis codes.A separate study of software quality assurance methods and techniques has been performed in this work objective to establish appropriate pre-test software specification methods. Two methods of software analysis and specification have been selected as the most suitable for this purpose: The method of data flow diagrams has been shown to be particularly valuable for performing the functional/procedural software specification while the entities - relationship diagrams has been approved to be efficient for specifying software data/information domain. Feasibility of these two methods has been analyzed in particular for software uncertainty analysis and overall code accuracy estimation. (author). 14 refs

  7. Software safety hazard analysis

    International Nuclear Information System (INIS)

    Lawrence, J.D.

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper

  8. A software engineering process for safety-critical software application

    International Nuclear Information System (INIS)

    Kang, Byung Heon; Kim, Hang Bae; Chang, Hoon Seon; Jeon, Jong Sun

    1995-01-01

    Application of computer software to safety-critical systems in on the increase. To be successful, the software must be designed and constructed to meet the functional and performance requirements of the system. For safety reason, the software must be demonstrated not only to meet these requirements, but also to operate safely as a component within the system. For longer-term cost consideration, the software must be designed and structured to ease future maintenance and modifications. This paper presents a software engineering process for the production of safety-critical software for a nuclear power plant. The presentation is expository in nature of a viable high quality safety-critical software development. It is based on the ideas of a rational design process and on the experience of the adaptation of such process in the production of the safety-critical software for the shutdown system number two of Wolsung 2, 3 and 4 nuclear power generation plants. This process is significantly different from a conventional process in terms of rigorous software development phases and software design techniques, The process covers documentation, design, verification and testing using mathematically precise notations and highly reviewable tabular format to specify software requirements and software requirements and software requirements and code against software design using static analysis. The software engineering process described in this paper applies the principle of information-hiding decomposition in software design using a modular design technique so that when a change is required or an error is detected, the affected scope can be readily and confidently located. it also facilitates a sense of high degree of confidence in the 'correctness' of the software production, and provides a relatively simple and straightforward code implementation effort. 1 figs., 10 refs. (Author)

  9. Application of Artificial Intelligence technology to the analysis and synthesis of reliable software systems

    Science.gov (United States)

    Wild, Christian; Eckhardt, Dave

    1987-01-01

    The development of a methodology for the production of highly reliable software is one of the greatest challenges facing the computer industry. Meeting this challenge will undoubtably involve the integration of many technologies. This paper describes the use of Artificial Intelligence technologies in the automated analysis of the formal algebraic specifications of abstract data types. These technologies include symbolic execution of specifications using techniques of automated deduction and machine learning through the use of examples. On-going research into the role of knowledge representation and problem solving in the process of developing software is also discussed.

  10. Software testability and its application to avionic software

    Science.gov (United States)

    Voas, Jeffrey M.; Miller, Keith W.; Payne, Jeffery E.

    1993-01-01

    Randomly generated black-box testing is an established yet controversial method of estimating software reliability. Unfortunately, as software applications have required higher reliabilities, practical difficulties with black-box testing have become increasingly problematic. These practical problems are particularly acute in life-critical avionics software, where requirements of 10 exp -7 failures per hour of system reliability can translate into a probability of failure (POF) of perhaps 10 exp -9 or less for each individual execution of the software. This paper describes the application of one type of testability analysis called 'sensitivity analysis' to B-737 avionics software; one application of sensitivity analysis is to quantify whether software testing is capable of detecting faults in a particular program and thus whether we can be confident that a tested program is not hiding faults. We so 80 by finding the testabilities of the individual statements of the program, and then use those statement testabilities to find the testabilities of the functions and modules. For the B-737 system we analyzed, we were able to isolate those functions that are more prone to hide errors during system/reliability testing.

  11. Supervised Semi-Automated Data Analysis Software for Gas Chromatography / Differential Mobility Spectrometry (GC/DMS) Metabolomics Applications.

    Science.gov (United States)

    Peirano, Daniel J; Pasamontes, Alberto; Davis, Cristina E

    2016-09-01

    Modern differential mobility spectrometers (DMS) produce complex and multi-dimensional data streams that allow for near-real-time or post-hoc chemical detection for a variety of applications. An active area of interest for this technology is metabolite monitoring for biological applications, and these data sets regularly have unique technical and data analysis end user requirements. While there are initial publications on how investigators have individually processed and analyzed their DMS metabolomic data, there are no user-ready commercial or open source software packages that are easily used for this purpose. We have created custom software uniquely suited to analyze gas chromatograph / differential mobility spectrometry (GC/DMS) data from biological sources. Here we explain the implementation of the software, describe the user features that are available, and provide an example of how this software functions using a previously-published data set. The software is compatible with many commercial or home-made DMS systems. Because the software is versatile, it can also potentially be used for other similarly structured data sets, such as GC/GC and other IMS modalities.

  12. Software for safety critical applications

    International Nuclear Information System (INIS)

    Kropik, M.; Matejka, K.; Jurickova, M.; Chudy, R.

    2001-01-01

    The contribution gives an overview of the project of the software development for safety critical applications. This project has been carried out since 1997. The principal goal of the project was to establish a research laboratory for the development of the software with the highest requirements for quality and reliability. This laboratory was established at the department, equipped with proper hardware and software to support software development. A research team of predominantly young researchers for software development was created. The activities of the research team started with studying and proposing the software development methodology. In addition, this methodology was applied to the real software development. The verification and validation process followed the software development. The validation system for the integrated hardware and software tests was brought into being and its control software was developed. The quality of the software tools was also observed, and the SOSAT tool was used during these activities. National and international contacts were established and maintained during the project solution.(author)

  13. SOFTWARE FOR DESIGNING PARALLEL APPLICATIONS

    Directory of Open Access Journals (Sweden)

    M. K. Bouza

    2017-01-01

    Full Text Available The object of research is the tools to support the development of parallel programs in C/C ++. The methods and software which automates the process of designing parallel applications are proposed.

  14. Software qualification in safety applications

    International Nuclear Information System (INIS)

    Lawrence, J.D.

    2000-01-01

    The developers of safety-critical instrumentation and control systems must qualify the design of the components used, including the software in the embedded computer systems, in order to ensure that the component can be trusted to perform its safety function under the full range of operating conditions. There are well known ways to qualify analog systems using the facts that: (1) they are built from standard modules with known properties; (2) design documents are available and described in a well understood language; (3) the performance of the component is constrained by physics; and (4) physics models exist to predict the performance. These properties are not generally available for qualifying software, and one must fall back on extensive testing and qualification of the design process. Neither of these is completely satisfactory. The research reported here is exploring an alternative approach that is intended to permit qualification for an important subset of instrumentation software. The research goal is to determine if a combination of static analysis and limited testing can be used to qualify a class of simple, but practical, computer-based instrumentation components for safety application. These components are of roughly the complexity of a motion detector alarm controller. This goal is accomplished by identifying design constraints that enable meaningful analysis and testing. Once such design constraints are identified, digital systems can be designed to allow for analysis and testing, or existing systems may be tested for conformance to the design constraints as a first step in a qualification process. This will considerably reduce the cost and monetary risk involved in qualifying commercial components for safety-critical service

  15. Software Application for Data Collection and Analysis in Acute Myeloid Leukemia

    Directory of Open Access Journals (Sweden)

    Anca BACÂREA

    2011-03-01

    Full Text Available Aim: It is important in the context of the informatics development and also of medical research, that new software technology to be integrated in order to achieve easier research. The aim of this study was to develop a software application that uses few resources, and that enable data collection, their primary processing in statistical terms (e.g. mean, median, etc., drawing of survival curves and survival Log Rank statistic testing according to the collected parameters. Material and Method: For this purpose, a database in SQLite3 was developed. Because the database engine is embedded in the Database Management System (DBMS this program allows absolute portability. Graphical interface was made in wxWidgets. Statistical calculations were obtained using R software (the `addons` E1071 was used for descriptive statistics and the `Survival `for testing survival and Northest for Kaplan Meier survival curve. Patients were cases admitted and treated in the Hematology Department of County Emergency Hospital Tîrgu Mureş hospitalized and treated during 2007-2010. Results: We created a GUI in wxWidgets to collect the desired medical data: age, date of diagnosis, date of death, blood count values, and the CD leukocyte markers detected by flow cytometry. Entwining of medical data collection and processing statistics (for acute myeloid leukemia - survival, prognostic factors evaluation is a further step in medical research. Conclusion: The tool presented is a useful for research. Application in acute myeloid leukemia derives from the author's interest in the subject; development of this tool in other directions is possible and desirable.

  16. Infusing Reliability Techniques into Software Safety Analysis

    Science.gov (United States)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  17. Automated Software Vulnerability Analysis

    Science.gov (United States)

    Sezer, Emre C.; Kil, Chongkyung; Ning, Peng

    Despite decades of research, software continues to have vulnerabilities. Successful exploitations of these vulnerabilities by attackers cost millions of dollars to businesses and individuals. Unfortunately, most effective defensive measures, such as patching and intrusion prevention systems, require an intimate knowledge of the vulnerabilities. Many systems for detecting attacks have been proposed. However, the analysis of the exploited vulnerabilities is left to security experts and programmers. Both the human effortinvolved and the slow analysis process are unfavorable for timely defensive measure to be deployed. The problem is exacerbated by zero-day attacks.

  18. Dependability Analysis Methods For Configurable Software

    International Nuclear Information System (INIS)

    Dahll, Gustav; Pulkkinen, Urho

    1996-01-01

    Configurable software systems are systems which are built up by standard software components in the same way as a hardware system is built up by standard hardware components. Such systems are often used in the control of NPPs, also in safety related applications. A reliability analysis of such systems is therefore necessary. This report discusses what configurable software is, and what is particular with respect to reliability assessment of such software. Two very commonly used techniques in traditional reliability analysis, viz. failure mode, effect and criticality analysis (FMECA) and fault tree analysis are investigated. A real example is used to illustrate the discussed methods. Various aspects relevant to the assessment of the software reliability in such systems are discussed. Finally some models for quantitative software reliability assessment applicable on configurable software systems are described. (author)

  19. APPLICATION OF IMPRECISE MODELS IN ANALYSIS OF RISK MANAGEMENT OF SOFTWARE SYSTEMS

    Directory of Open Access Journals (Sweden)

    Volodymyr Kharchenko

    2017-11-01

    Full Text Available The analysis of functional completeness for existing detection systems was conducted. It made it possible to define information systems with a similar feature set, to assess the degree of similarity and the matching degree of the means from the "standard" model of risk management system, that considers the recommended ICAO practices and standards on aviation safety, to justify the advisability of decision-making support system creation, using imprecise model and imprecise logic for risk analysis at aviation activities. Imprecise models have a number of features regarding the possibility of taking into account the experts’ intuition and experience, the possibility of more adequate flight safety management processes modelling and obtaining the accurate decisions that correlate with the initial data; support for the rapid development of a safety management system with its further functionality complexity increase; their hardware and software implementation in control systems and decision making is less sophisticated in comparison with classical algorithms.

  20. Reliability analysis of software based safety functions

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1993-05-01

    The methods applicable in the reliability analysis of software based safety functions are described in the report. Although the safety functions also include other components, the main emphasis in the report is on the reliability analysis of software. The check list type qualitative reliability analysis methods, such as failure mode and effects analysis (FMEA), are described, as well as the software fault tree analysis. The safety analysis based on the Petri nets is discussed. The most essential concepts and models of quantitative software reliability analysis are described. The most common software metrics and their combined use with software reliability models are discussed. The application of software reliability models in PSA is evaluated; it is observed that the recent software reliability models do not produce the estimates needed in PSA directly. As a result from the study some recommendations and conclusions are drawn. The need of formal methods in the analysis and development of software based systems, the applicability of qualitative reliability engineering methods in connection to PSA and the need to make more precise the requirements for software based systems and their analyses in the regulatory guides should be mentioned. (orig.). (46 refs., 13 figs., 1 tab.)

  1. Tired of Teaching Software Applications?

    Science.gov (United States)

    Lippert, Susan K.; Granger, Mary J.

    Many university business schools have an instructor-led course introducing computer software application packages. This course is often required for all undergraduates and is a prerequisite to other courses, such as accounting, finance, marketing, and operations management. Knowledge and skills gained in this course should enable students not only…

  2. The Study on Neuro-IE Management Software in Manufacturing Enterprises. -The Application of Video Analysis Technology

    Science.gov (United States)

    Bian, Jun; Fu, Huijian; Shang, Qian; Zhou, Xiangyang; Ma, Qingguo

    This paper analyzes the outstanding problems in current industrial production by reviewing the three stages of the Industrial Engineering Development. Based on investigations and interviews in enterprises, we propose the new idea of applying "computer video analysis technology" to new industrial engineering management software, and add "loose-coefficient" of the working station to this software in order to arrange scientific and humanistic production. Meanwhile, we suggest utilizing Biofeedback Technology to promote further research on "the rules of workers' physiological, psychological and emotional changes in production". This new kind of combination will push forward industrial engineering theories and benefit enterprises in progressing towards flexible social production, thus it will be of great theory innovation value, social significance and application value.

  3. The application of the SXF lattice description and the UAL software environment to the analysis of the LHC

    CERN Document Server

    Fischer, W; Ptitsyn, V I

    1999-01-01

    A software environment for accelerator modeling has been developed which includes the UAL (Unified Accelerator Library), a collection of accelerator physics libraries with a Perl interface for scripting, and the SXF (Standard eX-change Format), a format for accelerator description which extends the MAD sequence by including deviations from design values. SXF interfaces have been written for several programs, including MAD9 and MAD8 via the doom database, Cosy, TevLat and UAL itself, which includes Teapot++. After an overview of the software we describe the application of the tools to the analysis of the LHC lattice stability, in the presence of alignment and coupling errors, and to the correction of the first turn and closed orbit in the machine. (7 refs).

  4. Analysis of open source GIS software

    OpenAIRE

    Božnis, Andrius

    2006-01-01

    GIS is one of the most perspective information technology sciences sphere. GIS conjuncts the digital image analysis and data base systems. This makes GIS wide applicable and very high skills demanding system. There is a lot of commercial GIS software which is well advertised and which functionality is pretty well known, while open source software is forgotten. In this diploma work is made analysis of available open source GIS software on the Internet, in the scope of different projects interr...

  5. Fault tree analysis of KNICS RPS software

    International Nuclear Information System (INIS)

    Park, Gee Yong; Kwon, Kee Choon; Koh, Kwang Yong; Jee, Eun Kyoung; Seong, Poong Hyun; Lee, Dae Hyung

    2008-01-01

    This paper describes the application of a software Fault Tree Analysis (FTA) as one of the analysis techniques for a Software Safety Analysis (SSA) at the design phase and its analysis results for the safety-critical software of a digital reactor protection system, which is called the KNICS RPS, being developed in the KNICS (Korea Nuclear Instrumentation and Control Systems) project. The software modules in the design description were represented by Function Blocks (FBs), and the software FTA was performed based on the well-defined fault tree templates for the FBs. The SSA, which is part of the verification and validation (V and V) activities, was activated at each phase of the software lifecycle for the KNICS RPS. At the design phase, the software HAZOP (Hazard and Operability) and the software FTA were employed in the SSA in such a way that the software HAZOP was performed first and then the software FTA was applied. The software FTA was applied to some critical modules selected from the software HAZOP analysis

  6. An Application of Alloy to Static Analysis for Secure Information Flow and Verification of Software Systems

    National Research Council Canada - National Science Library

    Shaffer, Alan B

    2008-01-01

    Within a multilevel secure (MLS) system, flaws in design and implementation can result in overt and covert channels, both of which may be exploited by malicious software to cause unauthorized information flows...

  7. Software quality testing process analysis

    OpenAIRE

    Mera Paz, Julián

    2016-01-01

    Introduction: This article is the result of reading, review, analysis of books, magazines and articles well known for their scientific and research quality, which have addressed the software quality testing process. The author, based on his work experience in software development companies, teaching and other areas, has compiled and selected information to argue and substantiate the importance of the software quality testing process. Methodology: the existing literature on the software qualit...

  8. Application of the thermal efficiency analysis software 'EgWin' at existing power plants

    International Nuclear Information System (INIS)

    Koda, E.; Takahashi, T.; Nakao, Y.

    2008-01-01

    'EgWin' is the general purpose software to analyze a thermal efficiency of power system developed in CRIEPI. This software has been used to analyze the existing power generation unit of 30 or more, and the effectiveness has been confirmed. In thermal power plants, it was used for the clarification of the thermal efficiency decrease factor and the quantitative estimation of the influence that each factor gave to the thermal efficiency of the plant. Also it was used for the quantitative estimation of the effect by the operating condition change and the facility remodeling in thermal power, atomic energy, and geothermal power plants. (author)

  9. Automating Embedded Analysis Capabilities and Managing Software Complexity in Multiphysics Simulation, Part II: Application to Partial Differential Equations

    Directory of Open Access Journals (Sweden)

    Roger P. Pawlowski

    2012-01-01

    Full Text Available A template-based generic programming approach was presented in Part I of this series of papers [Sci. Program. 20 (2012, 197–219] that separates the development effort of programming a physical model from that of computing additional quantities, such as derivatives, needed for embedded analysis algorithms. In this paper, we describe the implementation details for using the template-based generic programming approach for simulation and analysis of partial differential equations (PDEs. We detail several of the hurdles that we have encountered, and some of the software infrastructure developed to overcome them. We end with a demonstration where we present shape optimization and uncertainty quantification results for a 3D PDE application.

  10. Characterization of analysis activity in the development of object-oriented software. Application to a examination system in nuclear medicine

    International Nuclear Information System (INIS)

    Bayas, Marcos Raul Cordova.

    1995-01-01

    The object-oriented approach, formerly proposed as an alternative to conventional software coding techniques, has expanded its scope to other phases in software development, including the analysis phase. This work discusses basic concepts and major object oriented analysis methods, drawing comparisons with structured analysis, which has been the dominant paradigm in systems analysis. The comparison is based on three interdependent system aspects, that must be specified during the analysis phase: data, control and functionality. The specification of a radioisotope examination archive system is presented as a case study. (author). 45 refs., 87 figs., 1 tab

  11. Free/open source software: a study of some applications for scientific data analysis of nuclear experiments

    International Nuclear Information System (INIS)

    Menezes, Mario Olimpio de

    2005-01-01

    Free/Open Source Software (FOSS) has been used in science long before the formal social movement known as 'Free Software/Open Source Software' came in to existence. After the Personal Computer (PC) boom in the 80s, commercial closed source software became widely available to scientists for data analysis in this platform. In this paper, we study some high quality FOSS, available also for free, that can be used for complex data analysis tasks. We show the results and data analysis process, aiming to expose the high quality and highly productive ways of both results and processes, while highlighting the different approach used in some of the FOSS. We show that scientists have today in FOSS a viable, high quality alternative to commercial closed source software which, besides being ready to use, also offer the possibility of great customization or extension to fit very particular needs of many fields of scientific data analysis. Among the FOSS, we study in this paper GNU Octave and SCILAB - free alternatives to MATLAB; Gnuplot - free alternative to ORIGIN-like software. We also show that scientists have invaluable resources in modern FOSS programming languages such as Python, and Perl, that can be used both to do data analysis and manipulation, allowing very complex tasks to be done automatically after some few lines of easy programming. (author)

  12. Free/open source software: a study of some applications for scientific data analysis of nuclear experiments

    Energy Technology Data Exchange (ETDEWEB)

    Menezes, Mario Olimpio de [Instituto de Pesquisas Energeticas e Nucleares (IPEN), Sao Paulo, SP (Brazil)]. E-mail: mario@ipen.br; mo.menezes@gmail.com

    2005-07-01

    Free/Open Source Software (FOSS) has been used in science long before the formal social movement known as 'Free Software/Open Source Software' came in to existence. After the Personal Computer (PC) boom in the 80s, commercial closed source software became widely available to scientists for data analysis in this platform. In this paper, we study some high quality FOSS, available also for free, that can be used for complex data analysis tasks. We show the results and data analysis process, aiming to expose the high quality and highly productive ways of both results and processes, while highlighting the different approach used in some of the FOSS. We show that scientists have today in FOSS a viable, high quality alternative to commercial closed source software which, besides being ready to use, also offer the possibility of great customization or extension to fit very particular needs of many fields of scientific data analysis. Among the FOSS, we study in this paper GNU Octave and SCILAB - free alternatives to MATLAB; Gnuplot - free alternative to ORIGIN-like software. We also show that scientists have invaluable resources in modern FOSS programming languages such as Python, and Perl, that can be used both to do data analysis and manipulation, allowing very complex tasks to be done automatically after some few lines of easy programming. (author)

  13. Transana Video Analysis Software as a Tool for Consultation: Applications to Improving PTA Meeting Leadership

    Science.gov (United States)

    Rush, Craig

    2012-01-01

    The chief aim of this article is to illustrate the potential of using Transana, a qualitative video analysis tool, for effective and efficient school-based consultation. In this illustrative study, the Transana program facilitated analysis of excerpts of video from a representative sample of Parent Teacher Association (PTA) meetings over the…

  14. Software architecture analysis tool : software architecture metrics collection

    NARCIS (Netherlands)

    Muskens, J.; Chaudron, M.R.V.; Westgeest, R.

    2002-01-01

    The Software Engineering discipline lacks the ability to evaluate software architectures. Here we describe a tool for software architecture analysis that is based on metrics. Metrics can be used to detect possible problems and bottlenecks in software architectures. Even though metrics do not give a

  15. Software engineering processes principles and applications

    CERN Document Server

    Wang, Yingxu

    2000-01-01

    Fundamentals of the Software Engineering ProcessIntroductionA Unified Framework of the Software Engineering ProcessProcess AlgebraProcess-Based Software EngineeringSoftware Engineering Process System ModelingThe CMM ModelThe ISO 9001 ModelThe BOOTSTRAP ModelThe ISO/IEC 15504 (SPICE) ModelThe Software Engineering Process Reference Model: SEPRMSoftware Engineering Process System AnalysisBenchmarking the SEPRM ProcessesComparative Analysis of Current Process ModelsTransformation of Capability Levels Between Current Process ModelsSoftware Engineering Process EstablishmentSoftware Process Establish

  16. Portable Medical Laboratory Applications Software

    OpenAIRE

    Silbert, Jerome A.

    1983-01-01

    Portability implies that a program can be run on a variety of computers with minimal software revision. The advantages of portability are outlined and design considerations for portable laboratory software are discussed. Specific approaches for achieving this goal are presented.

  17. GEnomes Management Application (GEM.app): a new software tool for large-scale collaborative genome analysis.

    Science.gov (United States)

    Gonzalez, Michael A; Lebrigio, Rafael F Acosta; Van Booven, Derek; Ulloa, Rick H; Powell, Eric; Speziani, Fiorella; Tekin, Mustafa; Schüle, Rebecca; Züchner, Stephan

    2013-06-01

    Novel genes are now identified at a rapid pace for many Mendelian disorders, and increasingly, for genetically complex phenotypes. However, new challenges have also become evident: (1) effectively managing larger exome and/or genome datasets, especially for smaller labs; (2) direct hands-on analysis and contextual interpretation of variant data in large genomic datasets; and (3) many small and medium-sized clinical and research-based investigative teams around the world are generating data that, if combined and shared, will significantly increase the opportunities for the entire community to identify new genes. To address these challenges, we have developed GEnomes Management Application (GEM.app), a software tool to annotate, manage, visualize, and analyze large genomic datasets (https://genomics.med.miami.edu/). GEM.app currently contains ∼1,600 whole exomes from 50 different phenotypes studied by 40 principal investigators from 15 different countries. The focus of GEM.app is on user-friendly analysis for nonbioinformaticians to make next-generation sequencing data directly accessible. Yet, GEM.app provides powerful and flexible filter options, including single family filtering, across family/phenotype queries, nested filtering, and evaluation of segregation in families. In addition, the system is fast, obtaining results within 4 sec across ∼1,200 exomes. We believe that this system will further enhance identification of genetic causes of human disease. © 2013 Wiley Periodicals, Inc.

  18. Software development for teleroentgenogram analysis

    Science.gov (United States)

    Goshkoderov, A. A.; Khlebnikov, N. A.; Obabkov, I. N.; Serkov, K. V.; Gajniyarov, I. M.; Aliev, A. A.

    2017-09-01

    A framework for the analysis and calculation of teleroentgenograms was developed. Software development was carried out in the Department of Children's Dentistry and Orthodontics in Ural State Medical University. The software calculates the teleroentgenogram by the original method which was developed in this medical department. Program allows designing its own methods for calculating the teleroentgenograms by new methods. It is planned to use the technology of machine learning (Neural networks) in the software. This will help to make the process of calculating the teleroentgenograms easier because methodological points will be placed automatically.

  19. APPLICATION OF THE SPECTRUM ANALYSIS WITH USING BERG METHOD TO DEVELOPED SPECIAL SOFTWARE TOOLS FOR OPTICAL VIBRATION DIAGNOSTICS SYSTEM

    Directory of Open Access Journals (Sweden)

    E. O. Zaitsev

    2016-01-01

    Full Text Available The objective of this paper is development and experimental verification special software of spectral analysis. Spectral analysis use of controlled vibrations objects. Spectral analysis of vibration based on use maximum-entropy autoregressive method of spectral analysis by the Berg algorithm. For measured signals use preliminary analysis based on regression analysis. This analysis of the signal enables to eliminate uninformative parameters such as – the noise and the trend. For preliminary analysis developed special software tools. Non-contact measurement of mechanical vibrations parameters rotating diffusely-reflecting surfaces used in circumstances where the use of contact sensors difficult or impossible for a number of reasons, including lack of access to the object, the small size of the controlled area controlled portion has a high temperature or is affected by strong electromagnetic fields. For control use offered laser measuring system. This measuring system overcomes the shortcomings interference or Doppler optical measuring systems. Such as measure the large amplitude and inharmonious vibration. On the basis of the proposed methods developed special software tools for use measuring laser system. LabVIEW using for developed special software. Experimental research of the proposed method of vibration signals processing is checked in the analysis of the diagnostic information obtained by measuring the vibration system grinding diamond wheel cold solid tungsten-containing alloy TK8. A result of work special software tools was complex spectrum obtained «purified» from non-informative parameters. Spectrum of the signal corresponding to the vibration process observed object. 

  20. Programming Language Software For Graphics Applications

    Science.gov (United States)

    Beckman, Brian C.

    1993-01-01

    New approach reduces repetitive development of features common to different applications. High-level programming language and interactive environment with access to graphical hardware and software created by adding graphical commands and other constructs to standardized, general-purpose programming language, "Scheme". Designed for use in developing other software incorporating interactive computer-graphics capabilities into application programs. Provides alternative to programming entire applications in C or FORTRAN, specifically ameliorating design and implementation of complex control and data structures typifying applications with interactive graphics. Enables experimental programming and rapid development of prototype software, and yields high-level programs serving as executable versions of software-design documentation.

  1. Visual querying and analysis of large software repositories

    NARCIS (Netherlands)

    Voinea, Lucian; Telea, Alexandru

    We present a software framework for mining software repositories. Our extensible framework enables the integration of data extraction from repositories with data analysis and interactive visualization. We demonstrate the applicability of the framework by presenting several case studies performed on

  2. Software

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, R.; Budd, G.; Ross, E.; Wells, P.

    2010-07-15

    The software section of this journal presented new software programs that have been developed to help in the exploration and development of hydrocarbon resources. Software provider IHS Inc. has made additions to its geological and engineering analysis software tool, IHS PETRA, a product used by geoscientists and engineers to visualize, analyze and manage well production, well log, drilling, reservoir, seismic and other related information. IHS PETRA also includes a directional well module and a decline curve analysis module to improve analysis capabilities in unconventional reservoirs. Petris Technology Inc. has developed a software to help manage the large volumes of data. PetrisWinds Enterprise (PWE) helps users find and manage wellbore data, including conventional wireline and MWD core data; analysis core photos and images; waveforms and NMR; and external files documentation. Ottawa-based Ambercore Software Inc. has been collaborating with Nexen on the Petroleum iQ software for steam assisted gravity drainage (SAGD) producers. Petroleum iQ integrates geology and geophysics data with engineering data in 3D and 4D. Calgary-based Envirosoft Corporation has developed a software that reduces the costly and time-consuming effort required to comply with Directive 39 of the Alberta Energy Resources Conservation Board. The product includes an emissions modelling software. Houston-based Seismic Micro-Technology (SMT) has developed the Kingdom software that features the latest in seismic interpretation. Holland-based Joa Oil and Gas and Calgary-based Computer Modelling Group have both supplied the petroleum industry with advanced reservoir simulation software that enables reservoir interpretation. The 2010 software survey included a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In

  3. Software Design for Smile Analysis

    Directory of Open Access Journals (Sweden)

    A. Sarkhosh

    2010-12-01

    Full Text Available Introduction: Esthetics and attractiveness of the smile is one of the major demands in contemporary orthodontic treatment. In order to improve a smile design, it is necessary to record “posed smile” as an intentional, non-pressure, static, natural and reproduciblesmile. The record then should be analyzed to determine its characteristics. In this study,we intended to design and introduce a software to analyze the smile rapidly and precisely in order to produce an attractive smile for the patients.Materials and Methods: For this purpose, a practical study was performed to design multimedia software “Smile Analysis” which can receive patients’ photographs and videographs. After giving records to the software, the operator should mark the points and lines which are displayed on the system’s guide and also define the correct scale for each image. Thirty-three variables are measured by the software and displayed on the report page. Reliability of measurements in both image and video was significantly high(=0.7-1.Results: In order to evaluate intra- operator and inter-operator reliability, five cases were selected randomly. Statistical analysis showed that calculations performed in smile analysis software were both valid and highly reliable (for both video and photo.Conclusion: The results obtained from smile analysis could be used in diagnosis,treatment planning and evaluation of the treatment progress.

  4. Application of Formal Methods in Software Engineering

    Directory of Open Access Journals (Sweden)

    Adriana Morales

    2011-12-01

    Full Text Available The purpose of this research work is to examine: (1 why are necessary the formal methods for software systems today, (2 high integrity systems through the methodology C-by-C –Correctness-by-Construction–, and (3 an affordable methodology to apply formal methods in software engineering. The research process included reviews of the literature through Internet, in publications and presentations in events. Among the Research results found that: (1 there is increasing the dependence that the nations have, the companies and people of software systems, (2 there is growing demand for software Engineering to increase social trust in the software systems, (3 exist methodologies, as C-by-C, that can provide that level of trust, (4 Formal Methods constitute a principle of computer science that can be applied software engineering to perform reliable process in software development, (5 software users have the responsibility to demand reliable software products, and (6 software engineers have the responsibility to develop reliable software products. Furthermore, it is concluded that: (1 it takes more research to identify and analyze other methodologies and tools that provide process to apply the Formal Software Engineering methods, (2 Formal Methods provide an unprecedented ability to increase the trust in the exactitude of the software products and (3 by development of new methodologies and tools is being achieved costs are not more a disadvantage for application of formal methods.

  5. Mining software specifications methodologies and applications

    CERN Document Server

    Lo, David

    2011-01-01

    An emerging topic in software engineering and data mining, specification mining tackles software maintenance and reliability issues that cost economies billions of dollars each year. The first unified reference on the subject, Mining Software Specifications: Methodologies and Applications describes recent approaches for mining specifications of software systems. Experts in the field illustrate how to apply state-of-the-art data mining and machine learning techniques to address software engineering concerns. In the first set of chapters, the book introduces a number of studies on mining finite

  6. Acoustic Emission Analysis Applet (AEAA) Software

    Science.gov (United States)

    Nichols, Charles T.; Roth, Don J.

    2013-01-01

    NASA Glenn Research and NASA White Sands Test Facility have developed software supporting an automated pressure vessel structural health monitoring (SHM) system based on acoustic emissions (AE). The software, referred to as the Acoustic Emission Analysis Applet (AEAA), provides analysts with a tool that can interrogate data collected on Digital Wave Corp. and Physical Acoustics Corp. software using a wide spectrum of powerful filters and charts. This software can be made to work with any data once the data format is known. The applet will compute basic AE statistics, and statistics as a function of time and pressure (see figure). AEAA provides value added beyond the analysis provided by the respective vendors' analysis software. The software can handle data sets of unlimited size. A wide variety of government and commercial applications could benefit from this technology, notably requalification and usage tests for compressed gas and hydrogen-fueled vehicles. Future enhancements will add features similar to a "check engine" light on a vehicle. Once installed, the system will ultimately be used to alert International Space Station crewmembers to critical structural instabilities, but will have little impact to missions otherwise. Diagnostic information could then be transmitted to experienced technicians on the ground in a timely manner to determine whether pressure vessels have been impacted, are structurally unsound, or can be safely used to complete the mission.

  7. Software for Graph Analysis and Visualization

    Directory of Open Access Journals (Sweden)

    M. I. Kolomeychenko

    2014-01-01

    Full Text Available This paper describes the software for graph storage, analysis and visualization. The article presents a comparative analysis of existing software for analysis and visualization of graphs, describes the overall architecture of application and basic principles of construction and operation of the main modules. Furthermore, a description of the developed graph storage oriented to storage and processing of large-scale graphs is presented. The developed algorithm for finding communities and implemented algorithms of autolayouts of graphs are the main functionality of the product. The main advantage of the developed software is high speed processing of large size networks (up to millions of nodes and links. Moreover, the proposed graph storage architecture is unique and has no analogues. The developed approaches and algorithms are optimized for operating with big graphs and have high productivity.

  8. Statistical Software and Artificial Intelligence: A Watershed in Applications Programming.

    Science.gov (United States)

    Pickett, John C.

    1984-01-01

    AUTOBJ and AUTOBOX are revolutionary software programs which contain the first application of artificial intelligence to statistical procedures used in analysis of time series data. The artificial intelligence included in the programs and program features are discussed. (JN)

  9. Residence time distribution software analysis. User's manual

    International Nuclear Information System (INIS)

    1996-01-01

    Radiotracer applications cover a wide range of industrial activities in chemical and metallurgical processes, water treatment, mineral processing, environmental protection and civil engineering. Experiment design, data acquisition, treatment and interpretation are the basic elements of tracer methodology. The application of radiotracers to determine impulse response as RTD as well as the technical conditions for conducting experiments in industry and in the environment create a need for data processing using special software. Important progress has been made during recent years in the preparation of software programs for data treatment and interpretation. The software package developed for industrial process analysis and diagnosis by the stimulus-response methods contains all the methods for data processing for radiotracer experiments

  10. Firing Room Remote Application Software Development

    Science.gov (United States)

    Liu, Kan

    2015-01-01

    The Engineering and Technology Directorate (NE) at National Aeronautics and Space Administration (NASA) Kennedy Space Center (KSC) is designing a new command and control system for the checkout and launch of Space Launch System (SLS) and future rockets. The purposes of the semester long internship as a remote application software developer include the design, development, integration, and verification of the software and hardware in the firing rooms, in particular with the Mobile Launcher (ML) Launch Accessories (LACC) subsystem. In addition, a software test verification procedure document was created to verify and checkout LACC software for Launch Equipment Test Facility (LETF) testing.

  11. Software reliability for safety-critical applications

    International Nuclear Information System (INIS)

    Everett, B.; Musa, J.

    1994-01-01

    In this talk, the authors address the question open-quotes Can Software Reliability Engineering measurement and modeling techniques be applied to safety-critical applications?close quotes Quantitative techniques have long been applied in engineering hardware components of safety-critical applications. The authors have seen a growing acceptance and use of quantitative techniques in engineering software systems but a continuing reluctance in using such techniques in safety-critical applications. The general case posed against using quantitative techniques for software components runs along the following lines: safety-critical applications should be engineered such that catastrophic failures occur less frequently than one in a billion hours of operation; current software measurement/modeling techniques rely on using failure history data collected during testing; one would have to accumulate over a billion operational hours to verify failure rate objectives of about one per billion hours

  12. QInfer: Statistical inference software for quantum applications

    Directory of Open Access Journals (Sweden)

    Christopher Granade

    2017-04-01

    Full Text Available Characterizing quantum systems through experimental data is critical to applications as diverse as metrology and quantum computing. Analyzing this experimental data in a robust and reproducible manner is made challenging, however, by the lack of readily-available software for performing principled statistical analysis. We improve the robustness and reproducibility of characterization by introducing an open-source library, QInfer, to address this need. Our library makes it easy to analyze data from tomography, randomized benchmarking, and Hamiltonian learning experiments either in post-processing, or online as data is acquired. QInfer also provides functionality for predicting the performance of proposed experimental protocols from simulated runs. By delivering easy-to-use characterization tools based on principled statistical analysis, QInfer helps address many outstanding challenges facing quantum technology.

  13. Utility application of simulation software

    International Nuclear Information System (INIS)

    Sudduth, A.L.

    1986-01-01

    The purpose of this paper is to discuss dynamic system simulation from the perspective of a successful utility user. In it, four aspects of the issue of utility use of simulation will be addressed: (1) What simulation software is available to utilities which can be of practical assistance with a modest investment in staff and training. (2) To what specific problems can utilities apply the technique of simulation and achieve reasonably cost effective results. (3) What the advantages are of in-house dynamic simulation capability, as opposed to depending on NSSS vendors or consultants. (4) What the prospects are for wider use of dynamic simulation in the utility industry

  14. MAUS: MICE Analysis User Software

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    The Muon Ionization Cooling Experiment (MICE) has developed the MICE Analysis User Software (MAUS) to simulate and analyse experimental data. It serves as the primary codebase for the experiment, providing for online data quality checks and offline batch simulation and reconstruction. The code is structured in a Map-Reduce framework to allow parallelization whether on a personal machine or in the control room. Various software engineering practices from industry are also used to ensure correct and maintainable physics code, which include unit, functional and integration tests, continuous integration and load testing, code reviews, and distributed version control systems. Lastly, there are various small design decisions like using JSON as the data structure, using SWIG to allow developers to write components in either Python or C++, or using the SCons python-based build system that may be of interest to other experiments.

  15. High-throughput image analysis of tumor spheroids: a user-friendly software application to measure the size of spheroids automatically and accurately.

    Science.gov (United States)

    Chen, Wenjin; Wong, Chung; Vosburgh, Evan; Levine, Arnold J; Foran, David J; Xu, Eugenia Y

    2014-07-08

    The increasing number of applications of three-dimensional (3D) tumor spheroids as an in vitro model for drug discovery requires their adaptation to large-scale screening formats in every step of a drug screen, including large-scale image analysis. Currently there is no ready-to-use and free image analysis software to meet this large-scale format. Most existing methods involve manually drawing the length and width of the imaged 3D spheroids, which is a tedious and time-consuming process. This study presents a high-throughput image analysis software application - SpheroidSizer, which measures the major and minor axial length of the imaged 3D tumor spheroids automatically and accurately; calculates the volume of each individual 3D tumor spheroid; then outputs the results in two different forms in spreadsheets for easy manipulations in the subsequent data analysis. The main advantage of this software is its powerful image analysis application that is adapted for large numbers of images. It provides high-throughput computation and quality-control workflow. The estimated time to process 1,000 images is about 15 min on a minimally configured laptop, or around 1 min on a multi-core performance workstation. The graphical user interface (GUI) is also designed for easy quality control, and users can manually override the computer results. The key method used in this software is adapted from the active contour algorithm, also known as Snakes, which is especially suitable for images with uneven illumination and noisy background that often plagues automated imaging processing in high-throughput screens. The complimentary "Manual Initialize" and "Hand Draw" tools provide the flexibility to SpheroidSizer in dealing with various types of spheroids and diverse quality images. This high-throughput image analysis software remarkably reduces labor and speeds up the analysis process. Implementing this software is beneficial for 3D tumor spheroids to become a routine in vitro model

  16. Standardization of software application development and governance

    OpenAIRE

    Labbe, Peter P.

    2015-01-01

    Approved for public release; distribution is unlimited A number of Defense Department initiatives focus on how to engineer better systems that directly influence software architecture, including Open Architecture, Enterprise Architecture, and Joint Information Enterprise. Additionally, the Department of Defense (DOD) mandates moving applications to consolidated datacenters and cloud computing. When examined from an application development perspective, the DOD lacks a common approach for in...

  17. Linear algebra applications using Matlab software

    Directory of Open Access Journals (Sweden)

    Cornelia Victoria Anghel

    2005-10-01

    Full Text Available The paper presents two ways of special matrix generating using some functions included in the MatLab software package. The MatLab software package contains a set of functions that generate special matrixes used in the linear algebra applications and the signal processing from different activity fields. The paper presents two tipes of special matrixes that can be generated using written sintaxes in the dialog window of the MatLab software and for the command validity we need to press the Enter task. The applications presented in the paper represent eamples of numerical calculus using the MatLab software and belong to the scientific field „Computer Assisted Mathematics” thus creating the symbiosis between mathematics and informatics.

  18. Interim report on the development and application of environmental mapped data digitization, encoding, analysis, and display software for the ALICE system. Volume II. [MAP, CHAIN, FIX, and DOUT, in FORTRAN IV for PDP-10

    Energy Technology Data Exchange (ETDEWEB)

    Amiot, L.W.; Lima, R.J.; Scholbrock, S.D.; Shelman, C.B.; Wehman, R.H.

    1979-06-01

    Volume I of An Interim Report on the Development and Application of Environmental Mapped Data Digitization, Encoding, Analysis, and Display Software for the ALICE System provided an overall description of the software developed for the ALICE System and presented an example of its application. The scope of the information presented in Volume I was directed both to the users and developers of digitization, encoding, analysis, and display software. Volume II presents information which is directly related to the actual computer code and operational characteristics (keys and subroutines) of the software. Volume II will be of more interest to developers of software than to users of the software. However, developers of software should be aware that the code developed for the ALICE System operates in an environment where much of the peripheral hardware to the PDP-10 is ANL/AMD built. For this reason, portions of the code may have to be modified for implementation on other computer system configurations. 11 tables.

  19. High Performance Computing Software Applications for Space Situational Awareness

    Science.gov (United States)

    Giuliano, C.; Schumacher, P.; Matson, C.; Chun, F.; Duncan, B.; Borelli, K.; Desonia, R.; Gusciora, G.; Roe, K.

    The High Performance Computing Software Applications Institute for Space Situational Awareness (HSAI-SSA) has completed its first full year of applications development. The emphasis of our work in this first year was in improving space surveillance sensor models and image enhancement software. These applications are the Space Surveillance Network Analysis Model (SSNAM), the Air Force Space Fence simulation (SimFence), and physically constrained iterative de-convolution (PCID) image enhancement software tool. Specifically, we have demonstrated order of magnitude speed-up in those codes running on the latest Cray XD-1 Linux supercomputer (Hoku) at the Maui High Performance Computing Center. The software applications improvements that HSAI-SSA has made, has had significant impact to the warfighter and has fundamentally changed the role of high performance computing in SSA.

  20. Software Performs Complex Design Analysis

    Science.gov (United States)

    2008-01-01

    Designers use computational fluid dynamics (CFD) to gain greater understanding of the fluid flow phenomena involved in components being designed. They also use finite element analysis (FEA) as a tool to help gain greater understanding of the structural response of components to loads, stresses and strains, and the prediction of failure modes. Automated CFD and FEA engineering design has centered on shape optimization, which has been hindered by two major problems: 1) inadequate shape parameterization algorithms, and 2) inadequate algorithms for CFD and FEA grid modification. Working with software engineers at Stennis Space Center, a NASA commercial partner, Optimal Solutions Software LLC, was able to utilize its revolutionary, one-of-a-kind arbitrary shape deformation (ASD) capability-a major advancement in solving these two aforementioned problems-to optimize the shapes of complex pipe components that transport highly sensitive fluids. The ASD technology solves the problem of inadequate shape parameterization algorithms by allowing the CFD designers to freely create their own shape parameters, therefore eliminating the restriction of only being able to use the computer-aided design (CAD) parameters. The problem of inadequate algorithms for CFD grid modification is solved by the fact that the new software performs a smooth volumetric deformation. This eliminates the extremely costly process of having to remesh the grid for every shape change desired. The program can perform a design change in a markedly reduced amount of time, a process that would traditionally involve the designer returning to the CAD model to reshape and then remesh the shapes, something that has been known to take hours, days-even weeks or months-depending upon the size of the model.

  1. Technical note on the validation of a semi-automated image analysis software application for estrogen and progesterone receptor detection in breast cancer

    Science.gov (United States)

    2011-01-01

    Background The immunohistochemical detection of estrogen (ER) and progesterone (PR) receptors in breast cancer is routinely used for prognostic and predictive testing. Whole slide digitalization supported by dedicated software tools allows quantization of the image objects (e.g. cell membrane, nuclei) and an unbiased analysis of immunostaining results. Validation studies of image analysis applications for the detection of ER and PR in breast cancer specimens provided strong concordance between the pathologist's manual assessment of slides and scoring performed using different software applications. Methods The effectiveness of two connected semi-automated image analysis software (NuclearQuant v. 1.13 application for Pannoramic™ Viewer v. 1.14) for determination of ER and PR status in formalin-fixed paraffin embedded breast cancer specimens immunostained with the automated Leica Bond Max system was studied. First the detection algorithm was calibrated to the scores provided an independent assessors (pathologist), using selected areas from 38 small digital slides (created from 16 cases) containing a mean number of 195 cells. Each cell was manually marked and scored according to the Allred-system combining frequency and intensity scores. The performance of the calibrated algorithm was tested on 16 cases (14 invasive ductal carcinoma, 2 invasive lobular carcinoma) against the pathologist's manual scoring of digital slides. Results The detection was calibrated to 87 percent object detection agreement and almost perfect Total Score agreement (Cohen's kappa 0.859, quadratic weighted kappa 0.986) from slight or moderate agreement at the start of the study, using the un-calibrated algorithm. The performance of the application was tested against the pathologist's manual scoring of digital slides on 53 regions of interest of 16 ER and PR slides covering all positivity ranges, and the quadratic weighted kappa provided almost perfect agreement (κ = 0.981) among the two

  2. 4th International Conference in Software Engineering for Defence Applications

    CERN Document Server

    Sillitti, Alberto; Succi, Giancarlo; Messina, Angelo

    2016-01-01

    This book presents high-quality original contributions on new software engineering models, approaches, methods, and tools and their evaluation in the context of defence and security applications. In addition, important business and economic aspects are discussed, with a particular focus on cost/benefit analysis, new business models, organizational evolution, and business intelligence systems. The contents are based on presentations delivered at SEDA 2015, the 4th International Conference in Software Engineering for Defence Applications, which was held in Rome, Italy, in May 2015. This conference series represents a targeted response to the growing need for research that reports and debates the practical implications of software engineering within the defence environment and also for software performance evaluation in real settings through controlled experiments as well as case and field studies. The book will appeal to all with an interest in modeling, managing, and implementing defence-related software devel...

  3. Intercomparison of gamma ray analysis software packages

    International Nuclear Information System (INIS)

    1998-04-01

    The IAEA undertook an intercomparison exercise to review available software for gamma ray spectra analysis. This document describes the methods used in the intercomparison exercise, characterizes the software packages reviewed and presents the results obtained. Only direct results are given without any recommendation for a particular software or method for gamma ray spectra analysis

  4. Software engineering with application-specific languages

    Science.gov (United States)

    Campbell, David J.; Barker, Linda; Mitchell, Deborah; Pollack, Robert H.

    1993-01-01

    Application-Specific Languages (ASL's) are small, special-purpose languages that are targeted to solve a specific class of problems. Using ASL's on software development projects can provide considerable cost savings, reduce risk, and enhance quality and reliability. ASL's provide a platform for reuse within a project or across many projects and enable less-experienced programmers to tap into the expertise of application-area experts. ASL's have been used on several software development projects for the Space Shuttle Program. On these projects, the use of ASL's resulted in considerable cost savings over conventional development techniques. Two of these projects are described.

  5. A Software Phantom : Application in Digital Tomosynthesis

    Energy Technology Data Exchange (ETDEWEB)

    Lazos, D; Kolitsi, Z; Badea, C; Pallikarakis, N [Medical Physics Laboratory, School of Medicine, Univercity of Patras (Greece)

    1999-12-31

    A software phantom intended to be used in radiographic applications has been developed. The application was used for research in the field of Digital Tomosynthesis and specifically for studying tomographic noise removal methods. The application consists of a phantom design and a phantom imaging module. The radiation-matter interaction is based on the exponential relation of attenuation. Projections are formed by simulated irradiation with selectable geometrical parameters, source spectrum and detector response. Phantoms are defined either as sets containing certain geometrical objects or as groups of voxels. Comparison with real projections taken from a physical phantom with identical geometry and composition with the simulated one, showed good approximation with improved contrast due to the absence of scatter in the simulated projections. The software phantom proved to be a very useful tool for DTS investigations. Further development to include scatter is expected to expand the use of the application to more areas in radiological imaging research. (author) 4 refs., 3 figs

  6. A Software Phantom : Application in Digital Tomosynthesis

    International Nuclear Information System (INIS)

    Lazos, D.; Kolitsi, Z.; Badea, C.; Pallikarakis, N.

    1998-01-01

    A software phantom intended to be used in radiographic applications has been developed. The application was used for research in the field of Digital Tomosynthesis and specifically for studying tomographic noise removal methods. The application consists of a phantom design and a phantom imaging module. The radiation-matter interaction is based on the exponential relation of attenuation. Projections are formed by simulated irradiation with selectable geometrical parameters, source spectrum and detector response. Phantoms are defined either as sets containing certain geometrical objects or as groups of voxels. Comparison with real projections taken from a physical phantom with identical geometry and composition with the simulated one, showed good approximation with improved contrast due to the absence of scatter in the simulated projections. The software phantom proved to be a very useful tool for DTS investigations. Further development to include scatter is expected to expand the use of the application to more areas in radiological imaging research. (author)

  7. Remote Software Application and Display Development

    Science.gov (United States)

    Sanders, Brandon T.

    2014-01-01

    The era of the shuttle program has come to an end, but only to give rise to newer and more exciting projects. Now is the time of the Orion spacecraft, a work of art designed to exceed all previous endeavors of man. NASA is exiting the time of exploration and is entering a new period, a period of pioneering. With this new mission, many of NASAs organizations must undergo a great deal of change and development to support the Orion missions. The Spaceport Command and Control System (SCCS) is the new system that will provide NASA the ability to launch rockets into orbit and thus control Orion and other spacecraft as the goal of populating Mars becomes ever increasingly tangible. Since the previous control system, Launch Processing System (LPS), was primarily designed to launch the shuttles, SCCS was needed as Kennedy Space Center (KSC) reorganized to a multiuser spaceport for commercial flights, providing a more versatile control over rockets. Within SCCS, is the Launch Control System (LCS), which is the remote software behind the command and monitoring of flight and ground system hardware. This internship at KSC has involved two main components in LCS, including Remote Software Application and Display development. The display environment provides a graphical user interface for an operator to view and see if any cautions are raised, while the remote applications are the backbone that communicate with hardware, and then relay the data back to the displays. These elements go hand in hand as they provide monitoring and control over hardware and software alike from the safety of the Launch Control Center. The remote software applications are written in Application Control Language (ACL), which must undergo unit testing to ensure data integrity. This paper describes both the implementation and writing of unit tests in ACL code for remote software applications, as well as the building of remote displays to be used in the Launch Control Center (LCC).

  8. Software architecture analysis of usability

    NARCIS (Netherlands)

    Folmer, Eelke

    2005-01-01

    One of the qualities that has received increased attention in recent decades is usability. A software product with poor usability is likely to fail in a highly competitive market; therefore software developing organizations are paying more and more attention to ensuring the usability of their

  9. Software Defined Radio: Basic Principles and Applications

    Directory of Open Access Journals (Sweden)

    José Raúl Machado-Fernández

    2014-12-01

    Full Text Available The author makes a review of the SDR (Software Defined Radio technology, including hardware schemes and application fields. A low performance device is presented and several tests are executed with it using free software. With the acquired experience, SDR employment opportunities are identified for low-cost solutions that can solve significant problems. In addition, a list of the most important frameworks related to the technology developed in the last years is offered, recommending the use of three of them.

  10. Application of software engineering to development of reactor safety codes

    International Nuclear Information System (INIS)

    Wilburn, N.P.; Niccoli, L.G.

    1981-01-01

    Software Engineering, which is a systematic methodology by which a large scale software development project is partitioned into manageable pieces, has been applied to the development of LMFBR safety codes. The techniques have been applied extensively in the business and aerospace communities and have provided an answer to the drastically increasing cost of developing and maintaining software. The five phases of software engineering (Survey, Analysis, Design, Implementation, and Testing) were applied in turn to development of these codes, along with Walkthroughs (peer review) at each stage. The application of these techniques has resulted in SUPERIOR SOFTWARE which is well documented, thoroughly tested, easy to modify, easier to use and maintain. The development projects have resulted in lower overall cost. (orig.) [de

  11. Software defined networking applications in distributed datacenters

    CERN Document Server

    Qi, Heng

    2016-01-01

    This SpringerBrief provides essential insights on the SDN application designing and deployment in distributed datacenters. In this book, three key problems are discussed: SDN application designing, SDN deployment and SDN management. This book demonstrates how to design the SDN-based request allocation application in distributed datacenters. It also presents solutions for SDN controller placement to deploy SDN in distributed datacenters. Finally, an SDN management system is proposed to guarantee the performance of datacenter networks which are covered and controlled by many heterogeneous controllers. Researchers and practitioners alike will find this book a valuable resource for further study on Software Defined Networking. .

  12. Gamma-Ray Spectrum Analysis Software GDA

    International Nuclear Information System (INIS)

    Wanabongse, P.

    1998-01-01

    The developmental work on computer software for gamma-ray spectrum analysis has been completed as a software package version 1.02 named GDA, which is an acronym for Gamma-spectrum Deconvolution and Analysis. The software package consists of three 3.5-inch diskettes for setup and a user's manual. GDA software can be installed for using on a personal computer with Windows 95 or Windows NT 4.0 operating system. A computer maybe the type of 80486 CPU with 8 megabytes of memory

  13. Software architecture analysis of usability

    NARCIS (Netherlands)

    Folmer, E; van Gurp, J; Bosch, J; Bastide, R; Palanque, P; Roth, J

    2005-01-01

    Studies of software engineering projects show that a large number of usability related change requests are made after its deployment. Fixing usability problems during the later stages of development often proves to be costly, since many of the necessary changes require changes to the system that

  14. A software application for the processing of students results | Ukem ...

    African Journals Online (AJOL)

    A software application for the processing of students results. ... In this work, a computer software application was developed to facilitate the automated processing of the results. ... Full Text: EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  15. Application of Surpac and Whittle Software in Open Pit Optimisation ...

    African Journals Online (AJOL)

    Application of Surpac and Whittle Software in Open Pit Optimisation and Design. ... This paper studies the Surpac and Whittle software and their application in designing an optimised pit. ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  16. Development of a fatigue analysis software system

    International Nuclear Information System (INIS)

    Choi, B. I.; Lee, H. J.; Han, S. W.; Kim, J. Y.; Hwang, K. H.; Kang, J. Y.

    2001-01-01

    A general purpose fatigue analysis software to predict fatigue lives of mechanical components and structures was developed. This software has some characteristic features including functions of searching weak regions on the free surface in order to reduce computing time significantly, a database of fatigue properties for various materials, and an expert system which can assist any users to get more proper results. This software can be used in the environment consists of commercial finite element packages. Using the software developed fatigue analyses for a SAE keyhole specimen and an automobile knuckle were carried out. It was observed that the results were agree well with those from commercial packages

  17. Numerical methods in software and analysis

    CERN Document Server

    Rice, John R

    1992-01-01

    Numerical Methods, Software, and Analysis, Second Edition introduces science and engineering students to the methods, tools, and ideas of numerical computation. Introductory courses in numerical methods face a fundamental problem-there is too little time to learn too much. This text solves that problem by using high-quality mathematical software. In fact, the objective of the text is to present scientific problem solving using standard mathematical software. This book discusses numerous programs and software packages focusing on the IMSL library (including the PROTRAN system) and ACM Algorithm

  18. Software Process Improvement Using Force Field Analysis ...

    African Journals Online (AJOL)

    An improvement plan is then drawn and implemented. This paper studied the state of Nigerian software development organizations based on selected attributes. Force field analysis is used to partition the factors obtained into driving and restraining forces. An attempt was made to improve the software development process ...

  19. Application software for new BEPC interlock system

    International Nuclear Information System (INIS)

    Tang Shuming; Na Xiangyin; Chen Jiansong; Yu Yulan

    1997-01-01

    New BEPC (Beijing electron Positron collider) interlock system has been built in order to improve the reliability of personnel safety and interlock functions. Moreover, the system updates BEPC operation message once every 6 seconds, which are displayed on TV screens at the major entrances. Since March of 1996, new BEPC interlock system has been operating reliably. The hardware of the system is based on Programmable Logic Controllers (PLC). A multimedia IBM/PC-586 as the host computer of the PLCs, monitors the PLC system via serial port COM2. The PC communicates with the central computer VAX-4500 of BEPC control system and gets operating massage of the accelerator through serial port COM3. The application software on the host computer has been developed. Visual C++ for MS-Windows 3.2 TM is selected as the work bench. It provides nice tools for building programs, such as APP STUDIO, CLASS WIZARD, APP WIZARD and debugger tool. The author describes the design idea and the structure of the application software. Error tolerance is taken into consideration. The author also presents a small database and its data structure for the application

  20. Designing SCADA application software a practical approach

    CERN Document Server

    McCrady, Stuart G

    2013-01-01

    Automation systems, often referred to as SCADA systems, involve programming at several levels; these systems include computer type field controllers that monitor and control plant equipment such as conveyor systems, pumps, and user workstations that allow the user to monitor and control the equipment through color graphic displays. All of the components of these systems are integrated through a network, such as Ethernet for fast communications. This book provides a practical guide to developing the application software for all aspects of the automation system, from the field controll

  1. Continuous Software Quality analysis for the ATLAS experiment

    CERN Document Server

    Washbrook, Andrew; The ATLAS collaboration

    2017-01-01

    The regular application of software quality tools in large collaborative projects is required to reduce code defects to an acceptable level. If left unchecked the accumulation of defects invariably results in performance degradation at scale and problems with the long-term maintainability of the code. Although software quality tools are effective for identification there remains a non-trivial sociological challenge to resolve defects in a timely manner. This is a ongoing concern for the ATLAS software which has evolved over many years to meet the demands of Monte Carlo simulation, detector reconstruction and data analysis. At present over 3.8 million lines of C++ code (and close to 6 million total lines of code) are maintained by a community of hundreds of developers worldwide. It is therefore preferable to address code defects before they are introduced into a widely used software release. Recent wholesale changes to the ATLAS software infrastructure have provided an ideal opportunity to apply software quali...

  2. Static and Dynamic Verification of Critical Software for Space Applications

    Science.gov (United States)

    Moreira, F.; Maia, R.; Costa, D.; Duro, N.; Rodríguez-Dapena, P.; Hjortnaes, K.

    Space technology is no longer used only for much specialised research activities or for sophisticated manned space missions. Modern society relies more and more on space technology and applications for every day activities. Worldwide telecommunications, Earth observation, navigation and remote sensing are only a few examples of space applications on which we rely daily. The European driven global navigation system Galileo and its associated applications, e.g. air traffic management, vessel and car navigation, will significantly expand the already stringent safety requirements for space based applications Apart from their usefulness and practical applications, every single piece of onboard software deployed into the space represents an enormous investment. With a long lifetime operation and being extremely difficult to maintain and upgrade, at least when comparing with "mainstream" software development, the importance of ensuring their correctness before deployment is immense. Verification &Validation techniques and technologies have a key role in ensuring that the onboard software is correct and error free, or at least free from errors that can potentially lead to catastrophic failures. Many RAMS techniques including both static criticality analysis and dynamic verification techniques have been used as a means to verify and validate critical software and to ensure its correctness. But, traditionally, these have been isolated applied. One of the main reasons is the immaturity of this field in what concerns to its application to the increasing software product(s) within space systems. This paper presents an innovative way of combining both static and dynamic techniques exploiting their synergy and complementarity for software fault removal. The methodology proposed is based on the combination of Software FMEA and FTA with Fault-injection techniques. The case study herein described is implemented with support from two tools: The SoftCare tool for the SFMEA and SFTA

  3. Development of design and analysis software for advanced nuclear system

    International Nuclear Information System (INIS)

    Wu Yican; Hu Liqin; Long Pengcheng; Luo Yuetong; Li Yazhou; Zeng Qin; Lu Lei; Zhang Junjun; Zou Jun; Xu Dezheng; Bai Yunqing; Zhou Tao; Chen Hongli; Peng Lei; Song Yong; Huang Qunying

    2010-01-01

    A series of professional codes, which are necessary software tools and data libraries for advanced nuclear system design and analysis, were developed by the FDS Team, including the codes of automatic modeling, physics and engineering calculation, virtual simulation and visualization, system engineering and safety analysis and the related database management etc. The development of these software series was proposed as an exercise of development of nuclear informatics. This paper introduced the main functions and key techniques of the software series, as well as some tests and practical applications. (authors)

  4. Software safety analysis practice in installation phase

    Energy Technology Data Exchange (ETDEWEB)

    Huang, H. W.; Chen, M. H.; Shyu, S. S., E-mail: hwhwang@iner.gov.t [Institute of Nuclear Energy Research, No. 1000 Wenhua Road, Chiaan Village, Longtan Township, 32546 Taoyuan County, Taiwan (China)

    2010-10-15

    This work performed a software safety analysis in the installation phase of the Lung men nuclear power plant in Taiwan, under the cooperation of Institute of Nuclear Energy Research and Tpc. The US Nuclear Regulatory Commission requests licensee to perform software safety analysis and software verification and validation in each phase of software development life cycle with Branch Technical Position 7-14. In this work, 37 safety grade digital instrumentation and control systems were analyzed by failure mode and effects analysis, which is suggested by IEEE standard 7-4.3.2-2003. During the installation phase, skew tests for safety grade network and point to point tests were performed. The failure mode and effects analysis showed all the single failure modes can be resolved by the redundant means. Most of the common mode failures can be resolved by operator manual actions. (Author)

  5. Software safety analysis practice in installation phase

    International Nuclear Information System (INIS)

    Huang, H. W.; Chen, M. H.; Shyu, S. S.

    2010-10-01

    This work performed a software safety analysis in the installation phase of the Lung men nuclear power plant in Taiwan, under the cooperation of Institute of Nuclear Energy Research and Tpc. The US Nuclear Regulatory Commission requests licensee to perform software safety analysis and software verification and validation in each phase of software development life cycle with Branch Technical Position 7-14. In this work, 37 safety grade digital instrumentation and control systems were analyzed by failure mode and effects analysis, which is suggested by IEEE standard 7-4.3.2-2003. During the installation phase, skew tests for safety grade network and point to point tests were performed. The failure mode and effects analysis showed all the single failure modes can be resolved by the redundant means. Most of the common mode failures can be resolved by operator manual actions. (Author)

  6. Computer-assisted qualitative data analysis software.

    Science.gov (United States)

    Cope, Diane G

    2014-05-01

    Advances in technology have provided new approaches for data collection methods and analysis for researchers. Data collection is no longer limited to paper-and-pencil format, and numerous methods are now available through Internet and electronic resources. With these techniques, researchers are not burdened with entering data manually and data analysis is facilitated by software programs. Quantitative research is supported by the use of computer software and provides ease in the management of large data sets and rapid analysis of numeric statistical methods. New technologies are emerging to support qualitative research with the availability of computer-assisted qualitative data analysis software (CAQDAS).CAQDAS will be presented with a discussion of advantages, limitations, controversial issues, and recommendations for this type of software use.

  7. Application of a statistical software package for analysis of large patient dose data sets obtained from RIS

    International Nuclear Information System (INIS)

    Fazakerley, J.; Charnock, P.; Wilde, R.; Jones, R.; Ward, M.

    2010-01-01

    For the purpose of patient dose audit, clinical audit and radiology workload analysis, data from Radiology Information Systems (RIS) at many hospitals are collected using a database and the analysis was automated using a statistical package and Visual Basic coding. The database is a Structured Query Language database, which can be queried using an off-the-shelf statistical package, Statistica. Macros were created to automatically format the data to a consistent format between different hospitals ready for analysis. These macros can also be used to automate further analysis such as detailing mean kV, mAs and entrance surface dose per room and per gender. Standard deviation and standard error of the mean are also generated. Graphs can also be generated to illustrate the trends in doses between different variables such as room and gender. Collectively, this information can be used to generate a report. A process that once could take up to 1 d to complete now takes around 1 h. A major benefit in providing the service to hospital trusts is that less resource is now required to report on RIS data, making the possibility of continuous dose audit more likely. Time that was spent on sorting through data can now be spent on improving the analysis to provide benefit to the customer. Using data sets from RIS is a good way to perform dose audits as the huge numbers of data available provide the bases for very accurate analysis. Using macros written in Statistica Visual Basic has helped sort and consistently analyse these data. Being able to analyse by exposure factors has provided a more detailed report to the customer. (authors)

  8. CernVM - a virtual software appliance for LHC applications

    International Nuclear Information System (INIS)

    Buncic, P; Sanchez, C Aguado; Blomer, J; Franco, L; Mato, P; Harutyunian, A; Yao, Y

    2010-01-01

    CernVM is a Virtual Software Appliance capable of running physics applications from the LHC experiments at CERN. It aims to provide a complete and portable environment for developing and running LHC data analysis on any end-user computer (laptop, desktop) as well as on the Grid, independently of Operating System platforms (Linux, Windows, MacOS). The experiment application software and its specific dependencies are built independently from CernVM and delivered to the appliance just in time by means of a CernVM File System (CVMFS) specifically designed for efficient software distribution. The procedures for building, installing and validating software releases remains under the control and responsibility of each user community. We provide a mechanism to publish pre-built and configured experiment software releases to a central distribution point from where it finds its way to the running CernVM instances via the hierarchy of proxy servers or content delivery networks. In this paper, we present current state of CernVM project and compare performance of CVMFS to performance of traditional network file system like AFS and discuss possible scenarios that could further improve its performance and scalability.

  9. A Software Application to Detect Dental Color

    Directory of Open Access Journals (Sweden)

    Dan SÎMPĂLEAN

    2015-09-01

    Full Text Available Choosing dental color for missing teeth or tooth reconstruction is an important step and it usually raises difficulties for dentists due to a significant amount of subjective factors that can influence the color selection. Dental reconstruction presumes the combination between dentistry and chromatics, thus implying important challenges. Purpose: The aim of this study was to develop and implement a software application for detecting dental color to come to the aid of dentists and largely to remove the inherent subjectiveness of the human vision. Basic Methods: The implemented application was named Color Detection and the application’s source code is written using the C++ language. During application development, for creating the GUI (graphical user interface the wxWidgets 2.8 library it was used. Results: The application displays the average color of the selected area of interest, the reference color from the key collection existent in the program and also the degree of similarity between the original (the selected area of interest and the nearest reference key. This degree of similarity is expressed as a percentage. Conclusions: The Color Detection Program, by eliminating the subjectivity inherent to human sight, can help the dentist to select an appropriate dental color with precision.

  10. Test Analysis Tools to Ensure Higher Quality of On-Board Real Time Software for Space Applications

    Science.gov (United States)

    Boudillet, O.; Mescam, J.-C.; Dalemagne, D.

    2008-08-01

    EADS Astrium Space Transportation, in its Les Mureaux premises, is responsible for the French M51 nuclear deterrent missile onboard SW. There was also developed over 1 million of line of code, mostly in ADA, for the Automated Transfer Vehicle (ATV) onboard SW and the flight control SW of the ARIANE5 launcher which has put it into orbit. As part of the ATV SW, ASTRIUM ST has developed the first Category A SW ever qualified for a European space application. To ensure that all these embedded SW have been developed with the highest quality and reliability level, specific development tools have been designed to cover the steps of source code verification, automated validation test or complete target instruction coverage verification. Three of such dedicated tools are presented here.

  11. The ESA's Space Trajectory Analysis software suite

    Science.gov (United States)

    Ortega, Guillermo

    The European Space Agency (ESA) initiated in 2005 an internal activity to develop an open source software suite involving university science departments and research institutions all over the world. This project is called the "Space Trajectory Analysis" or STA. This article describes the birth of STA and its present configuration. One of the STA aims is to promote the exchange of technical ideas, and raise knowledge and competence in the areas of applied mathematics, space engineering, and informatics at University level. Conceived as a research and education tool to support the analysis phase of a space mission, STA is able to visualize a wide range of space trajectories. These include among others ascent, re-entry, descent and landing trajectories, orbits around planets and moons, interplanetary trajectories, rendezvous trajectories, etc. The article explains that STA project is an original idea of the Technical Directorate of ESA. It was born in August 2005 to provide a framework in astrodynamics research at University level. As research and education software applicable to Academia, a number of Universities support this development by joining ESA in leading the development. ESA and Universities partnership are expressed in the STA Steering Board. Together with ESA, each University has a chair in the board whose tasks are develop, control, promote, maintain, and expand the software suite. The article describes that STA provides calculations in the fields of spacecraft tracking, attitude analysis, coverage and visibility analysis, orbit determination, position and velocity of solar system bodies, etc. STA implements the concept of "space scenario" composed of Solar system bodies, spacecraft, ground stations, pads, etc. It is able to propagate the orbit of a spacecraft where orbital propagators are included. STA is able to compute communication links between objects of a scenario (coverage, line of sight), and to represent the trajectory computations and

  12. Development of data acquisition and analysis software for multichannel detectors

    International Nuclear Information System (INIS)

    Chung, Y.

    1988-06-01

    This report describes the development of data acquisition and analysis software for Apple Macintosh computers, capable of controlling two multichannel detectors. With the help of outstanding graphics capabilities, easy-to-use user interface, and several other built-in convenience features, this application has enhanced the productivity and the efficiency of data analysis. 2 refs., 6 figs

  13. Introduction to methodology of dose-response meta-analysis for binary outcome: With application on software.

    Science.gov (United States)

    Zhang, Chao; Jia, Pengli; Yu, Liu; Xu, Chang

    2018-05-01

    Dose-response meta-analysis (DRMA) is widely applied to investigate the dose-specific relationship between independent and dependent variables. Such methods have been in use for over 30 years and are increasingly employed in healthcare and clinical decision-making. In this article, we give an overview of the methodology used in DRMA. We summarize the commonly used regression model and the pooled method in DRMA. We also use an example to illustrate how to employ a DRMA by these methods. Five regression models, linear regression, piecewise regression, natural polynomial regression, fractional polynomial regression, and restricted cubic spline regression, were illustrated in this article to fit the dose-response relationship. And two types of pooling approaches, that is, one-stage approach and two-stage approach are illustrated to pool the dose-response relationship across studies. The example showed similar results among these models. Several dose-response meta-analysis methods can be used for investigating the relationship between exposure level and the risk of an outcome. However the methodology of DRMA still needs to be improved. © 2018 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.

  14. Software criticality analysis of COTS/SOUP

    International Nuclear Information System (INIS)

    Bishop, Peter; Bloomfield, Robin; Clement, Tim; Guerra, Sofia

    2003-01-01

    This paper describes the Software Criticality Analysis (SCA) approach that was developed to support the justification of using commercial off-the-shelf software (COTS) in a safety-related system. The primary objective of SCA is to assess the importance to safety of the software components within the COTS and to show there is segregation between software components with different safety importance. The approach taken was a combination of Hazops based on design documents and on a detailed analysis of the actual code (100 kloc). Considerable effort was spent on validation and ensuring the conservative nature of the results. The results from reverse engineering from the code showed that results based only on architecture and design documents would have been misleading

  15. Software criticality analysis of COTS/SOUP

    Energy Technology Data Exchange (ETDEWEB)

    Bishop, Peter; Bloomfield, Robin; Clement, Tim; Guerra, Sofia

    2003-09-01

    This paper describes the Software Criticality Analysis (SCA) approach that was developed to support the justification of using commercial off-the-shelf software (COTS) in a safety-related system. The primary objective of SCA is to assess the importance to safety of the software components within the COTS and to show there is segregation between software components with different safety importance. The approach taken was a combination of Hazops based on design documents and on a detailed analysis of the actual code (100 kloc). Considerable effort was spent on validation and ensuring the conservative nature of the results. The results from reverse engineering from the code showed that results based only on architecture and design documents would have been misleading.

  16. Application of image editing software for forensic detection of image ...

    African Journals Online (AJOL)

    Application of image editing software for forensic detection of image. ... The image editing software's available today is apt for creating visually compelling and sophisticated fake images, ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  17. Failure Mode and Effect Analysis of the Application Software of the Safety-critical I and C System in APR1400

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Koheun; Kim, Yong geul; Choi, Woong seok; Sohn, Se do [KEPCO Engineering and Construction, Daejeon (Korea, Republic of)

    2016-10-15

    In APR1400, the computer software hazard analysis is performed by hazard and operability analysis (HAZOP) method. Meanwhile, HAZOP has its limitation and cannot be considered better than fault tree analysis (FTA) or failure mode and effect (FMEA) analysis. HAZOP assumes that the system has been carefully studied, and all possible hazards, their effects or consequences and remedies are incorporated in the system. But incorporating every possible event in the design is impossible. In this light, this paper attempts to use FMEA method for evaluating the risk for safety-critical instrumentation and control (I and C) system software for NPP which is more practically than HAZOP. It is possible because the software failures are due to systematic faults that causing simultaneous failure in multiple division when the triggering event happens. This analysis is applied to safety-critical system of Shin-Hanul units 1 and 2 NPP, i.e., APR1400. Through SFMEA, the critical software failure modes and tasks that could result in CCF are identified and also evaluated to determine the associated risk level (e.g. high or intermediate or low) based on the failure effect. Biggest benefit from this analysis comparing with HAZOP is it can reveal the possible weak points and provide the guidance to the V and V team by helping to generate the test cases.

  18. Process mining application in software process assessment

    NARCIS (Netherlands)

    Samalikova, J.

    2012-01-01

    Nowadays, our daily life heavily depends on software. Software is everywhere, from appliances in our homes, to safety-critical systems such as medical equipment. The failure of these software-intensive systems results in high financial losses, environmental or property damages, or even loss of life.

  19. Application of agile methodologies in software development

    Directory of Open Access Journals (Sweden)

    Jovanović Aca D.

    2016-01-01

    Full Text Available The paper presents the potentials for the development of software using agile methodologies. Special consideration is devoted to the potentials and advantages of use of the Scrum methodology in the development of software and the relationship between the implementation of agile methodologies and the software development projects.

  20. A reliability evaluation method for NPP safety DCS application software

    International Nuclear Information System (INIS)

    Li Yunjian; Zhang Lei; Liu Yuan

    2014-01-01

    In the field of nuclear power plant (NPP) digital i and c application, reliability evaluation for safety DCS application software is a key obstacle to be removed. In order to quantitatively evaluate reliability of NPP safety DCS application software, this paper propose a reliability evaluating method based on software development life cycle every stage's v and v defects density characteristics, by which the operating reliability level of the software can be predicted before its delivery, and helps to improve the reliability of NPP safety important software. (authors)

  1. Introducing a New Software for Geodetic Analysis

    Science.gov (United States)

    Hjelle, Geir Arne; Dähnn, Michael; Fausk, Ingrid; Kirkvik, Ann-Silje; Mysen, Eirik

    2017-04-01

    At the Norwegian Mapping Authority, we are currently developing Where, a new software for geodetic analysis. Where is built on our experiences with the Geosat software, and will be able to analyse and combine data from VLBI, SLR, GNSS and DORIS. The software is mainly written in Python which has proved very fruitful. The code is quick to write and the architecture is easily extendable and maintainable, while at the same time taking advantage of well-tested code like the SOFA and IERS libraries. This presentation will show some of the current capabilities of Where, including benchmarks against other software packages, and outline our plans for further progress. In addition we will report on some investigations we have done experimenting with alternative weighting strategies for VLBI.

  2. Software reliability models for critical applications

    Energy Technology Data Exchange (ETDEWEB)

    Pham, H.; Pham, M.

    1991-12-01

    This report presents the results of the first phase of the ongoing EG&G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  3. Software reliability models for critical applications

    Energy Technology Data Exchange (ETDEWEB)

    Pham, H.; Pham, M.

    1991-12-01

    This report presents the results of the first phase of the ongoing EG G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place. 407 refs., 4 figs., 2 tabs.

  4. Power and performance software analysis and optimization

    CERN Document Server

    Kukunas, Jim

    2015-01-01

    Power and Performance: Software Analysis and Optimization is a guide to solving performance problems in modern Linux systems. Power-efficient chips are no help if the software those chips run on is inefficient. Starting with the necessary architectural background as a foundation, the book demonstrates the proper usage of performance analysis tools in order to pinpoint the cause of performance problems, and includes best practices for handling common performance issues those tools identify. Provides expert perspective from a key member of Intel's optimization team on how processors and memory

  5. 76 FR 60939 - Metal Fatigue Analysis Performed by Computer Software

    Science.gov (United States)

    2011-09-30

    ... Software AGENCY: Nuclear Regulatory Commission. ACTION: Regulatory issue summary; request for comment... computer software package, WESTEMS TM , to demonstrate compliance with Section III, ``Rules for... Software Addressees All holders of, and applicants for, a power reactor operating license or construction...

  6. A software package for biomedical image processing and analysis

    International Nuclear Information System (INIS)

    Goncalves, J.G.M.; Mealha, O.

    1988-01-01

    The decreasing cost of computing power and the introduction of low cost imaging boards justifies the increasing number of applications of digital image processing techniques in the area of biomedicine. There is however a large software gap to be fulfilled, between the application and the equipment. The requirements to bridge this gap are twofold: good knowledge of the hardware provided and its interface to the host computer, and expertise in digital image processing and analysis techniques. A software package incorporating these two requirements was developed using the C programming language, in order to create a user friendly image processing programming environment. The software package can be considered in two different ways: as a data structure adapted to image processing and analysis, which acts as the backbone and the standard of communication for all the software; and as a set of routines implementing the basic algorithms used in image processing and analysis. Hardware dependency is restricted to a single module upon which all hardware calls are based. The data structure that was built has four main features: hierchical, open, object oriented, and object dependent dimensions. Considering the vast amount of memory needed by imaging applications and the memory available in small imaging systems, an effective image memory management scheme was implemented. This software package is being used for more than one and a half years by users with different applications. It proved to be an excellent tool for helping people to get adapted into the system, and for standardizing and exchanging software, yet preserving flexibility allowing for users' specific implementations. The philosophy of the software package is discussed and the data structure that was built is described in detail

  7. Harmonic Calculation Software for Industrial Applications with Adjustable Speed Drives

    DEFF Research Database (Denmark)

    Asiminoaei, Lucian; Hansen, S.; Blaabjerg, Frede

    2005-01-01

    This paper describes the evaluation of a new harmonic software. By using a combination of a pre-stored database and new interpolation techniques the software can very fast provide the harmonic data on real applications. The harmonic results obtained with this software have acceptable precision even...

  8. Harmonic calculation software for industrial applications with ASDs

    DEFF Research Database (Denmark)

    Blaabjerg, Frede; Asiminoaei, Lucian; Hansen, Steffan

    2007-01-01

    This article describes the evaluation of new harmonic calculation software. By using a combination of a prestored database and new interpolation techniques the software can provide the harmonic data on real applications of a very fast speed. The harmonic results obtained with this software have a...

  9. Developing software for safety-critical applications

    International Nuclear Information System (INIS)

    Chudleigh, M.

    1989-01-01

    The effective implementation of many safety-critical systems involves microprocessors running software which needs to be of very high integrity. This article describes some of the problems of producing such software and the place of software within the total system. A development strategy is proposed based on three principles: the goal of defect-free development, the use of mathematical formalism, and the use of an independent team for testing. (author)

  10. INFOS: spectrum fitting software for NMR analysis

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Albert A., E-mail: alsi@nmr.phys.chem.ethz.ch [ETH Zürich, Physical Chemistry (Switzerland)

    2017-02-15

    Software for fitting of NMR spectra in MATLAB is presented. Spectra are fitted in the frequency domain, using Fourier transformed lineshapes, which are derived using the experimental acquisition and processing parameters. This yields more accurate fits compared to common fitting methods that use Lorentzian or Gaussian functions. Furthermore, a very time-efficient algorithm for calculating and fitting spectra has been developed. The software also performs initial peak picking, followed by subsequent fitting and refinement of the peak list, by iteratively adding and removing peaks to improve the overall fit. Estimation of error on fitting parameters is performed using a Monte-Carlo approach. Many fitting options allow the software to be flexible enough for a wide array of applications, while still being straightforward to set up with minimal user input.

  11. PIV/HPIV Film Analysis Software Package

    Science.gov (United States)

    Blackshire, James L.

    1997-01-01

    A PIV/HPIV film analysis software system was developed that calculates the 2-dimensional spatial autocorrelations of subregions of Particle Image Velocimetry (PIV) or Holographic Particle Image Velocimetry (HPIV) film recordings. The software controls three hardware subsystems including (1) a Kodak Megaplus 1.4 camera and EPIX 4MEG framegrabber subsystem, (2) an IEEE/Unidex 11 precision motion control subsystem, and (3) an Alacron I860 array processor subsystem. The software runs on an IBM PC/AT host computer running either the Microsoft Windows 3.1 or Windows 95 operating system. It is capable of processing five PIV or HPIV displacement vectors per second, and is completely automated with the exception of user input to a configuration file prior to analysis execution for update of various system parameters.

  12. Applications of ATILA FEM software to smart materials case studies in designing devices

    CERN Document Server

    Uchino, Kenji

    2013-01-01

    ATILA Finite Element Method (FEM) software facilitates the modelling and analysis of applications using piezoelectric, magnetostrictor and shape memory materials. It allows entire designs to be constructed, refined and optimized before production begins. Through a range of instructive case studies, Applications of ATILA FEM software to smart materials provides an indispensable guide to the use of this software in the design of effective products.Part one provides an introduction to ATILA FEM software, beginning with an overview of the software code. New capabilities and loss integratio

  13. Computer, Network, Software, and Hardware Engineering with Applications

    CERN Document Server

    Schneidewind, Norman F

    2012-01-01

    There are many books on computers, networks, and software engineering but none that integrate the three with applications. Integration is important because, increasingly, software dominates the performance, reliability, maintainability, and availability of complex computer and systems. Books on software engineering typically portray software as if it exists in a vacuum with no relationship to the wider system. This is wrong because a system is more than software. It is comprised of people, organizations, processes, hardware, and software. All of these components must be considered in an integr

  14. 使用SPSS软件进行多因素方差分析%Application of SPSS Software in Multivariate Analysis of Variance

    Institute of Scientific and Technical Information of China (English)

    龚江; 石培春; 李春燕

    2012-01-01

    以两因素完全随机有重复的试验为例,阐述用SPSS软进行方差分析的详细过程,包括数据的输入、变异来源的分析,方差分析结果,以及显著性检验,最后还对方差分析注意事项进行分析,为科技工作者使用SPSS软进方差分析提供参考。%An example about two factors multiple completely random design analysis of variance was given and the detailed process of analysis of variance in SPSS software was elaborated,including the data input,he source analysis of the variance,the result of analysis of variance,the test of significance,etc.At last,precautions on the analysis of variance with SPSS software were given,providing references to the analysis of variance with SPSS software for scientific research workers.

  15. Software for computerised analysis of cardiotocographic traces.

    Science.gov (United States)

    Romano, M; Bifulco, P; Ruffo, M; Improta, G; Clemente, F; Cesarelli, M

    2016-02-01

    Despite the widespread use of cardiotocography in foetal monitoring, the evaluation of foetal status suffers from a considerable inter and intra-observer variability. In order to overcome the main limitations of visual cardiotocographic assessment, computerised methods to analyse cardiotocographic recordings have been recently developed. In this study, a new software for automated analysis of foetal heart rate is presented. It allows an automatic procedure for measuring the most relevant parameters derivable from cardiotocographic traces. Simulated and real cardiotocographic traces were analysed to test software reliability. In artificial traces, we simulated a set number of events (accelerations, decelerations and contractions) to be recognised. In the case of real signals, instead, results of the computerised analysis were compared with the visual assessment performed by 18 expert clinicians and three performance indexes were computed to gain information about performances of the proposed software. The software showed preliminary performance we judged satisfactory in that the results matched completely the requirements, as proved by tests on artificial signals in which all simulated events were detected from the software. Performance indexes computed in comparison with obstetricians' evaluations are, on the contrary, not so satisfactory; in fact they led to obtain the following values of the statistical parameters: sensitivity equal to 93%, positive predictive value equal to 82% and accuracy equal to 77%. Very probably this arises from the high variability of trace annotation carried out by clinicians. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  16. Problems in Systematic Application of Software Metrics and Possible Solution

    OpenAIRE

    Rakic, Gordana; Budimac, Zoran

    2013-01-01

    Systematic application of software metric techniques can lead to significant improvements of the quality of a final software product. However, there is still the evident lack of wider utilization of software metrics techniques and tools due to many reasons. In this paper we investigate some limitations of contemporary software metrics tools and then propose construction of a new tool that would solve some of the problems. We describe the promising prototype, its internal structure, and then f...

  17. A 'Toolbox' Equivalent Process for Safety Analysis Software

    International Nuclear Information System (INIS)

    O'Kula, K.R.; Eng, Tony

    2004-01-01

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (Quality Assurance for Safety-Related Software) identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls that prevent or mitigate potential accidents. The development and maintenance of a collection, or 'toolbox', of multiple-site use, standard solution, Software Quality Assurance (SQA)-compliant safety software is one of the major improvements identified in the associated DOE Implementation Plan (IP). The DOE safety analysis toolbox will contain a set of appropriately quality-assured, configuration-controlled, safety analysis codes, recognized for DOE-broad, safety basis applications. Currently, six widely applied safety analysis computer codes have been designated for toolbox consideration. While the toolbox concept considerably reduces SQA burdens among DOE users of these codes, many users of unique, single-purpose, or single-site software may still have sufficient technical justification to continue use of their computer code of choice, but are thwarted by the multiple-site condition on toolbox candidate software. The process discussed here provides a roadmap for an equivalency argument, i.e., establishing satisfactory SQA credentials for single-site software that can be deemed ''toolbox-equivalent''. The process is based on the model established to meet IP Commitment 4.2.1.2: Establish SQA criteria for the safety analysis ''toolbox'' codes. Implementing criteria that establish the set of prescriptive SQA requirements are based on implementation plan/procedures from the Savannah River Site, also incorporating aspects of those from the Waste Isolation Pilot Plant (SNL component) and the Yucca Mountain Project. The major requirements are met with evidence of a software quality assurance plan, software requirements and design documentation, user's instructions, test report, a

  18. GRACAT, Software for grounding and collision analysis

    DEFF Research Database (Denmark)

    Friis-Hansen, Peter; Simonsen, Bo Cerup

    2002-01-01

    and grounding accidents. The software consists of three basic analysis modules and one risk mitigation module: 1) frequency, 2) damage, and 3) consequence. These modules can be used individually or in series and the analyses can be performed in deterministic or probabilistic mode. Finally, in the mitigation...

  19. Software Energy Footprint Lab (SEFlab). Towards the use of energy efficient software applications; Software Energy Footprint Lab (SEFlab). Naar toepassing van energiezuinige softwareapplicaties

    Energy Technology Data Exchange (ETDEWEB)

    Merkus, B.; Hoekstra, E.; Van den Hoed, R. [Hogeschool van Amsterdam HvA, Amsterdam (Netherlands)

    2012-12-15

    The Energy Footprint Software Lab (SEFLab) allows researchers to give a detailed overview of the energy consumption of the hardware and software with a combination of advanced techniques in a controlled environment. The project comprised four main activities: (1) professionalization of measurement setup and test protocol; (2) Concrete calculation of two software applications as case studies; (3) an (internal) stakeholder analysis; and (4) dissemination of results [Dutch] Het Software Energy Footprint Lab (SEFLab) stelt onderzoekers in staat met een combinatie van geavanceerde technieken in een gecontroleerde omgeving een gedetailleerd beeld van het energieverbruik van de hardware n.a.v. software-gebruik te meten. Binnen het project stonden vier activiteiten centraal: (1) professionalisering van meetopstelling en testprotocol; (2) concrete doorrekening van twee softwareapplicaties als cases; (3) een (interne) stakeholder-analyse; en (4) disseminatie van resultaten.

  20. Improving Software Systems By Flow Control Analysis

    Directory of Open Access Journals (Sweden)

    Piotr Poznanski

    2012-01-01

    Full Text Available Using agile methods during the implementation of the system that meets mission critical requirements can be a real challenge. The change in the system built of dozens or even hundreds of specialized devices with embedded software requires the cooperation of a large group of engineers. This article presents a solution that supports parallel work of groups of system analysts and software developers. Deployment of formal rules to the requirements written in natural language enables using formal analysis of artifacts being a bridge between software and system requirements. Formalism and textual form of requirements allowed the automatic generation of message flow graph for the (sub system, called the “big-picture-model”. Flow diagram analysis helped to avoid a large number of defects whose repair cost in extreme cases could undermine the legitimacy of agile methods in projects of this scale. Retrospectively, a reduction of technical debt was observed. Continuous analysis of the “big picture model” improves the control of the quality parameters of the software architecture. The article also tries to explain why the commercial platform based on UML modeling language may not be sufficient in projects of this complexity.

  1. Intraprocedural dataflow analysis for software product lines

    DEFF Research Database (Denmark)

    Brabrand, Claus; Ribeiro, Márcio; Tolêdo, Társis

    2013-01-01

    Software product lines (SPLs) developed using annotative approaches such as conditional compilation come with an inherent risk of constructing erroneous products. For this reason, it is essential to be able to analyze such SPLs. However, as dataflow analysis techniques are not able to deal with SP...... and memory characteristics on five qualitatively different SPLs. On our benchmarks, the combined analysis strategy is up to almost eight times faster than the brute-force approach....

  2. Application of software to development of reactor-safety codes

    International Nuclear Information System (INIS)

    Wilburn, N.P.; Niccoli, L.G.

    1980-09-01

    Over the past two-and-a-half decades, the application of new techniques has reduced hardware cost for digital computer systems and increased computational speed by several orders of magnitude. A corresponding cost reduction in business and scientific software development has not occurred. The same situation is seen for software developed to model the thermohydraulic behavior of nuclear systems under hypothetical accident situations. For all cases this is particularly noted when costs over the total software life cycle are considered. A solution to this dilemma for reactor safety code systems has been demonstrated by applying the software engineering techniques which have been developed over the course of the last few years in the aerospace and business communities. These techniques have been applied recently with a great deal of success in four major projects at the Hanford Engineering Development Laboratory (HEDL): 1) a rewrite of a major safety code (MELT); 2) development of a new code system (CONACS) for description of the response of LMFBR containment to hypothetical accidents, and 3) development of two new modules for reactor safety analysis

  3. Automating risk analysis of software design models.

    Science.gov (United States)

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  4. Hazard Analysis of Software Requirements Specification for Process Module of FPGA-based Controllers in NPP

    Energy Technology Data Exchange (ETDEWEB)

    Jung; Sejin; Kim, Eui-Sub; Yoo, Junbeom [Konkuk University, Seoul (Korea, Republic of); Keum, Jong Yong; Lee, Jang-Soo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Software in PLC, FPGA which are used to develop I and C system also should be analyzed to hazards and risks before used. NUREG/CR-6430 proposes the method for performing software hazard analysis. It suggests analysis technique for software affected hazards and it reveals that software hazard analysis should be performed with the aspects of software life cycle such as requirements analysis, design, detailed design, implements. It also provides the guide phrases for applying software hazard analysis. HAZOP (Hazard and operability analysis) is one of the analysis technique which is introduced in NUREG/CR-6430 and it is useful technique to use guide phrases. HAZOP is sometimes used to analyze the safety of software. Analysis method of NUREG/CR-6430 had been used in Korea nuclear power plant software for PLC development. Appropriate guide phrases and analysis process are selected to apply efficiently and NUREG/CR-6430 provides applicable methods for software hazard analysis is identified in these researches. We perform software hazard analysis of FPGA software requirements specification with two approaches which are NUREG/CR-6430 and HAZOP with using general GW. We also perform the comparative analysis with them. NUREG/CR-6430 approach has several pros and cons comparing with the HAZOP with general guide words and approach. It is enough applicable to analyze the software requirements specification of FPGA.

  5. PC Software for Artificial Intelligence Applications.

    Science.gov (United States)

    Epp, H; Kalin, M; Miller, D

    1988-05-06

    Our review has emphasized that AI tools are programming languages inspired by some problem-solving paradigm. We want to underscore their status as programming languages; even if an AI tool seems to fit a problem perfectly, its proficient use still requires the training and practice associated with any programming language. The programming manuals for PC-Plus, Smalltalk/ V, and Nexpert Object are all tutorial in nature, and the corresponding software packages come with sample applications. We find the manuals to be uniformly good introductions that try to anticipate the problems of a user who is new to the technology. All three vendors offer free technical support by telephone to licensed users. AI tools are sometimes oversold as a way to make programming easy or to avoid it altogether. The truth is that AI tools demand programming-but programming that allows you to concentrate on the essentials of the problem. If we had to implement a diagnostic system, we would look first to a product such as PC-Plus rather than BASIC or C, because PC-Plus is designed specifically for such a problem, whereas these conventional languages are not. If we had to implement a system that required graphical interfaces and could benefit from inheritance, we would look first to an object-oriented system such as Smalltalk/V that provides built-in mechanisms for both. If we had to implement an expert system that called for some mix of AI and conventional techniques, we would look first to a product such as Nexpert Object that integrates various problem-solving technologies. Finally, we might use FORTRAN if we were concerned primarily with programming a well-defined numerical algorithm. AI tools are a valuable complement to traditional languages.

  6. Forestry application of the AHP by use of MPC software

    Energy Technology Data Exchange (ETDEWEB)

    Perez-Rodriguez, F.; Rojo-Alboreca, A.

    2012-07-01

    We present an example of the application of the AHP decision-making approach to forest management, by use of MPC 2.0 software. The example considered is that of a forest services company interested in buying a timber harvester. The company had preselected four different machines as possible alternatives, and established 11 different criteria involved in the decision, grouped into four categories (economic, environmental, social and technical). The decision-making process was undertaken using MPC 2.0 software tools, which enable establishment of criteria on two levels, independent pairwise comparison of criteria (first phase) and of alternatives under each criterion (second phase), repetition of the decision-making process by the same or different users, graphical display of the results on the computer screen, and sensitivity analysis. (Author) 28 refs.

  7. Risk reduction using DDP (Defect Detection and Prevention): Software support and software applications

    Science.gov (United States)

    Feather, M. S.

    2001-01-01

    Risk assessment and mitigation is the focus of the Defect Detection and Prevention (DDP) process, which has been applied to spacecraft technology assessments and planning, both hardware and software. DDP's major elements and their relevance to core requirement engineering concerns are summarized. The accompanying research demonstration illustrates DDP's tool support, and further customizations for application to software.

  8. Operational excellence (six sigma) philosophy: Application to software quality assurance

    Energy Technology Data Exchange (ETDEWEB)

    Lackner, M.

    1997-11-01

    This report contains viewgraphs on operational excellence philosophy of six sigma applied to software quality assurance. This report outlines the following: goal of six sigma; six sigma tools; manufacturing vs administrative processes; Software quality assurance document inspections; map software quality assurance requirements document; failure mode effects analysis for requirements document; measuring the right response variables; and questions.

  9. Development of evaluation method for software safety analysis techniques

    International Nuclear Information System (INIS)

    Huang, H.; Tu, W.; Shih, C.; Chen, C.; Yang, W.; Yih, S.; Kuo, C.; Chen, M.

    2006-01-01

    Full text: Full text: Following the massive adoption of digital Instrumentation and Control (I and C) system for nuclear power plant (NPP), various Software Safety Analysis (SSA) techniques are used to evaluate the NPP safety for adopting appropriate digital I and C system, and then to reduce risk to acceptable level. However, each technique has its specific advantage and disadvantage. If the two or more techniques can be complementarily incorporated, the SSA combination would be more acceptable. As a result, if proper evaluation criteria are available, the analyst can then choose appropriate technique combination to perform analysis on the basis of resources. This research evaluated the applicable software safety analysis techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flowgraph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/ noise ratio, complexity, and implementation cost. These indexes may help the decision makers and the software safety analysts to choose the best SSA combination arrange their own software safety plan. By this proposed method, the analysts can evaluate various SSA combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (without transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and Simulation-based model analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantage are the completeness complexity

  10. Application of neural network and pattern recognition software to the automated analysis of continuous nuclear monitoring of on-load reactors

    Energy Technology Data Exchange (ETDEWEB)

    Howell, J.A.; Eccleston, G.W.; Halbig, J.K.; Klosterbuer, S.F. [Los Alamos National Lab., NM (United States); Larson, T.W. [California Polytechnic State Univ., San Luis Obispo, CA (US)

    1993-08-01

    Automated analysis using pattern recognition and neural network software can help interpret data, call attention to potential anomalies, and improve safeguards effectiveness. Automated software analysis, based on pattern recognition and neural networks, was applied to data collected from a radiation core discharge monitor system located adjacent to an on-load reactor core. Unattended radiation sensors continuously collect data to monitor on-line refueling operations in the reactor. The huge volume of data collected from a number of radiation channels makes it difficult for a safeguards inspector to review it all, check for consistency among the measurement channels, and find anomalies. Pattern recognition and neural network software can analyze large volumes of data from continuous, unattended measurements, thereby improving and automating the detection of anomalies. The authors developed a prototype pattern recognition program that determines the reactor power level and identifies the times when fuel bundles are pushed through the core during on-line refueling. Neural network models were also developed to predict fuel bundle burnup to calculate the region on the on-load reactor face from which fuel bundles were discharged based on the radiation signals. In the preliminary data set, which was limited and consisted of four distinct burnup regions, the neural network model correctly predicted the burnup region with an accuracy of 92%.

  11. Development of neutron activation analysis software

    International Nuclear Information System (INIS)

    Wang Liyu

    1987-10-01

    The software for quantitative neutron activation analysis was developed to run under the MS/DOS operating system. The programmes of the IBM/SPAN include: spectra file transfer from and to a Canberra Series 35 multichannel analyzer, spectrum evaluation routines, calibration subprogrammes, and quantitative analysis. The programmes for spectrum analysis include fitting routine for separation of multiple lines by reproducing the peak shape with a combination of Gaussian and exponential terms. The programmes were tested on an IBM/AT-compatible computer. The programmes and the sources are available costfree for the IAEA projects of Technical Cooperation. 7 refs, 3 figs

  12. IUE Data Analysis Software for Personal Computers

    Science.gov (United States)

    Thompson, R.; Caplinger, J.; Taylor, L.; Lawton , P.

    1996-01-01

    This report summarizes the work performed for the program titled, "IUE Data Analysis Software for Personal Computers" awarded under Astrophysics Data Program NRA 92-OSSA-15. The work performed was completed over a 2-year period starting in April 1994. As a result of the project, 450 IDL routines and eight database tables are now available for distribution for Power Macintosh computers and Personal Computers running Windows 3.1.

  13. Multidimensional Recurrence Quantification Analysis (MdRQA) for the Analysis of Multidimensional Time-Series: A Software Implementation in MATLAB and Its Application to Group-Level Data in Joint Action.

    Science.gov (United States)

    Wallot, Sebastian; Roepstorff, Andreas; Mønster, Dan

    2016-01-01

    We introduce Multidimensional Recurrence Quantification Analysis (MdRQA) as a tool to analyze multidimensional time-series data. We show how MdRQA can be used to capture the dynamics of high-dimensional signals, and how MdRQA can be used to assess coupling between two or more variables. In particular, we describe applications of the method in research on joint and collective action, as it provides a coherent analysis framework to systematically investigate dynamics at different group levels-from individual dynamics, to dyadic dynamics, up to global group-level of arbitrary size. The Appendix in Supplementary Material contains a software implementation in MATLAB to calculate MdRQA measures.

  14. Digital PIV (DPIV) Software Analysis System

    Science.gov (United States)

    Blackshire, James L.

    1997-01-01

    A software package was developed to provide a Digital PIV (DPIV) capability for NASA LaRC. The system provides an automated image capture, test correlation, and autocorrelation analysis capability for the Kodak Megaplus 1.4 digital camera system for PIV measurements. The package includes three separate programs that, when used together with the PIV data validation algorithm, constitutes a complete DPIV analysis capability. The programs are run on an IBM PC/AT host computer running either Microsoft Windows 3.1 or Windows 95 using a 'quickwin' format that allows simple user interface and output capabilities to the windows environment.

  15. k0-NAA implementation and application at IPEN neutron activation laboratory by using the k0-IAEA software: application to geological sample analysis

    International Nuclear Information System (INIS)

    Mariano, Davi Brigatto

    2011-01-01

    The Neutron Activation Analysis Laboratory (LAN-IPEN) has been analysing geological samples such as rocks, soils and sediments, for many years with the INAA comparative method, for geochemical and environmental research. This study presents the results obtained in the implementation of the k 0 -standardization method at LAN - IPEN, for geological sample analysis, by using the program k 0 - IAEA, provided by the International Atomic Energy Agency (IAEA). The thermal to epithermal flux ratio f and the shape factor α of the epithermal flux distribution of the IPEN IEA-R1 nuclear reactor were determined for the pneumatic irradiation facility and one selected irradiation position, for short and long irradiations, respectively. To obtain these factors, the 'are triple-monitor' method with 197 Au- 96 Zr- 94 Zr was used. In order to validate the methodology, the geological reference materials basalts JB-1 (GSJ) and BE-N (IWG-GIT), andesite AGV-1 (USGS), granite GS-N (ANRT), SOIL-7 (IAEA) and sediment Buffalo River Sediment (NIST - BRS-8704), which represent different geological matrices, were analysed. The concentration results obtained agreed with assigned values, with bias less than 10% except for Zn in AGV-1 (11.4%) and Mg in GS-N (13.4%). Three different scores were used to evaluate the results: z-score, zeta-score and Uscore. The z-score showed that the results can be considered satisfactory (z 3) for Mn in BE-N, Mg, Ce and La in GS-N, Mg in JB-1, and Th and Eu in Buffalo River Sediment. The U-score test showed that all results, except Mg in JB-1, were within 95% confidence interval. These results indicate excellent possibilities of using this parametric method at the LAN-IPEN for geological samples analysis in geochemical and environmental studies. (author)

  16. Software Process Improvement Journey: IBM Australia Application Management Services

    Science.gov (United States)

    2005-03-01

    See Section 5.1.2) - Client Relationship Management ( CRM ) processes-specifically, Solution Design and Solution Delivery - Worldwide Project Management ...plex systems life-cycle management , rapid solutions development, custom development, package selection and implementation, maintenance, minor...CarnegieMellon ___ Software Engineering Institute Software Process Improvement Journey: IBM Australia Application Management Services Robyn Nichols

  17. 36 CFR 1194.21 - Software applications and operating systems.

    Science.gov (United States)

    2010-07-01

    ... operating systems. 1194.21 Section 1194.21 Parks, Forests, and Public Property ARCHITECTURAL AND... Standards § 1194.21 Software applications and operating systems. (a) When software is designed to run on a... shall not disrupt or disable activated features of any operating system that are identified as...

  18. Robot modelling; Control and applications with software

    Energy Technology Data Exchange (ETDEWEB)

    Ranky, P G; Ho, C Y

    1985-01-01

    This book provides a ''picture'' of robotics covering both the theoretical aspect of modeling as well as the practical and design aspects of: robot programming; robot tooling and automated hand changing; implementation planning; testing; and software design for robot systems. The authors present an introduction to robotics with a systems approach. They describe not only the tasks relating to a single robot (or arm) but also systems of robots working together on a product or several products.

  19. Specdata: Automated Analysis Software for Broadband Spectra

    Science.gov (United States)

    Oliveira, Jasmine N.; Martin-Drumel, Marie-Aline; McCarthy, Michael C.

    2017-06-01

    With the advancement of chirped-pulse techniques, broadband rotational spectra with a few tens to several hundred GHz of spectral coverage are now routinely recorded. When studying multi-component mixtures that might result, for example, with the use of an electrical discharge, lines of new chemical species are often obscured by those of known compounds, and analysis can be laborious. To address this issue, we have developed SPECdata, an open source, interactive tool which is designed to simplify and greatly accelerate the spectral analysis and discovery. Our software tool combines both automated and manual components that free the user from computation, while giving him/her considerable flexibility to assign, manipulate, interpret and export their analysis. The automated - and key - component of the new software is a database query system that rapidly assigns transitions of known species in an experimental spectrum. For each experiment, the software identifies spectral features, and subsequently assigns them to known molecules within an in-house database (Pickett .cat files, list of frequencies...), or those catalogued in Splatalogue (using automatic on-line queries). With suggested assignments, the control is then handed over to the user who can choose to accept, decline or add additional species. Data visualization, statistical information, and interactive widgets assist the user in making decisions about their data. SPECdata has several other useful features intended to improve the user experience. Exporting a full report of the analysis, or a peak file in which assigned lines are removed are among several options. A user may also save their progress to continue at another time. Additional features of SPECdata help the user to maintain and expand their database for future use. A user-friendly interface allows one to search, upload, edit or update catalog or experiment entries.

  20. Software development processes and analysis software: a mismatch and a novel framework

    International Nuclear Information System (INIS)

    Kelly, D.; Harauz, J.

    2011-01-01

    This paper discusses the salient characteristics of analysis software and the impact of those characteristics on its development. From this discussion, it can be seen that mainstream software development processes, usually characterized as Plan Driven or Agile, are built upon assumptions that are mismatched to the development and maintenance of analysis software. We propose a novel software development framework that would match the process normally observed in the development of analysis software. In the discussion of this framework, we suggest areas of research and directions for future work. (author)

  1. Evaluation of peak-fitting software for gamma spectrum analysis

    International Nuclear Information System (INIS)

    Zahn, Guilherme S.; Genezini, Frederico A.; Moralles, Mauricio

    2009-01-01

    In all applications of gamma-ray spectroscopy, one of the most important and delicate parts of the data analysis is the fitting of the gamma-ray spectra, where information as the number of counts, the position of the centroid and the width, for instance, are associated with each peak of each spectrum. There's a huge choice of computer programs that perform this type of analysis, and the most commonly used in routine work are the ones that automatically locate and fit the peaks; this fit can be made in several different ways - the most common ways are to fit a Gaussian function to each peak or simply to integrate the area under the peak, but some software go far beyond and include several small corrections to the simple Gaussian peak function, in order to compensate for secondary effects. In this work several gamma-ray spectroscopy software are compared in the task of finding and fitting the gamma-ray peaks in spectra taken with standard sources of 137 Cs, 60 Co, 133 Ba and 152 Eu. The results show that all of the automatic software can be properly used in the task of finding and fitting peaks, with the exception of GammaVision; also, it was possible to verify that the automatic peak-fitting software did perform as well as - and sometimes even better than - a manual peak-fitting software. (author)

  2. CMS Computing Software and Analysis Challenge 2006

    Energy Technology Data Exchange (ETDEWEB)

    De Filippis, N. [Dipartimento interateneo di Fisica M. Merlin and INFN Bari, Via Amendola 173, 70126 Bari (Italy)

    2007-10-15

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work.

  3. CMS Computing Software and Analysis Challenge 2006

    International Nuclear Information System (INIS)

    De Filippis, N.

    2007-01-01

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work

  4. An Assessment of the Library Application Software Packages in ...

    African Journals Online (AJOL)

    Journal Home > Vol 7, No 2 (2007) > ... the study examined the adopted softwares' security, compatibility/capabilities, ... The study found that most application packages available in the Nigerian automation market place are effective since they ...

  5. Static analysis of software the abstract interpretation

    CERN Document Server

    Boulanger, Jean-Louis

    2013-01-01

    The existing literature currently available to students and researchers is very general, covering only the formal techniques of static analysis. This book presents real examples of the formal techniques called ""abstract interpretation"" currently being used in various industrial fields: railway, aeronautics, space, automotive, etc. The purpose of this book is to present students and researchers, in a single book, with the wealth of experience of people who are intrinsically involved in the realization and evaluation of software-based safety critical systems. As the authors are people curr

  6. Sandia National Laboratories ASCI Applications Software Quality Engineering Practices; TOPICAL

    International Nuclear Information System (INIS)

    ZEPPER, JOHN D.; ARAGON, KATHRYN MARY; ELLIS, MOLLY A.; BYLE, KATHLEEN A.; EATON, DONNA SUE

    2002-01-01

    This document provides a guide to the deployment of the software verification activities, software engineering practices, and project management principles that guide the development of Accelerated Strategic Computing Initiative (ASCI) applications software at Sandia National Laboratories (Sandia). The goal of this document is to identify practices and activities that will foster the development of reliable and trusted products produced by the ASCI Applications program. Document contents include an explanation of the structure and purpose of the ASCI Quality Management Council, an overview of the software development lifecycle, an outline of the practices and activities that should be followed, and an assessment tool. These sections map practices and activities at Sandia to the ASCI Software Quality Engineering: Goals, Principles, and Guidelines, a Department of Energy document

  7. A new paradigm for the development of analysis software

    International Nuclear Information System (INIS)

    Kelly, D.; Harauz, J.

    2012-01-01

    For the CANDU industry, analysis software is an important tool for scientists and engineers to examine issues related to safety, operation, and design. However, the software quality assurance approach currently used for these tools assumes the software is the delivered product. In this paper, we present a model that shifts the emphasis from software being the end-product to software being support for the end-product, the science. We describe a novel software development paradigm that supports this shift and provides the groundwork for re-examining the quality assurance practices used for analysis software. (author)

  8. Comparison of PV system design software packages for urban applications

    Energy Technology Data Exchange (ETDEWEB)

    Gharakhani Siraki, Arbi; Pillay, Pragasen

    2010-09-15

    A large number of software packages are available for solar resource evaluation and PV system design. However, few of them are suitable for urban applications. In this paper a comparison has been made between two specifically designed solar tools known as the Ecotect 2010 and the PVsyst 5.05. Conclusions have been made for proper use of these packages based on their specifications and privileges. Moreover, the calculations have been repeated with HOMER software package (which is a generic tool) for the same location. The results suggest that a generic solar software tool should not be used for an urban application.

  9. Modularity analysis of automotive control software

    OpenAIRE

    Dajsuren, Y.; Brand, van den, M.G.J.; Serebrenik, A.

    2013-01-01

    A design language and tool like MATLAB/Simulink is used for the graphical modelling and simulation of automotive control software. As the functionality based on electronics and software systems increases in motor vehicles, it is becoming increasingly important for system/software architects and control engineers in the automotive industry to ensure the quality of the highly complex MATLAB/Simulink control software. For automotive software, modularity is recognized as being a crucial quality a...

  10. Model Based Analysis and Test Generation for Flight Software

    Science.gov (United States)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  11. Computer- Aided Design in Power Engineering Application of Software Tools

    CERN Document Server

    Stojkovic, Zlatan

    2012-01-01

    This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents  application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel & Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems ...

  12. Adapted wavelet analysis from theory to software

    CERN Document Server

    Wickerhauser, Mladen Victor

    1994-01-01

    This detail-oriented text is intended for engineers and applied mathematicians who must write computer programs to perform wavelet and related analysis on real data. It contains an overview of mathematical prerequisites and proceeds to describe hands-on programming techniques to implement special programs for signal analysis and other applications. From the table of contents: - Mathematical Preliminaries - Programming Techniques - The Discrete Fourier Transform - Local Trigonometric Transforms - Quadrature Filters - The Discrete Wavelet Transform - Wavelet Packets - The Best Basis Algorithm - Multidimensional Library Trees - Time-Frequency Analysis - Some Applications - Solutions to Some of the Exercises - List of Symbols - Quadrature Filter Coefficients

  13. Control system software, simulation, and robotic applications

    Science.gov (United States)

    Frisch, Harold P.

    1991-01-01

    All essential existing capabilities needed to create a man-machine interaction dynamics and performance (MMIDAP) capability are reviewed. The multibody system dynamics software program Order N DISCOS will be used for machine and musculo-skeletal dynamics modeling. The program JACK will be used for estimating and animating whole body human response to given loading situations and motion constraints. The basic elements of performance (BEP) task decomposition methodologies associated with the Human Performance Institute database will be used for performance assessment. Techniques for resolving the statically indeterminant muscular load sharing problem will be used for a detailed understanding of potential musculotendon or ligamentous fatigue, pain, discomfort, and trauma. The envisioned capacity is to be used for mechanical system design, human performance assessment, extrapolation of man/machine interaction test data, biomedical engineering, and soft prototyping within a concurrent engineering (CE) system.

  14. Uses of software in digital image analysis: a forensic report

    Science.gov (United States)

    Sharma, Mukesh; Jha, Shailendra

    2010-02-01

    Forensic image analysis is required an expertise to interpret the content of an image or the image itself in legal matters. Major sub-disciplines of forensic image analysis with law enforcement applications include photo-grammetry, photographic comparison, content analysis and image authentication. It has wide applications in forensic science range from documenting crime scenes to enhancing faint or indistinct patterns such as partial fingerprints. The process of forensic image analysis can involve several different tasks, regardless of the type of image analysis performed. Through this paper authors have tried to explain these tasks, which are described in to three categories: Image Compression, Image Enhancement & Restoration and Measurement Extraction. With the help of examples like signature comparison, counterfeit currency comparison and foot-wear sole impression using the software Canvas and Corel Draw.

  15. Application Reuse Library for Software, Requirements, and Guidelines

    Science.gov (United States)

    Malin, Jane T.; Thronesbery, Carroll

    1994-01-01

    Better designs are needed for expert systems and other operations automation software, for more reliable, usable and effective human support. A prototype computer-aided Application Reuse Library shows feasibility of supporting concurrent development and improvement of advanced software by users, analysts, software developers, and human-computer interaction experts. Such a library expedites development of quality software, by providing working, documented examples, which support understanding, modification and reuse of requirements as well as code. It explicitly documents and implicitly embodies design guidelines, standards and conventions. The Application Reuse Library provides application modules with Demo-and-Tester elements. Developers and users can evaluate applicability of a library module and test modifications, by running it interactively. Sub-modules provide application code and displays and controls. The library supports software modification and reuse, by providing alternative versions of application and display functionality. Information about human support and display requirements is provided, so that modifications will conform to guidelines. The library supports entry of new application modules from developers throughout an organization. Example library modules include a timer, some buttons and special fonts, and a real-time data interface program. The library prototype is implemented in the object-oriented G2 environment for developing real-time expert systems.

  16. Standardization of Software Application Development and Governance

    Science.gov (United States)

    2015-03-01

    of their systems or applications. DOD systems do not have the luxury of replacing systems at the same pace as commercial companies. DOD has to...is not that the commercial market purposefully sells products that are not complete, but having a 100% complete product requires extensive testing...develop applications for Google ’s Android and Apple ’s iOS devices. Both these companies have SDKs online as well as a number of resources available

  17. Development of Image Analysis Software of MAXI

    Science.gov (United States)

    Eguchi, S.; Ueda, Y.; Hiroi, K.; Isobe, N.; Sugizaki, M.; Suzuki, M.; Tomida, H.; Maxi Team

    2010-12-01

    Monitor of All-sky X-ray Image (MAXI) is an X-ray all-sky monitor, attached to the Japanese experiment module Kibo on the International Space Station. The main scientific goals of the MAXI mission include the discovery of X-ray novae followed by prompt alerts to the community (Negoro et al., in this conference), and production of X-ray all-sky maps and new source catalogs with unprecedented sensitivities. To extract the best capabilities of the MAXI mission, we are working on the development of detailed image analysis tools. We utilize maximum likelihood fitting to a projected sky image, where we take account of the complicated detector responses, such as the background and point spread functions (PSFs). The modeling of PSFs, which strongly depend on the orbit and attitude of MAXI, is a key element in the image analysis. In this paper, we present the status of our software development.

  18. STAR: Software Toolkit for Analysis Research

    International Nuclear Information System (INIS)

    Doak, J.E.; Prommel, J.M.; Whiteson, R.; Hoffbauer, B.L.; Thomas, T.R.; Helman, P.

    1993-01-01

    Analyzing vast quantities of data from diverse information sources is an increasingly important element for nonproliferation and arms control analysis. Much of the work in this area has used human analysts to assimilate, integrate, and interpret complex information gathered from various sources. With the advent of fast computers, we now have the capability to automate this process thereby shifting this burden away from humans. In addition, there now exist huge data storage capabilities which have made it possible to formulate large integrated databases comprising many thereabouts of information spanning a variety of subjects. We are currently designing a Software Toolkit for Analysis Research (STAR) to address these issues. The goal of STAR is to Produce a research tool that facilitates the development and interchange of algorithms for locating phenomena of interest to nonproliferation and arms control experts. One major component deals with the preparation of information. The ability to manage and effectively transform raw data into a meaningful form is a prerequisite for analysis by any methodology. The relevant information to be analyzed can be either unstructured text structured data, signals, or images. Text can be numerical and/or character, stored in raw data files, databases, streams of bytes, or compressed into bits in formats ranging from fixed, to character-delimited, to a count followed by content The data can be analyzed in real-time or batch mode. Once the data are preprocessed, different analysis techniques can be applied. Some are built using expert knowledge. Others are trained using data collected over a period of time. Currently, we are considering three classes of analyzers for use in our software toolkit: (1) traditional machine learning techniques, (2) the purely statistical system, and (3) expert systems

  19. Effective Results Analysis for the Similar Software Products’ Orthogonality

    Directory of Open Access Journals (Sweden)

    Ion Ivan

    2009-10-01

    Full Text Available It is defined the concept of similar software. There are established conditions of archiving the software components. It is carried out the orthogonality evaluation and the correlation between the orthogonality and the complexity of the homogenous software components is analyzed. Shall proceed to build groups of similar software products, belonging to the orthogonality intervals. There are presented in graphical form the results of the analysis. There are detailed aspects of the functioning of the software product allocated for the orthogonality.

  20. Fuzzy relational calculus theory, applications and software

    CERN Document Server

    Peeva, Ketty

    2004-01-01

    This book examines fuzzy relational calculus theory with applications in various engineering subjects. The scope of the text covers unified and exact methods with algorithms for direct and inverse problem resolution in fuzzy relational calculus. Extensive engineering applications of fuzzy relation compositions and fuzzy linear systems (linear, relational and intuitionistic) are discussed. Some examples of such applications include solutions of equivalence, reduction and minimization problems in fuzzy machines, pattern recognition in fuzzy languages, optimization and inference engines in textile and chemical engineering, etc. A comprehensive overview of the authors' original work in fuzzy relational calculus is also provided in each chapter. The attached CD-Rom contains a toolbox with many functions for fuzzy calculations, together with an original algorithm for inverse problem resolution in MATLAB. This book is also suitable for use as a textbook in related courses at advanced undergraduate and graduate level...

  1. Decision Engines for Software Analysis Using Satisfiability Modulo Theories Solvers

    Science.gov (United States)

    Bjorner, Nikolaj

    2010-01-01

    The area of software analysis, testing and verification is now undergoing a revolution thanks to the use of automated and scalable support for logical methods. A well-recognized premise is that at the core of software analysis engines is invariably a component using logical formulas for describing states and transformations between system states. The process of using this information for discovering and checking program properties (including such important properties as safety and security) amounts to automatic theorem proving. In particular, theorem provers that directly support common software constructs offer a compelling basis. Such provers are commonly called satisfiability modulo theories (SMT) solvers. Z3 is a state-of-the-art SMT solver. It is developed at Microsoft Research. It can be used to check the satisfiability of logical formulas over one or more theories such as arithmetic, bit-vectors, lists, records and arrays. The talk describes some of the technology behind modern SMT solvers, including the solver Z3. Z3 is currently mainly targeted at solving problems that arise in software analysis and verification. It has been applied to various contexts, such as systems for dynamic symbolic simulation (Pex, SAGE, Vigilante), for program verification and extended static checking (Spec#/Boggie, VCC, HAVOC), for software model checking (Yogi, SLAM), model-based design (FORMULA), security protocol code (F7), program run-time analysis and invariant generation (VS3). We will describe how it integrates support for a variety of theories that arise naturally in the context of the applications. There are several new promising avenues and the talk will touch on some of these and the challenges related to SMT solvers. Proceedings

  2. Software hazard analysis for nuclear digital protection system by Colored Petri Net

    International Nuclear Information System (INIS)

    Bai, Tao; Chen, Wei-Hua; Liu, Zhen; Gao, Feng

    2017-01-01

    Highlights: •A dynamic hazard analysis method is proposed for the safety-critical software. •The mechanism relies on Colored Petri Net. •Complex interactions between software and hardware are captured properly. •Common failure mode in software are identified effectively. -- Abstract: The software safety of a nuclear digital protection system is critical for the safety of nuclear power plants as any software defect may result in severe damage. In order to ensure the safety and reliability of safety-critical digital system products and their applications, software hazard analysis is required to be performed during the lifecycle of software development. The dynamic software hazard modeling and analysis method based on Colored Petri Net is proposed and applied to the safety-critical control software of the nuclear digital protection system in this paper. The analysis results show that the proposed method can explain the complex interactions between software and hardware and identify the potential common cause failure in software properly and effectively. Moreover, the method can find the dominant software induced hazard to safety control actions, which aids in increasing software quality.

  3. Development and application of new quality model for software projects.

    Science.gov (United States)

    Karnavel, K; Dillibabu, R

    2014-01-01

    The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects.

  4. Knickpoint finder: A software tool that improves neotectonic analysis

    Science.gov (United States)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  5. Modularity analysis of automotive control software

    NARCIS (Netherlands)

    Dajsuren, Y.; Brand, van den M.G.J.; Serebrenik, A.

    2013-01-01

    A design language and tool like MATLAB/Simulink is used for the graphical modelling and simulation of automotive control software. As the functionality based on electronics and software systems increases in motor vehicles, it is becoming increasingly important for system/software architects and

  6. Novel Services and Applications Demand Intelligent Software

    DEFF Research Database (Denmark)

    Olsen, Rasmus Løvenstein

    2011-01-01

    The idea of using context for enhancing application and service behavior toward humans is not new, and has its roots in the human mind and how it works, i.e. many decisions, perceptions and understanding of information we make as humans are based on the context we are in. A concrete example from...

  7. Elementary study on γ analysis software for low level measurement

    International Nuclear Information System (INIS)

    Ruan Guanglin; Huang Xianguo; Xing Shixiong

    2001-01-01

    The difficulty in using fashion γ analysis software in low level measurement is discussed. The ROI report file of ORTEC operation system has been chosen as interface file to write γ analysis software for low-level measurement. The author gives software flowchart and applied example and discusses the existent problems

  8. Workshop and conference on Grand Challenges applications and software technology

    Energy Technology Data Exchange (ETDEWEB)

    1993-12-31

    On May 4--7, 1993, nine federal agencies sponsored a four-day meeting on Grand Challenge applications and software technology. The objective was to bring High-Performance Computing and Communications (HPCC) Grand Challenge applications research groups supported under the federal HPCC program together with HPCC software technologists to: discuss multidisciplinary computational science research issues and approaches, identify major technology challenges facing users and providers, and refine software technology requirements for Grand Challenge applications research. The first day and a half focused on applications. Presentations were given by speakers from universities, national laboratories, and government agencies actively involved in Grand Challenge research. Five areas of research were covered: environmental and earth sciences; computational physics; computational biology, chemistry, and materials sciences; computational fluid and plasma dynamics; and applications of artificial intelligence. The next day and a half was spent in working groups in which the applications researchers were joined by software technologists. Nine breakout sessions took place: I/0, Data, and File Systems; Parallel Programming Paradigms; Performance Characterization and Evaluation of Massively Parallel Processing Applications; Program Development Tools; Building Multidisciplinary Applications; Algorithm and Libraries I; Algorithms and Libraries II; Graphics and Visualization; and National HPCC Infrastructure.

  9. Image processing and analysis software development

    International Nuclear Information System (INIS)

    Shahnaz, R.

    1999-01-01

    The work presented in this project is aimed at developing a software 'IMAGE GALLERY' to investigate various image processing and analysis techniques. The work was divided into two parts namely the image processing techniques and pattern recognition, which further comprised of character and face recognition. Various image enhancement techniques including negative imaging, contrast stretching, compression of dynamic, neon, diffuse, emboss etc. have been studied. Segmentation techniques including point detection, line detection, edge detection have been studied. Also some of the smoothing and sharpening filters have been investigated. All these imaging techniques have been implemented in a window based computer program written in Visual Basic Neural network techniques based on Perception model have been applied for face and character recognition. (author)

  10. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    Science.gov (United States)

    Sun, Z. J.; Wells, D.; Segebade, C.; Green, J.

    2011-06-01

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  11. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    International Nuclear Information System (INIS)

    Sun, Z. J.; Wells, D.; Green, J.; Segebade, C.

    2011-01-01

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  12. LANDSAFE: LANDING SITE RISK ANALYSIS SOFTWARE FRAMEWORK

    Directory of Open Access Journals (Sweden)

    R. Schmidt

    2012-08-01

    Full Text Available The European Space Agency (ESA is planning a Lunar Lander mission in the 2018 timeframe that will demonstrate precise soft landing at the polar regions of the Moon. To ensure a safe and successful landing a careful risk analysis has to be carried out. This is comprised of identifying favorable target areas and evaluating the surface conditions in these areas. Features like craters, boulders, steep slopes, rough surfaces and shadow areas have to be identified in order to assess the risk associated to a landing site in terms of a successful touchdown and subsequent surface operation of the lander. In addition, global illumination conditions at the landing site have to be simulated and analyzed. The Landing Site Risk Analysis software framework (LandSAfe is a system for the analysis, selection and certification of safe landing sites on the lunar surface. LandSAfe generates several data products including high resolution digital terrain models (DTMs, hazard maps, illumination maps, temperature maps and surface reflectance maps which assist the user in evaluating potential landing site candidates. This paper presents the LandSAfe system and describes the methods and products of the different modules. For one candidate landing site on the rim of Shackleton crater at the south pole of the Moon a high resolution DTM is showcased.

  13. Application software, domain-specific languages, and language design assistants

    OpenAIRE

    Heering, Jan

    2000-01-01

    textabstractWhile application software does the real work, domain-specific languages (DSLs) are tools to help produce it efficiently, and language design assistants in turn are meta-tools to help produce DSLs quickly. DSLs are already in wide use (HTML for web pages, Excel macros for spreadsheet applications, VHDL for hardware design, ...), but many more will be needed for both new as well as existing application domains. Language design assistants to help develop them currently exist only in...

  14. A review on software testing approaches for cloud applications

    Directory of Open Access Journals (Sweden)

    Tamanna Siddiqui

    2016-09-01

    Full Text Available Cloud computing has actually been invented to be the latest computing standard that will work several distinctive research areas, such as software testing. Testing cloud applications will keep its unique characteristics that involve more recent testing techniques. Software testing helps to reduce the need for hardware and software services and also provide adaptable and valuable cloud platform. Testing within the cloud platform is easily manageable based on new test models and criteria. Prioritization approach is made responsive to build much better relationship between test cases. These test cases are clustered dependent on priority level. The resources can be used properly by applying load balancing algorithm. Cloud guarantees maximum usage of existing resources. But, security defined as a primary problem in cloud. At the present time, organizations are progressively moving excited about deploying and making use of ready-prepared business applications, with particular short-term to the marketplace. The possible lack of capital budgets for software planning and on principle deployments, along with the swift progression of cloud these are the reasons why one should make the interest on business application. However, these are the interests that help make the SaaS based business application on-demand. In this paper different approaches has been discussed that will help to extend the cloud environment. Also, the study of several well-known software testing approaches.

  15. Firing Room Remote Application Software Development & Swamp Works Laboratory Robot Software Development

    Science.gov (United States)

    Garcia, Janette

    2016-01-01

    The National Aeronautics and Space Administration (NASA) is creating a way to send humans beyond low Earth orbit, and later to Mars. Kennedy Space Center (KSC) is working to make this possible by developing a Spaceport Command and Control System (SCCS) which will allow the launch of Space Launch System (SLS). This paper's focus is on the work performed by the author in her first and second part of the internship as a remote application software developer. During the first part of her internship, the author worked on the SCCS's software application layer by assisting multiple ground subsystems teams including Launch Accessories (LACC) and Environmental Control System (ECS) on the design, development, integration, and testing of remote control software applications. Then, on the second part of the internship, the author worked on the development of robot software at the Swamp Works Laboratory which is a research and technology development group which focuses on inventing new technology to help future In-Situ Resource Utilization (ISRU) missions.

  16. Software patterns, knowledge maps, and domain analysis

    CERN Document Server

    Fayad, Mohamed E; Hegde, Srikanth GK; Basia, Anshu; Vakil, Ashka

    2014-01-01

    Preface AcknowledgmentsAuthors INTRODUCTIONAn Overview of Knowledge MapsIntroduction: Key Concepts-Software Stable Models, Knowledge Maps, Pattern Language, Goals, Capabilities (Enduring Business Themes + Business Objects) The Motivation The Problem The Objectives Overview of Software Stability Concepts Overview of Knowledge Maps Pattern Languages versus Knowledge Maps: A Brief ComparisonThe Solution Knowledge Maps Methodology or Concurrent Software Development ModelWhy Knowledge Maps? Research Methodology Undertaken Research Verification and Validation The Stratification of This Book Summary

  17. Bringing Legacy Visualization Software to Modern Computing Devices via Application Streaming

    Science.gov (United States)

    Fisher, Ward

    2014-05-01

    Planning software compatibility across forthcoming generations of computing platforms is a problem commonly encountered in software engineering and development. While this problem can affect any class of software, data analysis and visualization programs are particularly vulnerable. This is due in part to their inherent dependency on specialized hardware and computing environments. A number of strategies and tools have been designed to aid software engineers with this task. While generally embraced by developers at 'traditional' software companies, these methodologies are often dismissed by the scientific software community as unwieldy, inefficient and unnecessary. As a result, many important and storied scientific software packages can struggle to adapt to a new computing environment; for example, one in which much work is carried out on sub-laptop devices (such as tablets and smartphones). Rewriting these packages for a new platform often requires significant investment in terms of development time and developer expertise. In many cases, porting older software to modern devices is neither practical nor possible. As a result, replacement software must be developed from scratch, wasting resources better spent on other projects. Enabled largely by the rapid rise and adoption of cloud computing platforms, 'Application Streaming' technologies allow legacy visualization and analysis software to be operated wholly from a client device (be it laptop, tablet or smartphone) while retaining full functionality and interactivity. It mitigates much of the developer effort required by other more traditional methods while simultaneously reducing the time it takes to bring the software to a new platform. This work will provide an overview of Application Streaming and how it compares against other technologies which allow scientific visualization software to be executed from a remote computer. We will discuss the functionality and limitations of existing application streaming

  18. OVERVIEW OF THE SAPHIRE PROBABILISTIC RISK ANALYSIS SOFTWARE

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L.; Wood, Ted; Knudsen, James; Ma, Zhegang

    2016-10-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer (PC) running the Microsoft Windows operating system. SAPHIRE Version 8 is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). INL's primary role in this project is that of software developer and tester. However, INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users, who constitute a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. In this paper, we provide an overview of the current technical capabilities found in SAPHIRE Version 8, including the user interface and enhanced solving algorithms.

  19. Change impact analysis for software product lines

    Directory of Open Access Journals (Sweden)

    Jihen Maâzoun

    2016-10-01

    Full Text Available A software product line (SPL represents a family of products in a given application domain. Each SPL is constructed to provide for the derivation of new products by covering a wide range of features in its domain. Nevertheless, over time, some domain features may become obsolete with the apparition of new features while others may become refined. Accordingly, the SPL must be maintained to account for the domain evolution. Such evolution requires a means for managing the impact of changes on the SPL models, including the feature model and design. This paper presents an automated method that analyzes feature model evolution, traces their impact on the SPL design, and offers a set of recommendations to ensure the consistency of both models. The proposed method defines a set of new metrics adapted to SPL evolution to identify the effort needed to maintain the SPL models consistently and with a quality as good as the original models. The method and its tool are illustrated through an example of an SPL in the Text Editing domain. In addition, they are experimentally evaluated in terms of both the quality of the maintained SPL models and the precision of the impact change management.

  20. Dispersion analysis of biotoxins using HPAC software

    International Nuclear Information System (INIS)

    Wu, A.; Nurthen, N.; Horstman, A.; Watson, R.; Phillips, M.

    2009-01-01

    Biotoxins are emerging threat agents produced by living organisms: bacteria, plants, or animals. Biotoxins are generally classified as cyanotoxins, hemotoxins, necrotoxins, neurotoxins, and cytotoxins. The application of classical biotoxins as weapons of terror has been realized because of extreme potency and lethality; ease of production, transport, and misuse; and the need for prolonged intensive care among affected persons. Recently, emerging biotoxins, such as ricin and T2 micotoxin have been clandestinely used by either terrorist groups or military combat operations. It is thus highly desirable to have a modeling system to simulate dispersions of biotoxins in a terrorist attack scenario in order to provide prompt technical support and casualty estimation to the first responders and military rescuers. The Hazard Prediction and Assessment Capability (HPAC) automated software system provides the means to accurately predict the effects of hazardous material released into the atmosphere and its impact on civilian and military populations. The system uses integrated source terms, high-resolution weather forecasts and atmospheric transport and dispersion analyses to model hazard areas produced by military or terrorist incidents and industrial accidents. We have successfully incorporated physical, chemical, epidemiological and biological characteristics of a variety of biotoxins into the HPAC system and have conducted numerous analyses for our emergency responders. The health effects caused by these hazards are closely reflected in HPAC output results.(author)

  1. The SAVI Vulnerability Analysis Software Package

    International Nuclear Information System (INIS)

    Mc Aniff, R.J.; Paulus, W.K.; Key, B.; Simpkins, B.

    1987-01-01

    SAVI (Systematic Analysis of Vulnerability to Intrusion) is a new PC-based software package for modeling Physical Protection Systems (PPS). SAVI utilizes a path analysis approach based on the Adversary Sequence Diagram (ASD) methodology. A highly interactive interface allows the user to accurately model complex facilities, maintain a library of these models on disk, and calculate the most vulnerable paths through any facility. Recommendations are provided to help the user choose facility upgrades which should reduce identified path vulnerabilities. Pop-up windows throughout SAVI are used for the input and display of information. A menu at the top of the screen presents all options to the user. These options are further explained on a message line directly below the menu. A diagram on the screen graphically represents the current protection system model. All input is checked for errors, and data are presented in a logical and clear manner. Print utilities provide the user with hard copies of all information and calculated results

  2. Application of automated reasoning software: procedure generation system verifier

    International Nuclear Information System (INIS)

    Smith, D.E.; Seeman, S.E.

    1984-09-01

    An on-line, automated reasoning software system for verifying the actions of other software or human control systems has been developed. It was demonstrated by verifying the actions of an automated procedure generation system. The verifier uses an interactive theorem prover as its inference engine with the rules included as logic axioms. Operation of the verifier is generally transparent except when the verifier disagrees with the actions of the monitored software. Testing with an automated procedure generation system demonstrates the successful application of automated reasoning software for verification of logical actions in a diverse, redundant manner. A higher degree of confidence may be placed in the verified actions gathered by the combined system

  3. Web Application Software for Ground Operations Planning Database (GOPDb) Management

    Science.gov (United States)

    Lanham, Clifton; Kallner, Shawn; Gernand, Jeffrey

    2013-01-01

    A Web application facilitates collaborative development of the ground operations planning document. This will reduce costs and development time for new programs by incorporating the data governance, access control, and revision tracking of the ground operations planning data. Ground Operations Planning requires the creation and maintenance of detailed timelines and documentation. The GOPDb Web application was created using state-of-the-art Web 2.0 technologies, and was deployed as SaaS (Software as a Service), with an emphasis on data governance and security needs. Application access is managed using two-factor authentication, with data write permissions tied to user roles and responsibilities. Multiple instances of the application can be deployed on a Web server to meet the robust needs for multiple, future programs with minimal additional cost. This innovation features high availability and scalability, with no additional software that needs to be bought or installed. For data governance and security (data quality, management, business process management, and risk management for data handling), the software uses NAMS. No local copy/cloning of data is permitted. Data change log/tracking is addressed, as well as collaboration, work flow, and process standardization. The software provides on-line documentation and detailed Web-based help. There are multiple ways that this software can be deployed on a Web server to meet ground operations planning needs for future programs. The software could be used to support commercial crew ground operations planning, as well as commercial payload/satellite ground operations planning. The application source code and database schema are owned by NASA.

  4. Social network analysis in software process improvement

    DEFF Research Database (Denmark)

    Nielsen, Peter Axel; Tjørnehøj, Gitte

    2010-01-01

    Software process improvement in small organisation is often problematic and communication and knowledge sharing is more informal. To improve software processes we need to understand how they communicate and share knowledge. In this article have studied the company SmallSoft through action research...

  5. Software project profitability analysis using temporal probabilistic reasoning; an empirical study with the CASSE framework

    CSIR Research Space (South Africa)

    Balikuddembe, JK

    2009-04-01

    Full Text Available Undertaking adequate risk management by understanding project requirements and ensuring that viable estimates are made on software projects require extensive application and sophisticated techniques of analysis and interpretation. Informative...

  6. Outsourcing the development of specific application software using the ESA software engineering standards the SPS software Interlock System

    CERN Document Server

    Denis, B

    1995-01-01

    CERN is considering outsourcing as a solution to the reduction of staff. To need to re-engineer the SPS Software Interlock System provided an opportunity to explore the applicability of outsourcing to our specific controls environment and the ESA PSS-05 standards were selected for the requirements specification, the development, the control and monitoring and the project management. The software produced by the contractor is now fully operational. After outlining the scope and the complexity of the project, a discussion on the ESA PSS-05 will be presented: the choice, the way these standards improve the outsourcing process, the quality induced but also the need to adapt them and their limitation in the definition of the customer-supplier relationship. The success factors and the difficulties of development under contract will also be discussed. The maintenance aspect and the impact on in-house developments will finally be addressed.

  7. Effective Results Analysis for the Similar Software Products’ Orthogonality

    OpenAIRE

    Ion Ivan; Daniel Milodin

    2009-01-01

    It is defined the concept of similar software. There are established conditions of archiving the software components. It is carried out the orthogonality evaluation and the correlation between the orthogonality and the complexity of the homogenous software components is analyzed. Shall proceed to build groups of similar software products, belonging to the orthogonality intervals. There are presented in graphical form the results of the analysis. There are detailed aspects of the functioning o...

  8. Open-source software platform for medical image segmentation applications

    Science.gov (United States)

    Namías, R.; D'Amato, J. P.; del Fresno, M.

    2017-11-01

    Segmenting 2D and 3D images is a crucial and challenging problem in medical image analysis. Although several image segmentation algorithms have been proposed for different applications, no universal method currently exists. Moreover, their use is usually limited when detection of complex and multiple adjacent objects of interest is needed. In addition, the continually increasing volumes of medical imaging scans require more efficient segmentation software design and highly usable applications. In this context, we present an extension of our previous segmentation framework which allows the combination of existing explicit deformable models in an efficient and transparent way, handling simultaneously different segmentation strategies and interacting with a graphic user interface (GUI). We present the object-oriented design and the general architecture which consist of two layers: the GUI at the top layer, and the processing core filters at the bottom layer. We apply the framework for segmenting different real-case medical image scenarios on public available datasets including bladder and prostate segmentation from 2D MRI, and heart segmentation in 3D CT. Our experiments on these concrete problems show that this framework facilitates complex and multi-object segmentation goals while providing a fast prototyping open-source segmentation tool.

  9. Application of PIMS Software in Monthly Planning of Refinery Production

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    This article describes the application of the PIMS software in formulating monthly refining production plan. Application of the PIMS software can help to solve a series of problems related with monthly plan of refining production such as optimized selection of crude and feedstocks, optimized selection of production scale and processing scheme, identification of bottlenecks and their mitigation,optimized selection of turnaround time and optimized selection of operating regime, which have increased the economic benefits of refining enterprises. With the further development and improvement of models the PIMS software will play an increasingly important role in formulating monthly plans of refining operations and production management at refineries. This article also explores the problems existing in refinery monthly planning, and has made recommendations on developing and improving models and reporting system, enhancement of basic data acquisition, model maintenance personnel and staff training.

  10. Analysis for Parallel Execution without Performing Hardware/Software Co-simulation

    OpenAIRE

    Muhammad Rashid

    2014-01-01

    Hardware/software co-simulation improves the performance of embedded applications by executing the applications on a virtual platform before the actual hardware is available in silicon. However, the virtual platform of the target architecture is often not available during early stages of the embedded design flow. Consequently, analysis for parallel execution without performing hardware/software co-simulation is required. This article presents an analysis methodology for parallel execution of ...

  11. IMMAN: free software for information theory-based chemometric analysis.

    Science.gov (United States)

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  12. An ion beam analysis software based on ImageJ

    International Nuclear Information System (INIS)

    Udalagama, C.; Chen, X.; Bettiol, A.A.; Watt, F.

    2013-01-01

    The suit of techniques (RBS, STIM, ERDS, PIXE, IL, IF,…) available in ion beam analysis yields a variety of rich information. Typically, after the initial challenge of acquiring data we are then faced with the task of having to extract relevant information or to present the data in a format with the greatest impact. This process sometimes requires developing new software tools. When faced with such situations the usual practice at the Centre for Ion Beam Applications (CIBA) in Singapore has been to use our computational expertise to develop ad hoc software tools as and when we need them. It then became apparent that the whole ion beam community can benefit from such tools; specifically from a common software toolset that can be developed and maintained by everyone with freedom to use and allowance to modify. In addition to the benefits of readymade tools and sharing the onus of development, this also opens up the possibility for collaborators to access and analyse ion beam data without having to depend on an ion beam lab. This has the virtue of making the ion beam techniques more accessible to a broader scientific community. We have identified ImageJ as an appropriate software base to develop such a common toolset. In addition to being in the public domain and been setup for collaborative tool development, ImageJ is accompanied by hundreds of modules (plugins) that allow great breadth in analysis. The present work is the first step towards integrating ion beam analysis into ImageJ. Some of the features of the current version of the ImageJ ‘ion beam’ plugin are: (1) reading list mode or event-by-event files, (2) energy gates/sorts, (3) sort stacks, (4) colour function, (5) real time map updating, (6) real time colour updating and (7) median and average map creation

  13. An ion beam analysis software based on ImageJ

    Energy Technology Data Exchange (ETDEWEB)

    Udalagama, C., E-mail: chammika@nus.edu.sg [Centre for Ion Beam Applications (CIBA), Department of Physics, National University of Singapore, 2 Science Drive 3, Singapore 117 542 (Singapore); Chen, X.; Bettiol, A.A.; Watt, F. [Centre for Ion Beam Applications (CIBA), Department of Physics, National University of Singapore, 2 Science Drive 3, Singapore 117 542 (Singapore)

    2013-07-01

    The suit of techniques (RBS, STIM, ERDS, PIXE, IL, IF,…) available in ion beam analysis yields a variety of rich information. Typically, after the initial challenge of acquiring data we are then faced with the task of having to extract relevant information or to present the data in a format with the greatest impact. This process sometimes requires developing new software tools. When faced with such situations the usual practice at the Centre for Ion Beam Applications (CIBA) in Singapore has been to use our computational expertise to develop ad hoc software tools as and when we need them. It then became apparent that the whole ion beam community can benefit from such tools; specifically from a common software toolset that can be developed and maintained by everyone with freedom to use and allowance to modify. In addition to the benefits of readymade tools and sharing the onus of development, this also opens up the possibility for collaborators to access and analyse ion beam data without having to depend on an ion beam lab. This has the virtue of making the ion beam techniques more accessible to a broader scientific community. We have identified ImageJ as an appropriate software base to develop such a common toolset. In addition to being in the public domain and been setup for collaborative tool development, ImageJ is accompanied by hundreds of modules (plugins) that allow great breadth in analysis. The present work is the first step towards integrating ion beam analysis into ImageJ. Some of the features of the current version of the ImageJ ‘ion beam’ plugin are: (1) reading list mode or event-by-event files, (2) energy gates/sorts, (3) sort stacks, (4) colour function, (5) real time map updating, (6) real time colour updating and (7) median and average map creation.

  14. Application of Plagiarism Screening Software in the Chemical Engineering Curriculum

    Science.gov (United States)

    Cooper, Matthew E.; Bullard, Lisa G.

    2014-01-01

    Plagiarism is an area of increasing concern for written ChE assignments, such as laboratory and design reports, due to ease of access to text and other materials via the internet. This study examines the application of plagiarism screening software to four courses in a university chemical engineering curriculum. The effectiveness of plagiarism…

  15. Metabolic interrelationships software application: Interactive learning tool for intermediary metabolism

    NARCIS (Netherlands)

    A.J.M. Verhoeven (Adrie); M. Doets (Mathijs); J.M.J. Lamers (Jos); J.F. Koster (Johan)

    2005-01-01

    textabstractWe developed and implemented the software application titled Metabolic Interrelationships as a self-learning and -teaching tool for intermediary metabolism. It is used by undergraduate medical students in an integrated organ systems-based and disease-oriented core curriculum, which

  16. Application software, domain-specific languages, and language design assistants

    NARCIS (Netherlands)

    J. Heering (Jan)

    2000-01-01

    textabstractWhile application software does the real work, domain-specific languages (DSLs) are tools to help produce it efficiently, and language design assistants in turn are meta-tools to help produce DSLs quickly. DSLs are already in wide use (HTML for web pages, Excel macros for spreadsheet

  17. QFD Application to a Software - Intensive System Development Project

    Science.gov (United States)

    Tran, T. L.

    1996-01-01

    This paper describes the use of Quality Function Deployment (QFD), adapted to requirements engineering for a software-intensive system development project, and sysnthesizes the lessons learned from the application of QFD to the Network Control System (NCS) pre-project of the Deep Space Network.

  18. Analysis of signal acquisition in GPS receiver software

    Directory of Open Access Journals (Sweden)

    Vlada S. Sokolović

    2011-01-01

    Full Text Available This paper presents a critical analysis of the flow signal processing carried out in GPS receiver software, which served as a basis for a critical comparison of different signal processing architectures within the GPS receiver. It is possible to achieve Increased flexibility and reduction of GPS device commercial costs, including those of mobile devices, by using radio technology software (SDR, Software Defined Radio. The SDR application can be realized when certain hardware components in a GPS receiver are replaced. Signal processing in the SDR is implemented using a programmable DSP (Digital Signal Processing or FPGA (Field Programmable Gate Array circuit, which allows a simple change of digital signal processing algorithms and a simple change of the receiver parameters. The starting point of the research is the signal generated on the satellite the structure of which is shown in the paper. Based on the GPS signal structure, a receiver is realized with a task to extract an appropriate signal from the spectrum and detect it. Based on collected navigation data, the receiver calculates the position of the end user. The signal coming from the satellite may be at the carrier frequencies of L1 and L2. Since the SPS is used in the civil service, all the tests shown in the work were performed on the L1 signal. The signal coming to the receiver is generated in the spread spectrum technology and is situated below the level of noise. Such signals often interfere with signals from the environment which presents a difficulty for a receiver to perform proper detection and signal processing. Therefore, signal processing technology is continually being improved, aiming at more accurate and faster signal processing. All tests were carried out on a signal acquired from the satellite using the SE4110 input circuit used for filtering, amplification and signal selection. The samples of the received signal were forwarded to a computer for data post processing, i. e

  19. Software representation methodology for agile application development: An architectural approach

    Directory of Open Access Journals (Sweden)

    Alejandro Paolo Daza Corredor

    2016-06-01

    Full Text Available The generation of Web applications represents the execution of repetitive tasks, this process involves determining information structures, the generation of different types of components and finally deployment tasks and tuning applications. In many applications of this type are coincident components generated from application to application. Current trends in software engineering as MDE, MDA or MDD pretend to automate the generation of applications based on structuring a model to apply transformations to the achievement of the application. This document intends to translate an architectural foundation that facilitates the generation of these applications relying on model-driven architecture but without ignoring the existence and relevance of existing trends mentioned in this summary architectural models.

  20. Accident Damage Analysis Module (ADAM) – Technical Guidance, Software tool for Consequence Analysis calculations

    OpenAIRE

    FABBRI LUCIANO; BINDA MASSIMO; BRUINEN DE BRUIN YURI

    2017-01-01

    This report provides a technical description of the modelling and assumptions of the Accident Damage Analysis Module (ADAM) software application, which has been recently developed by the Joint Research Centre (JRC) of the European Commission (EC) to assess physical effects of an industrial accident resulting from an unintended release of a dangerous substance

  1. Software application for quality control protocol of mammography systems

    International Nuclear Information System (INIS)

    Kjosevski, Vladimir; Gershan, Vesna; Ginovska, Margarita; Spasevska, Hristina

    2010-01-01

    Considering the fact that the Quality Control of the technological process of the mammographic system involves testing of a large number of parameters, it is clearly evident that there is a need for using the information technology for gathering, processing and storing of all the parameters that are result of this process. The main goal of this software application is facilitation and automation of the gathering, processing, storing and presenting process of the data related to the qualification of the physical and technical parameters during the quality control of the technological process of the mammographic system. The software application along with its user interface and database has been made with the Microsoft Access 2003 application which is part of the Microsoft Office 2003 software packet and has been chosen as a platform for developing because it is the most commonly used office application today among the computer users in the country. This is important because it will provide the end users a familiar environment to work in, without the need for additional training and improving the computer skills that they posses. Most importantly, the software application is easy to use, fast in calculating the parameters needed and it is an excellent way to store and display the results. There is a possibility for up scaling this software solution so it can be used by many different users at the same time over the Internet. It is highly recommended that this system is implemented as soon as possible in the quality control process of the mammographic systems due to its many advantages.(Author)

  2. COMPARATIVE ANALYSIS OF SOFTWARE DEVELOPMENT MODELS

    OpenAIRE

    Sandeep Kaur*

    2017-01-01

    No geek is unfamiliar with the concept of software development life cycle (SDLC). This research deals with the various SDLC models covering waterfall, spiral, and iterative, agile, V-shaped, prototype model. In the modern era, all the software systems are fallible as they can’t stand with certainty. So, it is tried to compare all aspects of the various models, their pros and cons so that it could be easy to choose a particular model at the time of need

  3. CAX a software for automated spectrum analysis

    International Nuclear Information System (INIS)

    Zahn, Guilherme S.; Genezini, Frederico A.

    2017-01-01

    In this work, the scripting capabilities of Genie-2000 were used to develop a software that automatically analyses all spectrum files in either Ortec's CHN or Canberra's MCA or CNF formats in a folder, generating two output files: a print-ready text le (.DAT) and a Comma-Separated Values (.CSV) le which can be easily imported in any major spreadsheet software. This software, named CAX ('Convert and Analyse for eXcel'), uses Genie-2000's functions to import spectrum files into Genie's native CNF format and analyze the converted spectra. The software can also, if requested, import energy and FWHM calibrations from a stored calibrated spectrum. The print-ready output le (.DAT) is generated by Genie-2000 using a customized script, and the CSV le is generated by a custom-built DAT2CSV software which generates a CSV le that complies to the Brazilian standards, with commas as a decimal indicator and semicolons as eld separators. This software is already used in the daily routines in IPEN's Neutron Activation Laboratory, greatly reducing the time required for sample analyses, as well as reducing the possibility of transcription errors. (author)

  4. CAX a software for automated spectrum analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zahn, Guilherme S.; Genezini, Frederico A., E-mail: gzahn@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (CRPq/IPEN/CNEN-SP), Sao Paulo, SP (Brazil). Centro do Reator de Pesquisas

    2017-11-01

    In this work, the scripting capabilities of Genie-2000 were used to develop a software that automatically analyses all spectrum files in either Ortec's CHN or Canberra's MCA or CNF formats in a folder, generating two output files: a print-ready text le (.DAT) and a Comma-Separated Values (.CSV) le which can be easily imported in any major spreadsheet software. This software, named CAX ('Convert and Analyse for eXcel'), uses Genie-2000's functions to import spectrum files into Genie's native CNF format and analyze the converted spectra. The software can also, if requested, import energy and FWHM calibrations from a stored calibrated spectrum. The print-ready output le (.DAT) is generated by Genie-2000 using a customized script, and the CSV le is generated by a custom-built DAT2CSV software which generates a CSV le that complies to the Brazilian standards, with commas as a decimal indicator and semicolons as eld separators. This software is already used in the daily routines in IPEN's Neutron Activation Laboratory, greatly reducing the time required for sample analyses, as well as reducing the possibility of transcription errors. (author)

  5. Development of Spectrometer Software for Electromagnetic Radiation Measurement and Analysis

    International Nuclear Information System (INIS)

    Mohd Idris Taib; Noor Ezati Shuib; Wan Saffiey Wan Abdullah

    2013-01-01

    This software was under development using LabVIEW to be using with StellarNet Spectrometer system. StellarNet Spectrometer was supplied with SpectraWiz operating software that can measure spectral data for real-time spectroscopy. This LabVIEW software was used to access real-time data from SpectraWiz dynamic link library as hardware interfacing. This software will acquire amplitude of every electromagnetic wavelength at periodic time. In addition to hardware interfacing, the user interface capabilities of software include plotting of spectral data in various mode including scope, absorbance, transmission and irradiance mode. This software surely can be used for research and development in application, utilization and safety of electromagnetic radiation, especially solar, laser and ultra violet. Of-line capabilities of this software are almost unlimited due to availability of mathematical and signal processing function in the LabVIEW add on library. (author)

  6. Development of a New VLBI Data Analysis Software

    Science.gov (United States)

    Bolotin, Sergei; Gipson, John M.; MacMillan, Daniel S.

    2010-01-01

    We present an overview of a new VLBI analysis software under development at NASA GSFC. The new software will replace CALC/SOLVE and many related utility programs. It will have the capabilities of the current system as well as incorporate new models and data analysis techniques. In this paper we give a conceptual overview of the new software. We formulate the main goals of the software. The software should be flexible and modular to implement models and estimation techniques that currently exist or will appear in future. On the other hand it should be reliable and possess production quality for processing standard VLBI sessions. Also, it needs to be capable of processing observations from a fully deployed network of VLBI2010 stations in a reasonable time. We describe the software development process and outline the software architecture.

  7. A software architectural framework specification for neutron activation analysis

    International Nuclear Information System (INIS)

    Preston, J.A.; Grant, C.N.

    2013-01-01

    Neutron Activation Analysis (NAA) is a sensitive multi-element nuclear analytical technique that has been routinely applied by research reactor (RR) facilities to environmental, nutritional, health related, geological and geochemical studies. As RR facilities face calls to increase their research output and impact, with existing or reducing budgets, automation of NAA offers a possible solution. However, automation has many challenges, not the least of which is a lack of system architecture standards to establish acceptable mechanisms for the various hardware/software and software/software interactions among data acquisition systems, specialised hardware such as sample changers, sample loaders, and data processing modules. This lack of standardization often results in automation hardware and software being incompatible with existing system components, in a facility looking to automate its NAA operations. This limits the availability of automation to a few RR facilities with adequate budgets or in-house engineering resources. What is needed is a modern open system architecture for NAA, that provides the required set of functionalities. This paper describes such an 'architectural framework' (OpenNAA), and portions of a reference implementation. As an example of the benefits, calculations indicate that applying this architecture to the compilation and QA steps associated with the analysis of 35 elements in 140 samples, with 14 SRM's, can reduce the time required by over 80 %. The adoption of open standards in the nuclear industry has been very successful over the years in promoting interchangeability and maximising the lifetime and output of nuclear measurement systems. OpenNAA will provide similar benefits within the NAA application space, safeguarding user investments in their current system, while providing a solid path for development into the future. (author)

  8. Continuous software quality analysis for the ATLAS experiment

    CERN Document Server

    Washbrook, Andrew; The ATLAS collaboration

    2017-01-01

    The software for the ATLAS experiment on the Large Hadron Collider at CERN has evolved over many years to meet the demands of Monte Carlo simulation, particle detector reconstruction and data analysis. At present over 3.8 million lines of C++ code (and close to 6 million total lines of code) are maintained by an active worldwide developer community. In order to run the experiment software efficiently at hundreds of computing centres it is essential to maintain a high level of software quality standards. The methods proposed to improve software quality practices by incorporating checks into the new ATLAS software build infrastructure.

  9. Core analysis: new features and applications

    International Nuclear Information System (INIS)

    Edenius, M.; Kurcyusz, E.; Molina, D.; Wiksell, G.

    1995-01-01

    Today, core analysis may be performed with sophisticated software capable of both steady state and transient analysis using a common methodology for BWRs and PWRs. General trends in core analysis software development are: improved accuracy, automated engineering functions; three-dimensional transient capability; graphical user interfaces. As a demonstration of such software, new features of Studsvik-CMS (Core management system) and examples of applications are discussed in this article. 2 figs., 8 refs

  10. Software Piracy in Research: A Moral Analysis.

    Science.gov (United States)

    Santillanes, Gary; Felder, Ryan Marshall

    2015-08-01

    Researchers in virtually every discipline rely on sophisticated proprietary software for their work. However, some researchers are unable to afford the licenses and instead procure the software illegally. We discuss the prohibition of software piracy by intellectual property laws, and argue that the moral basis for the copyright law offers the possibility of cases where software piracy may be morally justified. The ethics codes that scientific institutions abide by are informed by a rule-consequentialist logic: by preserving personal rights to authored works, people able to do so will be incentivized to create. By showing that the law has this rule-consequentialist grounding, we suggest that scientists who blindly adopt their institutional ethics codes will commit themselves to accepting that software piracy could be morally justified, in some cases. We hope that this conclusion will spark debate over important tensions between ethics codes, copyright law, and the underlying moral basis for these regulations. We conclude by offering practical solutions (other than piracy) for researchers.

  11. Identifying trace evidence in data wiping application software

    Directory of Open Access Journals (Sweden)

    Gregory H. Carlton

    2012-06-01

    Full Text Available One area of particular concern for computer forensics examiners involves situations in which someone utilized software applications to destroy evidence. There are products available in the marketplace that are relatively inexpensive and advertised as being able to destroy targeted portions of data stored within a computer system. This study was undertaken to identify these tools and analyze them to determine the extent to which each of the evaluated data wiping applications perform their tasks and to identify trace evidence, if any, left behind on disk media after executing these applications. We evaluated five Windows 7 compatible software products whose advertised features include the ability for users to wipe targeted files, folders, or evidence of selected activities. We conducted a series of experiments that involved executing each application on systems with identical data, and we then analyzed the results and compared the before and after images for each application. We identified information for each application that is beneficial to forensics examiners when faced with similar situations. This paper describes our application selection process, our application evaluation methodology, and our findings. Following this, we describe limitations of this study and suggest areas of additional research that will benefit the study of digital forensics.

  12. Final Report. Center for Scalable Application Development Software

    Energy Technology Data Exchange (ETDEWEB)

    Mellor-Crummey, John [Rice Univ., Houston, TX (United States)

    2014-10-26

    The Center for Scalable Application Development Software (CScADS) was established as a part- nership between Rice University, Argonne National Laboratory, University of California Berkeley, University of Tennessee – Knoxville, and University of Wisconsin – Madison. CScADS pursued an integrated set of activities with the aim of increasing the productivity of DOE computational scientists by catalyzing the development of systems software, libraries, compilers, and tools for leadership computing platforms. Principal Center activities were workshops to engage the research community in the challenges of leadership computing, research and development of open-source software, and work with computational scientists to help them develop codes for leadership computing platforms. This final report summarizes CScADS activities at Rice University in these areas.

  13. The application of image processing software: Photoshop in environmental design

    Science.gov (United States)

    Dong, Baohua; Zhang, Chunmi; Zhuo, Chen

    2011-02-01

    In the process of environmental design and creation, the design sketch holds a very important position in that it not only illuminates the design's idea and concept but also shows the design's visual effects to the client. In the field of environmental design, computer aided design has made significant improvement. Many types of specialized design software for environmental performance of the drawings and post artistic processing have been implemented. Additionally, with the use of this software, working efficiency has greatly increased and drawings have become more specific and more specialized. By analyzing the application of photoshop image processing software in environmental design and comparing and contrasting traditional hand drawing and drawing with modern technology, this essay will further explore the way for computer technology to play a bigger role in environmental design.

  14. Case Analysis on Application of SPSS Software in Forestry Production and Scientific Research%SPSS在林业生产和科学研究中的应用实例解析

    Institute of Scientific and Technical Information of China (English)

    琚松苗

    2012-01-01

    SPSS作为统计分析工具,具有数据管理、统计分析、趋势研究、制表绘图、文字处理等功能。本文从科技推广角度出发,结合典型实例介绍SPSS统计软件在林业生产和科学研究中的应用。%SPSS (Statistical Program for Social Sciences), a statistical analysis tool, is used for data management, statistical analysis, trend study, tabulation and drawing, word processing and so on. In this paper the application of the statistical software SPSS in forestry production and scientific research is introduced with typical cases from the perspective of forestry science and technology promotion.

  15. A proposal for performing software safety hazard analysis

    International Nuclear Information System (INIS)

    Lawrence, J.D.; Gallagher, J.M.

    1997-01-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper. The method concentrates on finding hazards during the early stages of the software life cycle, using an extension of HAZOP

  16. Correct software in web applications and web services

    CERN Document Server

    Thalheim, Bernhard; Prinz, Andreas; Buchberger, Bruno

    2015-01-01

    The papers in this volume aim at obtaining a common understanding of the challenging research questions in web applications comprising web information systems, web services, and web interoperability; obtaining a common understanding of verification needs in web applications; achieving a common understanding of the available rigorous approaches to system development, and the cases in which they have succeeded; identifying how rigorous software engineering methods can be exploited to develop suitable web applications; and at developing a European-scale research agenda combining theory, methods a

  17. Development of tools for safety analysis of control software in advanced reactors

    Energy Technology Data Exchange (ETDEWEB)

    Guarro, S.; Yau, M.; Motamed, M. [Advanced Systems Concepts Associates, El Segundo, CA (United States)

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described.

  18. Development of tools for safety analysis of control software in advanced reactors

    International Nuclear Information System (INIS)

    Guarro, S.; Yau, M.; Motamed, M.

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described

  19. Multi-channel software defined radio experimental evaluation and analysis

    CSIR Research Space (South Africa)

    Van der Merwe, JR

    2014-09-01

    Full Text Available Multi-channel software-defined radios (SDRs) can be utilised as inexpensive prototyping platforms for transceiver arrays. The application for multi-channel prototyping is discussed and measured results of coherent channels for both receiver...

  20. Design and validation of Segment - freely available software for cardiovascular image analysis

    International Nuclear Information System (INIS)

    Heiberg, Einar; Sjögren, Jane; Ugander, Martin; Carlsson, Marcus; Engblom, Henrik; Arheden, Håkan

    2010-01-01

    Commercially available software for cardiovascular image analysis often has limited functionality and frequently lacks the careful validation that is required for clinical studies. We have already implemented a cardiovascular image analysis software package and released it as freeware for the research community. However, it was distributed as a stand-alone application and other researchers could not extend it by writing their own custom image analysis algorithms. We believe that the work required to make a clinically applicable prototype can be reduced by making the software extensible, so that researchers can develop their own modules or improvements. Such an initiative might then serve as a bridge between image analysis research and cardiovascular research. The aim of this article is therefore to present the design and validation of a cardiovascular image analysis software package (Segment) and to announce its release in a source code format. Segment can be used for image analysis in magnetic resonance imaging (MRI), computed tomography (CT), single photon emission computed tomography (SPECT) and positron emission tomography (PET). Some of its main features include loading of DICOM images from all major scanner vendors, simultaneous display of multiple image stacks and plane intersections, automated segmentation of the left ventricle, quantification of MRI flow, tools for manual and general object segmentation, quantitative regional wall motion analysis, myocardial viability analysis and image fusion tools. Here we present an overview of the validation results and validation procedures for the functionality of the software. We describe a technique to ensure continued accuracy and validity of the software by implementing and using a test script that tests the functionality of the software and validates the output. The software has been made freely available for research purposes in a source code format on the project home page (http://segment.heiberg.se). Segment

  1. Bifurcation software in Matlab with applications in neuronal modeling.

    Science.gov (United States)

    Govaerts, Willy; Sautois, Bart

    2005-02-01

    Many biological phenomena, notably in neuroscience, can be modeled by dynamical systems. We describe a recent improvement of a Matlab software package for dynamical systems with applications to modeling single neurons and all-to-all connected networks of neurons. The new software features consist of an object-oriented approach to bifurcation computations and the partial inclusion of C-code to speed up the computation. As an application, we study the origin of the spiking behaviour of neurons when the equilibrium state is destabilized by an incoming current. We show that Class II behaviour, i.e. firing with a finite frequency, is possible even if the destabilization occurs through a saddle-node bifurcation. Furthermore, we show that synchronization of an all-to-all connected network of such neurons with only excitatory connections is also possible in this case.

  2. How can operations get the applications software that they want?

    CERN Document Server

    Bailey, R

    1995-01-01

    In SL division at CERN, any major development of application software to be used in the control room entails interaction and a division of responsibilities between people from operations, controls, accelerator physics and equipment groups. Providing the suite of applications for the operators to use for efficient exploitation of the machines is no exception to this. Indeed since routine operation accounts for the largest fraction of machine time, it should be regarded as the most important activity. We have found over 10 years that teams led by operations people, making pragmatic use of formal development methods, prove to be very effective in producing the software required for running the machines. This paper examines the modus operandi of some operations-driven projects, and contrasts this with some less successful ventures.

  3. Reusable Software Usability Specifications for mHealth Applications.

    Science.gov (United States)

    Cruz Zapata, Belén; Fernández-Alemán, José Luis; Toval, Ambrosio; Idri, Ali

    2018-01-25

    One of the key factors for the adoption of mobile technologies, and in particular of mobile health applications, is usability. A usable application will be easier to use and understand by users, and will improve user's interaction with it. This paper proposes a software requirements catalog for usable mobile health applications, which can be used for the development of new applications, or the evaluation of existing ones. The catalog is based on the main identified sources in literature on usability and mobile health applications. Our catalog was organized according to the ISO/IEC/IEEE 29148:2011 standard and follows the SIREN methodology to create reusable catalogs. The applicability of the catalog was verified by the creation of an audit method, which was used to perform the evaluation of a real app, S Health, application created by Samsung Electronics Co. The usability requirements catalog, along with the audit method, identified several usability flaws on the evaluated app, which scored 83%. Some flaws were detected in the app related to the navigation pattern. Some more issues related to the startup experience, empty screens or writing style were also found. The way a user navigates through an application improves or deteriorates user's experience with the application. We proposed a reusable usability catalog and an audit method. This proposal was used to evaluate a mobile health application. An audit report was created with the usability issues identified on the evaluated application.

  4. Research and Development on Food Nutrition Statistical Analysis Software System

    OpenAIRE

    Du Li; Ke Yun

    2013-01-01

    Designing and developing a set of food nutrition component statistical analysis software can realize the automation of nutrition calculation, improve the nutrition processional professional’s working efficiency and achieve the informatization of the nutrition propaganda and education. In the software development process, the software engineering method and database technology are used to calculate the human daily nutritional intake and the intelligent system is used to evaluate the user’s hea...

  5. Selection of bioprocess simulation software for industrial applications.

    Science.gov (United States)

    Shanklin, T; Roper, K; Yegneswaran, P K; Marten, M R

    2001-02-20

    Two commercially available, process-simulation software packages (Aspen Batch Plus v1.2, Aspen Technology, Inc., Cambridge, Massachusetts, and Intelligen SuperPro v3.0, INTELLIGEN, INC., Scotch Plains, Ner Jersey) are evaluated for use in modeling industrial, biotechnology processes. Software is quantitatively evaluated by Kepner-Tregoe Decision Analysis (Kepner and Tregoe, 1981). This evaluation shows that Aspen Batch Plus v1.2 (ABP) and Intelligen SuperPro v3.0 (ISP) can successfully perform specific simulation tasks but do not provide a complete model of all phenomena occurring within a biotechnology process. Software is best suited to provide a format for process management, using material and energy balances to answer scheduling questions, explore equipment change-outs, and calculate cost data. The ability of simulation software to accurately predict unit operation scale-up and optimize bioprocesses is limited. To realistically evaluate the software, a vaccine manufacturing process under development at Merck & Company is simulated. Case studies from the vaccine process are presented as examples of how ABP and ISP can be used to shed light on real-world processing issues. Copyright 2001 John Wiley & Sons, Inc.

  6. Cross-instrument Analysis Correlation Software

    Energy Technology Data Exchange (ETDEWEB)

    2017-06-28

    This program has been designed to assist with the tracking of a sample from one analytical instrument to another such as SEM, microscopes, micro x-ray diffraction and other instruments where particular positions/locations on the sample are examined, photographed, etc. The software is designed to easily enter the position of fiducials and locations of interest such that in a future session in the same of different instrument the positions of interest can be re-found through using the known location fiducials in the current and reference session to transform the point into the current sessions coordinate system. The software is dialog box driven guiding the user through the necessary data entry and program choices. Information is stored in a series of text based extensible markup language (XML) files.

  7. Nuclear Fuel Depletion Analysis Using Matlab Software

    Science.gov (United States)

    Faghihi, F.; Nematollahi, M. R.

    Coupled first order IVPs are frequently used in many parts of engineering and sciences. In this article, we presented a code including three computer programs which are joint with the Matlab software to solve and plot the solutions of the first order coupled stiff or non-stiff IVPs. Some engineering and scientific problems related to IVPs are given and fuel depletion (production of the 239Pu isotope) in a Pressurized Water Nuclear Reactor (PWR) are computed by the present code.

  8. Application of knowledge based software to industrial automation in Japan

    International Nuclear Information System (INIS)

    Matsumoto, Yoshihiro

    1985-01-01

    In Japan, large industrial undertakings such as electric utilities or steel works are making first steps towards knowledge engineering, testing the applicability of knowledge based software to industrial automation. The goal is to achieve more intelligent, computer-aided assistance for the personnel and thus to enhance safety, reliability, and maintenance efficiency in large industrial plants. The article presents various examples showing advantages and draw-backs of such systems, and potential applications among others in nuclear or fossil fueled power plants or in electricity supply control systems. (orig./HP) [de

  9. Software architecture for time-constrained machine vision applications

    Science.gov (United States)

    Usamentiaga, Rubén; Molleda, Julio; García, Daniel F.; Bulnes, Francisco G.

    2013-01-01

    Real-time image and video processing applications require skilled architects, and recent trends in the hardware platform make the design and implementation of these applications increasingly complex. Many frameworks and libraries have been proposed or commercialized to simplify the design and tuning of real-time image processing applications. However, they tend to lack flexibility, because they are normally oriented toward particular types of applications, or they impose specific data processing models such as the pipeline. Other issues include large memory footprints, difficulty for reuse, and inefficient execution on multicore processors. We present a novel software architecture for time-constrained machine vision applications that addresses these issues. The architecture is divided into three layers. The platform abstraction layer provides a high-level application programming interface for the rest of the architecture. The messaging layer provides a message-passing interface based on a dynamic publish/subscribe pattern. A topic-based filtering in which messages are published to topics is used to route the messages from the publishers to the subscribers interested in a particular type of message. The application layer provides a repository for reusable application modules designed for machine vision applications. These modules, which include acquisition, visualization, communication, user interface, and data processing, take advantage of the power of well-known libraries such as OpenCV, Intel IPP, or CUDA. Finally, the proposed architecture is applied to a real machine vision application: a jam detector for steel pickling lines.

  10. A study of software safety analysis system for safety-critical software

    International Nuclear Information System (INIS)

    Chang, H. S.; Shin, H. K.; Chang, Y. W.; Jung, J. C.; Kim, J. H.; Han, H. H.; Son, H. S.

    2004-01-01

    The core factors and requirements for the safety-critical software traced and the methodology adopted in each stage of software life cycle are presented. In concept phase, Failure Modes and Effects Analysis (FMEA) for the system has been performed. The feasibility evaluation of selected safety parameter was performed and Preliminary Hazards Analysis list was prepared using HAZOP(Hazard and Operability) technique. And the check list for management control has been produced via walk-through technique. Based on the evaluation of the check list, activities to be performed in requirement phase have been determined. In the design phase, hazard analysis has been performed to check the safety capability of the system with regard to safety software algorithm using Fault Tree Analysis (FTA). In the test phase, the test items based on FMEA have been checked for fitness guided by an accident scenario. The pressurizer low pressure trip algorithm has been selected to apply FTA method to software safety analysis as a sample. By applying CASE tool, the requirements traceability of safety critical system has been enhanced during all of software life cycle phases

  11. eXtended CASA Line Analysis Software Suite (XCLASS)

    Science.gov (United States)

    Möller, T.; Endres, C.; Schilke, P.

    2017-02-01

    The eXtended CASA Line Analysis Software Suite (XCLASS) is a toolbox for the Common Astronomy Software Applications package (CASA) containing new functions for modeling interferometric and single dish data. Among the tools is the myXCLASS program which calculates synthetic spectra by solving the radiative transfer equation for an isothermal object in one dimension, whereas the finite source size and dust attenuation are considered as well. Molecular data required by the myXCLASS program are taken from an embedded SQLite3 database containing entries from the Cologne Database for Molecular Spectroscopy (CDMS) and JPL using the Virtual Atomic and Molecular Data Center (VAMDC) portal. Additionally, the toolbox provides an interface for the model optimizer package Modeling and Analysis Generic Interface for eXternal numerical codes (MAGIX), which helps to find the best description of observational data using myXCLASS (or another external model program), that is, finding the parameter set that most closely reproduces the data. http://www.astro.uni-koeln.de/projects/schilke/myXCLASSInterface A copy of the code is available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/598/A7

  12. Change Impact Analysis of Crosscutting in Software Architectural Design

    NARCIS (Netherlands)

    van den Berg, Klaas

    2006-01-01

    Software architectures should be amenable to changes in user requirements and implementation technology. The analysis of the impact of these changes can be based on traceability of architectural design elements. Design elements have dependencies with other software artifacts but also evolve in time.

  13. Integrated analysis software for bulk power system stability

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, T; Nagao, T; Takahashi, K [Central Research Inst. of Electric Power Industry, Tokyo (Japan)

    1994-12-31

    This paper presents Central Research Inst.of Electric Power Industry - CRIEPI`s - own developed three softwares for bulk power network analysis and the user support system which arranges tremendous data necessary for these softwares with easy and high reliability. (author) 3 refs., 7 figs., 2 tabs.

  14. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    Science.gov (United States)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  15. Usability study of clinical exome analysis software: top lessons learned and recommendations.

    Science.gov (United States)

    Shyr, Casper; Kushniruk, Andre; Wasserman, Wyeth W

    2014-10-01

    dramatically accelerate clinician analysis and interpretation of patient genomic data. We present the first application of usability methods to evaluate software interfaces in the context of exome analysis. Our results highlight how the study of user responses can lead to identification of usability issues and challenges and reveal software reengineering opportunities for improving clinical next-generation sequencing analysis. While the evaluation focused on two distinctive software tools, the results are general and should inform active and future software development for genome analysis software. As large-scale genome analysis becomes increasingly common in healthcare, it is critical that efficient and effective software interfaces are provided to accelerate clinical adoption of the technology. Implications for improved design of such applications are discussed. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  16. Analysis and design of software ecosystem architectures – Towards the 4S telemedicine ecosystem

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Hansen, Klaus Marius; Kyng, Morten

    2014-01-01

    performed a descriptive, revelatory case study of the Danish telemedicine ecosystem and for ii), we experimentally designed, implemented, and evaluated the architecture of 4S. Results We contribute in three areas. First, we define the software ecosystem architecture concept that captures organization......, and application stove-pipes that inhibit the adoption of telemedical solutions. To which extent can a software ecosystem approach to telemedicine alleviate this? Objective In this article, we define the concept of software ecosystem architecture as the structure(s) of a software ecosystem comprising elements...... experience in creating and evolving the 4S telemedicine ecosystem. Conclusion The concept of software ecosystem architecture can be used analytically and constructively in respectively the analysis and design of software ecosystems....

  17. The application of formal software engineering methods to the unattended and remote monitoring software suite at Los Alamos National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Determan, John Clifford [Los Alamos National Laboratory; Longo, Joseph F [Los Alamos National Laboratory; Michel, Kelly D [Los Alamos National Laboratory

    2009-01-01

    The Unattended and Remote Monitoring (UNARM) system is a collection of specialized hardware and software used by the International Atomic Energy Agency (IAEA) to institute nuclear safeguards at many nuclear facilities around the world. The hardware consists of detectors, instruments, and networked computers for acquiring various forms of data, including but not limited to radiation data, global position coordinates, camera images, isotopic data, and operator declarations. The software provides two primary functions: the secure and reliable collection of this data from the instruments and the ability to perform an integrated review and analysis of the disparate data sources. Several years ago the team responsible for maintaining the software portion of the UNARM system began the process of formalizing its operations. These formal operations include a configuration management system, a change control board, an issue tracking system, and extensive formal testing, for both functionality and reliability. Functionality is tested with formal test cases chosen to fully represent the data types and methods of analysis that will be commonly encountered. Reliability is tested with iterative, concurrent testing where up to five analyses are executed simultaneously for thousands of cycles. Iterative concurrent testing helps ensure that there are no resource conflicts or leaks when multiple system components are in use simultaneously. The goal of this work is to provide a high quality, reliable product, commensurate with the criticality of the application. Testing results will be presented that demonstrate that this goal has been achieved and the impact of the introduction of a formal software engineering framework to the UNARM product will be presented.

  18. GWAMA: software for genome-wide association meta-analysis

    Directory of Open Access Journals (Sweden)

    Mägi Reedik

    2010-05-01

    Full Text Available Abstract Background Despite the recent success of genome-wide association studies in identifying novel loci contributing effects to complex human traits, such as type 2 diabetes and obesity, much of the genetic component of variation in these phenotypes remains unexplained. One way to improving power to detect further novel loci is through meta-analysis of studies from the same population, increasing the sample size over any individual study. Although statistical software analysis packages incorporate routines for meta-analysis, they are ill equipped to meet the challenges of the scale and complexity of data generated in genome-wide association studies. Results We have developed flexible, open-source software for the meta-analysis of genome-wide association studies. The software incorporates a variety of error trapping facilities, and provides a range of meta-analysis summary statistics. The software is distributed with scripts that allow simple formatting of files containing the results of each association study and generate graphical summaries of genome-wide meta-analysis results. Conclusions The GWAMA (Genome-Wide Association Meta-Analysis software has been developed to perform meta-analysis of summary statistics generated from genome-wide association studies of dichotomous phenotypes or quantitative traits. Software with source files, documentation and example data files are freely available online at http://www.well.ox.ac.uk/GWAMA.

  19. Cost Analysis of Poor Quality Using a Software Simulation

    Directory of Open Access Journals (Sweden)

    Jana Fabianová

    2017-02-01

    Full Text Available The issues of quality, cost of poor quality and factors affecting quality are crucial to maintaining a competitiveness regarding to business activities. Use of software applications and computer simulation enables more effective quality management. Simulation tools offer incorporating the variability of more variables in experiments and evaluating their common impact on the final output. The article presents a case study focused on the possibility of using computer simulation Monte Carlo in the field of quality management. Two approaches for determining the cost of poor quality are introduced here. One from retrospective scope of view, where the cost of poor quality and production process are calculated based on historical data. The second approach uses the probabilistic characteristics of the input variables by means of simulation, and reflects as a perspective view of the costs of poor quality. Simulation output in the form of a tornado and sensitivity charts complement the risk analysis.

  20. PROTEINCHALLENGE: Crowd sourcing in proteomics analysis and software development

    DEFF Research Database (Denmark)

    Martin, Sarah F.; Falkenberg, Heiner; Dyrlund, Thomas Franck

    2013-01-01

    , including arguments for community-wide open source software development and “big data” compatible solutions for the future. For the meantime, we have laid out ten top tips for data processing. With these at hand, a first large-scale proteomics analysis hopefully becomes less daunting to navigate......, with the aim of setting a community-driven gold standard for data handling, reporting and sharing. This article is part of a Special Issue entitled: New Horizons and Applications for Proteomics [EuPA 2012].......In large-scale proteomics studies there is a temptation, after months of experimental work, to plug resulting data into a convenient—if poorly implemented—set of tools, which may neither do the data justice nor help answer the scientific question. In this paper we have captured key concerns...

  1. Long-term preservation of analysis software environment

    International Nuclear Information System (INIS)

    Toppe Larsen, Dag; Blomer, Jakob; Buncic, Predrag; Charalampidis, Ioannis; Haratyunyan, Artem

    2012-01-01

    Long-term preservation of scientific data represents a challenge to experiments, especially regarding the analysis software. Preserving data is not enough; the full software and hardware environment is needed. Virtual machines (VMs) make it possible to preserve hardware “in software”. A complete infrastructure package has been developed for easy deployment and management of VMs, based on CERN virtual machine (CernVM). Further, a HTTP-based file system, CernVM file system (CVMFS), is used for the distribution of the software. It is possible to process data with any given software version, and a matching, regenerated VM version. A point-and-click web user interface is being developed for setting up the complete processing chain, including VM and software versions, number and type of processing nodes, and the particular type of analysis and data. This paradigm also allows for distributed cloud-computing on private and public clouds, for both legacy and contemporary experiments.

  2. A pioneering application of NQA-1 quality assurance standards in the development of software

    International Nuclear Information System (INIS)

    Weisbin, A.N.

    1988-01-01

    The application of NQA-1 Quality Assurance Standards to computer software programs has been recent at the Oak Ridge National Laboratory. One reason for systematically applying quality assurance to computer software is the extensive use of results from computer programs. to characterize potential sites for nuclear waste repositories leading ultimately to important policy making decisions. Because data from these programs characterize the likely radioactivity profile for many hundreds of years, experimental validation is not feasible. The Sensitivity and Uncertainty Analysis Methods Development Project (SUAMDP) was developed to formulate and utilize efficient and comprehensive methods for determining sensitivities of calculated results with respect to changes in all input parameters. The computerized methodology was embodied in the Gradient Enhanced Software System (GRESS). Due to the fact that GRESS was to be used in the site characterization for waste storage, stringent NQA-1 requirements were imposed by the sponsor. A working relationship between the Oak Ridge National Laboratory (ORNL) Quality Department and the research scientists developing GRESS was essential in achieving understanding and acceptance of the quality assurance requirements as applied to the SUAMDP. The relationship resulted in the SUAMDP becoming the first software project at ORNL to develop a comprehensive NQA-1 Quality Assurance Plan; this plan now serves as a model for software quality assurance at ORNL. This paper describes the evolution of this plan and its impact on the application of quality assurance procedures to software

  3. Experimental analysis of specification language impact on NPP software diversity

    International Nuclear Information System (INIS)

    Yoo, Chang Sik; Seong, Poong Hyun

    1998-01-01

    When redundancy and diversity is applied in NPP digital computer system, diversification of system software may be a critical point for the entire system dependability. As the means of enhancing software diversity, specification language diversity is suggested in this study. We set up a simple hypothesis for the specification language impact on common errors, and an experiment based on NPP protection system application was performed. Experiment result showed that this hypothesis could be justified and specification language diversity is effective in overcoming software common mode failure problem

  4. Quality control in diagnostic radiology: software (Visual Basic 6) and database applications

    International Nuclear Information System (INIS)

    Md Saion Salikin; Muhammad Farid Abdul Khalid

    2002-01-01

    Quality Assurance programme in diagnostic Radiology is being implemented by the Ministry of Health (MoH) in Malaysia. Under this program the performance of an x-ray machine used for diagnostic purpose is tested by using the approved procedure which is commonly known as Quality Control in diagnostic radiology. The quality control or performance tests are carried out b a class H licence holder issued the Atomic Energy Licensing Act 1984. There are a few computer applications (software) that are available in the market which can be used for this purpose. A computer application (software) using Visual Basics 6 and Microsoft Access, is being developed to expedite data handling, analysis and storage as well as report writing of the quality control tests. In this paper important features of the software for quality control tests are explained in brief. A simple database is being established for this purpose which is linked to the software. Problems encountered in the preparation of database are discussed in this paper. A few examples of practical usage of the software and database applications are presented in brief. (Author)

  5. The software safety analysis based on SFTA for reactor power regulating system in nuclear power plant

    International Nuclear Information System (INIS)

    Liu Zhaohui; Yang Xiaohua; Liao Longtao; Wu Zhiqiang

    2015-01-01

    The digitalized Instrumentation and Control (I and C) system of Nuclear power plants can provide many advantages. However, digital control systems induce new failure modes that differ from those of analog control systems. While the cost effectiveness and flexibility of software is widely recognized, it is very difficult to achieve and prove high levels of dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. Software safety analysis (SSA) was one way to improve the software safety by identify the system hazards caused by software failure. This paper describes the application of a software fault tree analysis (SFTA) at the software design phase. At first, we evaluate all the software modules of the reactor power regulating system in nuclear power plant and identify various hazards. The SFTA was applied to some critical modules selected from the previous step. At last, we get some new hazards that had not been identified in the prior processes of the document evaluation which were helpful for our design. (author)

  6. Integrated Software Environment for Pressurized Thermal Shock Analysis

    Directory of Open Access Journals (Sweden)

    Dino Araneo

    2011-01-01

    Full Text Available The present paper describes the main features and an application to a real Nuclear Power Plant (NPP of an Integrated Software Environment (in the following referred to as “platform” developed at University of Pisa (UNIPI to perform Pressurized Thermal Shock (PTS analysis. The platform is written in Java for the portability and it implements all the steps foreseen in the methodology developed at UNIPI for the deterministic analysis of PTS scenarios. The methodology starts with the thermal hydraulic analysis of the NPP with a system code (such as Relap5-3D and Cathare2, during a selected transient scenario. The results so obtained are then processed to provide boundary conditions for the next step, that is, a CFD calculation. Once the system pressure and the RPV wall temperature are known, the stresses inside the RPV wall can be calculated by mean a Finite Element (FE code. The last step of the methodology is the Fracture Mechanics (FM analysis, using weight functions, aimed at evaluating the stress intensity factor (KI at crack tip to be compared with the critical stress intensity factor KIc. The platform automates all these steps foreseen in the methodology once the user specifies a number of boundary conditions at the beginning of the simulation.

  7. Surfing the Edge of Chaos: Applications to Software Engineering

    National Research Council Canada - National Science Library

    Nogueira, Juan C; Jones, Carl

    2000-01-01

    .... We discuss the feasibility of using this theory in software engineering as an alternative to bureaucratic software development processes. We present also some recommendations that could help to acquire competitive advantage in software development, hence achieve information superiority.

  8. OST: analysis tool for real time software by simulation of material and software environments

    International Nuclear Information System (INIS)

    Boulc'h; Le Meur; Lapassat; Salichon; Segalard

    1988-07-01

    The utilization of microprocessors systems in a nuclear installation control oblige a great operation safety in the installation operation and in the environment protection. For the safety analysis of these installations the Institute of Protection and Nuclear Safety (IPSN) will dispose tools which permit to make controls during all the life of the software. The simulation and test tool (OST) which have been created is completely made by softwares. It is used on VAX calculators and can be easily transportable on other calculators [fr

  9. Software requirements definition Shipping Cask Analysis System (SCANS)

    International Nuclear Information System (INIS)

    Johnson, G.L.; Serbin, R.

    1985-01-01

    The US Nuclear Regulatory Commission (NRC) staff reviews the technical adequacy of applications for certification of designs of shipping casks for spent nuclear fuel. In order to confirm an acceptable design, the NRC staff may perform independent calculations. The current NRC procedure for confirming cask design analyses is laborious and tedious. Most of the work is currently done by hand or through the use of a remote computer network. The time required to certify a cask can be long. The review process may vary somewhat with the engineer doing the reviewing. Similarly, the documentation on the results of the review can also vary with the reviewer. To increase the efficiency of this certification process, LLNL was requested to design and write an integrated set of user-oriented, interactive computer programs for a personal microcomputer. The system is known as the NRC Shipping Cask Analysis System (SCANS). The computer codes and the software system supporting these codes are being developed and maintained for the NRC by LLNL. The objective of this system is generally to lessen the time and effort needed to review an application. Additionally, an objective of the system is to assure standardized methods and documentation of the confirmatory analyses used in the review of these cask designs. A software system should be designed based on NRC-defined requirements contained in a requirements document. The requirements document is a statement of a project's wants and needs as the users and implementers jointly understand them. The requirements document states the desired end products (i.e. WHAT's) of the project, not HOW the project provides them. This document describes the wants and needs for the SCANS system. 1 fig., 3 tabs

  10. Power Analysis Software for Educational Researchers

    Science.gov (United States)

    Peng, Chao-Ying Joanne; Long, Haiying; Abaci, Serdar

    2012-01-01

    Given the importance of statistical power analysis in quantitative research and the repeated emphasis on it by American Educational Research Association/American Psychological Association journals, the authors examined the reporting practice of power analysis by the quantitative studies published in 12 education/psychology journals between 2005…

  11. JEM-X science analysis software

    DEFF Research Database (Denmark)

    Westergaard, Niels Jørgen Stenfeldt; Kretschmar, P.; Oxborrow, Carol Anne

    2003-01-01

    The science analysis of the data from JEM-X on INTEGRAL is performed through a number of levels including corrections, good time selection, imaging and source finding, spectrum and light-curve extraction. These levels consist of individual executables and the running of the complete analysis...

  12. Software for 3D diagnostic image reconstruction and analysis

    International Nuclear Information System (INIS)

    Taton, G.; Rokita, E.; Sierzega, M.; Klek, S.; Kulig, J.; Urbanik, A.

    2005-01-01

    Recent advances in computer technologies have opened new frontiers in medical diagnostics. Interesting possibilities are the use of three-dimensional (3D) imaging and the combination of images from different modalities. Software prepared in our laboratories devoted to 3D image reconstruction and analysis from computed tomography and ultrasonography is presented. In developing our software it was assumed that it should be applicable in standard medical practice, i.e. it should work effectively with a PC. An additional feature is the possibility of combining 3D images from different modalities. The reconstruction and data processing can be conducted using a standard PC, so low investment costs result in the introduction of advanced and useful diagnostic possibilities. The program was tested on a PC using DICOM data from computed tomography and TIFF files obtained from a 3D ultrasound system. The results of the anthropomorphic phantom and patient data were taken into consideration. A new approach was used to achieve spatial correlation of two independently obtained 3D images. The method relies on the use of four pairs of markers within the regions under consideration. The user selects the markers manually and the computer calculates the transformations necessary for coupling the images. The main software feature is the possibility of 3D image reconstruction from a series of two-dimensional (2D) images. The reconstructed 3D image can be: (1) viewed with the most popular methods of 3D image viewing, (2) filtered and processed to improve image quality, (3) analyzed quantitatively (geometrical measurements), and (4) coupled with another, independently acquired 3D image. The reconstructed and processed 3D image can be stored at every stage of image processing. The overall software performance was good considering the relatively low costs of the hardware used and the huge data sets processed. The program can be freely used and tested (source code and program available at

  13. Analysis of Variance: What Is Your Statistical Software Actually Doing?

    Science.gov (United States)

    Li, Jian; Lomax, Richard G.

    2011-01-01

    Users assume statistical software packages produce accurate results. In this article, the authors systematically examined Statistical Package for the Social Sciences (SPSS) and Statistical Analysis System (SAS) for 3 analysis of variance (ANOVA) designs, mixed-effects ANOVA, fixed-effects analysis of covariance (ANCOVA), and nested ANOVA. For each…

  14. Distribution and communication in software engineering environments. Application to the HELIOS Software Bus.

    OpenAIRE

    Jean, F. C.; Jaulent, M. C.; Coignard, J.; Degoulet, P.

    1991-01-01

    Modularity, distribution and integration are current trends in Software Engineering. To reach these goals HELIOS, a distributive Software Engineering Environment dedicated to the medical field, has been conceived and a prototype implemented. This environment is made by the collaboration of several, well encapsulated Software Components. This paper presents the architecture retained to allow communication between the different components and focus on the implementation details of the Software ...

  15. Quad channel software defined receiver for passive radar application

    Directory of Open Access Journals (Sweden)

    Pető Tamás

    2017-03-01

    Full Text Available In recent times the growing utilization of the electromagnetic environment brings the passive radar researches more and more to the fore. For the utilization of the wide range of illuminators of opportunity the application of wideband radio receivers is required. At the same time the multichannel receiver structure has also critical importance in target direction finding and interference suppression. This paper presents the development of a multichannel software defined receiver specifically for passive radar applications. One of the relevant feature of the developed receiver platform is its up-to-date SoC (System on hip based structure, which greatly enhance the integration and signal processing capacity of the system, all while keeping the costs low. The software defined operation of the discussed receiver system is demonstrated with using DVB-T (Digital Video Broadcast – Terrestrial signal as illuminator of opportunity. During this demonstration the multichannel capabilities of the realized system are also tested with real data using direction finding and beamforming algorithms.

  16. Standards Interoperability: Application of Contemporary Software Safety Assurance Standards to the Evolution of Legacy Software

    National Research Council Canada - National Science Library

    Meacham, Desmond J

    2006-01-01

    .... The proposed formal model is then applied to the requirements for RTCA DO-178B and MIL-STD-498 as representative examples of contemporary and legacy software standards. The results provide guidance on how to achieve airworthiness certification for modified legacy software, whilst maximizing the use of software products from the previous development.

  17. Development of Emittance Analysis Software for Ion Beam Characterization

    International Nuclear Information System (INIS)

    Padilla, M.J.; Liu, Yuan

    2007-01-01

    Transverse beam emittance is a crucial property of charged particle beams that describes their angular and spatial spread. It is a figure of merit frequently used to determine the quality of ion beams, the compatibility of an ion beam with a given beam transport system, and the ability to suppress neighboring isotopes at on-line mass separator facilities. Generally, a high-quality beam is characterized by a small emittance. In order to determine and improve the quality of ion beams used at the Holifield Radioactive Ion Beam Facility (HRIBF) for nuclear physics and nuclear astrophysics research, the emittances of the ion beams are measured at the off-line Ion Source Test Facilities. In this project, emittance analysis software was developed to perform various data processing tasks for noise reduction, to evaluate root-mean-square emittance, Twiss parameters, and area emittance of different beam fractions. The software also provides 2D and 3D graphical views of the emittance data, beam profiles, emittance contours, and RMS. Noise exclusion is essential for accurate determination of beam emittance values. A Self-Consistent, Unbiased Elliptical Exclusion (SCUBEEx) method is employed. Numerical data analysis techniques such as interpolation and nonlinear fitting are also incorporated into the software. The software will provide a simplified, fast tool for comprehensive emittance analysis. The main functions of the software package have been completed. In preliminary tests with experimental emittance data, the analysis results using the software were shown to be accurate

  18. DEVELOPMENT OF EMITTANCE ANALYSIS SOFTWARE FOR ION BEAM CHARACTERIZATION

    Energy Technology Data Exchange (ETDEWEB)

    Padilla, M. J.; Liu, Y.

    2007-01-01

    Transverse beam emittance is a crucial property of charged particle beams that describes their angular and spatial spread. It is a fi gure of merit frequently used to determine the quality of ion beams, the compatibility of an ion beam with a given beam transport system, and the ability to suppress neighboring isotopes at on-line mass separator facilities. Generally a high quality beam is characterized by a small emittance. In order to determine and improve the quality of ion beams used at the Holifi eld Radioactive Ion beam Facility (HRIBF) for nuclear physics and nuclear astrophysics research, the emittances of the ion beams are measured at the off-line Ion Source Test Facilities. In this project, emittance analysis software was developed to perform various data processing tasks for noise reduction, to evaluate root-mean-square emittance, Twiss parameters, and area emittance of different beam fractions. The software also provides 2D and 3D graphical views of the emittance data, beam profi les, emittance contours, and RMS. Noise exclusion is essential for accurate determination of beam emittance values. A Self-Consistent, Unbiased Elliptical Exclusion (SCUBEEx) method is employed. Numerical data analysis techniques such as interpolation and nonlinear fi tting are also incorporated into the software. The software will provide a simplifi ed, fast tool for comprehensive emittance analysis. The main functions of the software package have been completed. In preliminary tests with experimental emittance data, the analysis results using the software were shown to be accurate.

  19. Development of Cell Analysis Software for Cultivated Corneal Endothelial Cells.

    Science.gov (United States)

    Okumura, Naoki; Ishida, Naoya; Kakutani, Kazuya; Hongo, Akane; Hiwa, Satoru; Hiroyasu, Tomoyuki; Koizumi, Noriko

    2017-11-01

    To develop analysis software for cultured human corneal endothelial cells (HCECs). Software was designed to recognize cell borders and to provide parameters such as cell density, coefficient of variation, and polygonality of cultured HCECs based on phase contrast images. Cultured HCECs with high or low cell density were incubated with Ca-free and Mg-free phosphate-buffered saline for 10 minutes to reveal the cell borders and were then analyzed with software (n = 50). Phase contrast images showed that cell borders were not distinctly outlined, but these borders became more distinctly outlined after phosphate-buffered saline treatment and were recognized by cell analysis software. The cell density value provided by software was similar to that obtained using manual cell counting by an experienced researcher. Morphometric parameters, such as the coefficient of variation and polygonality, were also produced by software, and these values were significantly correlated with cell density (Pearson correlation coefficients -0.62 and 0.63, respectively). The software described here provides morphometric information from phase contrast images, and it enables subjective and noninvasive quality assessment for tissue engineering therapy of the corneal endothelium.

  20. An investigation of modelling and design for software service applications

    Science.gov (United States)

    2017-01-01

    Software services offer the opportunity to use a component-based approach for the design of applications. However, this needs a deeper understanding of how to develop service-based applications in a systematic manner, and of the set of properties that need to be included in the ‘design model’. We have used a realistic application to explore systematically how service-based designs can be created and described. We first identified the key properties of an SOA (service oriented architecture) and then undertook a single-case case study to explore its use in the development of a design for a large-scale application in energy engineering, modelling this with existing notations wherever possible. We evaluated the resulting design model using two walkthroughs with both domain and application experts. We were able to successfully develop a design model around the ten properties identified, and to describe it by adapting existing design notations. A component-based approach to designing such systems does appear to be feasible. However, it needs the assistance of a more integrated set of notations for describing the resulting design model. PMID:28489905

  1. An investigation of modelling and design for software service applications.

    Science.gov (United States)

    Anjum, Maria; Budgen, David

    2017-01-01

    Software services offer the opportunity to use a component-based approach for the design of applications. However, this needs a deeper understanding of how to develop service-based applications in a systematic manner, and of the set of properties that need to be included in the 'design model'. We have used a realistic application to explore systematically how service-based designs can be created and described. We first identified the key properties of an SOA (service oriented architecture) and then undertook a single-case case study to explore its use in the development of a design for a large-scale application in energy engineering, modelling this with existing notations wherever possible. We evaluated the resulting design model using two walkthroughs with both domain and application experts. We were able to successfully develop a design model around the ten properties identified, and to describe it by adapting existing design notations. A component-based approach to designing such systems does appear to be feasible. However, it needs the assistance of a more integrated set of notations for describing the resulting design model.

  2. Recent Survey and Application of the simSUNDT Software

    Science.gov (United States)

    Persson, G.; Wirdelius, H.

    2010-02-01

    The simSUNDT software is based on a previous developed program (SUNDT). The latest version has been customized in order to generate realistic synthetic data (including a grain noise model), compatible with a number of off-line analysis software. The software consists of a Windows®-based preprocessor and postprocessor together with a mathematical kernel (UTDefect), dealing with the actual mathematical modeling. The model employs various integral transforms and integral equation and enables simulations of the entire ultrasonic testing situation. The model is completely three-dimensional though the simulated component is two-dimensional, bounded by the scanning surface and a planar back surface as an option. It is of great importance that inspection methods that are applied are proper validated and that their capability of detection of cracks and defects are quantified. In order to achieve this, statistical methods such as Probability of Detection (POD) often are applied, with the ambition to estimate the detectability as a function of defect size. Despite the fact that the proposed procedure with the utilization of test pieces is very expensive, it also tends to introduce a number of possible misalignments between the actual NDT situation that is to be performed and the proposed experimental simulation. The presentation will describe the developed model that will enable simulation of a phased array NDT inspection and the ambition to use this simulation software to generate POD information. The paper also includes the most recent developments of the model including some initial experimental validation of the phased array probe model.

  3. Economic Consequence Analysis of Disasters: The ECAT Software Tool

    Energy Technology Data Exchange (ETDEWEB)

    Rose, Adam; Prager, Fynn; Chen, Zhenhua; Chatterjee, Samrat; Wei, Dan; Heatwole, Nathaniel; Warren, Eric

    2017-04-15

    This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economic consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.

  4. Application of the Life Cycle Analysis and the Building Information Modelling Software in the Architectural Climate Change-Oriented Design Process

    Science.gov (United States)

    Gradziński, Piotr

    2017-10-01

    Whereas World’s climate is changing (inter alia, under the influence of architecture activity), the author attempts to reorientations design practice primarily in a direction the use and adapt to the climatic conditions. Architectural Design using in early stages of the architectural Design Process of the building, among other Life Cycle Analysis (LCA) and digital analytical tools BIM (Building Information Modelling) defines the overriding requirements which the designer/architect should meet. The first part, the text characterized the architecture activity influences (by consumption, pollution, waste, etc.) and the use of building materials (embodied energy, embodied carbon, Global Warming Potential, etc.) within the meaning of the direct negative environmental impact. The second part, the paper presents the revision of the methods and analytical techniques prevent negative influences. Firstly, showing the study of the building by using the Life Cycle Analysis of the structure (e.g. materials) and functioning (e.g. energy consumptions) of the architectural object (stages: before use, use, after use). Secondly, the use of digital analytical tools for determining the benefits of running multi-faceted simulations in terms of environmental factors (exposure to light, shade, wind) directly affecting shaping the form of the building. The conclusion, author’s research results highlight the fact that indicates the possibility of building design using the above-mentioned elements (LCA, BIM) causes correction, early designs decisions in the design process of architectural form, minimizing the impact on nature, environment. The work refers directly to the architectural-environmental dimensions, orienting the design process of buildings in respect of widely comprehended climatic changes.

  5. Automated Freedom from Interference Analysis for Automotive Software

    OpenAIRE

    Leitner-Fischer , Florian; Leue , Stefan; Liu , Sirui

    2016-01-01

    International audience; Freedom from Interference for automotive software systems developed according to the ISO 26262 standard means that a fault in a less safety critical software component will not lead to a fault in a more safety critical component. It is an important concern in the realm of functional safety for automotive systems. We present an automated method for the analysis of concurrency-related interferences based on the QuantUM approach and tool that we have previously developed....

  6. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines.

  7. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    International Nuclear Information System (INIS)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines

  8. Constructing a working taxonomy of functional Ada software components for real-time embedded system applications

    Science.gov (United States)

    Wallace, Robert

    1986-01-01

    A major impediment to a systematic attack on Ada software reusability is the lack of an effective taxonomy for software component functions. The scope of all possible applications of Ada software is considered too great to allow the practical development of a working taxonomy. Instead, for the purposes herein, the scope of Ada software application is limited to device and subsystem control in real-time embedded systems. A functional approach is taken in constructing the taxonomy tree for identified Ada domain. The use of modular software functions as a starting point fits well with the object oriented programming philosophy of Ada. Examples of the types of functions represented within the working taxonomy are real time kernels, interrupt service routines, synchronization and message passing, data conversion, digital filtering and signal conditioning, and device control. The constructed taxonomy is proposed as a framework from which a need analysis can be performed to reveal voids in current Ada real-time embedded programming efforts for Space Station.

  9. Development of output user interface software to support analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wahanani, Nursinta Adi, E-mail: sintaadi@batan.go.id; Natsir, Khairina, E-mail: sintaadi@batan.go.id; Hartini, Entin, E-mail: sintaadi@batan.go.id [Center for Development of Nuclear Informatics - National Nuclear Energy Agency, PUSPIPTEK, Serpong, Tangerang, Banten (Indonesia)

    2014-09-30

    Data processing software packages such as VSOP and MCNPX are softwares that has been scientifically proven and complete. The result of VSOP and MCNPX are huge and complex text files. In the analyze process, user need additional processing like Microsoft Excel to show informative result. This research develop an user interface software for output of VSOP and MCNPX. VSOP program output is used to support neutronic analysis and MCNPX program output is used to support burn-up analysis. Software development using iterative development methods which allow for revision and addition of features according to user needs. Processing time with this software 500 times faster than with conventional methods using Microsoft Excel. PYTHON is used as a programming language, because Python is available for all major operating systems: Windows, Linux/Unix, OS/2, Mac, Amiga, among others. Values that support neutronic analysis are k-eff, burn-up and mass Pu{sup 239} and Pu{sup 241}. Burn-up analysis used the mass inventory values of actinide (Thorium, Plutonium, Neptunium and Uranium). Values are visualized in graphical shape to support analysis.

  10. Development of output user interface software to support analysis

    International Nuclear Information System (INIS)

    Wahanani, Nursinta Adi; Natsir, Khairina; Hartini, Entin

    2014-01-01

    Data processing software packages such as VSOP and MCNPX are softwares that has been scientifically proven and complete. The result of VSOP and MCNPX are huge and complex text files. In the analyze process, user need additional processing like Microsoft Excel to show informative result. This research develop an user interface software for output of VSOP and MCNPX. VSOP program output is used to support neutronic analysis and MCNPX program output is used to support burn-up analysis. Software development using iterative development methods which allow for revision and addition of features according to user needs. Processing time with this software 500 times faster than with conventional methods using Microsoft Excel. PYTHON is used as a programming language, because Python is available for all major operating systems: Windows, Linux/Unix, OS/2, Mac, Amiga, among others. Values that support neutronic analysis are k-eff, burn-up and mass Pu 239 and Pu 241 . Burn-up analysis used the mass inventory values of actinide (Thorium, Plutonium, Neptunium and Uranium). Values are visualized in graphical shape to support analysis

  11. LANDSAFE: LANDING SITE RISK ANALYSIS SOFTWARE FRAMEWORK

    OpenAIRE

    Schmidt, Ralph; Bostelmann, Jonas; Cornet, Yves; Heipke, Christian; Philippe, Christian; Poncelet, Nadia; de Rosa, Diego; Vandeloise, Yannick

    2012-01-01

    The European Space Agency (ESA) is planning a Lunar Lander mission in the 2018 timeframe that will demonstrate precise soft landing at the polar regions of the Moon. To ensure a safe and successful landing a careful risk analysis has to be carried out. This is comprised of identifying favorable target areas and evaluating the surface conditions in these areas. Features like craters, boulders, steep slopes, rough surfaces and shadow areas have to be identified in order to assess the risk assoc...

  12. Laboratory and software applications for clinical trials: the global laboratory environment.

    Science.gov (United States)

    Briscoe, Chad

    2011-11-01

    The Applied Pharmaceutical Software Meeting is held annually. It is sponsored by The Boston Society, a not-for-profit organization that coordinates a series of meetings within the global pharmaceutical industry. The meeting generally focuses on laboratory applications, but in recent years has expanded to include some software applications for clinical trials. The 2011 meeting emphasized the global laboratory environment. Global clinical trials generate massive amounts of data in many locations that must be centralized and processed for efficient analysis. Thus, the meeting had a strong focus on establishing networks and systems for dealing with the computer infrastructure to support such environments. In addition to the globally installed laboratory information management system, electronic laboratory notebook and other traditional laboratory applications, cloud computing is quickly becoming the answer to provide efficient, inexpensive options for managing the large volumes of data and computing power, and thus it served as a central theme for the meeting.

  13. How to do Meta-Analysis using HLM software

    OpenAIRE

    Petscher, Yaacov

    2013-01-01

    This is a step-by-step presentation of how to run a meta-analysis using HLM software. Because it's a variance known model, it is not run through the GUI, but batch mode. These slides show how to prepare the data and run the analysis.

  14. A relational approach to support software architecture analysis

    NARCIS (Netherlands)

    Feijs, L.M.G.; Krikhaar, R.L.; van Ommering, R.C.

    1998-01-01

    This paper reports on our experience with a relational approach to support the analysis of existing software architectures. The analysis options provide for visualization and view calculation. The approach has been applied for reverse engineering. It is also possible to check concrete designs

  15. Radio-science performance analysis software

    Science.gov (United States)

    Morabito, D. D.; Asmar, S. W.

    1995-02-01

    The Radio Science Systems Group (RSSG) provides various support functions for several flight project radio-science teams. Among these support functions are uplink and sequence planning, real-time operations monitoring and support, data validation, archiving and distribution functions, and data processing and analysis. This article describes the support functions that encompass radio-science data performance analysis. The primary tool used by the RSSG to fulfill this support function is the STBLTY program set. STBLTY is used to reconstruct observable frequencies and calculate model frequencies, frequency residuals, frequency stability in terms of Allan deviation, reconstructed phase, frequency and phase power spectral density, and frequency drift rates. In the case of one-way data, using an ultrastable oscillator (USO) as a frequency reference, the program set computes the spacecraft transmitted frequency and maintains a database containing the in-flight history of the USO measurements. The program set also produces graphical displays. Some examples and discussions on operating the program set on Galileo and Ulysses data will be presented.

  16. Confirmatory Factor Analysis Alternative: Free, Accessible CBID Software.

    Science.gov (United States)

    Bott, Marjorie; Karanevich, Alex G; Garrard, Lili; Price, Larry R; Mudaranthakam, Dinesh Pal; Gajewski, Byron

    2018-02-01

    New software that performs Classical and Bayesian Instrument Development (CBID) is reported that seamlessly integrates expert (content validity) and participant data (construct validity) to produce entire reliability estimates with smaller sample requirements. The free CBID software can be accessed through a website and used by clinical investigators in new instrument development. Demonstrations are presented of the three approaches using the CBID software: (a) traditional confirmatory factor analysis (CFA), (b) Bayesian CFA using flat uninformative prior, and (c) Bayesian CFA using content expert data (informative prior). Outcomes of usability testing demonstrate the need to make the user-friendly, free CBID software available to interdisciplinary researchers. CBID has the potential to be a new and expeditious method for instrument development, adding to our current measurement toolbox. This allows for the development of new instruments for measuring determinants of health in smaller diverse populations or populations of rare diseases.

  17. FORECAST: Regulatory effects cost analysis software annual

    International Nuclear Information System (INIS)

    Lopez, B.; Sciacca, F.W.

    1991-11-01

    Over the past several years the NRC has developed a generic cost methodology for the quantification of cost/economic impacts associated with a wide range of new or revised regulatory requirements. This methodology has been developed to aid the NRC in preparing Regulatory Impact Analyses (RIAs). These generic costing methods can be useful in quantifying impacts both to industry and to the NRC. The FORECAST program was developed to facilitate the use of the generic costing methodology. This PC program integrates the major cost considerations that may be required because of a regulatory change. FORECAST automates much of the calculations typically needed in an RIA and thus reduces the time and labor required to perform these analysis. More importantly, its integrated and consistent treatment of the different cost elements should help assure comprehensiveness, uniformity, and accuracy in the preparation of needed cost estimates

  18. Equipment Obsolescence Analysis and Management Software

    Energy Technology Data Exchange (ETDEWEB)

    Redmond, J.; Carret, L.; Shaon, S.; Schultz, C.

    2015-07-01

    The procurement engineering resources at Nuclear Power Plants (NPPs) are experiencing increasing backlog for procurement items primarily due to the inability to order the original replacement parts. The level of effort and time required to prepare procurement packages is increasing since the number of obsolete parts are increasing exponentially. Procurement packages for obsolete components and parts are much more complex and take more time to prepare because of the need to perform equivalency evaluations, testing requirements and test acceptance criteria development, commercial grade dedication or equipment qualification, and increasing efforts to verify that no fraudulent or counterfeit parts are procured. This problem will be further compounded when NPPs pursue license renewal and approval for plant-life extension. Advanced planning and advanced knowledge of equipment obsolescence is required to allow for sufficient time to properly procure replacement parts for obsolete items. The uncertain supply chain capability due to obsolescence is a real problem and can cause a risk to reliable plant operations due to the potential for a lack of available spare parts and replacement components to support outages and unplanned component failures. Advanced notification of obsolescence is increasingly more important to ensure that adequate time and planning is scheduled to procure the proper replacement parts. A thorough analysis of Original Equipment Manufacturer (OEM) availability and inventory as well as an analysis of failure rates and usage rates is required to predict critical part needs to allow for early identification of obsolescence issues so that a planned and controlled strategy to qualify replacement equipment can be implemented. (Author)

  19. Safety Characteristics in System Application Software for Human Rated Exploration

    Science.gov (United States)

    Mango, E. J.

    2016-01-01

    NASA and its industry and international partners are embarking on a bold and inspiring development effort to design and build an exploration class space system. The space system is made up of the Orion system, the Space Launch System (SLS) and the Ground Systems Development and Operations (GSDO) system. All are highly coupled together and dependent on each other for the combined safety of the space system. A key area of system safety focus needs to be in the ground and flight application software system (GFAS). In the development, certification and operations of GFAS, there are a series of safety characteristics that define the approach to ensure mission success. This paper will explore and examine the safety characteristics of the GFAS development.

  20. The software analysis project for the Office of Human Resources

    Science.gov (United States)

    Tureman, Robert L., Jr.

    1994-01-01

    There were two major sections of the project for the Office of Human Resources (OHR). The first section was to conduct a planning study to analyze software use with the goal of recommending software purchases and determining whether the need exists for a file server. The second section was analysis and distribution planning for retirement planning computer program entitled VISION provided by NASA Headquarters. The software planning study was developed to help OHR analyze the current administrative desktop computing environment and make decisions regarding software acquisition and implementation. There were three major areas addressed by the study: current environment new software requirements, and strategies regarding the implementation of a server in the Office. To gather data on current environment, employees were surveyed and an inventory of computers were produced. The surveys were compiled and analyzed by the ASEE fellow with interpretation help by OHR staff. New software requirements represented a compilation and analysis of the surveyed requests of OHR personnel. Finally, the information on the use of a server represents research done by the ASEE fellow and analysis of survey data to determine software requirements for a server. This included selection of a methodology to estimate the number of copies of each software program required given current use and estimated growth. The report presents the results of the computing survey, a description of the current computing environment, recommenations for changes in the computing environment, current software needs, management advantages of using a server, and management considerations in the implementation of a server. In addition, detailed specifications were presented for the hardware and software recommendations to offer a complete picture to OHR management. The retirement planning computer program available to NASA employees will aid in long-range retirement planning. The intended audience is the NASA civil

  1. Software Delivery Risk Management: Application of Bayesian Networks in Agile Software Development

    Directory of Open Access Journals (Sweden)

    Ancveire Ieva

    2015-12-01

    Full Text Available The information technology industry cannot be imagined without large- or small-scale projects. They are implemented to develop systems enabling key business processes and improving performance and enterprise resource management. However, projects often experience various difficulties during their execution. These problems are usually related to the three objectives of the project – costs, quality and deadline. A way these challenges can be solved is project risk management. However, not always the main problems and their influencing factors can be easily identified. Usually there is a need for a more profound analysis of the problem situation. In this paper, we propose the use of a Bayesian Network concept for quantitative risk management in agile projects. The Bayesian Network is explored using a case study focusing on a project that faces difficulties during the software delivery process. We explain why an agile risk analysis is needed and assess the potential risk factors, which may occur during the project. Thereafter, we design the Bayesian Network to capture the actual problem situation and make suggestions how to improve the delivery process based on the measures to be taken to reduce the occurrence of project risks.

  2. Software application for a total management of a radioactive facility

    International Nuclear Information System (INIS)

    Mirpuri, E.; Escudero, R.; Macias, M.T.; Perez, J.; Sanchez, A.; Usera, F.

    2008-01-01

    The use of radiological material and/or equipment that generate ionizing radiation is widely extended in biological research. In every laboratory there are a large variety of methods, operations, techniques, equipment, radioisotopes and users related to the work with ionizing radiation. In order to control the radioactive material, users and the whole facility a large number of documents, databases and information is necessary to be created by the manager of the Radioactivity Facility. This kind of information is characterized by a constant and persistent manipulation and includes information of great importance such as the general management of the radioactive material and waste management, exposed workers vigilance, controlled areas access, laboratory and equipment reservations, radiological inspections, etc. These activities are often complicated by the fact that the main manager of the radioactive facility is also in charge of bio-safety and working prevention issues so the documents to generate and manipulate and the procedures to develop are multiplied. A procedure to access and manage all these files is highly recommended in order to optimize the general management of the facility, avoiding loss of information, automating all the activities and allowing data necessary for control easily accessible. In this work we present a software application for a total management of the facility. This software has been developed by the collaboration of six of the most important research centers from Spain in coordination with the company 'Appize soluciones'. This is a flexible and versatile application that adapts to any specific need of every research center, providing the appropriate reports and checklist that speed up to general management and increase the ease of writing the official documents, including the Operations Book. (author)

  3. Development of interactive software for fuel management analysis

    International Nuclear Information System (INIS)

    Graves, H.W. Jr.

    1986-01-01

    Electronic computation plays a central part in engineering analysis of all types. Utilization of microcomputers for calculations that were formerly carried out on large mainframe computers presents a unique opportunity to develop software that not only takes advantage of the lower cost of using these machines, but also increases the efficiency of the engineers performing these calculations. This paper reviews the use of electronic computers in engineering analysis, discusses the potential for microcomputer utilization in this area, and describes a series of steps to be followed in software development that can yield significant gains in engineering design efficiency

  4. Procedure for Application of Software Reliability Growth Models to NPP PSA

    International Nuclear Information System (INIS)

    Son, Han Seong; Kang, Hyun Gook; Chang, Seung Cheol

    2009-01-01

    As the use of software increases at nuclear power plants (NPPs), the necessity for including software reliability and/or safety into the NPP Probabilistic Safety Assessment (PSA) rises. This work proposes an application procedure of software reliability growth models (RGMs), which are most widely used to quantify software reliability, to NPP PSA. Through the proposed procedure, it can be determined if a software reliability growth model can be applied to the NPP PSA before its real application. The procedure proposed in this work is expected to be very helpful for incorporating software into NPP PSA

  5. Software EpiData - Applications for Needs of Biotechnological Processes

    Directory of Open Access Journals (Sweden)

    Ljakova K.

    2007-12-01

    Full Text Available EpiData (free software for entering and documenting data is presented. Some aspect of this software is shown for needs of database system (DB and information systems (IS that can be used in bioprocess system.

  6. Application of database management software to probabilistic risk assessment calculations

    International Nuclear Information System (INIS)

    Wyss, G.D.

    1993-01-01

    Probabilistic risk assessment (PRA) calculations require the management and processing of large amounts of information. This data normally falls into two general categories. For example, a commercial nuclear power plant PRA study makes use of plant blueprints and system schematics, formal plant safety analysis reports, incident reports, letters, memos, handwritten notes from plant visits, and even the analyst's ''engineering judgment''. This information must be documented and cross-referenced in order to properly execute and substantiate the models used in a PRA study. The first category is composed of raw data that is accumulated from equipment testing and operational experiences. These data describe the equipment, its service or testing conditions, its failure mode, and its performance history. The second category is composed of statistical distributions. These distributions can represent probabilities, frequencies, or values of important parameters that are not time-related. Probability and frequency distributions are often obtained by fitting raw data to an appropriate statistical distribution. Database management software is used to store both types of data so that it can be readily queried, manipulated, and archived. This paper provides an overview of the information models used for storing PRA data and illustrates the implementation of these models using examples from current PRA software packages

  7. Development of Software for Measurement and Analysis of Solar Radiation

    International Nuclear Information System (INIS)

    Mohamad Idris Taib; Abul Adli Anuar; Noor Ezati Shuib

    2015-01-01

    This software was under development using LabVIEW to be using with StellarNet spectrometers system with USB communication to computer. LabVIEW have capabilities in hardware interfacing, graphical user interfacing and mathematical calculation including array manipulation and processing. This software read data from StellarNet spectrometer in real-time and then processed for analysis. Several measurement of solar radiation and analysis have been done. Solar radiation involved mainly infra-red, visible light and ultra-violet. With solar radiation spectrum data, information of weather and suitability of plant can be gathered and analyzed. Furthermore, optimization of utilization and safety precaution of solar radiation can be planned. Using this software, more research and development in utilization and safety of solar radiation can be explored. (author)

  8. A pioneering application of NQA-1 quality assurance standards in the development of software

    International Nuclear Information System (INIS)

    Weisbin, A.N.

    1988-01-01

    One reason for systematically applying quality assurance to computer software is the extensive use of results from computer programs to characterize potential sited for nuclear waste repositories leading ultimately to important policy making decisions. Because data from these programs characterize the likely radioactivity profile for many hundreds of years, experimental validation is not feasible. The Sensitivity and Uncertainty Analysis Methods Development Project (SUAMDP) was developed to formulate and utilize efficient and comprehensive methods for determining sensitivities of calculated results with respect to changes in all input parameters. The computerized methodology was embodied in the Gradient Enhanced Software System (GRESS). Due to the fact that GRESS was to be used in the site characterization for waste storage, stringent NQA-1 requirements were imposed by the sponsor. A working relationship between the Oak Ridge National Laboratory (ORNL) Quality Department and the research scientists developing GRESS was essential in achieving understanding and acceptance of the quality assurance requirements as applied to the SUAMDP. The relationship resulted in the SUAMDP becoming the first software project at ORNL to develop a comprehensive NQA-1 Quality Assurance Plan; this plan now serves as a model for software quality assurance at ORNL. This paper describes the evolution of this plan and its impact on the application of quality assurance procedures to software. 2 refs

  9. Innovation Goals in Software Development for Business Applications

    Directory of Open Access Journals (Sweden)

    Robert Arnold

    2014-11-01

    Full Text Available Having been in touch with technical side of many companies in various sectors, we know industry is facing a period of unprecedented change. We have compiled a list of common technological challenges in the sector that companies are facing in adapting to the change. The purpose of this contribution is to discuss and communicate areas where technological challenges in the Software Development Sector lie. We see this kind of inventory beneficial to the academic community as it provides an account of industry challenges. The analysis would be necessary to really assist players in the sector in being better prepared for formalizing and documenting their learning and know-how development. Beside knowledge management benefits, the analysis also would help in taking advantage of research and development funding and attracting investors; both academic and industrial organizations can take advantage of this aspect. We are calling this type of analysis “capabilities analysis”. A single company might not solve the world’s technical problems, but just being aware of them and measuring steps taken to advance, even slightly, in the direction of solving the problems presented in this analysis would make the company stand out. When data is formulated this way, strong evidence is created that the project goes beyond standard engineering by distinguishing risk that can be eliminated through experiment from standard engineering risk.

  10. NASGRO(registered trademark): Fracture Mechanics and Fatigue Crack Growth Analysis Software

    Science.gov (United States)

    Forman, Royce; Shivakumar, V.; Mettu, Sambi; Beek, Joachim; Williams, Leonard; Yeh, Feng; McClung, Craig; Cardinal, Joe

    2004-01-01

    This viewgraph presentation describes NASGRO, which is a fracture mechanics and fatigue crack growth analysis software package that is used to reduce risk of fracture in Space Shuttles. The contents include: 1) Consequences of Fracture; 2) NASA Fracture Control Requirements; 3) NASGRO Reduces Risk; 4) NASGRO Use Inside NASA; 5) NASGRO Components: Crack Growth Module; 6) NASGRO Components:Material Property Module; 7) Typical NASGRO analysis: Crack growth or component life calculation; and 8) NASGRO Sample Application: Orbiter feedline flowliner crack analysis.

  11. Building Models of Installations to Recommend Applications in IoT Software Ecosystems

    DEFF Research Database (Denmark)

    Tomlein, Matus; Grønbæk, Kaj

    2016-01-01

    Internet of Things devices are driven by applications. To stimulate innovation on their platforms, manufacturers of embedded products are creating software ecosystems that enable third-parties develop applications for their devices. Since such applications are developed externally, their seamless...

  12. A software tool for design of process monitoring and analysis systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2009-01-01

    A well designed process monitoring and analysis system is necessary to consistently achieve any predefined end product quality. Systematic computer aided methods and tools provide the means to design the necessary process monitoring and analysis systems and/or to validate any existing monitoring...... and analysis system. A software to achieve this has been developed. Two developed supporting tools for the design, a knowledge base (consisting of the process knowledge as well as the knowledge on measurement methods & tools) and a model library (consisting of the process operational models) have been extended...... rigorously and integrated with the user interface, which made the software more generic and applicable to a wide range of problems. The software for the design of a process monitoring and analysis system is presented and illustrated with a tablet manufacturing process example....

  13. HeteroGenius: A Framework for Hybrid Analysis of Heterogeneous Software Specifications

    Directory of Open Access Journals (Sweden)

    Manuel Giménez

    2014-01-01

    Full Text Available Nowadays, software artifacts are ubiquitous in our lives being an essential part of home appliances, cars, cell phones, and even in more critical activities like aeronautics and health sciences. In this context software failures may produce enormous losses, either economical or, in the worst case, in human lives. Software analysis is an area in software engineering concerned with the application of diverse techniques in order to prove the absence of errors in software pieces. In many cases different analysis techniques are applied by following specific methodological combinations that ensure better results. These interactions between tools are usually carried out at the user level and it is not supported by the tools. In this work we present HeteroGenius, a framework conceived to develop tools that allow users to perform hybrid analysis of heterogeneous software specifications. HeteroGenius was designed prioritising the possibility of adding new specification languages and analysis tools and enabling a synergic relation of the techniques under a graphical interface satisfying several well-known usability enhancement criteria. As a case-study we implemented the functionality of Dynamite on top of HeteroGenius.

  14. UTOOLS: microcomputer software for spatial analysis and landscape visualization.

    Science.gov (United States)

    Alan A. Ager; Robert J. McGaughey

    1997-01-01

    UTOOLS is a collection of programs designed to integrate various spatial data in a way that allows versatile spatial analysis and visualization. The programs were designed for watershed-scale assessments in which a wide array of resource data must be integrated, analyzed, and interpreted. UTOOLS software combines raster, attribute, and vector data into "spatial...

  15. Using Business Analysis Software in a Business Intelligence Course

    Science.gov (United States)

    Elizondo, Juan; Parzinger, Monica J.; Welch, Orion J.

    2011-01-01

    This paper presents an example of a project used in an undergraduate business intelligence class which integrates concepts from statistics, marketing, and information systems disciplines. SAS Enterprise Miner software is used as the foundation for predictive analysis and data mining. The course culminates with a competition and the project is used…

  16. WinDAM C earthen embankment internal erosion analysis software

    Science.gov (United States)

    Two primary causes of dam failure are overtopping and internal erosion. For the purpose of evaluating dam safety for existing earthen embankment dams and proposed earthen embankment dams, Windows Dam Analysis Modules C (WinDAM C) software will simulate either internal erosion or erosion resulting f...

  17. ANALYSIS OF CONTEMPORARY SOFTWARE BEING USED FOR FORWARDING SERVICES

    Directory of Open Access Journals (Sweden)

    Naumov, V.

    2013-01-01

    Full Text Available The role of information technologies in the forwarding services has been specified. The typical structure of the logistic sites providing the search of requests of freight owners and carriers has been described. The analysis of the software for transportation companies was conducted. The perspective directions of improvement of forwarding services process have been revealed.

  18. Evaluation of a Game to Teach Requirements Collection and Analysis in Software Engineering at Tertiary Education Level

    Science.gov (United States)

    Hainey, Thomas; Connolly, Thomas M.; Stansfield, Mark; Boyle, Elizabeth A.

    2011-01-01

    A highly important part of software engineering education is requirements collection and analysis which is one of the initial stages of the Database Application Lifecycle and arguably the most important stage of the Software Development Lifecycle. No other conceptual work is as difficult to rectify at a later stage or as damaging to the overall…

  19. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    Energy Technology Data Exchange (ETDEWEB)

    Habib, Salman [Argonne National Lab. (ANL), Argonne, IL (United States); Roser, Robert [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); LeCompte, Tom [Argonne National Lab. (ANL), Argonne, IL (United States); Marshall, Zach [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Borgland, Anders [SLAC National Accelerator Lab., Menlo Park, CA (United States); Viren, Brett [Brookhaven National Lab. (BNL), Upton, NY (United States); Nugent, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Asai, Makato [SLAC National Accelerator Lab., Menlo Park, CA (United States); Bauerdick, Lothar [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Finkel, Hal [Argonne National Lab. (ANL), Argonne, IL (United States); Gottlieb, Steve [Indiana Univ., Bloomington, IN (United States); Hoeche, Stefan [SLAC National Accelerator Lab., Menlo Park, CA (United States); Sheldon, Paul [Vanderbilt Univ., Nashville, TN (United States); Vay, Jean-Luc [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Elmer, Peter [Princeton Univ., NJ (United States); Kirby, Michael [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Patton, Simon [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Potekhin, Maxim [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Yanny, Brian [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Calafiura, Paolo [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Dart, Eli [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Gutsche, Oliver [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Izubuchi, Taku [Brookhaven National Lab. (BNL), Upton, NY (United States); Lyon, Adam [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Petravick, Don [Univ. of Illinois, Urbana-Champaign, IL (United States). National Center for Supercomputing Applications (NCSA)

    2015-10-29

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  20. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    Energy Technology Data Exchange (ETDEWEB)

    Habib, Salman [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Roser, Robert [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2015-10-28

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  1. HDX Workbench: Software for the Analysis of H/D Exchange MS Data

    Science.gov (United States)

    Pascal, Bruce D.; Willis, Scooter; Lauer, Janelle L.; Landgraf, Rachelle R.; West, Graham M.; Marciano, David; Novick, Scott; Goswami, Devrishi; Chalmers, Michael J.; Griffin, Patrick R.

    2012-09-01

    Hydrogen/deuterium exchange mass spectrometry (HDX-MS) is an established method for the interrogation of protein conformation and dynamics. While the data analysis challenge of HDX-MS has been addressed by a number of software packages, new computational tools are needed to keep pace with the improved methods and throughput of this technique. To address these needs, we report an integrated desktop program titled HDX Workbench, which facilitates automation, management, visualization, and statistical cross-comparison of large HDX data sets. Using the software, validated data analysis can be achieved at the rate of generation. The application is available at the project home page http://hdx.florida.scripps.edu.

  2. Frame of reference of software architecture for web applications and mobile

    Directory of Open Access Journals (Sweden)

    Carlos Alberto Maliza Martinez

    2016-08-01

    Full Text Available Given the need to have a guide for the implementation of informatics applications, and thus achieve automate tasks improving response times of users, designed the framework of reference the architecture software for web and mobile applications with technology free software and open source. The technology to be used is the Object Oriented Programming (OOP with JAVA programming language, a client / server architecture and style of multitier architecture, which will allow us to create scalable, robust and stable systems, together of Java Platform Enterprise Edition (JEE that helps us to implement business applications thanks to the JPA and EJB APIs. By the server for handling transactions, security, scalability and concurrency we have Wildfly application server. And on the client side, for creating graphical interfaces we use the ExtJS and Sencha Touch Framework, which are lightweight, high-performance libraries based on HTML5, JavaScript and CSS3. The report generator is JasperReports, because it has the ability to deliver rich content display and printer. The database engine is MySQL, because its connectivity, speed, and security make it a very appropriate server for access from the web. Finally, as editor of web and mobile applications, we have the integrated development environment Eclipse IDE platform of open source. In this paper we make a critical analysis of such applications and formulate the Framework of Software Architecture for the development and implementation of Web and Mobile Applications, which were implemented in the ECU911 Babahoyo and at the Instituto Tecnologico Superior Babahoyo, proving through its application their effectiveness and efficiency in the implementation of integrated systems

  3. MathBrowser: Web-Enabled Mathematical Software with Application to the Chemistry Curriculum, v 1.0

    Science.gov (United States)

    Goldsmith, Jack G.

    1997-10-01

    MathSoft: Cambridge, MA, 1996; free via ftp from www.mathsoft.com. The movement to provide computer-based applications in chemistry has come to focus on three main areas: software aimed at specific applications (drawing, simulation, data analysis, etc.), multimedia applications designed to assist in the presentation of conceptual information, and packages to be used in conjunction with a particular textbook at a specific point in the chemistry curriculum. The result is a situation where no single software package devoted to problem solving can be used across a large segment of the curriculum. Adoption of World Wide Web (WWW) technology by a manufacturer of mathematical software, however, has produced software that provides an attractive means of providing a problem-solving resource to students in courses from freshman through senior level.

  4. A Knowledge-based Environment for Software Process Performance Analysis

    Directory of Open Access Journals (Sweden)

    Natália Chaves Lessa Schots

    2015-08-01

    Full Text Available Background: Process performance analysis is a key step for implementing continuous improvement in software organizations. However, the knowledge to execute such analysis is not trivial and the person responsible to executing it must be provided with appropriate support. Aim: This paper presents a knowledge-based environment, named SPEAKER, proposed for supporting software organizations during the execution of process performance analysis. SPEAKER comprises a body of knowledge and a set of activities and tasks for software process performance analysis along with supporting tools to executing these activities and tasks. Method: We conducted an informal literature reviews and a systematic mapping study, which provided basic requirements for the proposed environment. We implemented the SPEAKER environment integrating supporting tools for the execution of activities and tasks of performance analysis and the knowledge necessary to execute them, in order to meet the variability presented by the characteristics of these activities. Results: In this paper, we describe each SPEAKER module and the individual evaluations of these modules, and also present an example of use comprising how the environment can guide the user through a specific performance analysis activity. Conclusion: Although we only conducted individual evaluations of SPEAKER’s modules, the example of use indicates the feasibility of the proposed environment. Therefore, the environment as a whole will be further evaluated to verify if it attains its goal of assisting in the execution of process performance analysis by non-specialist people.

  5. Coordination and organization of security software process for power information application environment

    Science.gov (United States)

    Wang, Qiang

    2017-09-01

    As an important part of software engineering, the software process decides the success or failure of software product. The design and development feature of security software process is discussed, so is the necessity and the present significance of using such process. Coordinating the function software, the process for security software and its testing are deeply discussed. The process includes requirement analysis, design, coding, debug and testing, submission and maintenance. In each process, the paper proposed the subprocesses to support software security. As an example, the paper introduces the above process into the power information platform.

  6. Synchronized analysis of testbeam data with the Judith software

    CERN Document Server

    McGoldrick, Garrin; Gorišek, Andrej

    2014-01-01

    The Judith software performs pixel detector analysis tasks utilizing two different data streams such as those produced by the reference and tested devices typically found in a testbeam. This software addresses and fixes problems arising from the desynchronization of the two simultaneously triggered data streams by detecting missed triggers in either of the streams. The software can perform all tasks required to generate particle tracks using multiple detector planes: it can align the planes, cluster hits and generate tracks from these clusters. This information can then be used to measure the properties of a particle detector with very fine spatial resolution. It was tested at DESY in the Kartel telescope, a silicon tracking detector, with ATLAS Diamond Beam Monitor modules as a device under test.

  7. A theoretical basis for the analysis of multiversion software subject to coincident errors

    Science.gov (United States)

    Eckhardt, D. E., Jr.; Lee, L. D.

    1985-01-01

    Fundamental to the development of redundant software techniques (known as fault-tolerant software) is an understanding of the impact of multiple joint occurrences of errors, referred to here as coincident errors. A theoretical basis for the study of redundant software is developed which: (1) provides a probabilistic framework for empirically evaluating the effectiveness of a general multiversion strategy when component versions are subject to coincident errors, and (2) permits an analytical study of the effects of these errors. An intensity function, called the intensity of coincident errors, has a central role in this analysis. This function describes the propensity of programmers to introduce design faults in such a way that software components fail together when executing in the application environment. A condition under which a multiversion system is a better strategy than relying on a single version is given.

  8. One-Click Data Analysis Software for Science Operations

    Science.gov (United States)

    Navarro, Vicente

    2015-12-01

    One of the important activities of ESA Science Operations Centre is to provide Data Analysis Software (DAS) to enable users and scientists to process data further to higher levels. During operations and post-operations, Data Analysis Software (DAS) is fully maintained and updated for new OS and library releases. Nonetheless, once a Mission goes into the "legacy" phase, there are very limited funds and long-term preservation becomes more and more difficult. Building on Virtual Machine (VM), Cloud computing and Software as a Service (SaaS) technologies, this project has aimed at providing long-term preservation of Data Analysis Software for the following missions: - PIA for ISO (1995) - SAS for XMM-Newton (1999) - Hipe for Herschel (2009) - EXIA for EXOSAT (1983) Following goals have guided the architecture: - Support for all operations, post-operations and archive/legacy phases. - Support for local (user's computer) and cloud environments (ESAC-Cloud, Amazon - AWS). - Support for expert users, requiring full capabilities. - Provision of a simple web-based interface. This talk describes the architecture, challenges, results and lessons learnt gathered in this project.

  9. Comparison of two three-dimensional cephalometric analysis computer software.

    Science.gov (United States)

    Sawchuk, Dena; Alhadlaq, Adel; Alkhadra, Thamer; Carlyle, Terry D; Kusnoto, Budi; El-Bialy, Tarek

    2014-10-01

    Three-dimensional cephalometric analyses are getting more attraction in orthodontics. The aim of this study was to compare two softwares to evaluate three-dimensional cephalometric analyses of orthodontic treatment outcomes. Twenty cone beam computed tomography images were obtained using i-CAT(®) imaging system from patient's records as part of their regular orthodontic records. The images were analyzed using InVivoDental5.0 (Anatomage Inc.) and 3DCeph™ (University of Illinois at Chicago, Chicago, IL, USA) software. Before and after orthodontic treatments data were analyzed using t-test. Reliability test using interclass correlation coefficient was stronger for InVivoDental5.0 (0.83-0.98) compared with 3DCeph™ (0.51-0.90). Paired t-test comparison of the two softwares shows no statistical significant difference in the measurements made in the two softwares. InVivoDental5.0 measurements are more reproducible and user friendly when compared to 3DCeph™. No statistical difference between the two softwares in linear or angular measurements. 3DCeph™ is more time-consuming in performing three-dimensional analysis compared with InVivoDental5.0.

  10. Automated pre-processing and multivariate vibrational spectra analysis software for rapid results in clinical settings

    Science.gov (United States)

    Bhattacharjee, T.; Kumar, P.; Fillipe, L.

    2018-02-01

    Vibrational spectroscopy, especially FTIR and Raman, has shown enormous potential in disease diagnosis, especially in cancers. Their potential for detecting varied pathological conditions are regularly reported. However, to prove their applicability in clinics, large multi-center multi-national studies need to be undertaken; and these will result in enormous amount of data. A parallel effort to develop analytical methods, including user-friendly software that can quickly pre-process data and subject them to required multivariate analysis is warranted in order to obtain results in real time. This study reports a MATLAB based script that can automatically import data, preprocess spectra— interpolation, derivatives, normalization, and then carry out Principal Component Analysis (PCA) followed by Linear Discriminant Analysis (LDA) of the first 10 PCs; all with a single click. The software has been verified on data obtained from cell lines, animal models, and in vivo patient datasets, and gives results comparable to Minitab 16 software. The software can be used to import variety of file extensions, asc, .txt., .xls, and many others. Options to ignore noisy data, plot all possible graphs with PCA factors 1 to 5, and save loading factors, confusion matrices and other parameters are also present. The software can provide results for a dataset of 300 spectra within 0.01 s. We believe that the software will be vital not only in clinical trials using vibrational spectroscopic data, but also to obtain rapid results when these tools get translated into clinics.

  11. Cloud based, Open Source Software Application for Mitigating Herbicide Drift

    Science.gov (United States)

    Saraswat, D.; Scott, B.

    2014-12-01

    The spread of herbicide resistant weeds has resulted in the need for clearly marked fields. In response to this need, the University of Arkansas Cooperative Extension Service launched a program named Flag the Technology in 2011. This program uses color-coded flags as a visual alert of the herbicide trait technology within a farm field. The flag based program also serves to help avoid herbicide misapplication and prevent herbicide drift damage between fields with differing crop technologies. This program has been endorsed by Southern Weed Science Society of America and is attracting interest from across the USA, Canada, and Australia. However, flags have risk of misplacement or disappearance due to mischief or severe windstorms/thunderstorms, respectively. This presentation will discuss the design and development of a cloud-based, free application utilizing open-source technologies, called Flag the Technology Cloud (FTTCloud), for allowing agricultural stakeholders to color code their farm fields for indicating herbicide resistant technologies. The developed software utilizes modern web development practices, widely used design technologies, and basic geographic information system (GIS) based interactive interfaces for representing, color-coding, searching, and visualizing fields. This program has also been made compatible for a wider usability on different size devices- smartphones, tablets, desktops and laptops.

  12. A software-radio front-end for microwave applications

    Directory of Open Access Journals (Sweden)

    M. Streifinger

    2003-01-01

    Full Text Available In modern communication, sensor and signal processing systems digitisation methods are gaining importance. They allow for building software configurable systems and provide better stability and reproducibility. Moreover digital front-ends cover a wider range of applications and have better performance compared with analog ones. The quest for new architectures in radio frequency front-ends is a clear consequence of the ever increasing number of different standards and the resulting task to provide a platform which covers as many standards as possible. At microwave frequencies, in particular at frequencies beyond 10 GHz, no direct sampling receivers are available yet. A look at the roadmap of the development of commercial analog-to-digital-converters (ADC shows clearly, that they can neither be expected in near future. We present a novel architecture, which is capable of direct sampling of band-limited signals at frequencies beyond 10 GHz by means of an over-sampling technique. The wellknown Nyquist criterion states that wide-band digitisation of an RF-signal with a maximum frequency ƒ requires a minimum sampling rate of 2 · ƒ . But for a band-limited signal of bandwidth B the demands for the minimum sampling rate of the ADC relax to the value 2 · B. Employing a noise-forming sigma-delta ADC architecture even with a 1-bit-ADC a signal-to-noise ratio sufficient for many applications can be achieved. The key component of this architecture is the sample-and-hold switch. The required bandwidth of this switch must be well above 2 · ƒ . We designed, fabricated and characterized a preliminary demonstrator for the ISM-band at 2.4 GHz employing silicon Schottky diodes as a switch and SiGe-based MMICs as impedance transformers and comparators. Simulated and measured results will be presented.

  13. Veterinary software application for comparison of thermograms for pathology evaluation

    Science.gov (United States)

    Pant, Gita; Umbaugh, Scott E.; Dahal, Rohini; Lama, Norsang; Marino, Dominic J.; Sackman, Joseph

    2017-09-01

    The bilateral symmetry property in mammals allows for the detection of pathology by comparison of opposing sides. For any pathological disorder, thermal patterns differ compared to the normal body part. A software application for veterinary clinics has been under development to input two thermograms of body parts on both sides, one normal and the other unknown, and the application compares them based on extracted features and appropriate similarity and difference measures and outputs the likelihood of pathology. Here thermographic image data from 19° C to 40° C was linearly remapped to create images with 256 gray level values. Features were extracted from these images, including histogram, texture and spectral features. The comparison metrics used are the vector inner product, Tanimoto, Euclidean, city block, Minkowski and maximum value metric. Previous research with the anterior cruciate ligament (ACL) pathology in dogs suggested any thermogram variation below a threshold of 40% of Euclidean distance is normal and above 40% is abnormal. Here the 40% threshold was applied to a new ACL image set and achieved a sensitivity of 75%, an improvement from the 55% sensitivity of the previous work. With the new data set it was determined that using a threshold of 20% provided a much improved 92% sensitivity metric. However, this will require further research to determine the corresponding specificity success rate. Additionally, it was found that the anterior view provided better results than the lateral view. It was also determined that better results were obtained with all three feature sets than with just the histogram and texture sets. Further experiments are ongoing with larger image datasets, and pathologies, new features and comparison metric evaluation for determination of more accurate threshold values to separate normal and abnormal images.

  14. Failure mode and effects analysis of software-based automation systems

    International Nuclear Information System (INIS)

    Haapanen, P.; Helminen, A.

    2002-08-01

    Failure mode and effects analysis (FMEA) is one of the well-known analysis methods having an established position in the traditional reliability analysis. The purpose of FMEA is to identify possible failure modes of the system components, evaluate their influences on system behaviour and propose proper countermeasures to suppress these effects. The generic nature of FMEA has enabled its wide use in various branches of industry reaching from business management to the design of spaceships. The popularity and diverse use of the analysis method has led to multiple interpretations, practices and standards presenting the same analysis method. FMEA is well understood at the systems and hardware levels, where the potential failure modes usually are known and the task is to analyse their effects on system behaviour. Nowadays, more and more system functions are realised on software level, which has aroused the urge to apply the FMEA methodology also on software based systems. Software failure modes generally are unknown - 'software modules do not fail, they only display incorrect behaviour' - and depend on dynamic behaviour of the application. These facts set special requirements on the FMEA of software based systems and make it difficult to realise. In this report the failure mode and effects analysis is studied for the use of reliability analysis of software-based systems. More precisely, the target system of FMEA is defined to be a safety-critical software-based automation application in a nuclear power plant, implemented on an industrial automation system platform. Through a literature study the report tries to clarify the intriguing questions related to the practical use of software failure mode and effects analysis. The study is a part of the research project 'Programmable Automation System Safety Integrity assessment (PASSI)', belonging to the Finnish Nuclear Safety Research Programme (FINNUS, 1999-2002). In the project various safety assessment methods and tools for

  15. Development of Software for Automatic Analysis of Intervention in the Field of Homeopathy.

    Science.gov (United States)

    Jain, Rajesh Kumar; Goyal, Shagun; Bhat, Sushma N; Rao, Srinath; Sakthidharan, Vivek; Kumar, Prasanna; Sajan, Kannanaikal Rappayi; Jindal, Sameer Kumar; Jindal, Ghanshyam D

    2018-05-01

    To study the effect of homeopathic medicines (in higher potencies) in normal subjects, Peripheral Pulse Analyzer (PPA) has been used to record physiologic variability parameters before and after administration of the medicine/placebo in 210 normal subjects. Data have been acquired in seven rounds; placebo was administered in rounds 1 and 2 and medicine in potencies 6, 30, 200, 1 M, and 10 M was administered in rounds 3 to 7, respectively. Five different medicines in the said potencies were given to a group of around 40 subjects each. Although processing of data required human intervention, a software application has been developed to analyze the processed data and detect the response to eliminate the undue delay as well as human bias in subjective analysis. This utility named Automatic Analysis of Intervention in the Field of Homeopathy is run on the processed PPA data and the outcome has been compared with the manual analysis. The application software uses adaptive threshold based on statistics for detecting responses in contrast to fixed threshold used in manual analysis. The automatic analysis has detected 12.96% higher responses than subjective analysis. Higher response rates have been manually verified to be true positive. This indicates robustness of the application software. The automatic analysis software was run on another set of pulse harmonic parameters derived from the same data set to study cardiovascular susceptibility and 385 responses were detected in contrast to 272 of variability parameters. It was observed that 65% of the subjects, eliciting response, were common. This not only validates the software utility for giving consistent yield but also reveals the certainty of the response. This development may lead to electronic proving of homeopathic medicines (e-proving).

  16. Formal synthesis of application and platform behaviors of embedded software systems

    DEFF Research Database (Denmark)

    Kim, Jin Hyun; Kang, Inhye; Choi, Jin-Young

    2015-01-01

    Two main embedded software components, application software and platform software, i.e., the real-time operating system (RTOS), interact with each other in order to achieve the functionality of the system. However, they are so different in behaviors that one behavior modeling language is not suff......Two main embedded software components, application software and platform software, i.e., the real-time operating system (RTOS), interact with each other in order to achieve the functionality of the system. However, they are so different in behaviors that one behavior modeling language...... is not sufficient to model both styles of behaviors and to reason about the characteristics of their individual behaviors as well as their parallel behavior and interaction properties. In this paper, we present a formal approach to the synthesis of the application software and the RTOS behavior models...

  17. Bonaparte: Application of new software for missing persons program

    NARCIS (Netherlands)

    van Dongen, C.J.J.; Slooten, K.; Slagter, M.; Burgers, W.; Wiegerinck, W.

    2011-01-01

    The Netherlands Forensic Institute (NFI), together with SNN at Radboud University Nijmegen, have developed new software for pedigree matching which can handle autosomal, Y chromosomal and mitochondrial DNA profiles. Initially this software, called Bonaparte, has been developed for DNA DVI. Bonaparte

  18. The Enterprise Derivative Application: Flexible Software for Optimizing Manufacturing Processes

    Energy Technology Data Exchange (ETDEWEB)

    Ward, Richard C [ORNL; Allgood, Glenn O [ORNL; Knox, John R [ORNL

    2008-11-01

    The Enterprise Derivative Application (EDA) implements the enterprise-derivative analysis for optimization of an industrial process (Allgood and Manges, 2001). It is a tool to help industry planners choose the most productive way of manufacturing their products while minimizing their cost. Developed in MS Access, the application allows users to input initial data ranging from raw material to variable costs and enables the tracking of specific information as material is passed from one process to another. Energy-derivative analysis is based on calculation of sensitivity parameters. For the specific application to a steel production process these include: the cost to product sensitivity, the product to energy sensitivity, the energy to efficiency sensitivity, and the efficiency to cost sensitivity. Using the EDA, for all processes the user can display a particular sensitivity or all sensitivities can be compared for all processes. Although energy-derivative analysis was originally designed for use by the steel industry, it is flexible enough to be applied to many other industrial processes. Examples of processes where energy-derivative analysis would prove useful are wireless monitoring of processes in the petroleum cracking industry and wireless monitoring of motor failure for determining the optimum time to replace motor parts. One advantage of the MS Access-based application is its flexibility in defining the process flow and establishing the relationships between parent and child process and products resulting from a process. Due to the general design of the program, a process can be anything that occurs over time with resulting output (products). So the application can be easily modified to many different industrial and organizational environments. Another advantage is the flexibility of defining sensitivity parameters. Sensitivities can be determined between all possible variables in the process flow as a function of time. Thus the dynamic development of the

  19. STARS software tool for analysis of reliability and safety

    International Nuclear Information System (INIS)

    Poucet, A.; Guagnini, E.

    1989-01-01

    This paper reports on the STARS (Software Tool for the Analysis of Reliability and Safety) project aims at developing an integrated set of Computer Aided Reliability Analysis tools for the various tasks involved in systems safety and reliability analysis including hazard identification, qualitative analysis, logic model construction and evaluation. The expert system technology offers the most promising perspective for developing a Computer Aided Reliability Analysis tool. Combined with graphics and analysis capabilities, it can provide a natural engineering oriented environment for computer assisted reliability and safety modelling and analysis. For hazard identification and fault tree construction, a frame/rule based expert system is used, in which the deductive (goal driven) reasoning and the heuristic, applied during manual fault tree construction, is modelled. Expert system can explain their reasoning so that the analyst can become aware of the why and the how results are being obtained. Hence, the learning aspect involved in manual reliability and safety analysis can be maintained and improved

  20. STAMPS: development and verification of swallowing kinematic analysis software.

    Science.gov (United States)

    Lee, Woo Hyung; Chun, Changmook; Seo, Han Gil; Lee, Seung Hak; Oh, Byung-Mo

    2017-10-17

    Swallowing impairment is a common complication in various geriatric and neurodegenerative diseases. Swallowing kinematic analysis is essential to quantitatively evaluate the swallowing motion of the oropharyngeal structures. This study aims to develop a novel swallowing kinematic analysis software, called spatio-temporal analyzer for motion and physiologic study (STAMPS), and verify its validity and reliability. STAMPS was developed in MATLAB, which is one of the most popular platforms for biomedical analysis. This software was constructed to acquire, process, and analyze the data of swallowing motion. The target of swallowing structures includes bony structures (hyoid bone, mandible, maxilla, and cervical vertebral bodies), cartilages (epiglottis and arytenoid), soft tissues (larynx and upper esophageal sphincter), and food bolus. Numerous functions are available for the spatiotemporal parameters of the swallowing structures. Testing for validity and reliability was performed in 10 dysphagia patients with diverse etiologies and using the instrumental swallowing model which was designed to mimic the motion of the hyoid bone and the epiglottis. The intra- and inter-rater reliability tests showed excellent agreement for displacement and moderate to excellent agreement for velocity. The Pearson correlation coefficients between the measured and instrumental reference values were nearly 1.00 (P software is expected to be useful for researchers who are interested in the swallowing motion analysis.

  1. Effectiveness of an Automatic Tracking Software in Underwater Motion Analysis

    Directory of Open Access Journals (Sweden)

    Fabrício A. Magalhaes

    2013-12-01

    Full Text Available Tracking of markers placed on anatomical landmarks is a common practice in sports science to perform the kinematic analysis that interests both athletes and coaches. Although different software programs have been developed to automatically track markers and/or features, none of them was specifically designed to analyze underwater motion. Hence, this study aimed to evaluate the effectiveness of a software developed for automatic tracking of underwater movements (DVP, based on the Kanade-Lucas-Tomasi feature tracker. Twenty-one video recordings of different aquatic exercises (n = 2940 markers’ positions were manually tracked to determine the markers’ center coordinates. Then, the videos were automatically tracked using DVP and a commercially available software (COM. Since tracking techniques may produce false targets, an operator was instructed to stop the automatic procedure and to correct the position of the cursor when the distance between the calculated marker’s coordinate and the reference one was higher than 4 pixels. The proportion of manual interventions required by the software was used as a measure of the degree of automation. Overall, manual interventions were 10.4% lower for DVP (7.4% than for COM (17.8%. Moreover, when examining the different exercise modes separately, the percentage of manual interventions was 5.6% to 29.3% lower for DVP than for COM. Similar results were observed when analyzing the type of marker rather than the type of exercise, with 9.9% less manual interventions for DVP than for COM. In conclusion, based on these results, the developed automatic tracking software presented can be used as a valid and useful tool for underwater motion analysis.

  2. Hardware and software constructs for a vibration analysis network

    International Nuclear Information System (INIS)

    Cook, S.A.; Crowe, R.D.; Toffer, H.

    1985-01-01

    Vibration level monitoring and analysis has been initiated at N Reactor, the dual purpose reactor operated at Hanford, Washington by UNC Nuclear Industries (UNC) for the Department of Energy (DOE). The machinery to be monitored was located in several buildings scattered over the plant site, necessitating an approach using satellite stations to collect, monitor and temporarily store data. The satellite stations are, in turn, linked to a centralized processing computer for further analysis. The advantages of a networked data analysis system are discussed in this paper along with the hardware and software required to implement such a system

  3. Calibration Analysis Software for the ATLAS Pixel Detector

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00372086; The ATLAS collaboration

    2016-01-01

    The calibration of the ATLAS Pixel detector at LHC fulfils two main purposes: to tune the front-end configuration parameters for establishing the best operational settings and to measure the tuning performance through a subset of scans. An analysis framework has been set up in order to take actions on the detector given the outcome of a calibration scan (e.g. to create a mask for disabling noisy pixels). The software framework to control all aspects of the Pixel detector scans and analyses is called Calibration Console. The introduction of a new layer, equipped with new Front End-I4 Chips, required an update the Console architecture. It now handles scans and scans analyses applied together to chips with different characteristics. An overview of the newly developed Calibration Analysis Software will be presented, together with some preliminary result.

  4. Using Statistical Analysis Software to Advance Nitro Plasticizer Wettability

    Energy Technology Data Exchange (ETDEWEB)

    Shear, Trevor Allan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-29

    Statistical analysis in science is an extremely powerful tool that is often underutilized. Additionally, it is frequently the case that data is misinterpreted or not used to its fullest extent. Utilizing the advanced software JMP®, many aspects of experimental design and data analysis can be evaluated and improved. This overview will detail the features of JMP® and how they were used to advance a project, resulting in time and cost savings, as well as the collection of scientifically sound data. The project analyzed in this report addresses the inability of a nitro plasticizer to coat a gold coated quartz crystal sensor used in a quartz crystal microbalance. Through the use of the JMP® software, the wettability of the nitro plasticizer was increased by over 200% using an atmospheric plasma pen, ensuring good sample preparation and reliable results.

  5. Spectrum analysis on quality requirements consideration in software design documents.

    Science.gov (United States)

    Kaiya, Haruhiko; Umemura, Masahiro; Ogata, Shinpei; Kaijiri, Kenji

    2013-12-01

    Software quality requirements defined in the requirements analysis stage should be implemented in the final products, such as source codes and system deployment. To guarantee this meta-requirement, quality requirements should be considered in the intermediate stages, such as the design stage or the architectural definition stage. We propose a novel method for checking whether quality requirements are considered in the design stage. In this method, a technique called "spectrum analysis for quality requirements" is applied not only to requirements specifications but also to design documents. The technique enables us to derive the spectrum of a document, and quality requirements considerations in the document are numerically represented in the spectrum. We can thus objectively identify whether the considerations of quality requirements in a requirements document are adapted to its design document. To validate the method, we applied it to commercial software systems with the help of a supporting tool, and we confirmed that the method worked well.

  6. Quality assurance of EDP software in practical application

    International Nuclear Information System (INIS)

    Winkler, H.

    1982-01-01

    Alongside the specific properties of the soft software, it is mainly points outside the traditional testing field which apply for the quality assurance thereof. Measures for quality assurance must in particular, start in the development. This presupposes a partial-result orientated development process of software. Due to the high qualitative demands, implements for testing and inspection are of great importance. The problems in software quality assurance are typical for a young technical field where the necessity of which is indisputed, but which has to effect on an empirical-pragmatical level still, due to insufficient scientific foundation. (orig.) [de

  7. Software for a measuring facility for activation analysis

    International Nuclear Information System (INIS)

    De Keyser, A.; De Roost, E.

    1985-01-01

    A software package has been developed for an APPLE P.C. The programs are intended to control an automated measuring station for photon activation analysis at GELINA, the linear accelerator of C.B.N.M. at Geel (Belgium). They allow to set-up a measuring scheme, to execute it under computer control, to accumulate and store 2 K-spectra using a built-in ADC and to output the results as listings, plots or evaluated reports

  8. Phenomenology and Qualitative Data Analysis Software (QDAS): A Careful Reconciliation

    OpenAIRE

    Brian Kelleher Sohn

    2017-01-01

    An oft-cited phenomenological methodologist, Max VAN MANEN (2014), claims that qualitative data analysis software (QDAS) is not an appropriate tool for phenomenological research. Yet phenomenologists rarely describe how phenomenology is to be done: pencil, paper, computer? DAVIDSON and DI GREGORIO (2011) urge QDAS contrarians such as VAN MANEN to get over their methodological loyalties and join the digital world, claiming that all qualitative researchers, whatever their methodology, perform p...

  9. Strategic Business-IT alignment of application software packages: Bridging the Information Technology gap

    Directory of Open Access Journals (Sweden)

    Wandi Kruger

    2012-09-01

    Full Text Available An application software package implementation is a complex endeavour, and as such it requires the proper understanding, evaluation and redefining of the current business processes to ensure that the implementation delivers on the objectives set at the start of the project. Numerous factors exist that may contribute to the unsuccessful implementation of application software packages. However, the most significant contributor to the failure of an application software package implementation lies in the misalignment of the organisation’s business processes with the functionality of the application software package. Misalignment is attributed to a gap that exists between the business processes of an organisation and what functionality the application software package has to offer to translate the business processes of an organisation into digital form when implementing and configuring an application software package. This gap is commonly referred to as the information technology (IT gap. This study proposes to define and discuss the IT gap. Furthermore this study will make recommendations for aligning the business processes with the functionality of the application software package (addressing the IT gap. The end result of adopting these recommendations will be more successful application software package implementations.

  10. An interactive end-user software application for a deep-sea photographic database

    Digital Repository Service at National Institute of Oceanography (India)

    Jaisankar, S.; Sharma, R.

    . The software is the first of its kind in deep-sea applications and it also attempts to educate the user about deep-sea photography. The application software is developed by modifying established routines and by creating new routines to save the retrieved...

  11. A virtualized software based on the NVIDIA cuFFT library for image denoising: performance analysis

    DEFF Research Database (Denmark)

    Galletti, Ardelio; Marcellino, Livia; Montella, Raffaele

    2017-01-01

    Abstract Generic Virtualization Service (GVirtuS) is a new solution for enabling GPGPU on Virtual Machines or low powered devices. This paper focuses on the performance analysis that can be obtained using a GPGPU virtualized software. Recently, GVirtuS has been extended in order to support CUDA...... ancillary libraries with good results. Here, our aim is to analyze the applicability of this powerful tool to a real problem, which uses the NVIDIA cuFFT library. As case study we consider a simple denoising algorithm, implementing a virtualized GPU-parallel software based on the convolution theorem...

  12. SIMA: Python software for analysis of dynamic fluorescence imaging data

    Directory of Open Access Journals (Sweden)

    Patrick eKaifosh

    2014-09-01

    Full Text Available Fluorescence imaging is a powerful method for monitoring dynamic signals in the nervous system. However, analysis of dynamic fluorescence imaging data remains burdensome, in part due to the shortage of available software tools. To address this need, we have developed SIMA, an open source Python package that facilitates common analysis tasks related to fluorescence imaging. Functionality of this package includes correction of motion artifacts occurring during in vivo imaging with laser-scanning microscopy, segmentation of imaged fields into regions of interest (ROIs, and extraction of signals from the segmented ROIs. We have also developed a graphical user interface (GUI for manual editing of the automatically segmented ROIs and automated registration of ROIs across multiple imaging datasets. This software has been designed with flexibility in mind to allow for future extension with different analysis methods and potential integration with other packages. Software, documentation, and source code for the SIMA package and ROI Buddy GUI are freely available at http://www.losonczylab.org/sima/.

  13. Nuclear analysis software. Pt. 1: Spectrum transfer and reformatting (SPEDAC)

    International Nuclear Information System (INIS)

    1991-01-01

    GANAAS (Gamma, Activity, and Neutron Activation Analysis System) is one in the family of software packages developed under the auspices of the International Atomic Energy Agency. Primarily, the package was intended to support the IAEA Technical Assistance and Cooperation projects in developing countries. However, it is open domain software that can be copied and used by anybody, except for commercial purposes. All the nuclear analysis software provided by the IAEA has the same design philosophy and similar structure. The intention was to provide the user with maximum flexibility, at the same time with a simple and logical organization that requires minimum digging through the manuals. GANAAS is a modular system. It consists of several programmes that can be installed on the hard disk as the are needed. Obviously, some parts of they system are required in all cases. Those are installed at the beginning, without consulting the operator. GANAAS offers the opportunity to expand and improve the system. The gamma spectrum evaluation programmes using different fitting algorithms can be added to GANAAS, under the condition that the format of their input and output files corresponds to the rules of GANAAS. The same applies to the quantitative analysis parts of the programme

  14. Freud: a software suite for high-throughput simulation analysis

    Science.gov (United States)

    Harper, Eric; Spellings, Matthew; Anderson, Joshua; Glotzer, Sharon

    Computer simulation is an indispensable tool for the study of a wide variety of systems. As simulations scale to fill petascale and exascale supercomputing clusters, so too does the size of the data produced, as well as the difficulty in analyzing these data. We present Freud, an analysis software suite for efficient analysis of simulation data. Freud makes no assumptions about the system being analyzed, allowing for general analysis methods to be applied to nearly any type of simulation. Freud includes standard analysis methods such as the radial distribution function, as well as new methods including the potential of mean force and torque and local crystal environment analysis. Freud combines a Python interface with fast, parallel C + + analysis routines to run efficiently on laptops, workstations, and supercomputing clusters. Data analysis on clusters reduces data transfer requirements, a prohibitive cost for petascale computing. Used in conjunction with simulation software, Freud allows for smart simulations that adapt to the current state of the system, enabling the study of phenomena such as nucleation and growth, intelligent investigation of phases and phase transitions, and determination of effective pair potentials.

  15. Suma-alpha software description. Study of its applications to detection problems and environmental radioactivity measurements

    International Nuclear Information System (INIS)

    Gasco, C.; Perez, C.

    2010-01-01

    Software named suma-espectros has been developed by TECNASA/CIEMAT for adding counts automatically from the alpha spectra, energy to energy, with the purpose of: evaluating real background of alpha spectrometers, studying its temporal variations, increasing the possibilities of isotopes detection -where it has been impossible to detect due elapsed time of the measurement- and implementing other applications. The programme is written in Visual-Basic and it can export data to Excel spreadsheets for later treatment. The software has established by default a channels range for adding the counts energy by energy but it can be adapted to the analysis of different isotopes and backgrounds simply changing a text file that is incorporated to the programme. The description of the programme management is described for whoever can realise its applications immediately. This software has the advantage of emitting an add-spectrum in cnf format that is used by alpha analyst (Genie 2K) for de convoluting spectra or doing calculations. (Author) 3 refs.

  16. Integrating open-source software applications to build molecular dynamics systems.

    Science.gov (United States)

    Allen, Bruce M; Predecki, Paul K; Kumosa, Maciej

    2014-04-05

    Three open-source applications, NanoEngineer-1, packmol, and mis2lmp are integrated using an open-source file format to quickly create molecular dynamics (MD) cells for simulation. The three software applications collectively make up the open-source software (OSS) suite known as MD Studio (MDS). The software is validated through software engineering practices and is verified through simulation of the diglycidyl ether of bisphenol-a and isophorone diamine (DGEBA/IPD) system. Multiple simulations are run using the MDS software to create MD cells, and the data generated are used to calculate density, bulk modulus, and glass transition temperature of the DGEBA/IPD system. Simulation results compare well with published experimental and numerical results. The MDS software prototype confirms that OSS applications can be analyzed against real-world research requirements and integrated to create a new capability. Copyright © 2014 Wiley Periodicals, Inc.

  17. Application of NX Siemens PLM software in educational process in preparing students of engineering branch

    Science.gov (United States)

    Sadchikova, G. M.

    2017-01-01

    This article discusses the results of the introduction of computer-aided design NX by Siemens Plm Software to the classes of a higher education institution. The necessity of application of modern information technologies in teaching students of engineering profile and selection of a software product is substantiated. The author describes stages of the software module study in relation to some specific courses, considers the features of NX software, which require the creation of standard and unified product databases. The article also gives examples of research carried out by the students with the various software modules.

  18. Open source software and crowdsourcing for energy analysis

    International Nuclear Information System (INIS)

    Bazilian, Morgan; Rice, Andrew; Rotich, Juliana; Howells, Mark; DeCarolis, Joseph; Macmillan, Stuart; Brooks, Cameron; Bauer, Florian; Liebreich, Michael

    2012-01-01

    Informed energy decision making requires effective software, high-quality input data, and a suitably trained user community. Developing these resources can be expensive and time consuming. Even when data and tools are intended for public re-use they often come with technical, legal, economic and social barriers that make them difficult to adopt, adapt and combine for use in new contexts. We focus on the promise of open, publically accessible software and data as well as crowdsourcing techniques to develop robust energy analysis tools that can deliver crucial, policy-relevant insight, particularly in developing countries, where planning resources are highly constrained—and the need to adapt these resources and methods to the local context is high. We survey existing research, which argues that these techniques can produce high-quality results, and also explore the potential role that linked, open data can play in both supporting the modelling process and in enhancing public engagement with energy issues. - Highlights: ► We focus on the promise of open, publicly accessible software and data. ► These emerging techniques can produce high-quality results for energy analysis. ► Developing economies require new techniques for energy planning.

  19. PuMA: the Porous Microstructure Analysis software

    Science.gov (United States)

    Ferguson, Joseph C.; Panerai, Francesco; Borner, Arnaud; Mansour, Nagi N.

    2018-01-01

    The Porous Microstructure Analysis (PuMA) software has been developed in order to compute effective material properties and perform material response simulations on digitized microstructures of porous media. PuMA is able to import digital three-dimensional images obtained from X-ray microtomography or to generate artificial microstructures. PuMA also provides a module for interactive 3D visualizations. Version 2.1 includes modules to compute porosity, volume fractions, and surface area. Two finite difference Laplace solvers have been implemented to compute the continuum tortuosity factor, effective thermal conductivity, and effective electrical conductivity. A random method has been developed to compute tortuosity factors from the continuum to rarefied regimes. Representative elementary volume analysis can be performed on each property. The software also includes a time-dependent, particle-based model for the oxidation of fibrous materials. PuMA was developed for Linux operating systems and is available as a NASA software under a US & Foreign release.

  20. A Systematic Analysis of Functional Safety Certification Practices in Industrial Robot Software Development

    Directory of Open Access Journals (Sweden)

    Tong Xie

    2017-01-01

    Full Text Available For decades, industry robotics have delivered on the promise of speed, efficiency and productivity. The last several years have seen a sharp resurgence in the orders of industrial robots in China, and the areas addressed within industrial robotics has extended into safety-critical domains. However, safety standards have not yet been implemented widely in academia and engineering applications, particularly in robot software development. This paper presents a systematic analysis of functional safety certification practices in software development for the safety-critical software of industrial robots, to identify the safety certification practices used for the development of industrial robots in China and how these practices comply with the safety standard requirements. Reviewing from Chinese academic papers, our research shows that safety standards are barely used in software development of industrial robot. The majority of the papers propose various solutions to achieve safety, but only about two thirds of the papers refer to non-standardized approaches that mainly address the systematic level rather than the software development level. In addition, our research shows that with the development of artificial intelligent, an emerging field is still on the quest for standardized and suitable approaches to develop safety-critical software.

  1. A software platform for the analysis of dermatology images

    Science.gov (United States)

    Vlassi, Maria; Mavraganis, Vlasios; Asvestas, Panteleimon

    2017-11-01

    The purpose of this paper is to present a software platform developed in Python programming environment that can be used for the processing and analysis of dermatology images. The platform provides the capability for reading a file that contains a dermatology image. The platform supports image formats such as Windows bitmaps, JPEG, JPEG2000, portable network graphics, TIFF. Furthermore, it provides suitable tools for selecting, either manually or automatically, a region of interest (ROI) on the image. The automated selection of a ROI includes filtering for smoothing the image and thresholding. The proposed software platform has a friendly and clear graphical user interface and could be a useful second-opinion tool to a dermatologist. Furthermore, it could be used to classify images including from other anatomical parts such as breast or lung, after proper re-training of the classification algorithms.

  2. New analysis software for Viking Lander meteorological data

    Directory of Open Access Journals (Sweden)

    O. Kemppinen

    2013-02-01

    Full Text Available We have developed a set of tools that enable us to process Viking Lander meteorological data beyond what has been previously publicly available. Besides providing data for new periods of time, the existing data periods have been augmented by enhancing the data resolution significantly. This was accomplished by first transferring the original Prime computer version of the data analysis software to a standard Linux platform, and then by modifying the software to be able to process the data despite irregularities in the original raw data and reverse engineering various parameter files. In addition to this, the processing pipeline has been streamlined, making processing the data faster and easier. As a case example of new data, freshly processed Viking Lander 1 and 2 temperature records are described and briefly analyzed in ways that have not been previously possible due to the lack of data.

  3. The cleanroom case study in the Software Engineering Laboratory: Project description and early analysis

    Science.gov (United States)

    Green, Scott; Kouchakdjian, Ara; Basili, Victor; Weidow, David

    1990-01-01

    This case study analyzes the application of the cleanroom software development methodology to the development of production software at the NASA/Goddard Space Flight Center. The cleanroom methodology emphasizes human discipline in program verification to produce reliable software products that are right the first time. Preliminary analysis of the cleanroom case study shows that the method can be applied successfully in the FDD environment and may increase staff productivity and product quality. Compared to typical Software Engineering Laboratory (SEL) activities, there is evidence of lower failure rates, a more complete and consistent set of inline code documentation, a different distribution of phase effort activity, and a different growth profile in terms of lines of code developed. The major goals of the study were to: (1) assess the process used in the SEL cleanroom model with respect to team structure, team activities, and effort distribution; (2) analyze the products of the SEL cleanroom model and determine the impact on measures of interest, including reliability, productivity, overall life-cycle cost, and software quality; and (3) analyze the residual products in the application of the SEL cleanroom model, such as fault distribution, error characteristics, system growth, and computer usage.

  4. Verification of operating software for cooperative monitoring applications

    International Nuclear Information System (INIS)

    Tolk, K.M.; Rembold, R.K.

    1997-01-01

    Monitoring agencies often use computer based equipment to control instruments and to collect data at sites that are being monitored under international safeguards or other cooperative monitoring agreements. In order for this data to be used as an independent verification of data supplied by the host at the facility, the software used must be trusted by the monitoring agency. The monitoring party must be sure that the software has not be altered to give results that could lead to erroneous conclusions about nuclear materials inventories or other operating conditions at the site. The host might also want to verify that the software being used is the software that has been previously inspected in order to be assured that only data that is allowed under the agreement is being collected. A description of a method to provide this verification using keyed has functions and how the proposed method overcomes possible vulnerabilities in methods currently in use such as loading the software from trusted disks is presented. The use of public key data authentication for this purpose is also discussed

  5. Research and practice on NPP safety DCS application software V and V defect classification system

    International Nuclear Information System (INIS)

    Zhang Dongwei; Li Yunjian; Li Xiangjian

    2012-01-01

    One of the most significant aims of Verification and Validation (V and V) is to find software errors and risks, especially for a DCS application software designed for nuclear power plant (NPP). Through classifying and analyzing errors, a number of obtained data can be utilized to estimate current status and potential risks of software development and improve the quality of project. A method of error classification is proposed, which is applied to whole V and V life cycle, using a MW pressurized reactor project as an example. The purpose is to analyze errors discovered by V and V activities, and result in improvement of safety critical DCS application software. (authors)

  6. Software tool for data mining and its applications

    Science.gov (United States)

    Yang, Jie; Ye, Chenzhou; Chen, Nianyi

    2002-03-01

    A software tool for data mining is introduced, which integrates pattern recognition (PCA, Fisher, clustering, hyperenvelop, regression), artificial intelligence (knowledge representation, decision trees), statistical learning (rough set, support vector machine), computational intelligence (neural network, genetic algorithm, fuzzy systems). It consists of nine function models: pattern recognition, decision trees, association rule, fuzzy rule, neural network, genetic algorithm, Hyper Envelop, support vector machine, visualization. The principle and knowledge representation of some function models of data mining are described. The software tool of data mining is realized by Visual C++ under Windows 2000. Nonmonotony in data mining is dealt with by concept hierarchy and layered mining. The software tool of data mining has satisfactorily applied in the prediction of regularities of the formation of ternary intermetallic compounds in alloy systems, and diagnosis of brain glioma.

  7. Research and practice of application software verification and validation for nuclear safety digital I and C system

    International Nuclear Information System (INIS)

    Dong Yaxin; Xu Xianzhu; Bai Xiangji

    2014-01-01

    Application software V and V activities determine whether the output results are consistent with the requirements of tasks throughout each stage of application software development process, and confirm whether the ultimately generated application software is conform to its intended use and other related requirements. A set of reasonable and feasible workflow for application software V and V activities was proposed based on the characteristics of the application software in this paper, and was demonstrated with the application software V and V activities in transformation project of in-core instrumentation system (RIC) in a nuclear power plant. (authors)

  8. BIM Software Capability and Interoperability Analysis : An analytical approach toward structural usage of BIM software (S-BIM)

    OpenAIRE

    A. Taher, Ali

    2016-01-01

    This study focused on the structuralanalysis of BIM models. Different commercial software (Autodesk products and Rhinoceros)are presented through modelling and analysis of different structures with varying complexity,section properties, geometry, and material. Beside the commercial software, differentarchitectural and different tools for structural analysis are evaluated (dynamo, grasshopper,add-on tool, direct link, indirect link via IFC). BIM and Structural BIM (S-BIM)

  9. An Analysis of Security and Privacy Issues in Smart Grid Software Architectures on Clouds

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh; Kumbhare, Alok; Cao, Baohua; Prasanna, Viktor K.

    2011-07-09

    Power utilities globally are increasingly upgrading to Smart Grids that use bi-directional communication with the consumer to enable an information-driven approach to distributed energy management. Clouds offer features well suited for Smart Grid software platforms and applications, such as elastic resources and shared services. However, the security and privacy concerns inherent in an information rich Smart Grid environment are further exacerbated by their deployment on Clouds. Here, we present an analysis of security and privacy issues in a Smart Grids software architecture operating on different Cloud environments, in the form of a taxonomy. We use the Los Angeles Smart Grid Project that is underway in the largest U.S. municipal utility to drive this analysis that will benefit both Cloud practitioners targeting Smart Grid applications, and Cloud researchers investigating security and privacy.

  10. Application of software technology to a future spacecraft computer design

    Science.gov (United States)

    Labaugh, R. J.

    1980-01-01

    A study was conducted to determine how major improvements in spacecraft computer systems can be obtained from recent advances in hardware and software technology. Investigations into integrated circuit technology indicated that the CMOS/SOS chip set being developed for the Air Force Avionics Laboratory at Wright Patterson had the best potential for improving the performance of spaceborne computer systems. An integral part of the chip set is the bit slice arithmetic and logic unit. The flexibility allowed by microprogramming, combined with the software investigations, led to the specification of a baseline architecture and instruction set.

  11. UPVapor: Cofrentes nuclear power plant production results analysis software

    International Nuclear Information System (INIS)

    Curiel, M.; Palomo, M. J.; Baraza, A.; Vaquer, J.

    2010-10-01

    UPVapor software version 02 has been developed for the Cofrentes nuclear power plant Data Analysis Department (Spain). It is an analysis graphical environment in which users have available all the plant variables registered in the process computer system (SIEC). In order to perform this, UPVapor software has many advanced graphic tools for work simplicity, as well as a friendly environment easy to use and with many configuration possibilities. Plant variables are classified in the same way that they are in SIEC computer and these values are taken from it through the network of Iberdrola. UPVapor can generate two different types of graphics: evolution graphs and X Y graphs. The first ones analyse the evolution up to twenty plant variables in a user's defined time period and according to historic plant files. Many tools are available: cursors, graphic configuration, mobile means, non valid data visualization ... Moreover, a particular analysis configuration can be saved, as a pre selection, giving the possibility of charging pre selection directly and developing quick monitoring of a group of preselected plant variables. In X Y graphs, it is possible to analyse a variable value against another variable in a defined time. As an option, users can filter previous data depending on a variable certain range, with the possibility of programming up to five filters. As well as the other graph, X Y graph has many configurations, saving and printing options. With UPVapor software, data analysts can save a valuable time during daily work and, as it is of easy utilization, it permits to other users to perform their own analysis without ask the analysts to develop. Besides, it can be used from any work centre with access to network framework. (Author)

  12. UPVapor: Cofrentes nuclear power plant production results analysis software

    Energy Technology Data Exchange (ETDEWEB)

    Curiel, M. [Logistica y Acondicionamientos Industriales SAU, Sorolla Center, local 10, Av. de las Cortes Valencianas No. 58, 46015 Valencia (Spain); Palomo, M. J. [ISIRYM, Universidad Politecnica de Valencia, Camino de Vera s/n, Valencia (Spain); Baraza, A. [Iberdrola Generacion S. A., Central Nuclear Cofrentes, Carretera Almansa Requena s/n, 04662 Cofrentes, Valencia (Spain); Vaquer, J., E-mail: m.curiel@lainsa.co [TITANIA Servicios Tecnologicos SL, Sorolla Center, local 10, Av. de las Cortes Valencianas No. 58, 46015 Valencia (Spain)

    2010-10-15

    UPVapor software version 02 has been developed for the Cofrentes nuclear power plant Data Analysis Department (Spain). It is an analysis graphical environment in which users have available all the plant variables registered in the process computer system (SIEC). In order to perform this, UPVapor software has many advanced graphic tools for work simplicity, as well as a friendly environment easy to use and with many configuration possibilities. Plant variables are classified in the same way that they are in SIEC computer and these values are taken from it through the network of Iberdrola. UPVapor can generate two different types of graphics: evolution graphs and X Y graphs. The first ones analyse the evolution up to twenty plant variables in a user's defined time period and according to historic plant files. Many tools are available: cursors, graphic configuration, mobile means, non valid data visualization ... Moreover, a particular analysis configuration can be saved, as a pre selection, giving the possibility of charging pre selection directly and developing quick monitoring of a group of preselected plant variables. In X Y graphs, it is possible to analyse a variable value against another variable in a defined time. As an option, users can filter previous data depending on a variable certain range, with the possibility of programming up to five filters. As well as the other graph, X Y graph has many configurations, saving and printing options. With UPVapor software, data analysts can save a valuable time during daily work and, as it is of easy utilization, it permits to other users to perform their own analysis without ask the analysts to develop. Besides, it can be used from any work centre with access to network framework. (Author)

  13. APPLICATION OF APM WINMACHINE SOFTWARE FOR DESIGN AND CALCULATIONS IN MECHANICAL ENGINEERING

    Directory of Open Access Journals (Sweden)

    L. O. Neduzha

    2016-04-01

    Full Text Available Purpose.To conduct the research at all stages of design, development, operation, residual operation life determination, namely, preliminary study, action principle choice, design of draft and technical projects, their optimization, preparation of design documentation and control information for automated production, comprehensive engineering analysis, it is required to use the latest computer technologies. Their use can not only present data and information in some way, but also gives the opportunity to effectively and directly interact with the information object that is created or demonstrated. Methodology.To perform engineering calculations associated with the analysis of the strength of machines, mechanisms, constructions one uses both analytical and numerical methods in practice.The most common method for analysing the stress-strain state of object models, obtaining their dynamic and stability characteristics at constant and variable modes of external load is the finite element method, which is implemented in many famous and widespread software products, providing strength calculation of models of machines, mechanisms and structures. Findings.The use of modern software for designing machine parts and various types of their joints and for strength analysis of structures is justified. Colour charts for distribution of stresses, displacement, internal efforts, safety factor and others allow accurate and quick identification of the most dangerous places in the structure. The program also provides an opportunity to «look» inside the elements and see the resulting distribution of internal force factors. Originality.The paper considered the aspects, which are unexplored at present, associated with the current state and prospects of development of industrial production, the use of software package for design and calculations in the mechanical industry. The result of the work is the justification of software application for solving problems that

  14. Cluster analysis for applications

    CERN Document Server

    Anderberg, Michael R

    1973-01-01

    Cluster Analysis for Applications deals with methods and various applications of cluster analysis. Topics covered range from variables and scales to measures of association among variables and among data units. Conceptual problems in cluster analysis are discussed, along with hierarchical and non-hierarchical clustering methods. The necessary elements of data analysis, statistics, cluster analysis, and computer implementation are integrated vertically to cover the complete path from raw data to a finished analysis.Comprised of 10 chapters, this book begins with an introduction to the subject o

  15. The Software Ontology (SWO): a resource for reproducibility in biomedical data analysis, curation and digital preservation.

    Science.gov (United States)

    Malone, James; Brown, Andy; Lister, Allyson L; Ison, Jon; Hull, Duncan; Parkinson, Helen; Stevens, Robert

    2014-01-01

    Biomedical ontologists to date have concentrated on ontological descriptions of biomedical entities such as gene products and their attributes, phenotypes and so on. Recently, effort has diversified to descriptions of the laboratory investigations by which these entities were produced. However, much biological insight is gained from the analysis of the data produced from these investigations, and there is a lack of adequate descriptions of the wide range of software that are central to bioinformatics. We need to describe how data are analyzed for discovery, audit trails, provenance and reproducibility. The Software Ontology (SWO) is a description of software used to store, manage and analyze data. Input to the SWO has come from beyond the life sciences, but its main focus is the life sciences. We used agile techniques to gather input for the SWO and keep engagement with our users. The result is an ontology that meets the needs of a broad range of users by describing software, its information processing tasks, data inputs and outputs, data formats versions and so on. Recently, the SWO has incorporated EDAM, a vocabulary for describing data and related concepts in bioinformatics. The SWO is currently being used to describe software used in multiple biomedical applications. The SWO is another element of the biomedical ontology landscape that is necessary for the description of biomedical entities and how they were discovered. An ontology of software used to analyze data produced by investigations in the life sciences can be made in such a way that it covers the important features requested and prioritized by its users. The SWO thus fits into the landscape of biomedical ontologies and is produced using techniques designed to keep it in line with user's needs. The Software Ontology is available under an Apache 2.0 license at http://theswo.sourceforge.net/; the Software Ontology blog can be read at http://softwareontology.wordpress.com.

  16. Fuzzy system for risk analysis in software projects through the attributes of quality standards iso 25000

    Directory of Open Access Journals (Sweden)

    Chau Sen Shia

    2014-02-01

    Full Text Available With the growth in demand for products and services in the IT area, companies encounter difficulties in establishing a metric or measure of quality of services to address measurably qualitative values in their planning. In this work fuzzy logic, standard SQuaRE (measurement of the quality of software products, Likertscale, GQM method (Goal-Question-Metric -indicator of quality of Software and the project risk analysis model of Boehm were used to assess the quality of services and decision-making, according to your demand and requests for software development. With the aim of improving the quality in the provision of services, the application is used to integrate the team and follow the life cycle of a project from its initial phase, and to assist in the comparison with the proposed schedule during the requirements elicitation.

  17. International Atomic Energy Agency intercomparison of ion beam analysis software

    Energy Technology Data Exchange (ETDEWEB)

    Barradas, N.P. [Instituto Tecnologico e Nuclear, Estrada Nacional No. 10, Apartado 21, 2686-953 Sacavem (Portugal); Centro de Fisica Nuclear da Universidade de Lisboa, Avenida do Professor Gama Pinto 2, 1649-003 Lisboa (Portugal)], E-mail: nunoni@itn.pt; Arstila, K. [K.U. Leuven, Instituut voor Kern-en Stralingsfysica, Celestijnenlaan 200D, B-3001 Leuven (Belgium); Battistig, G. [MFA Research Institute for Technical Physics and Materials Science, P.O. Box 49, H-1525 Budapest (Hungary); Bianconi, M. [CNR-IMM-Sezione di Bologna, Via P. Gobetti, 101, I-40129 Bologna (Italy); Dytlewski, N. [International Atomic Energy Agency, Wagramer Strasse 5, P.O. Box 100, A-1400 Vienna (Austria); Jeynes, C. [Surrey Ion Beam Centre, University of Surrey, Guildford, Surrey GU2 7XH (United Kingdom); Kotai, E. [KFKI Research Institute for Particle and Nuclear Physics, P.O. Box 49, H-1525 Budapest (Hungary); Lulli, G. [CNR-IMM-Sezione di Bologna, Via P. Gobetti, 101, I-40129 Bologna (Italy); Mayer, M. [Max-Planck-Institut fuer Plasmaphysik, EURATOM Association, Boltzmannstrasse 2, D-85748 Garching (Germany); Rauhala, E. [Accelerator Laboratory, Department of Physics, University of Helsinki, P.O. Box 43, FIN-00014 Helsinki (Finland); Szilagyi, E. [KFKI Research Institute for Particle and Nuclear Physics, P.O. Box 49, H-1525 Budapest (Hungary); Thompson, M. [Department of MS and E/Bard Hall 328, Cornell University, Ithaca, NY 14853 (United States)

    2007-09-15

    Ion beam analysis (IBA) includes a group of techniques for the determination of elemental concentration depth profiles of thin film materials. Often the final results rely on simulations, fits and calculations, made by dedicated codes written for specific techniques. Here we evaluate numerical codes dedicated to the analysis of Rutherford backscattering spectrometry, non-Rutherford elastic backscattering spectrometry, elastic recoil detection analysis and non-resonant nuclear reaction analysis data. Several software packages have been presented and made available to the community. New codes regularly appear, and old codes continue to be used and occasionally updated and expanded. However, those codes have to date not been validated, or even compared to each other. Consequently, IBA practitioners use codes whose validity, correctness and accuracy have never been validated beyond the authors' efforts. In this work, we present the results of an IBA software intercomparison exercise, where seven different packages participated. These were DEPTH, GISA, DataFurnace (NDF), RBX, RUMP, SIMNRA (all analytical codes) and MCERD (a Monte Carlo code). In a first step, a series of simulations were defined, testing different capabilities of the codes, for fixed conditions. In a second step, a set of real experimental data were analysed. The main conclusion is that the codes perform well within the limits of their design, and that the largest differences in the results obtained are due to differences in the fundamental databases used (stopping power and scattering cross section). In particular, spectra can be calculated including Rutherford cross sections with screening, energy resolution convolutions including energy straggling, and pileup effects, with agreement between the codes available at the 0.1% level. This same agreement is also available for the non-RBS techniques. This agreement is not limited to calculation of spectra from particular structures with predetermined

  18. Software Framework for Development of Web-GIS Systems for Analysis of Georeferenced Geophysical Data

    Science.gov (United States)

    Okladnikov, I.; Gordov, E. P.; Titov, A. G.

    2011-12-01

    Georeferenced datasets (meteorological databases, modeling and reanalysis results, remote sensing products, etc.) are currently actively used in numerous applications including modeling, interpretation and forecast of climatic and ecosystem changes for various spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their size which might constitute up to tens terabytes for a single dataset at present studies in the area of climate and environmental change require a special software support. A dedicated software framework for rapid development of providing such support information-computational systems based on Web-GIS technologies has been created. The software framework consists of 3 basic parts: computational kernel developed using ITTVIS Interactive Data Language (IDL), a set of PHP-controllers run within specialized web portal, and JavaScript class library for development of typical components of web mapping application graphical user interface (GUI) based on AJAX technology. Computational kernel comprise of number of modules for datasets access, mathematical and statistical data analysis and visualization of results. Specialized web-portal consists of web-server Apache, complying OGC standards Geoserver software which is used as a base for presenting cartographical information over the Web, and a set of PHP-controllers implementing web-mapping application logic and governing computational kernel. JavaScript library aiming at graphical user interface development is based on GeoExt library combining ExtJS Framework and OpenLayers software. Based on the software framework an information-computational system for complex analysis of large georeferenced data archives was developed. Structured environmental datasets available for processing now include two editions of NCEP/NCAR Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 Reanalysis, ECMWF ERA Interim Reanalysis, MRI/JMA APHRODITE's Water Resources Project Reanalysis

  19. Research and Development of Statistical Analysis Software System of Maize Seedling Experiment

    OpenAIRE

    Hui Cao

    2014-01-01

    In this study, software engineer measures were used to develop a set of software system for maize seedling experiments statistics and analysis works. During development works, B/S structure software design method was used and a set of statistics indicators for maize seedling evaluation were established. The experiments results indicated that this set of software system could finish quality statistics and analysis for maize seedling very well. The development of this software system explored a...

  20. Revenue Management and Demand Fulfillment: Matching Applications, Models, and Software

    NARCIS (Netherlands)

    R. Quante (Rainer); H. Meyr (Herbert); M. Fleischmann (Moritz)

    2007-01-01

    textabstractRecent years have seen great successes of revenue management, notably in the airline, hotel, and car rental business. Currently, an increasing number of industries, including manufacturers and retailers, are exploring ways to adopt similar concepts. Software companies are taking an

  1. Status of REBUS fuel management software development for RERTR applications

    International Nuclear Information System (INIS)

    Olson, Arne P.

    2000-01-01

    The REBUS-5 burnup code has evolved substantially in order to meet the needs of the ANL RERTR Program. This paper presents a summary of the past changes and improvements in the capabilities of this software, and also identifies future plans. (author)

  2. A Parallel Software Pipeline for DMET Microarray Genotyping Data Analysis

    Directory of Open Access Journals (Sweden)

    Giuseppe Agapito

    2018-06-01

    Full Text Available Personalized medicine is an aspect of the P4 medicine (predictive, preventive, personalized and participatory based precisely on the customization of all medical characters of each subject. In personalized medicine, the development of medical treatments and drugs is tailored to the individual characteristics and needs of each subject, according to the study of diseases at different scales from genotype to phenotype scale. To make concrete the goal of personalized medicine, it is necessary to employ high-throughput methodologies such as Next Generation Sequencing (NGS, Genome-Wide Association Studies (GWAS, Mass Spectrometry or Microarrays, that are able to investigate a single disease from a broader perspective. A side effect of high-throughput methodologies is the massive amount of data produced for each single experiment, that poses several challenges (e.g., high execution time and required memory to bioinformatic software. Thus a main requirement of modern bioinformatic softwares, is the use of good software engineering methods and efficient programming techniques, able to face those challenges, that include the use of parallel programming and efficient and compact data structures. This paper presents the design and the experimentation of a comprehensive software pipeline, named microPipe, for the preprocessing, annotation and analysis of microarray-based Single Nucleotide Polymorphism (SNP genotyping data. A use case in pharmacogenomics is presented. The main advantages of using microPipe are: the reduction of errors that may happen when trying to make data compatible among different tools; the possibility to analyze in parallel huge datasets; the easy annotation and integration of data. microPipe is available under Creative Commons license, and is freely downloadable for academic and not-for-profit institutions.

  3. Comparison of two software versions for assessment of body-composition analysis by DXA

    DEFF Research Database (Denmark)

    Vozarova, B; Wang, J; Weyer, C

    2001-01-01

    To compare two software versions provided by Lunar CO: for assessment of body composition analysis by DXA.......To compare two software versions provided by Lunar CO: for assessment of body composition analysis by DXA....

  4. Methodology for object-oriented real-time systems analysis and design: Software engineering

    Science.gov (United States)

    Schoeffler, James D.

    1991-01-01

    Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

  5. Graph based communication analysis for hardware/software codesign

    DEFF Research Database (Denmark)

    Knudsen, Peter Voigt; Madsen, Jan

    1999-01-01

    In this paper we present a coarse grain CDFG (Control/Data Flow Graph) model suitable for hardware/software partitioning of single processes and demonstrate how it is necessary to perform various transformations on the graph structure before partitioning in order to achieve a structure that allows...... for accurate estimation of communication overhead between nodes mapped to different processors. In particular, we demonstrate how various transformations of control structures can lead to a more accurate communication analysis and more efficient implementations. The purpose of the transformations is to obtain...

  6. Development of RCM analysis software for Korean nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Ho; Choi, Kwang Hee; Jeong, Hyeong Jong [Korea Electric Power Research Institute, Taejon (Korea, Republic of)

    1999-12-31

    A software called KEPCO RCM workstation (KRCM) has been developed to optimize the maintenance strategies of Korean nuclear power plants. The program modules of the KRCM were designed in a manner that combines EPRI methodologies and KEPRI analysis technique. The KRCM is being applied to the three pilot system, chemical and volume control system, main steam system, and compressed air system of Yonggwang Units 1 and 2. In addition, the KRCM can be utilized as a tool to meet a part of the requirements of maintenance rule (MR) imposed by U.S. NRC. 3 refs., 4 figs. (Author)

  7. Development of RCM analysis software for Korean nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Ho; Choi, Kwang Hee; Jeong, Hyeong Jong [Korea Electric Power Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    A software called KEPCO RCM workstation (KRCM) has been developed to optimize the maintenance strategies of Korean nuclear power plants. The program modules of the KRCM were designed in a manner that combines EPRI methodologies and KEPRI analysis technique. The KRCM is being applied to the three pilot system, chemical and volume control system, main steam system, and compressed air system of Yonggwang Units 1 and 2. In addition, the KRCM can be utilized as a tool to meet a part of the requirements of maintenance rule (MR) imposed by U.S. NRC. 3 refs., 4 figs. (Author)

  8. Gamma camera image processing and graphical analysis mutual software system

    International Nuclear Information System (INIS)

    Wang Zhiqian; Chen Yongming; Ding Ailian; Ling Zhiye; Jin Yongjie

    1992-01-01

    GCCS gamma camera image processing and graphical analysis system is a special mutual software system. It is mainly used to analyse various patient data acquired from gamma camera. This system is used on IBM PC, PC/XT or PC/AT. It consists of several parts: system management, data management, device management, program package and user programs. The system provides two kinds of user interfaces: command menu and command characters. It is easy to change and enlarge this system because it is best modularized. The user programs include almost all the clinical protocols used now

  9. Discriminant Analysis of the Effects of Software Cost Drivers on ...

    African Journals Online (AJOL)

    The paper work investigates the effect of software cost drivers on project schedule estimation of software development projects in Nigeria. Specifically, the paper determines the extent to which software cost variables affect our software project time schedule in our environment. Such studies are lacking in the recent ...

  10. Tool Support for Parametric Analysis of Large Software Simulation Systems

    Science.gov (United States)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  11. Development of a software for INAA analysis automation

    International Nuclear Information System (INIS)

    Zahn, Guilherme S.; Genezini, Frederico A.; Figueiredo, Ana Maria G.; Ticianelli, Regina B.

    2013-01-01

    In this work, a software to automate the post-counting tasks in comparative INAA has been developed that aims to become more flexible than the available options, integrating itself with some of the routines currently in use in the IPEN Activation Analysis Laboratory and allowing the user to choose between a fully-automatic analysis or an Excel-oriented one. The software makes use of the Genie 2000 data importing and analysis routines and stores each 'energy-counts-uncertainty' table as a separate ASCII file that can be used later on if required by the analyst. Moreover, it generates an Excel-compatible CSV (comma separated values) file with only the relevant results from the analyses for each sample or comparator, as well as the results of the concentration calculations and the results obtained with four different statistical tools (unweighted average, weighted average, normalized residuals and Rajeval technique), allowing the analyst to double-check the results. Finally, a 'summary' CSV file is also produced, with the final concentration results obtained for each element in each sample. (author)

  12. Models for composing software : an analysis of software composition and objects

    NARCIS (Netherlands)

    Bergmans, Lodewijk

    1999-01-01

    In this report, we investigate component-based software construction with a focus on composition. In particular we try to analyze the requirements and issues for components and software composition. As a means to understand this research area, we introduce a canonical model for representing

  13. Conceptual study of the application software manager using the Xlet model in the nuclear fields

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Joon Koo; Park, Heui Youn; Koo, In Soo [KAERI, Taejon (Korea, Republic of); Park, Hee Seok; Kim, Jung Seon; Sohn, Chang Ho [Samchang Enterprise Co., Ltd., Taejon (Korea, Republic of)

    2003-10-01

    In order to reduce the cost of software maintenance including software modification, we suggest the object oriented program with checking the version of application program using the Java language and the technique of executing the downloaded application program via network using the application manager. In order to change the traditional scheduler to the application manager we have adopted the Xlet concept in the nuclear fields using the network. In usual Xlet means a Java application that runs on the digital television receiver. The Java TV Application Program Interface(API) defines an application model called the Xlet application lifecycle. Java applications that use this lifecycle model are called Xlets. The Xlet application lifecycle is compatible with the existing application environment and virtual machine technology. The Xlet application lifecycle model defines the dialog (protocol) between an Xlet and its environment.

  14. Conceptual study of the application software manager using the Xlet model in the nuclear fields

    International Nuclear Information System (INIS)

    Lee, Joon Koo; Park, Heui Youn; Koo, In Soo; Park, Hee Seok; Kim, Jung Seon; Sohn, Chang Ho

    2003-01-01

    In order to reduce the cost of software maintenance including software modification, we suggest the object oriented program with checking the version of application program using the Java language and the technique of executing the downloaded application program via network using the application manager. In order to change the traditional scheduler to the application manager we have adopted the Xlet concept in the nuclear fields using the network. In usual Xlet means a Java application that runs on the digital television receiver. The Java TV Application Program Interface(API) defines an application model called the Xlet application lifecycle. Java applications that use this lifecycle model are called Xlets. The Xlet application lifecycle is compatible with the existing application environment and virtual machine technology. The Xlet application lifecycle model defines the dialog (protocol) between an Xlet and its environment

  15. PROTEINCHALLENGE: Crowd sourcing in proteomics analysis and software development

    DEFF Research Database (Denmark)

    Martin, Sarah F.; Falkenberg, Heiner; Dyrlund, Thomas Franck

    2013-01-01

    , including arguments for community-wide open source software development and “big data” compatible solutions for the future. For the meantime, we have laid out ten top tips for data processing. With these at hand, a first large-scale proteomics analysis hopefully becomes less daunting to navigate.......However there is clearly a real need for robust tools, standard operating procedures and general acceptance of best practises. Thus we submit to the proteomics community a call for a community-wide open set of proteomics analysis challenges—PROTEINCHALLENGE—that directly target and compare data analysis workflows......In large-scale proteomics studies there is a temptation, after months of experimental work, to plug resulting data into a convenient—if poorly implemented—set of tools, which may neither do the data justice nor help answer the scientific question. In this paper we have captured key concerns...

  16. PTaaS: Platform for Providing Software Developing Applications and Tools as a Service

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Babar, Muhammad Ali

    2014-01-01

    technological support for it that is not limited to one specific tools and a particular phase of software development life cycle. In this thesis, we have explored the possibility of offering software development applications and tools as services that can be acquired on demand according to the software...... with process. Information gained from the review of literature on GSD tools and processes is used to extract functional requirements for the middleware platform for provisioning of software development applications and tools as services. Finding from the review of literature on architecture solutions for cloud......Cloud computing has become an established paradigm for enabling organizations to build scalable software systems and to meet challenges of rapid demand of computing and storage resources. There has been a significant success in building cloud-enabled applications for many disciplines ranging from...

  17. Software and Database Usage on Metabolomic Studies: Using XCMS on LC-MS Data Analysis

    Directory of Open Access Journals (Sweden)

    Mustafa Celebier

    2014-04-01

    Full Text Available Metabolome is the complete set of small-molecule metabolites to be found in a cell or a single organism. Metabolomics is the scientific study to determine and identify the chemicals in metabolome with advanced analytical techniques. Nowadays, the elucidation of the molecular mechanism of any disease with genome analysis and proteome analysis is not sufficient. Instead of these, a holistic assessment including metabolomic studies provides rational and accurate results. Metabolite levels in an organism are associated with the cellular functions. Thus, determination of the metabolite amounts identifies the phenotype of a cell or tissue related with the genetic and some other variations. Even though, the analysis of metabolites for medical diagnosis and therapy have been performed for a long time, the studies to improve the analysis methods for metabolite profiling are recently increased. The application of metabolomics includes the identification of biomarkers, enzyme-substract interactions, drug-activity studies, metabolic pathway analysis and some other studies related with the system biology. The preprocessing and computing of the data obtained from LC-MS, GC-MS, CE-MS and NMR for metabolite profiling are helpful for preventing from time consuming manual data analysis processes and possible random errors on profiling period. In addition, such preprocesses allow us to identify low amount of metabolites which are not possible to be analyzed by manual processing. Therefore, the usage of software and databases for this purpose could not be ignored. In this study, it is briefly presented the software and database used on metabolomics and it is evaluated the capability of these software on metabolite profiling. Particularly, the performance of one of the most popular software called XCMS on the evaluation of LC-MS results for metabolomics was overviewed. In the near future, metabolomics with software and database support is estimated to be a routine

  18. Software Agents Applications Using Real-Time CORBA

    Science.gov (United States)

    Fowell, S.; Ward, R.; Nielsen, M.

    This paper describes current projects being performed by SciSys in the area of the use of software agents, built using CORBA middleware, to improve operations within autonomous satellite/ground systems. These concepts have been developed and demonstrated in a series of experiments variously funded by ESA's Technology Flight Opportunity Initiative (TFO) and Leading Edge Technology for SMEs (LET-SME), and the British National Space Centre's (BNSC) National Technology Programme. Some of this earlier work has already been reported in [1]. This paper will address the trends, issues and solutions associated with this software agent architecture concept, together with its implementation using CORBA within an on-board environment, that is to say taking account of its real- time and resource constrained nature.

  19. Free and open-source software application for the evaluation of coronary computed tomography angiography images.

    Science.gov (United States)

    Hadlich, Marcelo Souza; Oliveira, Gláucia Maria Moraes; Feijóo, Raúl A; Azevedo, Clerio F; Tura, Bernardo Rangel; Ziemer, Paulo Gustavo Portela; Blanco, Pablo Javier; Pina, Gustavo; Meira, Márcio; Souza e Silva, Nelson Albuquerque de

    2012-10-01

    The standardization of images used in Medicine in 1993 was performed using the DICOM (Digital Imaging and Communications in Medicine) standard. Several tests use this standard and it is increasingly necessary to design software applications capable of handling this type of image; however, these software applications are not usually free and open-source, and this fact hinders their adjustment to most diverse interests. To develop and validate a free and open-source software application capable of handling DICOM coronary computed tomography angiography images. We developed and tested the ImageLab software in the evaluation of 100 tests randomly selected from a database. We carried out 600 tests divided between two observers using ImageLab and another software sold with Philips Brilliance computed tomography appliances in the evaluation of coronary lesions and plaques around the left main coronary artery (LMCA) and the anterior descending artery (ADA). To evaluate intraobserver, interobserver and intersoftware agreements, we used simple and kappa statistics agreements. The agreements observed between software applications were generally classified as substantial or almost perfect in most comparisons. The ImageLab software agreed with the Philips software in the evaluation of coronary computed tomography angiography tests, especially in patients without lesions, with lesions 70% in the ADA was lower, but this is also observed when the anatomical reference standard is used.

  20. Application of software quality assurance to a specific scientific code development task

    International Nuclear Information System (INIS)

    Dronkers, J.J.

    1986-03-01

    This paper describes an application of software quality assurance to a specific scientific code development program. The software quality assurance program consists of three major components: administrative control, configuration management, and user documentation. The program attempts to be consistent with existing local traditions of scientific code development while at the same time providing a controlled process of development

  1. OpenFLUX: efficient modelling software for 13C-based metabolic flux analysis

    Directory of Open Access Journals (Sweden)

    Nielsen Lars K

    2009-05-01

    Full Text Available Abstract Background The quantitative analysis of metabolic fluxes, i.e., in vivo activities of intracellular enzymes and pathways, provides key information on biological systems in systems biology and metabolic engineering. It is based on a comprehensive approach combining (i tracer cultivation on 13C substrates, (ii 13C labelling analysis by mass spectrometry and (iii mathematical modelling for experimental design, data processing, flux calculation and statistics. Whereas the cultivation and the analytical part is fairly advanced, a lack of appropriate modelling software solutions for all modelling aspects in flux studies is limiting the application of metabolic flux analysis. Results We have developed OpenFLUX as a user friendly, yet flexible software application for small and large scale 13C metabolic flux analysis. The application is based on the new Elementary Metabolite Unit (EMU framework, significantly enhancing computation speed for flux calculation. From simple notation of metabolic reaction networks defined in a spreadsheet, the OpenFLUX parser automatically generates MATLAB-readable metabolite and isotopomer balances, thus strongly facilitating model creation. The model can be used to perform experimental design, parameter estimation and sensitivity analysis either using the built-in gradient-based search or Monte Carlo algorithms or in user-defined algorithms. Exemplified for a microbial flux study with 71 reactions, 8 free flux parameters and mass isotopomer distribution of 10 metabolites, OpenFLUX allowed to automatically compile the EMU-based model from an Excel file containing metabolic reactions and carbon transfer mechanisms, showing it's user-friendliness. It reliably reproduced the published data and optimum flux distributions for the network under study were found quickly ( Conclusion We have developed a fast, accurate application to perform steady-state 13C metabolic flux analysis. OpenFLUX will strongly facilitate and

  2. Functional analysis and applications

    CERN Document Server

    Siddiqi, Abul Hasan

    2018-01-01

    This self-contained textbook discusses all major topics in functional analysis. Combining classical materials with new methods, it supplies numerous relevant solved examples and problems and discusses the applications of functional analysis in diverse fields. The book is unique in its scope, and a variety of applications of functional analysis and operator-theoretic methods are devoted to each area of application. Each chapter includes a set of problems, some of which are routine and elementary, and some of which are more advanced. The book is primarily intended as a textbook for graduate and advanced undergraduate students in applied mathematics and engineering. It offers several attractive features making it ideally suited for courses on functional analysis intended to provide a basic introduction to the subject and the impact of functional analysis on applied and computational mathematics, nonlinear functional analysis and optimization. It introduces emerging topics like wavelets, Gabor system, inverse pro...

  3. Don't Blame the Software: Using Qualitative Data Analysis Software Successfully in Doctoral Research

    Directory of Open Access Journals (Sweden)

    Michelle Salmona

    2016-07-01

    Full Text Available In this article, we explore the learning experiences of doctoral candidates as they use qualitative data analysis software (QDAS. Of particular interest is the process of adopting technology during the development of research methodology. Using an action research approach, data was gathered over five years from advanced doctoral research candidates and supervisors. The technology acceptance model (TAM was then applied as a theoretical analytic lens for better understanding how students interact with new technology. Findings relate to two significant barriers which doctoral students confront: 1. aligning perceptions of ease of use and usefulness is essential in overcoming resistance to technological change; 2. transparency into the research process through technology promotes insights into methodological challenges. Transitioning through both barriers requires a competent foundation in qualitative research. The study acknowledges the importance of higher degree research, curriculum reform and doctoral supervision in post-graduate research training together with their interconnected relationships in support of high-quality inquiry. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1603117

  4. SEDA: A software package for the Statistical Earthquake Data Analysis

    Science.gov (United States)

    Lombardi, A. M.

    2017-03-01

    In this paper, the first version of the software SEDA (SEDAv1.0), designed to help seismologists statistically analyze earthquake data, is presented. The package consists of a user-friendly Matlab-based interface, which allows the user to easily interact with the application, and a computational core of Fortran codes, to guarantee the maximum speed. The primary factor driving the development of SEDA is to guarantee the research reproducibility, which is a growing movement among scientists and highly recommended by the most important scientific journals. SEDAv1.0 is mainly devoted to produce accurate and fast outputs. Less care has been taken for the graphic appeal, which will be improved in the future. The main part of SEDAv1.0 is devoted to the ETAS modeling. SEDAv1.0 contains a set of consistent tools on ETAS, allowing the estimation of parameters, the testing of model on data, the simulation of catalogs, the identification of sequences and forecasts calculation. The peculiarities of routines inside SEDAv1.0 are discussed in this paper. More specific details on the software are presented in the manual accompanying the program package.

  5. Software Application for Supporting the Education of Database Systems

    Science.gov (United States)

    Vágner, Anikó

    2015-01-01

    The article introduces an application which supports the education of database systems, particularly the teaching of SQL and PL/SQL in Oracle Database Management System environment. The application has two parts, one is the database schema and its content, and the other is a C# application. The schema is to administrate and store the tasks and the…

  6. Generating Safety-Critical PLC Code From a High-Level Application Software Specification

    Science.gov (United States)

    2008-01-01

    The benefits of automatic-application code generation are widely accepted within the software engineering community. These benefits include raised abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at Kennedy Space Center recognized the need for PLC code generation while developing the new ground checkout and launch processing system, called the Launch Control System (LCS). Engineers developed a process and a prototype software tool that automatically translates a high-level representation or specification of application software into ladder logic that executes on a PLC. All the computer hardware in the LCS is planned to be commercial off the shelf (COTS), including industrial controllers or PLCs that are connected to the sensors and end items out in the field. Most of the software in LCS is also planned to be COTS, with only small adapter software modules that must be developed in order to interface between the various COTS software products. A domain-specific language (DSL) is a programming language designed to perform tasks and to solve problems in a particular domain, such as ground processing of launch vehicles. The LCS engineers created a DSL for developing test sequences of ground checkout and launch operations of future launch vehicle and spacecraft elements, and they are developing a tabular specification format that uses the DSL keywords and functions familiar to the ground and flight system users. The tabular specification format, or tabular spec, allows most ground and flight system users to document how the application software is intended to function and requires little or no software programming knowledge or experience. A small sample from a prototype tabular spec application is

  7. System and software safety analysis for the ERA control computer

    International Nuclear Information System (INIS)

    Beerthuizen, P.G.; Kruidhof, W.

    2001-01-01

    The European Robotic Arm (ERA) is a seven degrees of freedom relocatable anthropomorphic robotic manipulator system, to be used in manned space operation on the International Space Station, supporting the assembly and external servicing of the Russian segment. The safety design concept and implementation of the ERA is described, in particular with respect to the central computer's software design. A top-down analysis and specification process is used to down flow the safety aspects of the ERA system towards the subsystems, which are produced by a consortium of companies in many countries. The user requirements documents and the critical function list are the key documents in this process. Bottom-up analysis (FMECA) and test, on both subsystem and system level, are the basis for safety verification. A number of examples show the use of the approach and methods used

  8. Compositional Solution Space Quantification for Probabilistic Software Analysis

    Science.gov (United States)

    Borges, Mateus; Pasareanu, Corina S.; Filieri, Antonio; d'Amorim, Marcelo; Visser, Willem

    2014-01-01

    Probabilistic software analysis aims at quantifying how likely a target event is to occur during program execution. Current approaches rely on symbolic execution to identify the conditions to reach the target event and try to quantify the fraction of the input domain satisfying these conditions. Precise quantification is usually limited to linear constraints, while only approximate solutions can be provided in general through statistical approaches. However, statistical approaches may fail to converge to an acceptable accuracy within a reasonable time. We present a compositional statistical approach for the efficient quantification of solution spaces for arbitrarily complex constraints over bounded floating-point domains. The approach leverages interval constraint propagation to improve the accuracy of the estimation by focusing the sampling on the regions of the input domain containing the sought solutions. Preliminary experiments show significant improvement on previous approaches both in results accuracy and analysis time.

  9. HistFitter software framework for statistical data analysis

    CERN Document Server

    Baak, M.; Côte, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-01-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fitted to data and interpreted with statistical tests. A key innovation of HistFitter is its design, which is rooted in core analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its very fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with mu...

  10. Rapid Development of Guidance, Navigation, and Control Core Flight System Software Applications Using Simulink Models

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this proposal is to demonstrate a new Guidance, Navigation, and Control (GNC) Flight Software (FSW) application development paradigm which takes...

  11. [Development and practice evaluation of blood acid-base imbalance analysis software].

    Science.gov (United States)

    Chen, Bo; Huang, Haiying; Zhou, Qiang; Peng, Shan; Jia, Hongyu; Ji, Tianxing

    2014-11-01

    To develop a blood gas, acid-base imbalance analysis computer software to diagnose systematically, rapidly, accurately and automatically determine acid-base imbalance type, and evaluate the clinical application. Using VBA programming language, a computer aided diagnostic software for the judgment of acid-base balance was developed. The clinical data of 220 patients admitted to the Second Affiliated Hospital of Guangzhou Medical University were retrospectively analyzed. The arterial blood gas [pH value, HCO(3)(-), arterial partial pressure of carbon dioxide (PaCO₂)] and electrolytes included data (Na⁺ and Cl⁻) were collected. Data were entered into the software for acid-base imbalances judgment. At the same time the data generation was calculated manually by H-H compensation formula for determining the type of acid-base imbalance. The consistency of judgment results from software and manual calculation was evaluated, and the judgment time of two methods was compared. The clinical diagnosis of the types of acid-base imbalance for the 220 patients: 65 cases were normal, 90 cases with simple type, mixed type in 41 cases, and triplex type in 24 cases. The accuracy of the judgment results of the normal and triplex types from computer software compared with which were calculated manually was 100%, the accuracy of the simple type judgment was 98.9% and 78.0% for the mixed type, and the total accuracy was 95.5%. The Kappa value of judgment result from software and manual judgment was 0.935, P=0.000. It was demonstrated that the consistency was very good. The time for software to determine acid-base imbalances was significantly shorter than the manual judgment (seconds:18.14 ± 3.80 vs. 43.79 ± 23.86, t=7.466, P=0.000), so the method of software was much faster than the manual method. Software judgment can replace manual judgment with the characteristics of rapid, accurate and convenient, can improve work efficiency and quality of clinical doctors and has great

  12. An Analysis of Related Software Cycles Among Organizations, People and the Software Industry

    National Research Council Canada - National Science Library

    Moore, Robert; Adams, Brady

    2008-01-01

    .... This thesis intends to explore the moderating factors of these three distinct and disjointed cycles and propose courses of action towards mitigating various issues and problems inherent in the software upgrade process...

  13. User Guide for the Plotting Software for the Los Alamos National Laboratory Nuclear Weapons Analysis Tools Version 2.0

    Energy Technology Data Exchange (ETDEWEB)

    Cleland, Timothy James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-01-02

    The Los Alamos National Laboratory Plotting Software for the Nuclear Weapons Analysis Tools is a Java™ application based upon the open source library JFreeChart. The software provides a capability for plotting data on graphs with a rich variety of display options while allowing the viewer interaction via graph manipulation and scaling to best view the data. The graph types include XY plots, Date XY plots, Bar plots and Histogram plots.

  14. Software Quality Evaluation Models Applicable in Health Information and Communications Technologies. A Review of the Literature.

    Science.gov (United States)

    Villamor Ordozgoiti, Alberto; Delgado Hito, Pilar; Guix Comellas, Eva María; Fernandez Sanchez, Carlos Manuel; Garcia Hernandez, Milagros; Lluch Canut, Teresa

    2016-01-01

    Information and Communications Technologies in healthcare has increased the need to consider quality criteria through standardised processes. The aim of this study was to analyse the software quality evaluation models applicable to healthcare from the perspective of ICT-purchasers. Through a systematic literature review with the keywords software, product, quality, evaluation and health, we selected and analysed 20 original research papers published from 2005-2016 in health science and technology databases. The results showed four main topics: non-ISO models, software quality evaluation models based on ISO/IEC standards, studies analysing software quality evaluation models, and studies analysing ISO standards for software quality evaluation. The models provide cost-efficiency criteria for specific software, and improve use outcomes. The ISO/IEC25000 standard is shown as the most suitable for evaluating the quality of ICTs for healthcare use from the perspective of institutional acquisition.

  15. A directory of computer software applications: energy. Report for 1974--1976

    International Nuclear Information System (INIS)

    Grooms, D.W.

    1977-04-01

    The computer programs or the computer program documentation cited in this directory have been developed for a variety of applications in the field of energy. The cited computer software includes applications in solar energy, petroleum resources, batteries, electrohydrodynamic generators, magnetohydrodynamic generators, natural gas, nuclear fission, nuclear fusion, hydroelectric power production, and geothermal energy. The computer software cited has been used for simulation and modeling, calculations of future energy requirements, calculations of energy conservation measures, and computations of economic considerations of energy systems

  16. Handbook of software quality assurance techniques applicable to the nuclear industry

    Energy Technology Data Exchange (ETDEWEB)

    Bryant, J.L.; Wilburn, N.P.

    1987-08-01

    Pacific Northwest Laboratory is conducting a research project to recommend good engineering practices in the application of 10 CFR 50, Appendix B requirements to assure quality in the development and use of computer software for the design and operation of nuclear power plants for NRC and industry. This handbook defines the content of a software quality assurance program by enumerating the techniques applicable. Definitions, descriptions, and references where further information may be obtained are provided for each topic.

  17. Handbook of software quality assurance techniques applicable to the nuclear industry

    International Nuclear Information System (INIS)

    Bryant, J.L.; Wilburn, N.P.

    1987-08-01

    Pacific Northwest Laboratory is conducting a research project to recommend good engineering practices in the application of 10 CFR 50, Appendix B requirements to assure quality in the development and use of computer software for the design and operation of nuclear power plants for NRC and industry. This handbook defines the content of a software quality assurance program by enumerating the techniques applicable. Definitions, descriptions, and references where further information may be obtained are provided for each topic

  18. Excel2Genie: A Microsoft Excel application to improve the flexibility of the Genie-2000 Spectroscopic software.

    Science.gov (United States)

    Forgács, Attila; Balkay, László; Trón, Lajos; Raics, Péter

    2014-12-01

    Excel2Genie, a simple and user-friendly Microsoft Excel interface, has been developed to the Genie-2000 Spectroscopic Software of Canberra Industries. This Excel application can directly control Canberra Multichannel Analyzer (MCA), process the acquired data and visualize them. Combination of Genie-2000 with Excel2Genie results in remarkably increased flexibility and a possibility to carry out repetitive data acquisitions even with changing parameters and more sophisticated analysis. The developed software package comprises three worksheets: display parameters and results of data acquisition, data analysis and mathematical operations carried out on the measured gamma spectra. At the same time it also allows control of these processes. Excel2Genie is freely available to assist gamma spectrum measurements and data evaluation by the interested Canberra users. With access to the Visual Basic Application (VBA) source code of this application users are enabled to modify the developed interface according to their intentions. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Using CASE Software to Teach Undergraduates Systems Analysis and Design.

    Science.gov (United States)

    Wilcox, Russell E.

    1988-01-01

    Describes the design and delivery of a college course for information system students utilizing a Computer-Aided Software Engineering program. Discusses class assignments, cooperative learning, student attitudes, and the advantages of using this software in the course. (CW)

  20. A Web Based Financial and Accounting Software Application

    Directory of Open Access Journals (Sweden)

    Doru E. TILIUTE

    2010-01-01

    Full Text Available The Client-server applications become more attractivein comparison with their counterpart desktop-type due to someincontestable advantages. Among the client-server applicationssome uses the Web environment providing full access fromanywhere and anytime to all application features. The presentwork presents the fist results in the achievement of a web basedfinancial and accounting application using open-sourcestechnologies and programming languages (Apache, MySQL,PHP and JavaScript

  1. Nucleonica. Web-based software tools for simulation and analysis

    International Nuclear Information System (INIS)

    Magill, J.; Dreher, R.; Soti, Z.

    2014-01-01

    The authors present a description of the Nucleonica web-based portal for simulation and analysis for a wide range of commonly encountered nuclear science applications. Advantages of a web-based approach include availability wherever there is internet access, intuitive user-friendly interface, remote access to high-power computing resources, and continual maintenance, improvement, and addition of tools and techniques common to the nuclear science industry. A description of the nuclear data resources, and some applications is given.

  2. Applications of Cerius2, software of molecular simulation; Aplicaciones de Cerius2, software de simulacion molecular

    Energy Technology Data Exchange (ETDEWEB)

    Fernandez G, M E; Perez A, M; Gutierrez W, C E [ININ, 52750 La Marquesa, Estado de Mexico (Mexico)

    2007-07-01

    Most of the investigations have a theoretical sustenance based on molecular simulation. The area of application of molecular simulation is very wide, in the Materials Technology Department assigned to the Applied Sciences Management have been treated problems about metallic nano structures, glasses, interfaces, and molecules, to sustain and to explain some of the experimental results. Energy calculations are carried out to determine minimum energy structures, for later on to carry out calculations of some of their properties; as well as the images simulation of Electron microscopy and X-ray diffraction. (Author)

  3. Unit Testing for the Application Control Language (ACL) Software

    Science.gov (United States)

    Heinich, Christina Marie

    2014-01-01

    In the software development process, code needs to be tested before it can be packaged for release in order to make sure the program actually does what it says is supposed to happen as well as to check how the program deals with errors and edge cases (such as negative or very large numbers). One of the major parts of the testing process is unit testing, where you test specific units of the code to make sure each individual part of the code works. This project is about unit testing many different components of the ACL software and fixing any errors encountered. To do this, mocks of other objects need to be created and every line of code needs to be exercised to make sure every case is accounted for. Mocks are important to make because it gives direct control of the environment the unit lives in instead of attempting to work with the entire program. This makes it easier to achieve the second goal of exercising every line of code.

  4. Application of parallelized software architecture to an autonomous ground vehicle

    Science.gov (United States)

    Shakya, Rahul; Wright, Adam; Shin, Young Ho; Momin, Orko; Petkovsek, Steven; Wortman, Paul; Gautam, Prasanna; Norton, Adam

    2011-01-01

    This paper presents improvements made to Q, an autonomous ground vehicle designed to participate in the Intelligent Ground Vehicle Competition (IGVC). For the 2010 IGVC, Q was upgraded with a new parallelized software architecture and a new vision processor. Improvements were made to the power system reducing the number of batteries required for operation from six to one. In previous years, a single state machine was used to execute the bulk of processing activities including sensor interfacing, data processing, path planning, navigation algorithms and motor control. This inefficient approach led to poor software performance and made it difficult to maintain or modify. For IGVC 2010, the team implemented a modular parallel architecture using the National Instruments (NI) LabVIEW programming language. The new architecture divides all the necessary tasks - motor control, navigation, sensor data collection, etc. into well-organized components that execute in parallel, providing considerable flexibility and facilitating efficient use of processing power. Computer vision is used to detect white lines on the ground and determine their location relative to the robot. With the new vision processor and some optimization of the image processing algorithm used last year, two frames can be acquired and processed in 70ms. With all these improvements, Q placed 2nd in the autonomous challenge.

  5. A Software Data Transport Framework for Trigger Applications on Clusters

    CERN Document Server

    Steinbeck, T M; Tilsner, H; Steinbeck, Timm M.; Lindenstruth, Volker; Tilsner, Heinz

    2003-01-01

    In the future ALICE heavy ion experiment at CERN's Large Hadron Collider input data rates of up to 25 GB/s have to be handled by the High Level Trigger (HLT) system, which has to scale them down to at most 1.25 GB/s before being written to permanent storage. The HLT system that is being designed to cope with these data rates consists of a large PC cluster, up to the order of a 1000 nodes, connected by a fast network. For the software that will run on these nodes a flexible data transport and distribution software framework has been developed. This framework consists of a set of separate components, that can be connected via a common interface, allowing to construct different configurations for the HLT, that are even changeable at runtime. To ensure a fault-tolerant operation of the HLT, the framework includes a basic fail-over mechanism that will be further expanded in the future, utilizing the runtime reconnection feature of the framework's component interface. First performance tests show very promising res...

  6. ATLAS tile calorimeter cesium calibration control and analysis software

    International Nuclear Information System (INIS)

    Solovyanov, O; Solodkov, A; Starchenko, E; Karyukhin, A; Isaev, A; Shalanda, N

    2008-01-01

    An online control system to calibrate and monitor ATLAS Barrel hadronic calorimeter (TileCal) with a movable radioactive source, driven by liquid flow, is described. To read out and control the system an online software has been developed, using ATLAS TDAQ components like DVS (Diagnostic and Verification System) to verify the hardware before running, IS (Information Server) for data and status exchange between networked computers, and other components like DDC (DCS to DAQ Connection), to connect to PVSS-based slow control systems of Tile Calorimeter, high voltage and low voltage. A system of scripting facilities, based on Python language, is used to handle all the calibration and monitoring processes from hardware perspective to final data storage, including various abnormal situations. A QT based graphical user interface to display the status of the calibration system during the cesium source scan is described. The software for analysis of the detector response, using online data, is discussed. Performance of the system and first experience from the ATLAS pit are presented

  7. Phenomenology and Qualitative Data Analysis Software (QDAS: A Careful Reconciliation

    Directory of Open Access Journals (Sweden)

    Brian Kelleher Sohn

    2017-01-01

    Full Text Available An oft-cited phenomenological methodologist, Max VAN MANEN (2014, claims that qualitative data analysis software (QDAS is not an appropriate tool for phenomenological research. Yet phenomenologists rarely describe how phenomenology is to be done: pencil, paper, computer? DAVIDSON and DI GREGORIO (2011 urge QDAS contrarians such as VAN MANEN to get over their methodological loyalties and join the digital world, claiming that all qualitative researchers, whatever their methodology, perform processes aided by QDAS: disaggregation and recontextualization of texts. Other phenomenologists exemplify DAVIDSON and DI GREGORIO's observation that arguments against QDAS often identify problems more closely related to the researchers than QDAS. But the concerns about technology of McLUHAN (2003 [1964], HEIDEGGER (2008 [1977], and FLUSSER (2013 cannot be ignored. In this conceptual article I answer the questions of phenomenologists and the call of QDAS methodologists to describe how I used QDAS to carry out a phenomenological study in order to guide others who choose to reconcile the use of software to assist their research. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1701142

  8. ATLAS tile calorimeter cesium calibration control and analysis software

    Energy Technology Data Exchange (ETDEWEB)

    Solovyanov, O; Solodkov, A; Starchenko, E; Karyukhin, A; Isaev, A; Shalanda, N [Institute for High Energy Physics, Protvino 142281 (Russian Federation)], E-mail: Oleg.Solovyanov@ihep.ru

    2008-07-01

    An online control system to calibrate and monitor ATLAS Barrel hadronic calorimeter (TileCal) with a movable radioactive source, driven by liquid flow, is described. To read out and control the system an online software has been developed, using ATLAS TDAQ components like DVS (Diagnostic and Verification System) to verify the hardware before running, IS (Information Server) for data and status exchange between networked computers, and other components like DDC (DCS to DAQ Connection), to connect to PVSS-based slow control systems of Tile Calorimeter, high voltage and low voltage. A system of scripting facilities, based on Python language, is used to handle all the calibration and monitoring processes from hardware perspective to final data storage, including various abnormal situations. A QT based graphical user interface to display the status of the calibration system during the cesium source scan is described. The software for analysis of the detector response, using online data, is discussed. Performance of the system and first experience from the ATLAS pit are presented.

  9. Carotid artery stenosis: Performance of advanced vessel analysis software in evaluating CTA

    International Nuclear Information System (INIS)

    Tsiflikas, Ilias; Biermann, Christina; Thomas, Christoph; Ketelsen, Dominik; Claussen, Claus D.; Heuschmid, Martin

    2012-01-01

    Objectives: The aim of this study was to evaluate time efficiency and diagnostic reproducibility of an advanced vessel analysis software for diagnosis of carotid artery stenosis. Material and methods: 40 patients with suspected carotid artery stenosis received head and neck DE-CTA as part of their pre-interventional workup. Acquired data were evaluated by 2 independent radiologists. Stenosis grading was performed by MPR eyeballing with freely adjustable MPRs and with a preliminary prototype of the meanwhile available client-server and advanced visualization software syngo.via CT Vascular (Siemens Healthcare, Erlangen, Germany). Stenoses were graded according to the following 5 categories: I: 0%, II: 1–50%, III: 51–69%, IV: 70–99% and V: total occlusion. Furthermore, time to diagnosis for each carotid artery was recorded. Results: Both readers achieved very good specificity values and good respectively very good sensitivity values without significant differences between both reading methods. Furthermore, there was a very good correlation between both readers for both reading methods without significant differences (kappa value: standard image interpretation k = 0.809; advanced vessel analysis software k = 0.863). Using advanced vessel analysis software resulted in a significant time saving (p < 0.0001) for both readers. Time to diagnosis could be decreased by approximately 55%. Conclusions: Advanced vessel analysis application CT Vascular of the new imaging software syngo.via (Siemens Healthcare, Forchheim, Germany) provides a high rate of reproducibility in assessment of carotid artery stenosis. Furthermore a significant time saving in comparison to standard image interpretation is achievable

  10. Carotid artery stenosis: Performance of advanced vessel analysis software in evaluating CTA

    Energy Technology Data Exchange (ETDEWEB)

    Tsiflikas, Ilias, E-mail: ilias.tsiflikas@med.uni-tuebingen.de [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany); Biermann, Christina, E-mail: christina.biermann@siemens.com [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany); Siemens AG, Siemens Healthcare Consulting, Allee am Röthelheimpark 3A, 91052 Erlangen (Germany); Thomas, Christoph, E-mail: christoph.thomas@med.uni-tuebingen.de [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany); Ketelsen, Dominik, E-mail: dominik.ketelsen@med.uni-tuebingen.de [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany); Claussen, Claus D., E-mail: claus.claussen@med.uni-tuebingen.de [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany); Heuschmid, Martin, E-mail: martin.heuschmid@med.uni-tuebingen.de [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany)

    2012-09-15

    Objectives: The aim of this study was to evaluate time efficiency and diagnostic reproducibility of an advanced vessel analysis software for diagnosis of carotid artery stenosis. Material and methods: 40 patients with suspected carotid artery stenosis received head and neck DE-CTA as part of their pre-interventional workup. Acquired data were evaluated by 2 independent radiologists. Stenosis grading was performed by MPR eyeballing with freely adjustable MPRs and with a preliminary prototype of the meanwhile available client-server and advanced visualization software syngo.via CT Vascular (Siemens Healthcare, Erlangen, Germany). Stenoses were graded according to the following 5 categories: I: 0%, II: 1–50%, III: 51–69%, IV: 70–99% and V: total occlusion. Furthermore, time to diagnosis for each carotid artery was recorded. Results: Both readers achieved very good specificity values and good respectively very good sensitivity values without significant differences between both reading methods. Furthermore, there was a very good correlation between both readers for both reading methods without significant differences (kappa value: standard image interpretation k = 0.809; advanced vessel analysis software k = 0.863). Using advanced vessel analysis software resulted in a significant time saving (p < 0.0001) for both readers. Time to diagnosis could be decreased by approximately 55%. Conclusions: Advanced vessel analysis application CT Vascular of the new imaging software syngo.via (Siemens Healthcare, Forchheim, Germany) provides a high rate of reproducibility in assessment of carotid artery stenosis. Furthermore a significant time saving in comparison to standard image interpretation is achievable.

  11. Experimental software for modeling and interpreting educational data analysis processes

    Directory of Open Access Journals (Sweden)

    Natalya V. Zorina

    2017-12-01

    Full Text Available Problems, tasks and processes of educational data mining are considered in this article. The objective is to create a fundamentally new information system of the University using the results educational data analysis. One of the functions of such a system is knowledge extraction from accumulated in the operation process data. The creation of the national system of this type is an iterative and time-consuming process requiring the preliminary studies and incremental prototyping modules. The novelty of such systems is that there is a lack of those using this methodology of the development, for this purpose a number of experiments was carried out in order to collect data, choose appropriate methods for the study and to interpret them. As a result of the experiment, the authors were available sources available for analysis in the information environment of the home university. The data were taken from the semester performance, obtained from the information system of the training department of the Institute of IT MTU MIREA, the data obtained as a result of the independent work of students and data, using specially designed Google-forms. To automate the collection of information and analysis of educational data, an experimental software package was created. As a methodology for developing the experimental software complex, a decision was made using the methodologies of rational-empirical complexes (REX and single-experimentation program technologies (TPEI. The details of the program implementation of the complex are described in detail, conclusions are given about the availability of the data sources used, and conclusions are drawn about the prospects for further development.

  12. Transient thermal-hydraulic characteristics analysis software for PWR nuclear power systems

    International Nuclear Information System (INIS)

    Wu Yingwei; Zhuang Chengjun; Su Guanghui; Qiu Suizheng

    2010-01-01

    A point reactor neutron kinetics model, a two-phase drift-flow U-tube steam generator model, an advanced non-equilibrium three regions pressurizer model, and a passive emergency core decay heat-removed system model are adopted in the paper to develop the computerized analysis code for PWR transient thermal-hydraulic characteristics, by Compaq Visual Fortran 6.0 language. Visual input, real-time processing and dynamic visualization output are achieved by Microsoft Visual Studio. NET language. The reliability verification of the soft has been conducted by RELAP 5, and the verification results show that the software is with high calculation precision, high calculation speed, modern interface, luxuriant functions and strong operability. The software was applied to calculate the transient accident conditions for QSNP, and the analysis results are significant to the practical engineering applications. (authors)

  13. ICAS-PAT: A Software for Design, Analysis and Validation of PAT Systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2010-01-01

    end product qualities. In an earlier article, Singh et al. [Singh, R., Gernaey, K. V., Gani, R. (2009). Model-based computer-aided framework for design of process monitoring and analysis systems. Computers & Chemical Engineering, 33, 22–42] proposed the use of a systematic model and data based...... methodology to design appropriate PAT systems. This methodology has now been implemented into a systematic computer-aided framework to develop a software (ICAS-PAT) for design, validation and analysis of PAT systems. Two supporting tools needed by ICAS-PAT have also been developed: a knowledge base...... (consisting of process knowledge as well as knowledge on measurement methods and tools) and a generic model library (consisting of process operational models). Through a tablet manufacturing process example, the application of ICAS-PAT is illustrated, highlighting as well, the main features of the software....

  14. Integrated State Estimation and Contingency Analysis Software Implementation using High Performance Computing Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yousu; Glaesemann, Kurt R.; Rice, Mark J.; Huang, Zhenyu

    2015-12-31

    Power system simulation tools are traditionally developed in sequential mode and codes are optimized for single core computing only. However, the increasing complexity in the power grid models requires more intensive computation. The traditional simulation tools will soon not be able to meet the grid operation requirements. Therefore, power system simulation tools need to evolve accordingly to provide faster and better results for grid operations. This paper presents an integrated state estimation and contingency analysis software implementation using high performance computing techniques. The software is able to solve large size state estimation problems within one second and achieve a near-linear speedup of 9,800 with 10,000 cores for contingency analysis application. The performance evaluation is presented to show its effectiveness.

  15. Visual data mining and analysis of software repositories

    NARCIS (Netherlands)

    Voinea, S.L.; Telea, A.C.

    2007-01-01

    In this article we describe an ongoing effort to integrate information visualization techniques into the process of configuration management for software systems. Our focus is to help software engineers manage the evolution of large and complex software systems by offering them effective and

  16. AppNVM: Software-Defined, Application_Driven SSD

    DEFF Research Database (Denmark)

    Bjørling, Matias; Madsen, Jesper; Gonzalez, Javier

    2015-01-01

    -NVM SSDs by installing rules, which define (i) the logical address space exposed to the application, and (ii) how application requests are handled. A controller then transforms those rules and installs them onto the device, enforcing permissions and global policies such as wear-leveling and garbage...

  17. Development of a software application to evaluate the performance and energy losses of grid-connected photovoltaic systems

    International Nuclear Information System (INIS)

    Trillo-Montero, D.; Santiago, I.; Luna-Rodriguez, J.J.; Real-Calvo, R.

    2014-01-01

    Highlights: • Software application to perform an automated analysis of grid-connected PV systems. • It integrates data from all devices registering data on typical PV installations. • Flexible to analyze installations with different configurations and components. • An analysis of two grid-connected PV systems located in Andalusia, was performed. • Temperature losses in summer months varying between 15% and 25% of energy production. - Abstract: The aim of this paper was to design and develop a software application that enables users to perform an automated analysis of data from the monitoring of grid-connected photovoltaic (PV) systems. This application integrates data from all devices already in operation such as environmental sensors, inverters and meters, which record information on typical PV installations. This required the development of a Relational Database Management System (RDBMS), consisting of a series of linked databases, enabling all PV system information to be stored; and a software, called S·lar, which enables all information from the monitoring to be automatically migrated to the database as well as determining some standard magnitudes related to performances and losses of PV installation components at different time scales. A visualization tool, which is both graphical and numerical, makes access to all of the information be a simple task. Moreover, the application enables relationships between parameters and/or magnitudes to be easily established. Furthermore, it can perform a preliminary analysis of the influence of PV installations on the distribution grids where the produced electricity is injected. The operation of such a software application was implemented by performing the analysis of two grid-connected PV installations located in Andalusia, Spain, via data monitoring therein. The monitoring took place from January 2011 to May 2012

  18. Application Software for the Cabinet Operator Module of the Reactor Protection System

    International Nuclear Information System (INIS)

    Lee, Hyun-Chul; Jung, Hae-Won; Lee, Sung-Jin; Koo, Young-Ho; Kim, Seong-Tae; Kwak, Tae-Kil; Jin, Kyo-Hong

    2006-01-01

    A reactor protection system (RPS) plays the roles of generating the reactor trip signal and the engineered safety features (ESF) actuation signal when the monitored plant processes reach predefined limits. A Korean project group, so-called KNICS (Korean Nuclear I and C System), is developing a new digitalized RPS and the Cabinet Operator Module (COM) of the RPS which is used for the RPS integrity testing and monitoring by equipment operators. A flat panel display (FPD) with a touch screen capability is provided as a main user interface for the RPS. This paper shows the application software developed for the COM FPD. Equipment operators can monitor the status of the RPS and carry out various tests to verify system functions by means of the application software. A qualified hardware and software development environment are used to develop the application software

  19. Automated software analysis of nuclear core discharge data

    International Nuclear Information System (INIS)

    Larson, T.W.; Halbig, J.K.; Howell, J.A.; Eccleston, G.W.; Klosterbuer, S.F.

    1993-03-01

    Monitoring the fueling process of an on-load nuclear reactor is a full-time job for nuclear safeguarding agencies. Nuclear core discharge monitors (CDMS) can provide continuous, unattended recording of the reactor's fueling activity for later, qualitative review by a safeguards inspector. A quantitative analysis of this collected data could prove to be a great asset to inspectors because more information can be extracted from the data and the analysis time can be reduced considerably. This paper presents a prototype for an automated software analysis system capable of identifying when fuel bundle pushes occurred and monitoring the power level of the reactor. Neural network models were developed for calculating the region on the reactor face from which the fuel was discharged and predicting the burnup. These models were created and tested using actual data collected from a CDM system at an on-load reactor facility. Collectively, these automated quantitative analysis programs could help safeguarding agencies to gain a better perspective on the complete picture of the fueling activity of an on-load nuclear reactor. This type of system can provide a cost-effective solution for automated monitoring of on-load reactors significantly reducing time and effort

  20. I-Structure software cache for distributed applications

    Directory of Open Access Journals (Sweden)

    Alfredo Cristóbal Salas

    2004-01-01

    Full Text Available En este artículo, describimos el caché de software I-Structure para entornos de memoria distribuida (D-ISSC, lo cual toma ventaja de la localidad de los datos mientras mantiene la capacidad de tolerancia a la latencia de sistemas de memoria I-Structure. Las facilidades de programación de los programas MPI, le ocultan los problemas de sincronización al programador. Nuestra evaluación experimental usando un conjunto de pruebas de rendimiento indica que clusters de PC con I-Structure y su mecanismo de cache D-ISSC son más robustos. El sistema puede acelerar aplicaciones de comunicación intensiva regulares e irregulares.

  1. Calculating earth dam seepage using HYDRUS software applications

    Directory of Open Access Journals (Sweden)

    Jakub Nieć

    2017-06-01

    Full Text Available This paper presents simulations of water seepage within and under the embankment dam of Lake Kowalskie reservoir. The aim of the study was to compare seepage calculation results obtained using analytical and numerical methods. In April 1985, after the first filling of the reservoir to normal storage levels, water leaks was observed at the base of the escarpment, on the air side of the dam. In order to control seepage flow, drainage was performed and additional piezometers installed. To explain the causes of increased pressure in the aquifer under the dam in May 1985 a simplified calculation of filtration was performed. Now, on the basis of archived data from the Department of Hydraulic and Sanitary Engineering using 3D HYDRUS STANDARD software, the conditions of seepage under the dam have been recreated and re-calculated. Piezometric pressure was investigated in three variants of drainage, including drainage before and after modernization.

  2. Knowledge-based requirements analysis for automating software development

    Science.gov (United States)

    Markosian, Lawrence Z.

    1988-01-01

    We present a new software development paradigm that automates the derivation of implementations from requirements. In this paradigm, informally-stated requirements are expressed in a domain-specific requirements specification language. This language is machine-understable and requirements expressed in it are captured in a knowledge base. Once the requirements are captured, more detailed specifications and eventually implementations are derived by the system using transformational synthesis. A key characteristic of the process is that the required human intervention is in the form of providing problem- and domain-specific engineering knowledge, not in writing detailed implementations. We describe a prototype system that applies the paradigm in the realm of communication engineering: the prototype automatically generates implementations of buffers following analysis of the requirements on each buffer.

  3. Nonlinear analysis of reinforced concrete structures using software package abaqus

    Directory of Open Access Journals (Sweden)

    Marković Nemanja

    2014-01-01

    Full Text Available Reinforced concrete (AB is characterized by huge inhomogeneity resulting from the material characteristics of the concrete, then, quasi-brittle behavior during failure. These and other phenomena require the introduction of material nonlinearity in the modeling of reinforced concrete structures. This paper presents the modeling reinforced concrete in the software package ABAQUS. A brief theoretical overview is presented of methods such as: Concrete Damage Plasticity (CDP, Smeared Concrete Cracking (CSC, Cap Plasticity (CP and Drucker-Prager model (DPM. We performed a nonlinear analysis of two-storey reinforced concrete frame by applying CDP method for modeling material nonlinearity of concrete. We have analyzed damage zones, crack propagation and loading-deflection ratio.

  4. The Database and Data Analysis Software of Radiation Monitoring System

    International Nuclear Information System (INIS)

    Wang Weizhen; Li Jianmin; Wang Xiaobing; Hua Zhengdong; Xu Xunjiang

    2009-01-01

    Shanghai Synchrotron Radiation Facility (SSRF for short) is a third-generation light source building in China, including a 150MeV injector, 3.5GeV booster, 3.5GeV storage ring and an amount of beam line stations. The data is fetched by the monitoring computer from collecting modules in the front end, and saved in the MySQL database in the managing computer. The data analysis software is coded with Python, a script language, to inquire, summarize and plot the data of a certain monitoring channel during a certain period and export to an external file. In addition, the warning event can be inquired separately. The website for historical and real-time data inquiry and plotting is coded with PHP. (authors)

  5. Image analysis software for following progression of peripheral neuropathy

    Science.gov (United States)

    Epplin-Zapf, Thomas; Miller, Clayton; Larkin, Sean; Hermesmeyer, Eduardo; Macy, Jenny; Pellegrini, Marco; Luccarelli, Saverio; Staurenghi, Giovanni; Holmes, Timothy

    2009-02-01

    A relationship has been reported by several research groups [1 - 4] between the density and shapes of nerve fibers in the cornea and the existence and severity of peripheral neuropathy. Peripheral neuropathy is a complication of several prevalent diseases or conditions, which include diabetes, HIV, prolonged alcohol overconsumption and aging. A common clinical technique for confirming the condition is intramuscular electromyography (EMG), which is invasive, so a noninvasive technique like the one proposed here carries important potential advantages for the physician and patient. A software program that automatically detects the nerve fibers, counts them and measures their shapes is being developed and tested. Tests were carried out with a database of subjects with levels of severity of diabetic neuropathy as determined by EMG testing. Results from this testing, that include a linear regression analysis are shown.

  6. The review of the modeling methods and numerical analysis software for nanotechnology in material science

    Directory of Open Access Journals (Sweden)

    SMIRNOV Vladimir Alexeevich

    2014-10-01

    Full Text Available Due to the high demand for building materials with universal set of roperties which extend their application area the research efforts are focusing on nanotechnology in material science. The rational combination of theoretical studies, mathematical modeling and simulation can favour reduced resource and time consumption when nanomodified materials are being developed. The development of composite material is based on the principles of system analysis which provides for the necessity of criteria determination and further classification of modeling methods. In this work the criteria of spatial scale, dominant type of interaction and heterogeneity are used for such classification. The presented classification became a framework for analysis of methods and software which can be applied to the development of building materials. For each of selected spatial levels - from atomistic one to macrostructural level of constructional coarsegrained composite – existing theories, modeling algorithms and tools have been considered. At the level of macrostructure which is formed under influence of gravity and exterior forces one can apply probabilistic and geometrical methods to study obtained structure. The existing models are suitable for packing density analysis and solution of percolation problems at the macroscopic level, but there are still no software tools which could be applied in nanotechnology to carry out systematic investigations. At the microstructure level it’s possible to use particle method along with probabilistic and statistical methods to explore structure formation but available software tools are partially suitable for numerical analysis of microstructure models. Therefore, modeling of the microstructure is rather complicated; the model has to include potential of pairwise interaction. After the model has been constructed and parameters of pairwise potential have been determined, many software packages for solution of ordinary

  7. Development of a User Interface for a Regression Analysis Software Tool

    Science.gov (United States)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  8. Fault tree synthesis for software design analysis of PLC based safety-critical systems

    International Nuclear Information System (INIS)

    Koo, S. R.; Cho, C. H.; Seong, P. H.

    2006-01-01

    As a software verification and validation should be performed for the development of PLC based safety-critical systems, a software safety analysis is also considered in line with entire software life cycle. In this paper, we propose a technique of software safety analysis in the design phase. Among various software hazard analysis techniques, fault tree analysis is most widely used for the safety analysis of nuclear power plant systems. Fault tree analysis also has the most intuitive notation and makes both qualitative and quantitative analyses possible. To analyze the design phase more effectively, we propose a technique of fault tree synthesis, along with a universal fault tree template for the architecture modules of nuclear software. Consequently, we can analyze the safety of software on the basis of fault tree synthesis. (authors)

  9. Exploring Dynamic User–Interface in Achieving Software Application ...

    African Journals Online (AJOL)

    PROF. O. E. OSUAGWU

    2013-09-01

    Sep 1, 2013 ... rudiments of user interface in application development may be a diffults task and time consuming, but there ... screen. Such interface is described as menu- driven. (3) Graphical user interface (GUI): user .... using green colour.

  10. Software Applications on the Peregrine System | High-Performance Computing

    Science.gov (United States)

    Algebraic Modeling System (GAMS) Statistics and analysis High-level modeling system for mathematical reactivity. Gurobi Optimizer Statistics and analysis Solver for mathematical programming LAMMPS Chemistry and , reactivities, and vibrational, electronic and NMR spectra. R Statistical Computing Environment Statistics and

  11. Immunogenetic Management Software: a new tool for visualization and analysis of complex immunogenetic datasets.

    Science.gov (United States)

    Johnson, Z P; Eady, R D; Ahmad, S F; Agravat, S; Morris, T; Else, J; Lank, S M; Wiseman, R W; O'Connor, D H; Penedo, M C T; Larsen, C P; Kean, L S

    2012-04-01

    Here we describe the Immunogenetic Management Software (IMS) system, a novel web-based application that permits multiplexed analysis of complex immunogenetic traits that are necessary for the accurate planning and execution of experiments involving large animal models, including nonhuman primates. IMS is capable of housing complex pedigree relationships, microsatellite-based MHC typing data, as well as MHC pyrosequencing expression analysis of class I alleles. It includes a novel, automated MHC haplotype naming algorithm and has accomplished an innovative visualization protocol that allows users to view multiple familial and MHC haplotype relationships through a single, interactive graphical interface. Detailed DNA and RNA-based data can also be queried and analyzed in a highly accessible fashion, and flexible search capabilities allow experimental choices to be made based on multiple, individualized and expandable immunogenetic factors. This web application is implemented in Java, MySQL, Tomcat, and Apache, with supported browsers including Internet Explorer and Firefox on Windows and Safari on Mac OS. The software is freely available for distribution to noncommercial users by contacting Leslie.kean@emory.edu. A demonstration site for the software is available at http://typing.emory.edu/typing_demo , user name: imsdemo7@gmail.com and password: imsdemo.

  12. CASSys: an integrated software-system for the interactive analysis of ChIP-seq data

    Directory of Open Access Journals (Sweden)

    Alawi Malik

    2011-06-01

    Full Text Available The mapping of DNA-protein interactions is crucial for a full understanding of transcriptional regulation. Chromatin-immunoprecipitation followed bymassively parallel sequencing (ChIP-seq has become the standard technique for analyzing these interactions on a genome-wide scale. We have developed a software system called CASSys (ChIP-seq data Analysis Software System spanning all steps of ChIP-seq data analysis. It supersedes the laborious application of several single command line tools. CASSys provides functionality ranging from quality assessment and -control of short reads, over the mapping of reads against a reference genome (readmapping and the detection of enriched regions (peakdetection to various follow-up analyses. The latter are accessible via a state-of-the-art web interface and can be performed interactively by the user. The follow-up analyses allow for flexible user defined association of putative interaction sites with genes, visualization of their genomic context with an integrated genome browser, the detection of putative binding motifs, the identification of over-represented Gene Ontology-terms, pathway analysis and the visualization of interaction networks. The system is client-server based, accessible via a web browser and does not require any software installation on the client side. To demonstrate CASSys’s functionality we used the system for the complete data analysis of a publicly available Chip-seq study that investigated the role of the transcription factor estrogen receptor-α in breast cancer cells.

  13. Automated Translation of Safety Critical Application Software Specifications into PLC Ladder Logic

    Science.gov (United States)

    Leucht, Kurt W.; Semmel, Glenn S.

    2008-01-01

    The numerous benefits of automatic application code generation are widely accepted within the software engineering community. A few of these benefits include raising the abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at the NASA Kennedy Space Center (KSC) recognized the need for PLC code generation while developing their new ground checkout and launch processing system. They developed a process and a prototype software tool that automatically translates a high-level representation or specification of safety critical application software into ladder logic that executes on a PLC. This process and tool are expected to increase the reliability of the PLC code over that which is written manually, and may even lower life-cycle costs and shorten the development schedule of the new control system at KSC. This paper examines the problem domain and discusses the process and software tool that were prototyped by the KSC software engineers.

  14. Wavelet analysis and its applications an introduction

    CERN Document Server

    Yajnik, Archit

    2013-01-01

    "Wavelet analysis and its applications: an introduction" demonstrates the consequences of Fourier analysis and introduces the concept of wavelet followed by applications lucidly. While dealing with one dimension signals, sometimes they are required to be oversampled. A novel technique of oversampling the digital signal is introduced in this book alongwith necessary illustrations. The technique of feature extraction in the development of optical character recognition software for any natural language alongwith wavelet based feature extraction technique is demonstrated using multiresolution analysis of wavelet in the book.

  15. PyElph - a software tool for gel images analysis and phylogenetics

    Directory of Open Access Journals (Sweden)

    Pavel Ana Brânduşa

    2012-01-01

    Full Text Available Abstract Background This paper presents PyElph, a software tool which automatically extracts data from gel images, computes the molecular weights of the analyzed molecules or fragments, compares DNA patterns which result from experiments with molecular genetic markers and, also, generates phylogenetic trees computed by five clustering methods, using the information extracted from the analyzed gel image. The software can be successfully used for population genetics, phylogenetics, taxonomic studies and other applications which require gel image analysis. Researchers and students working in molecular biology and genetics would benefit greatly from the proposed software because it is free, open source, easy to use, has a friendly Graphical User Interface and does not depend on specific image acquisition devices like other commercial programs with similar functionalities do. Results PyElph software tool is entirely implemented in Python which is a very popular programming language among the bioinformatics community. It provides a very friendly Graphical User Interface which was designed in six steps that gradually lead to the results. The user is guided through the following steps: image loading and preparation, lane detection, band detection, molecular weights computation based on a molecular weight marker, band matching and finally, the computation and visualization of phylogenetic trees. A strong point of the software is the visualization component for the processed data. The Graphical User Interface provides operations for image manipulation and highlights lanes, bands and band matching in the analyzed gel image. All the data and images generated in each step can be saved. The software has been tested on several DNA patterns obtained from experiments with different genetic markers. Examples of genetic markers which can be analyzed using PyElph are RFLP (Restriction Fragment Length Polymorphism, AFLP (Amplified Fragment Length Polymorphism, RAPD

  16. Software use cases to elicit the software requirements analysis within the ASTRI project

    Science.gov (United States)

    Conforti, Vito; Antolini, Elisa; Bonnoli, Giacomo; Bruno, Pietro; Bulgarelli, Andrea; Capalbi, Milvia; Fioretti, Valentina; Fugazza, Dino; Gardiol, Daniele; Grillo, Alessandro; Leto, Giuseppe; Lombardi, Saverio; Lucarelli, Fabrizio; Maccarone, Maria Concetta; Malaguti, Giuseppe; Pareschi, Giovanni; Russo, Federico; Sangiorgi, Pierluca; Schwarz, Joseph; Scuderi, Salvatore; Tanci, Claudio; Tosti, Gino; Trifoglio, Massimo; Vercellone, Stefano; Zanmar Sanchez, Ricardo

    2016-07-01

    The Italian National Institute for Astrophysics (INAF) is leading the Astrofisica con Specchi a Tecnologia Replicante Italiana (ASTRI) project whose main purpose is the realization of small size telescopes (SST) for the Cherenkov Telescope Array (CTA). The first goal of the ASTRI project has been the development and operation of an innovative end-to-end telescope prototype using a dual-mirror optical configuration (SST-2M) equipped with a camera based on silicon photo-multipliers and very fast read-out electronics. The ASTRI SST-2M prototype has been installed in Italy at the INAF "M.G. Fracastoro" Astronomical Station located at Serra La Nave, on Mount Etna, Sicily. This prototype will be used to test several mechanical, optical, control hardware and software solutions which will be used in the ASTRI mini-array, comprising nine telescopes proposed to be placed at the CTA southern site. The ASTRI mini-array is a collaborative and international effort led by INAF and carried out by Italy, Brazil and South-Africa. We present here the use cases, through UML (Unified Modeling Language) diagrams and text details, that describe the functional requirements of the software that will manage the ASTRI SST-2M prototype, and the lessons learned thanks to these activities. We intend to adopt the same approach for the Mini Array Software System that will manage the ASTRI miniarray operations. Use cases are of importance for the whole software life cycle; in particular they provide valuable support to the validation and verification activities. Following the iterative development approach, which breaks down the software development into smaller chunks, we have analysed the requirements, developed, and then tested the code in repeated cycles. The use case technique allowed us to formalize the problem through user stories that describe how the user procedurally interacts with the software system. Through the use cases we improved the communication among team members, fostered

  17. Computer-aided design in power engineering. Application of software tools

    International Nuclear Information System (INIS)

    Stojkovic, Zlatan

    2012-01-01

    Demonstrates the use software tools in the practice of design in the field of power systems. Presents many applications in the design in the field of power systems. Useful for educative purposes and practical work. This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel and Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems is discussed. Here, the emphasis is put on the standard software MS Excel and MS Project.

  18. Computer-aided design in power engineering. Application of software tools

    Energy Technology Data Exchange (ETDEWEB)

    Stojkovic, Zlatan

    2012-07-01

    Demonstrates the use software tools in the practice of design in the field of power systems. Presents many applications in the design in the field of power systems. Useful for educative purposes and practical work. This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel and Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems is discussed. Here, the emphasis is put on the standard software MS Excel and MS Project.

  19. Software features and applications in process design, integration and operation

    Energy Technology Data Exchange (ETDEWEB)

    Dhole, V. [Aspen Tech Limited, Warrington (United Kingdom)

    1999-02-01

    Process engineering technologies and tools have evolved rapidly over the last twenty years. Process simulation/modeling, advanced process control, on-line optimisation, production planning and supply chain management are some of the examples of technologies that have rapidly matured from early commercial prototypes and concepts to established tools with significant impact on profitability of process industry today. Process Synthesis or Process Integration (PI) in comparison is yet to create its impact and still remains largely in the domain of few expert users. One of the key reasons as to why PI has not taken off is because the PI tools have not become integral components of the standard process engineering environments. On the last 15 years AspenTech has grown from a small process simulation tool provider to a large multinational company providing a complete suite of process engineering technologies and services covering process design, operation, planning and supply chain management. Throughout this period, AspenTech has acquired experience in rapidly evolving technologies from their early prototype stage to mature products and services. The paper outlines AspenTech`s strategy of integrating PI with other more established process design and operational improvement technologies. The paper illustrates the key elements of AspenTech`s strategy via examples of software development initiatives and services projects. The paper also outlines AspenTech`s future vision of the role of PI in process engineering. (au)

  20. A Software Application for Managing Graduates and Graduation Diploma in the University

    Directory of Open Access Journals (Sweden)

    Mîzgaciu C.

    2009-12-01

    Full Text Available This paper presents the structure mode of organization and storage of data that is contained in a graduation diploma. The graduation diploma is of three types, based on the three important cycles of study (bachelor, master, and doctoral degree. We do an analysis of the information that is included in the graduation diploma and how we can manage this from the quality point of view.We print the graduation diploma once on the form, elaborated by our Ministry of Education, Research and Innovation (MECI, we can make a duplicate in certain cases.We suggest an online application which is based on a software solution using Apache, PHP and MySQL.

  1. Excel2Genie: A Microsoft Excel application to improve the flexibility of the Genie-2000 Spectroscopic software

    International Nuclear Information System (INIS)

    Forgács, Attila; Balkay, László; Trón, Lajos; Raics, Péter

    2014-01-01

    Excel2Genie, a simple and user-friendly Microsoft Excel interface, has been developed to the Genie-2000 Spectroscopic Software of Canberra Industries. This Excel application can directly control Canberra Multichannel Analyzer (MCA), process the acquired data and visualize them. Combination of Genie-2000 with Excel2Genie results in remarkably increased flexibility and a possibility to carry out repetitive data acquisitions even with changing parameters and more sophisticated analysis. The developed software package comprises three worksheets: display parameters and results of data acquisition, data analysis and mathematical operations carried out on the measured gamma spectra. At the same time it also allows control of these processes. Excel2Genie is freely available to assist gamma spectrum measurements and data evaluation by the interested Canberra users. With access to the Visual Basic Application (VBA) source code of this application users are enabled to modify the developed interface according to their intentions. - Highlights: • User-friendly Microsoft Excel interface to the Genie-2000 Spectroscopic software. • Data acquisitions with changing parameters and sophisticated data analysis. • Three worksheets: data acquisition, data analysis and mathematical operations on spectra. • The source code is freely available

  2. Application of an image processing software for quantitative autoradiography

    International Nuclear Information System (INIS)

    Sobeslavsky, E.; Bergmann, R.; Kretzschmar, M.; Wenzel, U.

    1993-01-01

    The present communication deals with the utilization of an image processing device for quantitative whole-body autoradiography, cell counting and also for interpretation of chromatograms. It is shown that the system parameters allow an adequate and precise determination of optical density values. Also shown are the main error sources limiting the applicability of the system. (orig.)

  3. Beehive: A Software Application for Synchronous Collaborative Learning

    Science.gov (United States)

    Turani, Aiman; Calvo, Rafael A.

    2006-01-01

    Purpose: The purpose of this paper is to describe Beehive, a new web application framework for designing and supporting synchronous collaborative learning. Design/methodology/approach: Our web engineering approach integrates educational design expertise into a technology for building tools for collaborative learning activities. Beehive simplifies…

  4. Development of analysis software for radiation time-series data with the use of visual studio 2005

    International Nuclear Information System (INIS)

    Hohara, Sin-ya; Horiguchi, Tetsuo; Ito, Shin

    2008-01-01

    Time-Series Analysis supplies a new vision that conventional analysis methods such as energy spectroscopy haven't achieved ever. However, application of time-series analysis to radiation measurements needs much effort in software and hardware development. By taking advantage of Visual Studio 2005, we developed an analysis software, 'ListFileConverter', for time-series radiation measurement system called as 'MPA-3'. The software is based on graphical user interface (GUI) architecture that enables us to save a large amount of operation time in the analysis, and moreover to make an easy-access to special file structure of MPA-3 data. In this paper, detailed structure of ListFileConverter is fully explained, and experimental results for counting capability of MPA-3 hardware system and those for neutron measurements with our UTR-KINKI reactor are also given. (author)

  5. The Software Therapist: Usability Problem Diagnosis Through Latent Semantic Analysis

    National Research Council Canada - National Science Library

    Sparks, Randall; Hartson, Rex

    2006-01-01

    The work we report on here addresses the problem of low return on investment in software usability engineering and offers support for usability practitioners in identifying, understanding, documenting...

  6. Melanie II--a third-generation software package for analysis of two-dimensional electrophoresis images: II. Algorithms.

    Science.gov (United States)

    Appel, R D; Vargas, J R; Palagi, P M; Walther, D; Hochstrasser, D F

    1997-12-01

    After two generations of software systems for the analysis of two-dimensional electrophoresis (2-DE) images, a third generation of such software packages has recently emerged that combines state-of-the-art graphical user interfaces with comprehensive spot data analysis capabilities. A key characteristic common to most of these software packages is that many of their tools are implementations of algorithms that resulted from research areas such as image processing, vision, artificial intelligence or machine learning. This article presents the main algorithms implemented in the Melanie II 2-D PAGE software package. The applications of these algorithms, embodied as the feature of the program, are explained in an accompanying article (R. D. Appel et al.; Electrophoresis 1997, 18, 2724-2734).

  7. User Manual for the Data-Series Interface of the Gr Application Software

    Science.gov (United States)

    Donovan, John M.

    2009-01-01

    This manual describes the data-series interface for the Gr Application software. Basic tasks such as plotting, editing, manipulating, and printing data series are presented. The properties of the various types of data objects and graphical objects used within the application, and the relationships between them also are presented. Descriptions of compatible data-series file formats are provided.

  8. Integrated structural analysis tool using the linear matching method part 1 – Software development

    International Nuclear Information System (INIS)

    Ure, James; Chen, Haofeng; Tipping, David

    2014-01-01

    A number of direct methods based upon the Linear Matching Method (LMM) framework have been developed to address structural integrity issues for components subjected to cyclic thermal and mechanical load conditions. This paper presents a new integrated structural analysis tool using the LMM framework for the assessment of load carrying capacity, shakedown limit, ratchet limit and steady state cyclic response of structures. First, the development of the LMM for the evaluation of design limits in plasticity is introduced. Second, preliminary considerations for the development of the LMM into a tool which can be used on a regular basis by engineers are discussed. After the re-structuring of the LMM subroutines for multiple central processing unit (CPU) solution, the LMM software tool for the assessment of design limits in plasticity is implemented by developing an Abaqus CAE plug-in with graphical user interfaces. Further demonstration of this new LMM analysis tool including practical application and verification is presented in an accompanying paper. - Highlights: • A new structural analysis tool using the Linear Matching Method (LMM) is developed. • The software tool is able to evaluate the design limits in plasticity. • Able to assess limit load, shakedown, ratchet limit and steady state cyclic response. • Re-structuring of the LMM subroutines for multiple CPU solution is conducted. • The software tool is implemented by developing an Abaqus CAE plug-in with GUI

  9. Integrating R and Java for Enhancing Interactivity of Algorithmic Data Analysis Software Solutions

    Directory of Open Access Journals (Sweden)

    Titus Felix FURTUNĂ

    2016-06-01

    Full Text Available Conceiving software solutions for statistical processing and algorithmic data analysis involves handling diverse data, fetched from various sources and in different formats, and presenting the results in a suggestive, tailorable manner. Our ongoing research aims to design programming technics for integrating R developing environment with Java programming language for interoperability at a source code level. The goal is to combine the intensive data processing capabilities of R programing language, along with the multitude of statistical function libraries, with the flexibility offered by Java programming language and platform, in terms of graphical user interface and mathematical function libraries. Both developing environments are multiplatform oriented, and can complement each other through interoperability. R is a comprehensive and concise programming language, benefiting from a continuously expanding and evolving set of packages for statistical analysis, developed by the open source community. While is a very efficient environment for statistical data processing, R platform lacks support for developing user friendly, interactive, graphical user interfaces (GUIs. Java on the other hand, is a high level object oriented programming language, which supports designing and developing performant and interactive frameworks for general purpose software solutions, through Java Foundation Classes, JavaFX and various graphical libraries. In this paper we treat both aspects of integration and interoperability that refer to integrating Java code into R applications, and bringing R processing sequences into Java driven software solutions. Our research has been conducted focusing on case studies concerning pattern recognition and cluster analysis.

  10. Engineering a large application software project: the controls of the CERN PS accelerator complex

    International Nuclear Information System (INIS)

    Benincasa, G.P.; Daneels, A.; Heymans, P.; Serre, Ch.

    1985-01-01

    The CERN PS accelerator complex has been progressively converted to full computer controls without interrupting its full-time operation (more than 6000 hours per year with on average not more than 1% of the total down-time due to controls). The application software amounts to 120 man-years and 450'000 instructions: it compares with other large software projects, also outside the accelerator world: e.g. Skylab's ground support software. This paper outlines the application software structure which takes into account technical requirements and constraints (resulting from the complexity of the process and its operation) and economical and managerial ones. It presents the engineering and management techniques used to promote implementation, testing and commissioning within budget, manpower and time constraints and concludes with experience gained

  11. Design and Application of the Reconstruction Software for the BaBar Calorimeter

    International Nuclear Information System (INIS)

    Strother, Philip David; Imperial Coll., London

    2006-01-01

    The BaBar high energy physics experiment will be in operation at the PEP-II asymmetric e + e - collider in Spring 1999. The primary purpose of the experiment is the investigation of CP violation in the neutral B meson system. The electromagnetic calorimeter forms a central part of the experiment and new techniques are employed in data acquisition and reconstruction software to maximize the capability of this device. The use of a matched digital filter in the feature extraction in the front end electronics is presented. The performance of the filter in the presence of the expected high levels of soft photon background from the machine is evaluated. The high luminosity of the PEP-II machine and the demands on the precision of the calorimeter require reliable software that allows for increased physics capability. BaBar has selected C++ as its primary programming language and object oriented analysis and design as its coding paradigm. The application of this technology to the reconstruction software for the calorimeter is presented. The design of the systems for clustering, cluster division, track matching, particle identification and global calibration is discussed with emphasis on the provisions in the design for increased physics capability as levels of understanding of the detector increase. The CP violating channel B 0 → J/Ψ K s 0 has been studied in the two lepton, two π 0 final state. The contribution of this channel to the evaluation of the angle sin 2β of the unitarity triangle is compared to that from the charged pion final state. An error of 0.34 on this quantity is expected after 1 year of running at design luminosity

  12. Design and Application of the Reconstruction Software for the BaBar Calorimeter

    Energy Technology Data Exchange (ETDEWEB)

    Strother, Philip David; /Imperial Coll., London

    2006-07-07

    The BaBar high energy physics experiment will be in operation at the PEP-II asymmetric e{sup +}e{sup -} collider in Spring 1999. The primary purpose of the experiment is the investigation of CP violation in the neutral B meson system. The electromagnetic calorimeter forms a central part of the experiment and new techniques are employed in data acquisition and reconstruction software to maximize the capability of this device. The use of a matched digital filter in the feature extraction in the front end electronics is presented. The performance of the filter in the presence of the expected high levels of soft photon background from the machine is evaluated. The high luminosity of the PEP-II machine and the demands on the precision of the calorimeter require reliable software that allows for increased physics capability. BaBar has selected C++ as its primary programming language and object oriented analysis and design as its coding paradigm. The application of this technology to the reconstruction software for the calorimeter is presented. The design of the systems for clustering, cluster division, track matching, particle identification and global calibration is discussed with emphasis on the provisions in the design for increased physics capability as levels of understanding of the detector increase. The CP violating channel B{sup 0} {yields} J/{Psi}K{sub S}{sup 0} has been studied in the two lepton, two {pi}{sup 0} final state. The contribution of this channel to the evaluation of the angle sin 2{beta} of the unitarity triangle is compared to that from the charged pion final state. An error of 0.34 on this quantity is expected after 1 year of running at design luminosity.

  13. The R software fundamentals of programming and statistical analysis

    CERN Document Server

    Lafaye de Micheaux, Pierre; Liquet, Benoit

    2013-01-01

    The contents of The R Software are presented so as to be both comprehensive and easy for the reader to use. Besides its application as a self-learning text, this book can support lectures on R at any level from beginner to advanced. This book can serve as a textbook on R for beginners as well as more advanced users, working on Windows, MacOs or Linux OSes. The first part of the book deals with the heart of the R language and its fundamental concepts, including data organization, import and export, various manipulations, documentation, plots, programming and maintenance.  The last chapter in this part deals with oriented object programming as well as interfacing R with C/C++ or Fortran, and contains a section on debugging techniques. This is followed by the second part of the book, which provides detailed explanations on how to perform many standard statistical analyses, mainly in the Biostatistics field. Topics from mathematical and statistical settings that are included are matrix operations, integration, o...

  14. The Program for Climate Model Diagnosis and Intercomparison (PCMDI) Software Development: Applications, Infrastructure, and Middleware/Networks

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Dean N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2011-06-30

    The status of and future plans for the Program for Climate Model Diagnosis and Intercomparison (PCMDI) hinge on software that PCMDI is either currently distributing or plans to distribute to the climate community in the near future. These software products include standard conventions, national and international federated infrastructures, and community analysis and visualization tools. This report also mentions other secondary software not necessarily led by or developed at PCMDI to provide a complete picture of the overarching applications, infrastructures, and middleware/networks. Much of the software described anticipates the use of future technologies envisioned over the span of next year to 10 years. These technologies, together with the software, will be the catalyst required to address extreme-scale data warehousing, scalability issues, and service-level requirements for a diverse set of well-known projects essential for predicting climate change. These tools, unlike the previous static analysis tools of the past, will support the co-existence of many users in a productive, shared virtual environment. This advanced technological world driven by extreme-scale computing and the data it generates will increase scientists’ productivity, exploit national and international relationships, and push research to new levels of understanding.

  15. HistFitter software framework for statistical data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Baak, M. [CERN, Geneva (Switzerland); Besjes, G.J. [Radboud University Nijmegen, Nijmegen (Netherlands); Nikhef, Amsterdam (Netherlands); Cote, D. [University of Texas, Arlington (United States); Koutsman, A. [TRIUMF, Vancouver (Canada); Lorenz, J. [Ludwig-Maximilians-Universitaet Muenchen, Munich (Germany); Excellence Cluster Universe, Garching (Germany); Short, D. [University of Oxford, Oxford (United Kingdom)

    2015-04-15

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface. (orig.)

  16. HistFitter software framework for statistical data analysis

    International Nuclear Information System (INIS)

    Baak, M.; Besjes, G.J.; Cote, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-01-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fit to data and interpreted with statistical tests. Internally HistFitter uses the statistics packages RooStats and HistFactory. A key innovation of HistFitter is its design, which is rooted in analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with multiple models at once that describe the data, HistFitter introduces an additional level of abstraction that allows for easy bookkeeping, manipulation and testing of large collections of signal hypotheses. Finally, HistFitter provides a collection of tools to present results with publication quality style through a simple command-line interface. (orig.)

  17. Conceptions of Software Development by Project Managers: A Study of Managing the Outsourced Development of Software Applications for United States Federal Government Agencies

    Science.gov (United States)

    Eisen, Daniel

    2013-01-01

    This study explores how project managers, working for private federal IT contractors, experience and understand managing the development of software applications for U.S. federal government agencies. Very little is known about how they manage their projects in this challenging environment. Software development is a complex task and only grows in…

  18. Newly Developed Software Application for Multiple Access Process Planning

    Directory of Open Access Journals (Sweden)

    Katarina Monkova

    2014-11-01

    Full Text Available The purchase of a complex system for computer aided process planning (CAPP can be expensive for little and some middle sized plants, sometimes an inaccessible investment, with a long recoupment period. According to this fact and the author's experience with Eastern European plants, they decided to design a new database application which is suitable for production, stock, and economic data holding as well as processing and exploitation within the manufacturing process. The application can also be used to process a plan according to the selected criteria, for technological documentation and NC program creation. It was based on the theory of a multivariant approach to computer aided plan generation. Its fundamental features, the internal mathematical structure and new code system of processed objects, were prepared by the authors. The verification of the designed information system in real practice has shown that it enables about 30% cost and production time reduction and decreases input material assortment variability.

  19. Software Attribution for Geoscience Applications in the Computational Infrastructure for Geodynamics

    Science.gov (United States)

    Hwang, L.; Dumit, J.; Fish, A.; Soito, L.; Kellogg, L. H.; Smith, M.

    2015-12-01

    Scientific software is largely developed by individual scientists and represents a significant intellectual contribution to the field. As the scientific culture and funding agencies move towards an expectation that software be open-source, there is a corresponding need for mechanisms to cite software, both to provide credit and recognition to developers, and to aid in discoverability of software and scientific reproducibility. We assess the geodynamic modeling community's current citation practices by examining more than 300 predominantly self-reported publications utilizing scientific software in the past 5 years that is available through the Computational Infrastructure for Geodynamics (CIG). Preliminary results indicate that authors cite and attribute software either through citing (in rank order) peer-reviewed scientific publications, a user's manual, and/or a paper describing the software code. Attributions maybe found directly in the text, in acknowledgements, in figure captions, or in footnotes. What is considered citable varies widely. Citations predominantly lack software version numbers or persistent identifiers to find the software package. Versioning may be implied through reference to a versioned user manual. Authors sometimes report code features used and whether they have modified the code. As an open-source community, CIG requests that researchers contribute their modifications to the repository. However, such modifications may not be contributed back to a repository code branch, decreasing the chances of discoverability and reproducibility. Survey results through CIG's Software Attribution for Geoscience Applications (SAGA) project suggest that lack of knowledge, tools, and workflows to cite codes are barriers to effectively implement the emerging citation norms. Generated on-demand attributions on software landing pages and a prototype extensible plug-in to automatically generate attributions in codes are the first steps towards reproducibility.

  20. IMAGE information monitoring and applied graphics software environment. Volume 4. Applications description

    International Nuclear Information System (INIS)

    Hallam, J.W.; Ng, K.B.; Upham, G.L.

    1986-09-01

    The EPRI Information Monitoring and Applied Graphics Environment (IMAGE) system is designed for 'fast proto-typing' of advanced concepts for computer-aided plant operations tools. It is a flexible software system which can be used for rapidly creating, dynamically driving and evaluating advanced operator aid displays. The software is written to be both host computer and graphic device independent. This four volume report includes an Executive Overview of the IMAGE package (Volume 1), followed by Software Description (Volume II), User's Guide (Volume III), and Description of Example Applications (Volume IV)

  1. A proposed acceptance process for commercial off-the-shelf (COTS) software in reactor applications

    International Nuclear Information System (INIS)

    Preckshot, G.G.; Scott, J.A.

    1996-03-01

    This paper proposes a process for acceptance of commercial off-the-shelf (COTS) software products for use in reactor systems important to safety. An initial set of four criteria establishes COTS software product identification and its safety category. Based on safety category, three sets of additional criteria, graded in rigor, are applied to approve/disapprove the product. These criteria fall roughly into three areas: product assurance, verification of safety function and safety impact, and examination of usage experience of the COTS product in circumstances similar to the proposed application. A report addressing the testing of existing software is included as an appendix

  2. A computationally efficient software application for calculating vibration from underground railways

    International Nuclear Information System (INIS)

    Hussein, M F M; Hunt, H E M

    2009-01-01

    The PiP model is a software application with a user-friendly interface for calculating vibration from underground railways. This paper reports about the software with a focus on its latest version and the plans for future developments. The software calculates the Power Spectral Density of vibration due to a moving train on floating-slab track with track irregularity described by typical values of spectra for tracks with good, average and bad conditions. The latest version accounts for a tunnel embedded in a half space by employing a toolbox developed at K.U. Leuven which calculates Green's functions for a multi-layered half-space.

  3. An analysis software of tritium distribution in food and environmental water in China

    International Nuclear Information System (INIS)

    Li Wenhong; Xu Cuihua; Ren Tianshan; Deng Guilong

    2006-01-01

    Objective: The purpose of developing this analysis-software of tritium distribution in food and environmental water is to collect tritium monitoring data, to analyze the data, both automatically, statistically and graphically, and to study and share the data. Methods: Based on the data obtained before, analysis-software is wrote by using VC++. NET as tool software. The software first transfers data from EXCEL into a database. It has additive function of data-append, so operators can embody new monitoring data easily. Results: After turning the monitoring data saved as EXCEL file by original researchers into a database, people can easily access them. The software provides a tool of distributing-analysis of tritium. Conclusion: This software is a first attempt of data-analysis about tritium level in food and environmental water in China. Data achieving, searching and analyzing become easily and directly with the software. (authors)

  4. A Component-based Software Development and Execution Framework for CAx Applications

    Directory of Open Access Journals (Sweden)

    N. Matsuki

    2004-01-01

    Full Text Available Digitalization of the manufacturing process and technologies is regarded as the key to increased competitive ability. The MZ-Platform infrastructure is a component-based software development framework, designed for supporting enterprises to enhance digitalized technologies using software tools and CAx components in a self-innovative way. In the paper we show the algorithm, system architecture, and a CAx application example on MZ-Platform. We also propose a new parametric data structure based on MZ-Platform.

  5. Software selection based on analysis and forecasting methods, practised in 1C

    Science.gov (United States)

    Vazhdaev, A. N.; Chernysheva, T. Y.; Lisacheva, E. I.

    2015-09-01

    The research focuses on the problem of a “1C: Enterprise 8” platform inboard mechanisms for data analysis and forecasting. It is important to evaluate and select proper software to develop effective strategies for customer relationship management in terms of sales, as well as implementation and further maintenance of software. Research data allows creating new forecast models to schedule further software distribution.

  6. Spacelab user implementation assessment study. (Software requirements analysis). Volume 2: Technical report

    Science.gov (United States)

    1976-01-01

    The engineering analyses and evaluation studies conducted for the Software Requirements Analysis are discussed. Included are the development of the study data base, synthesis of implementation approaches for software required by both mandatory onboard computer services and command/control functions, and identification and implementation of software for ground processing activities.

  7. Security Vulnerability Profiles of Mission Critical Software: Empirical Analysis of Security Related Bug Reports

    Science.gov (United States)

    Goseva-Popstojanova, Katerina; Tyo, Jacob

    2017-01-01

    While some prior research work exists on characteristics of software faults (i.e., bugs) and failures, very little work has been published on analysis of software applications vulnerabilities. This paper aims to contribute towards filling that gap by presenting an empirical investigation of application vulnerabilities. The results are based on data extracted from issue tracking systems of two NASA missions. These data were organized in three datasets: Ground mission IVV issues, Flight mission IVV issues, and Flight mission Developers issues. In each dataset, we identified security related software bugs and classified them in specific vulnerability classes. Then, we created the security vulnerability profiles, i.e., determined where and when the security vulnerabilities were introduced and what were the dominating vulnerabilities classes. Our main findings include: (1) In IVV issues datasets the majority of vulnerabilities were code related and were introduced in the Implementation phase. (2) For all datasets, around 90 of the vulnerabilities were located in two to four subsystems. (3) Out of 21 primary classes, five dominated: Exception Management, Memory Access, Other, Risky Values, and Unused Entities. Together, they contributed from 80 to 90 of vulnerabilities in each dataset.

  8. A software application for energy flow simulation of a grid connected photovoltaic system

    International Nuclear Information System (INIS)

    Hamad, Ayman A.; Alsaad, Mohammad A.

    2010-01-01

    A computer software application was developed to simulate hourly energy flow of a grid connected photovoltaic system. This software application enables conducting an operational evaluation of a studied photovoltaic system in terms of energy exchange with the electrical grid. The system model consists of a photovoltaic array, a converter and an optional generic energy storage component that supports scheduled charging/discharging. In addition to system design parameters, the software uses hourly solar data and hourly load data to determine the amount of energy exchanged with electrical grid for each hour of the simulated year. The resulting information is useful in assessing the impact of the system on demand for electrical energy of a building that uses it. The software also aggregates these hourly results in daily, monthly and full year sums. The software finds the financial benefit of the system as the difference in grid electrical energy cost between two simultaneously considered cases. One is with load supplied only by the electrical grid, while the other is with the photovoltaic system present and contributing energy. The software supports the energy pricing scheme used in Jordan for domestic consumers, which is based on slices of monthly consumption. By projecting the yearly financial results on the system lifetime, the application weighs the financial benefit resulting from using the system against its cost, thus facilitating an economical evaluation.

  9. Application of fuzzy-MOORA method: Ranking of components for reliability estimation of component-based software systems

    Directory of Open Access Journals (Sweden)

    Zeeshan Ali Siddiqui

    2016-01-01

    Full Text Available Component-based software system (CBSS development technique is an emerging discipline that promises to take software development into a new era. As hardware systems are presently being constructed from kits of parts, software systems may also be assembled from components. It is more reliable to reuse software than to create. It is the glue code and individual components reliability that contribute to the reliability of the overall system. Every component contributes to overall system reliability according to the number of times it is being used, some components are of critical usage, known as usage frequency of component. The usage frequency decides the weight of each component. According to their weights, each component contributes to the overall reliability of the system. Therefore, ranking of components may be obtained by analyzing their reliability impacts on overall application. In this paper, we propose the application of fuzzy multi-objective optimization on the basis of ratio analysis, Fuzzy-MOORA. The method helps us find the best suitable alternative, software component, from a set of available feasible alternatives named software components. It is an accurate and easy to understand tool for solving multi-criteria decision making problems that have imprecise and vague evaluation data. By the use of ratio analysis, the proposed method determines the most suitable alternative among all possible alternatives, and dimensionless measurement will realize the job of ranking of components for estimating CBSS reliability in a non-subjective way. Finally, three case studies are shown to illustrate the use of the proposed technique.

  10. Empirical analysis of change metrics for software fault prediction

    NARCIS (Netherlands)

    Choudhary, Garvit Rajesh; Kumar, Sandeep; Kumar, Kuldeep; Mishra, Alok; Catal, Cagatay

    2018-01-01

    A quality assurance activity, known as software fault prediction, can reduce development costs and improve software quality. The objective of this study is to investigate change metrics in conjunction with code metrics to improve the performance of fault prediction models. Experimental studies are

  11. An Analysis of Open Source Security Software Products Downloads

    Science.gov (United States)

    Barta, Brian J.

    2014-01-01

    Despite the continued demand for open source security software, a gap in the identification of success factors related to the success of open source security software persists. There are no studies that accurately assess the extent of this persistent gap, particularly with respect to the strength of the relationships of open source software…

  12. Multi-criteria decision analysis methods and software

    CERN Document Server

    Ishizaka, Alessio

    2013-01-01

    This book presents an introduction to MCDA followed by more detailed chapters about each of the leading methods used in this field. Comparison of methods and software is also featured to enable readers to choose the most appropriate method needed in their research. Worked examples as well as the software featured in the book are available on an accompanying website.

  13. A pattern framework for software quality assessment and tradeoff analysis

    NARCIS (Netherlands)

    Folmer, Eelke; Boscht, Jan

    The earliest design decisions often have a significant impact on software quality and are the most costly to revoke. One of the challenges in architecture design is to reduce the frequency of retrofit problems in software designs; not being able to improve the quality of a system cost effectively, a

  14. Gamma-ray spectral analysis software designed for extreme ease of use or unattended operation

    International Nuclear Information System (INIS)

    Buckley, W.M.; Carlson, J.B.; Romine, W.A.

    1993-07-01

    We are developing isotopic analysis software in the Safeguards Technology Program that advances usability in two complimentary directions. The first direction is towards Graphical User Interfaces (GUIs) for very easy. to use applications. The second is toward a minimal user interface, but with additional features for unattended or fully automatic applications. We are developing a GUI-based spectral viewing engine that is currently running in the MS-Windows environment. We intend to use this core application to provide the common user interface for our data analysis, and subsequently data acquisition and instrument control applications. We are also investigating sets of cases where the MGA methodology produces reduced accuracy results, incorrect errors, or incorrect results. We try to determine the root cause for the problem and extend the methodology or replace portions of the Methodology so that MGA will function over a wider domain of analysis without requiring intervention and analysis by a spectroscopist. This effort is necessary for applications where such intervention is inconvenient or impractical

  15. Using Software Applications to Facilitate and Enhance Strategic Planning

    Science.gov (United States)

    1993-09-01

    Angeles, CA 90089-0021 213-743-3083 XIA 2.0 Booz-Allen & Hamilton, Inc. 1953 Gallows Road Vienna, VA 11180 703-893-0040 92 REFERENCES Ansoff , H. Igor ...to doing business was invented ( Ansoff 1984). Two examples of business applications of strategy are generic and competitive strategies (Mintzberg 1991...34 ( Ansoff 1984, p. 188) Keller (1990) states the purpose of the PPBS was "to produce a plan, a program, and, finally a budget for the Department of Defense

  16. Applications of Cerius2, software of molecular simulation

    International Nuclear Information System (INIS)

    Fernandez G, M.E.; Perez A, M.; Gutierrez W, C.E.

    2007-01-01

    Most of the investigations have a theoretical sustenance based on molecular simulation. The area of application of molecular simulation is very wide, in the Materials Technology Department assigned to the Applied Sciences Management have been treated problems about metallic nano structures, glasses, interfaces, and molecules, to sustain and to explain some of the experimental results. Energy calculations are carried out to determine minimum energy structures, for later on to carry out calculations of some of their properties; as well as the images simulation of Electron microscopy and X-ray diffraction. (Author)

  17. Software development for the RF measurement and analysis of RFQ accelerator

    International Nuclear Information System (INIS)

    Fu Shinian

    2002-01-01

    In a high current RFQ accelerator, it is required to tightly control the beam losses and beam emittance growth. For this reason, it is demanded to accurately measure and to correctly analyze field distribution and mode components, and eventually, to tune the RF field to reach its design values. LebView is a widely used software platform for the automatic measurement and data processing. The author will present the code development on this platform for the RFQ measurement and analysis, including some applications of the codes

  18. Software development for the RF measurement and analysis of RFQ accelerator

    CERN Document Server

    Fu Shin Ian

    2002-01-01

    In a high current RFQ accelerator, it is required to tightly control the beam losses and beam emittance growth. For this reason, it is demanded to accurately measure and to correctly analyze field distribution and mode components, and eventually, to tune the RF field to reach its design values. LebView is a widely used software platform for the automatic measurement and data processing. The will present the code development on this platform for the RFQ measurement and analysis, including some applications of the codes

  19. Software development for the RF measurement and analysis of RFQ accelerator

    International Nuclear Information System (INIS)

    Fu Shinian

    2002-01-01

    In a high current RFQ accelerator, it is required to tightly control the beam losses and beam emittance growth. For this reason, it is demanded to accurately measure and to correctly analyze field distribution and mode components, and eventually, to tune the RF field to reach its design values. LebView is a widely used software platform for the automatic measurement and data processing, the authors present authors' code development on this platform for the RFQ measurement and analysis, including some applications of the codes

  20. Application software packages for library operations and services in ...

    African Journals Online (AJOL)

    Purposive sampling was used to select 218 subjects from five Federal Universities where Library Automation have begun. Analysis and discussions were made. Findings revealed that all Universities studied are making use of KOHA, Virtua, E-LIB and Dspace and Greenstone to manage their digital Information resources.